AN ADVANCED STUDENT MONITORING SYSTEM
FOR PHYSICAL CLASS ROOMS

Learn More
Project Scope

Literature Survey


Samridhi Dev and Tushar Patnaik have proposed a face recognition-based attendance system that is more accurate, efficient system. The system uses The system uses Haar classifiers, KNN, CNN, SVM algorithms to achieve high accuracy in face recognition, and has been tested under various conditions. The system is also non-invasive and easy to use, and requires minimal installation[1]. It's important to note that this system is primarily designed for attendance purposes and is not intended for classroom monitoring or assessing attentiveness levels.

Weiqing Wang and Kunliang Xu's research focused on classroom emotion detection. They designed an algorithm that uses MTCNN for face detection and image segmentation, creating a database for visual emotion classification. MTCNN outperformed VGG16 and ResNet18 with an accuracy of 93.53% in recognizing students' facial emotions[2]. It's worth noting that this system, like the previous one, is not designed for use in physical classrooms. It also lacks the ability to track multiple students simultaneously and cannot detect behaviors, setting it apart from our proposed system.

Recent research highlights machine learning's effectiveness in measuring student engagement through methods like keystroke analysis for programming activity and facial expression and body language analysis for presentations. These techniques, including eye-tracking and EEG, prove applicable in both face-to-face and online learning settings[3]. It's important to note that this system is specifically designed for online environments and is not intended for physical classrooms. Unlike our proposed system, it cannot detect multiple students simultaneously.

Jialu Wang et al. explored video super-resolution through generative adversarial networks (GAN) and edge enhancement in 2021[4]. Their model, applied to low-resolution and blur videos such as closed-circuit television, achieved realistic and detailed results through adversarial loss. The edge enhancement function, utilizing Laplacian edge module, further improved outcomes. The inclusion of perceptual loss enhanced visual experience. The approach was effective across datasets, with notable benefits observed on the Vid4 dataset and other low-resolution videos. In our proposed research we use the more improved and advanced of GAN model which is known as ESRGAN [10] model for better enhancement to detect students from video.

References

  1. S. Dev and T. Patnaik, "Student Attendance System using Face Recognition," in 2020 International Conference on Smart Electronics and Communication (ICOSEC), India, 2020.
  2. W. Wang, K. Xu, H. Niu and X. Miao, "Emotion Recognition of Students Based on Facial Expressions in Online Education Based on the Perspective of Computer Simulation," Complexity, vol. 2020, p. 9, 11 September 2020.
  3. K. Fahd, S. J. Miah and K. Ahmed, "Predicting student performance in a blended learning environment using learning management system interaction data," Applied Computing and Informatics, Vols. ahead-of-print, no. ahead-of-print, 2021.
  4. J. Wang, G. Teng and P. An, "Video Super-Resolution Based on Generative Adversarial Network and Edge Enhancement," Electronics, vol. 10, no. 4, February 2021.

Research Gap

Following areas are the research gaps found in most of the recent researches.

Classroom-Centric Focus

Our system is tailored for the dynamic environment of physical classrooms, designed to address the specific challenges encountered in this context.

Resolution Enhancement Capability

It has the capability to enhance the quality of low-resolution video footage, a feature especially advantageous in classroom settings.

Multi-Camera Array Utilization

Going beyond conventional methods, our system can work with multiple camera arrays to not only record attendance but also gauge students' attentiveness levels, enhancing the depth of insights it can provide.

Research Problem & Solution

Proposed Problem

Our biggest challenge

In traditional classroom settings, taking attendance through manual methods such as pen-and-paper or registers has become outdated and inefficient. This process is time-consuming and can be easily manipulated by students, leading to inaccurate attendance records. In addition to the challenges associated with attendance tracking, lecturers often find it difficult to assess whether students are genuinely engaged in their lectures. This is particularly challenging in physical classroom settings where it is difficult for lecturers to pay attention to individual students’ emotions and behaviors.


Product Feedback Video

Proposed Solution

The system proposed in this research project addresses this barrier by providing solutions.The solution is the development of an intelligent classroom system that integrates multiple components, including multi-facial recognition, emotion and head position detection, student attentiveness measurement, and video enhancement, to address this problem. This system automates attendance, analyzes student behavior, and provides educators with valuable insights to enhance teaching practices and identify students who may require additional support.

Research Objectives

Create a component for attendance marking via multi-student face recognition and time tracking in classroom settings.

The attendance marking component aims to automate the attendance process in classrooms using multi-student face recognition. It not only identifies students but also tracks the time they spend in class and track eye gaze to measure focused levels. This facilitates efficient attendance management, reduces manual work, and minimizes chances of inaccuracies or fraudulent recording.


Develop emotion and head position identification for multiple students.

This component focuses on recognizing the emotions and head positions of multiple students in a classroom setting. By employing deep learning techniques, it enhances the system's ability to gauge student reactions, thus providing insights into their engagement levels. It contributes to a more comprehensive understanding of students' classroom experiences.


Establish a system to capture student attentiveness through emotions, head positions, and student profiles.

The student attentiveness measurement component integrates the data collected from emotion and head position identification with student profiles. This approach evaluates student engagement and attentiveness during lectures. It enables educators to tailor their teaching methods according to individual needs, enhancing the overall learning experience.


Enhance low-resolution videos in classrooms and optional seamless integration with other components.

This component addresses the challenge of working with low-resolution classroom videos. It employs techniques to upscale and improve the quality of video frames. Importantly, it is designed for seamless integration with other system components. By enhancing video quality, this module aims to capture students that are hard to capture from a normal video, usually backrow students in a classroom.


Methodology

Figure 2. High Level Architecture of the system.

The proposed pest and disease Surveillance system consists of 4 main components. They are;

  1. Enhance low resolution footage
  2. Student Attendance Marking with Face Recognition and Eye Gaze Tracker
  3. Emotion and Head Pose capture
  4. Monitoring Student's Concentration levels

Fig 2 illustrates the overall system diagram of the proposed solution which was intended to provide a smart approach for stakeholders, researchers, and Institute Administrators to analize students in physical class room environment. As shown in the diagram, the processing flow of the proposed system follows a logical sequence designed to comprehensively monitor student engagement in a classroom setting. The low-resolution video footage of the classroom is enhanced to improve frame quality, ensuring clearer analysis. Then system initiates with the multi-student face recognition and eye gaze tracking with attendance marking component, which identifies and records the presence of each student and measure focus level from eye gaze tracking. Concurrently, the system employs emotion and head position identification techniques to analyze students' facial expressions and head orientations. This data is then utilized in the student attentiveness measurement component, which assesses individual engagement levels based on emotions, head positions, and student profiles. Finally, the integrated data, combining attendance information and student engagement metrics, can be accessed through a user interface or exported to a CSV file for educators to make informed decisions about their teaching methods and students' needs. This holistic approach to classroom monitoring offers valuable insights into student performance and attendance while facilitating an improved teaching and learning environment.

Technologies Used

Python

Python

Flask

Flask

Tensorflow

Tensorflow

Keras

Keras

DeepFace

Deep Face

OpenCV

OpenCV

NumPy

NumPy

PyTorch

PyTorch

PyCharm

PyCharm

Chart.js

Chart.js

Bootstrap

Bootstrap

Nvidia CUDA

Nvidia CUDA

Milestones

Timeline in Brief

  • February 2023

    Project Topic Assessment

    Initial topic assessment document of the proposed research was submitted for evaluation. The submitted document included a brief overview on key aspects of the proposed research along with the research problem, research objectives, overall solution as well as member task breakdown.

    Marks Allocated : No marks allocated

    0%
  • February 2023

    Project Charter

    Once the topic was finalized, the project charter document was submitted for evaluation. This document also included a brief overview on key aspects of the proposed research along with the research problem, research objectives, overall solution as well as member task breakdown of the proposed research.

    Marks Allocated : No marks allocated

    0%
  • March 2023

    Project Proposal Presentation

    Presented to a panel of judges in order to provide an overview of the proposed research.

    Marks Allocated : 6

    6%
  • March 2023

    Project Proposal Report

    A Project Proposal is presented to potential sponsors or clients to receive funding or get your project approved.The submission of a report which provides an in-depth analysis pertaining to key aspects of the proposed research along with the research problem, objectives, as well as the overall proposed solution.

    Marks Allocated : 6

    6%
  • May 2023

    Progress Presentation I

    Progress Presentation I reviews the 50% completetion status of the project. This reveals any gaps or inconsistencies in the design/requirements.

    Marks Allocated : 15

    15%
  • May 2023

    Status Document I

    The submission of a document that provides an overview of key tasks conducted by members during the implementation PPI phase of the research.

    Marks Allocated : 1

    1%
  • June 2023

    Research Paper

    Describes what you contribute to existing knowledge, giving due recognition to all work that you referred in making new knowledge

    Marks Allocated : 10

    10%
  • September 2023

    Progress Presentation II

    Progress Presentation II reviews the 90% completetion status demonstration of the project. Along with a Poster presesntation which describes the project as a whole.

    Marks Allocated : 18

    18%
  • September 2023

    Status Document II

    The submission of a document that provides an overview of key tasks conducted by members during the implementation PPII phase of the research.

    Marks Allocated : 1

    1%
  • November 2023

    Final Report

    Final Report evalutes the completed project done throughout the year. Marks mentioned below includes marks for Individual (15%) & group (4%) reports and also Final report.

    Marks Allocated : 19

    19%
  • November 2023

    Logbook

    Status of the project is validated through the Logbook. This also includes, Status documents 1 & 2.

    Marks Allocated : 1

    1%
  • November 2023

    Final Presentation & Viva

    Final evaluation of the completed product. Viva is held individually to assess each members contribution to the project.

    Marks Allocated : 20

    20%
  • November 2023

    Website Assessment

    The Website helps to promote our research project and reveals all details related to the project.

    Marks Allocated : 2

    2%
Downloads

Documents

Please find all documents related to this project below.

Topic Assessment

Submitted on 2023/02/13

Project Charter

Submitted on 2023/03/14

Project Proposal

Submitted on 2023/03/29

Status Documents I

Submitted on 2023/05/23

Status Documents II

Submitted on 2023/09/05

Research Paper

Submitted on 2023/09/15

Final Report

Submitted on 2023/10/13

Poster

Submitted on 2023/10/13

Presentations

Please find all presentations related this project below.

Project Proposal

Submitted on 2023/03/29

Progress Presentation I

Submitted on 2023/05/23

Progress Presentation II

Submitted on 2023/09/05

Final Presentation

Submitted on 2023/11/02

About Us

Meet Our Team !

Mr. Jeewaka Perera
Mr. Jeewaka Perera

Supervisor

Sri Lanka Institute of Information Technology

Department

Faculty of Computing | Computer Science & Software Engineering

Ms. Pradeepa Bandara
Ms. Pradeepa Bandara

Co-Supervisor

Sri Lanka Institute of Information Technology

Department

Faculty of Computing | Information Technology

Rathnayaka R.K.A.R.
Rathnayaka R.K.A.R.

Group Leader

Undergraduate

Sri Lanka Institute of Information Technology

Department

Information Technology

Mallawarachchi S.M.A.
Mallawarachchi S.M.A.

Group Member

Undergraduate

Sri Lanka Institute of Information Technology

Department

Information Technology

Wijesiriwardana H.G.N.D.
Wijesiriwardana H.G.N.D.

Group Member

Undergraduate

Sri Lanka Institute of Information Technology

Department

Information Technology

Perera S.S.A.
Perera S.S.A.

Group Member

Undergraduate

Sri Lanka Institute of Information Technology

Department

Information Technology

Contact Us

Get in Touch

Contact Details

For further queries please reach us at eduwatch227@gmail.com

Hope this project helped you in some manner. Thank you!

-Team EduWatch