2. Materials and Methods
This study adopted an exploratory observational design to monitor the emotions of secondary school students in six different class groups over 4 weeks.
The secondary school is located in Cambrils, a coastal town in northeastern Spain, which has three cohorts of four secondary education courses, and two cohorts of two bachelor’s degree courses. The average number of students per class is 32 and the number of students that participated in the experiment is shown in
Table 1. Classes are equally mixed by gender.
The experiment occurred during the first term of the school year. Most students had already studied the subject technology in previous years, but it was a new subject for the first-grade students in secondary education. The same went for the optional subject, the Green project. Six class groups were recorded during the experiment. The first two groups consisted of 16 students from the first year of secondary school, aged 12–13, who were enrolled to study technology and the Green project. The second group consisted of 24 students from the fourth year of secondary school, aged 15–16, who were enrolled in robotics. The third and fourth groups consisted of 24 students from the first year of the bachelor’s degree, aged 16–17; those in the third group were studying technology, while those in the fourth group were studying robotics. The last group comprised 12 students from the second year of the bachelor’s degree, aged 17–18, who were enrolled in technology. Every student group had a female teacher assigned to them.
Figure 1 illustrates the arrangement of the students in the classroom, and
Figure 2 shows some students attending class.
2.1. Materials
We used a laptop, an Intel Core i5, PC Notebook HP ProBook 640 G2, with Intel Core i-5 6200U @ 2,3 GHz, RAM 8 GB, with a webcam.
The laptop was placed in a high position in front of the class, in order to focus on the students’ faces, with the webcam directed towards them.
The experiment led us to take 47 videos. The dataset was annotated in a semi-automated procedure, with both manual and automatic annotations.
2.2. Experiment’s Procedure
Previously, we investigated the effect of the emotions of two students attending classes in two different subjects for several hours. To end that work, in this experiment, we used an improved configuration and a scenario of more students. The aim was to develop and apply a code capable of detecting faces and ER, then transfer the data collected into a database for further analysis. Based on that, we would explore the initial links between students’ emotions, subjects, time of day, and academic performance [
40].
Python was used as the programming environment in which we developed the code for acquiring and processing images (face detection, identification, and ER). Besides the language itself, Python has many instructions in libraries that can simplify complex tasks by introducing just a few lines of code. In addition, to facilitate programming, a code editor (IDE) was needed to create and execute the code; in this case, we used Visual Studio Code. Furthermore, Py-Feat (Python Facial Expression Analysis Toolbox) from a GitHub portal was also used to promptly process, analyse, and visualise the facial expression data.
In each class, the students were recorded using the camera on the laptop to obtain data; 50 min to 1 h of video was recorded and saved in an .mp4 file for each class. The video included as many students as there were in the field of view of the webcam. This camera could clearly focus on the front of the room as well as the back, which allowed us to obtain the full effect of using such an imaging tool in a classroom. Then, videos were uploaded and stored in a Google Drive. After that, a code was applied to split the videos into consecutive frames, one every 10 s, which were saved as images in .png files for further data analysis.
A code capable of detecting and identifying faces and analysing facial expressions was developed, with Py-Feat the chosen tool for obtaining the emotions of the students attending class. The results were entered into a .csv file containing all emotions.
Prior to discerning the emotions, the code gives different action units (AUs) for every detected face, which is a quantitative method for describing facial movements. We extracted the AUs of each facial part from the videos, and we added intensity to the code to obtain the value of each emotion (continuous values from 0 to 1). This data processing of AUs was performed independently for each face.
It should be noted that in order to be included, it was necessary that a student was within the focus of the webcam and was looking straight ahead, or at least the camera detected enough of their face to extract data that could be analysed.
In the AU detection task, we utilised the unit defined by the Facial Action Coding System [
44] to capture and interpret facial muscle movements associated with different expressions [
45]. In this research, we particularly focused on emotion detection.
2.3. Computer Code
The code used for the experiment was the Windows computer interpreter Command Prompt (
Figure 3), which allowed us to convert the recorded videos of 50 to 60 min duration into images, keeping an image from every 10 s, thus achieving an average of 300–360 images per recorded class.
Then, these images, with the help of a code editor for a desktop computer, Visual Studio Code (
Figure 4), were processed in the same command to be converted into AUs and emotions. The images saved for the ER through the code were deleted once all the data were available since these images were no longer useful.
The code protected the privacy of the students, making it impossible to track a particular student.
As shown in
Table 2, a .csv file was obtained with all the emotions obtained from the students detected in the images.
Ethical approval was given by the Ethics Committee of the Rovira i Virgili University before data collection because this experiment involved contact with humans, and more precisely, with minors (under the age of 18). They reviewed and approved this experiment with the reference number: CEIPSA-2021-TD-0019. Written informed consent was obtained from all students and their parents. In addition, at the beginning of the academic year, the parents of all students in the school signed a consent form in favour of or against their children being photographed or recorded. Students without parental consent sat out of the camera’s reach.
4. Discussion
In this research, it was observed that students experience many emotions throughout a class [
46]. Emotions significantly influence our cognitive functions [
47], linked to cognitive skills such as attention, working memory, planning, decision-making, critical thinking, problem-solving, and reasoning [
48]. However, we did not find a clear pattern for associating emotions with a subject, generating results that may initially seem contradictory. For instance, a student may be frustrated by their lack of understanding of a subject, and stress can either enhance or hinder learning and memory, depending on its intensity and duration. This study did not consider non-academic factors that could affect the emotional experience in the classroom. These factors include physical or mental conditions, events before class, influence from close peers, and expectations of meeting someone.
In terms of learning performance, pleasant emotions, such as enjoyment of learning, have been correlated with better performance on placement tests [
49]. In addition, research has shown that emotional–psychological satisfaction is a determinant variable in students’ academic performance. Regarding the presence of emotions in different school subjects, in technology, we observed greater emotions, as well as positive emotions that would promote learning such as surprise and happiness. Although these results are not conclusive, it is important to note that emotions affect learning in science, depending on the subject matter. In terms of negative emotions, it has been found that in physics and chemistry subjects, similar to technology, students show low interest, likely as they consider those to be difficult, boring, or useless subjects [
50]. In secondary education, emotions were more positive towards natural sciences and more negative towards physics and chemistry [
51]. Similarly, Dávila [
52] pointed out that students in compulsory secondary education often experience negative emotions such as boredom, nervousness, and worry when learning physics and chemistry. However, it is important to consider that embracing pessimism or seriousness can be beneficial for analytical and quantitative tasks [
27]. In the Green project, which required a more practical and holistic learning approach than technology, there was greater fear and sadness, which theoretically would not encourage learning. However, in order to fully interpret this outcome, we need to consider the context, external factors, and personal variables, rather than just the subject. Nonetheless, it seems that the ability to detect and understand emotions in the classroom context offers the potential to improve pedagogical practices, especially in subjects like the Green project, which may mean education on sustainability can become more effective.
In regard to the moment an emotion arose during the lesson, we found more fear at the beginning of the class than at the end. We propose that this was due to students adapting to their school environment and their initial lack of knowledge about what to expect from the class and teacher. Likewise, we found greater happiness at the beginning of class compared to the end, which may be due to the excitement of joining peers and the positive expectation of learning new things, which may diminish as the demands of academic tasks progress.
In terms of the relationship between emotions and academic years, no definite pattern was evident, with pleasant and unpleasant emotions existing simultaneously in different academic years. In the first year of secondary education, we observed increased anger and sadness, but also more joy; in the bachelor’s degree, we observed greater disgust and surprise. Based on these findings, we propose that emotions may be more connected to the subject than the academic year.
Nevertheless, the school and classroom environment are important factors influencing achievement emotions [
53]; furthermore, it can be assumed that classmates play an important role in affecting students’ achievement emotions. Similarly, it can be expected that students’ valuation of subjects is influenced by parents who value a subject highly or like a particular subject, and by teachers who teach a subject with enthusiasm. In this regard, our results show individual differences in emotional experiences in teaching and subjects, but these results are sample-specific; therefore, more research is needed.
This study is connected to goals 3 and 4 of the Sustainable Development Goals (SDGs) due to its interdisciplinary nature. Health and Well-being Goal 3 comprises ensuring healthy lives and promoting well-being for everyone, at all ages. In this sense, our developed tools have the potential to be used to prevent mental health problems, as they allow for the early identification of negative and positive emotions. Our research also contributes to promoting a healthy school environment by considering students’ emotions. Meanwhile, Goal 4, Quality Education, focuses on achieving inclusive, equitable, and quality education for all individuals and encouraging lifelong learning opportunities. In this regard, understanding emotions can make it easier to support the participation of all students, which is essential to ensuring that everyone has equal opportunities to engage in and contribute to learning.
There are several implications of this study, which could affect the entire educational community: students, teachers, administrators, families, and mental health professionals. For a start, our findings extend the latest understanding of how emotions affect the learning experience of each student in different disciplines. Accordingly, they may help educators adapt their teaching methods to address students’ emotional needs. The findings also identify factors in each subject or moment during the lesson that relate to certain emotions, which may facilitate the implementation of strategies to mitigate emotions considered unpleasant and encourage pleasant ones. An additional implication of this study is that it may affect how the classroom’s emotional climate is assessed, potentially in terms of developing future monitoring tools. Alternatively, the findings presented herein demonstrate that the tools we utilized could be applied with precision in other secondary schools, as the system is based on a simple and successfully tested code. We wish to highlight though that the assessment of their advantages in terms of ethical considerations rests with the appropriate experts, despite our evidence of privacy preservation.
This study’s limitations include how the precision of emotion monitoring via camera can be influenced by environmental conditions, camera quality, and student movement. Furthermore, emotions are complex phenomena that can be produced not only by the classroom context but also by subject content or by prior, external, or personal factors. Related to this, emotional responses to an academic situation differ for each individual. However, here, we have tried to find general tendencies. A final limitation to note is that the algorithm categorises emotions but does not assign them to a specific individual. Although this aspect fully protects the students’ privacy, it does not allow for monitoring emotions throughout the sessions or relating them to other variables, such as academic performance.
5. Conclusions
This study’s purpose has been achieved as we have provided evidence that technology can serve as a valuable tool to support the teaching–learning process, prioritising the emotional well-being of students. Furthermore, the concrete aims of this study—(1) to analyse the students’ manifestations of emotions in the classroom, (2) compare the emotions at the beginning and at the end of the class, and (3) relate the different emotions to the subject and academic year—have also been fulfilled, at least partially. However, we have not obtained conclusive results for the second and third objectives because some results seem inconclusive or contradictory. Perhaps this is because we have evidenced the complexity of the emotional phenomena and/or perhaps it is because more data are needed to analyse and find patterns that more clearly relate emotions to the moment in the lesson, academic year, and subject.
From a technological point of view, this study makes a useful contribution in that we developed and applied an innovative code system to detect students’ emotions during class. The system uses a vision-based model, with a webcam that records the class and then a developed code that detects and analyses the students’ facial expressions, categorising them into one of six basic emotions or a neutral emotion. The developed emotional expression recognition software is sufficiently accurate to identify the emotions of the learners, though it is only possible to obtain suitable images of the participants in close-up and when they are looking fully or partially at the camera.
Future research should focus on specific ways to improve the integration and effectiveness of emotion monitoring in the classroom, considering ethics. It would be useful to develop an interactive tool for teachers, such as by investigating how to design interfaces and tools that enable teachers to interpret and use emotional information effectively in the classroom, with the aim of improving students’ well-being and performance, or developing systems that provide specific suggestions (change in methodology, personalised attention) for how to address identified emotional needs.
Emotion monitoring is set to continue to grow as part of a holistic approach to education that considers academic development but also the well-being of students and their readiness to contribute to a more sustainable society, while empowering the students themselves.