Developing a Model to Predict Self-Reported Student Performance during Online Education Based on the Acoustic Environment
Abstract
:1. Introduction
- The following research questions were addressed and answered in the study:
- Can a predictive model perform well with a reduced number of predictors, most of them acoustics?
- Do the variables selected for the predictive model avoid repeating acoustic information?
- Are acoustic variables important for the prediction of perceived academic performance?
2. Materials and Methods
2.1. Online Survey
2.2. Dataset
2.3. Evaluating the Effects of Dependent Variables’ Imbalanced Distribution
- (a)
- The following dataset scenarios were generated:
- (1)
- Data with its original class distribution and the classifier considering the equal (=1) weights for each class.
- (2)
- Random over-sampling in a 1:1 ratio;
- (2.1)
- only over-sampling the training dataset;
- (2.2)
- over-sampling all the data.
- (3)
- Random under-sampling in a 1:1 ratio;
- (3.1)
- only under-sampling the training dataset;
- (3.2)
- under-sampling all the data.
- (4)
- Data with its original distribution and the classifier considering adjusted (“balanced”) weights for each class.
- (5)
- Combination of over- and under-sampling to obtain a less imbalanced dataset;
- (5.1)
- only on the training dataset.
- (5.2)
- on all the data.
- (b)
- A Random Forest algorithm on the previously imbalanced and resampled dataset scenarios was applied to explore which dataset to use in the feature selection process: the original class distribution or the resampled data after the ‘handling-imbalance’ procedure. To each scenario, 5 repetitions of 10 iterations were applied. For each iteration: (1) the data were split into train and test sets using the function cross_val_score(). Only for scenarios 4, 5.1, and 5.2, the parameter class weight was set as “balanced”, which automatically adjusts weights inversely proportional to class frequencies. (2) A Random Forest classifier was fitted to each dataset (according to the scenario configuration, the train, the test sets, or both could maintain their original weights, or be under-sampled, over-sampled, or “balanced”).
- (c)
- With enough imbalance, some relevant metrics assessing the model performance could be misleading given the bias towards the majority class during the training stage [43]. For the current assessment, the AUC-score metric was calculated which is generally considered to perform well with imbalanced data [44]. Consequently, to obtain an objective insight, the predictions were evaluated, calculating the mean AUC score for the test set of each scenario.
2.4. Recursive Feature Elimination
2.5. Features Selection and Features Importance
2.6. Implementation of Classifiers
3. Results and Discussion
3.1. Descriptive Statistics
3.2. Evaluating the Effects of Dependent Variables’ Imbalanced Distribution
3.3. Recursive Feature Elimination
Number of Features Selection
3.4. Features Selection and Features Importance
3.5. Implementation of Classifiers
3.5.1. Random Forest: Tuned Parameters
3.5.2. Gradient Boosting: Tuned Parameters
3.5.3. Support Vector Machine: Tuned Parameters
3.5.4. Models Metrics and Comparison
3.6. Research Limitations and Future Lines of Research
4. Conclusions
- The model of fourteen variables, nine of them acoustic models, with better overall performance, is Random Forest with a success rate of 78%. While these values may seem modest, they are very good results, since most variables are related to the perceived acoustic environment. Furthermore, we reduced the forty initial variables to fourteen, which makes it easy to interpret the importance of each variable in the construction of the prediction model.
- It is noteworthy that four out of the five noise sources queried in the survey (traffic, animals, voices, and TV/Radio/Household appliances) on different aspects (interference in autonomous and synchronous activities, loudness…) are included only once among the most relevant variables; furthermore, the nine variables selected address interference in both autonomous (voices and traffic) and synchronous activities (animals), as well as the perceived loudness (of TV/Radio/Household appliances). Consequently, the acoustic variables that conform to the model, selected by the Recursive Feature Elimination method, avoid duplicating information.
- A model with only nine variables (all acoustics), out of the fourteen used for the previous model, obtained a success rate of 73% only 5% lower than a model that included, among others, variables as important as the real academic performance GPA or the perceived academic quality. Therefore, these outcomes show the important role of the acoustic variables in the perception of the student’s academic performance.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Correction Statement
Appendix A
Questions | Measurement Scale |
---|---|
| Very bad (1), Excellent (7) |
| I did not hear them (0), Very low (1), Neither high nor low (3), Very high (5) |
| I did not hear them (0), They did not interfere (1), Extremely (5) |
| I did not hear them (0), They did not interfere (1), Extremely (5) |
| Never (1), Very often (5) |
| They did not interfere (1), Extremely (5) |
| They did not interfere (1), Extremely (5) |
| They did not interfere (1), Extremely (5) |
| They did not interfere (1), Extremely (5) |
| They did not interfere (1), Extremely (5) |
| I did not have problems (0), They did not disturb me (1), Extremely (5) |
| Just a little (1), A lot (5) |
| |
| Very bad (1), Neither bad nor good (3), Very good (5) |
Academic Data—Marks | Measurement Scale |
---|---|
| 1 to 10 |
References
- Palacios Hidalgo, F.J.; Huertas Abril, C.A.; Gómez Parra, M.E. MOOCs: Origins, Concept and Didactic Applications: A Systematic Review of the Literature (2012–2019). Technol. Knowl. Learn. 2020, 25, 853–879. [Google Scholar] [CrossRef]
- Metz, K. Benefits of Online Courses in Career and Technical Education. Tech. Connect. Educ. Careers 2010, 85, 20–23. [Google Scholar]
- Bozkurt, A.; Zawacki-Richter, O. Trends and Patterns in Massive Open Online Courses: Review and Content Analysis of Research on MOOCs. Int. Rev. Res. Open Distrib. Learn. 2017, 18, 119–147. [Google Scholar]
- Stansfield, M.; Mclellan, E.; Connolly, T. Enhancing Student Performance in Online Learning and Traditional Face-to-Face Class Delivery. J. Inf. Technol. Educ. Res. 2004, 3, 173–188. [Google Scholar] [CrossRef] [PubMed]
- Daymont, T.; Blau, G. Student Performance in Online and Traditional Sections of an Undergraduate Management Course. J. Behav. Appl. Manag. 2008, 9, 275–294. [Google Scholar] [CrossRef]
- Estelami, H. Determining the Drivers of Student Performance in Online Business Courses. Am. J. Bus. Educ. 2013, 7, 79–92. [Google Scholar] [CrossRef]
- Bir, D.D. Comparison of Academic Performance of Students in Online vs. Traditional Engineering Course. Eur. J. Open, Distance E-Learn. 2019, 22, 1–13. [Google Scholar] [CrossRef]
- Rebucas-Estacio, R.; CAllanta-Raga, R. Analyzing Students Online Learning Behavior in Blended Courses Using Moodle. Asian Assoc. Open Univ. J. 2017, 12, 52–68. [Google Scholar] [CrossRef]
- Krause, A.E.; Dimmock, J.; Rebar, A.L.; Jackson, B. Music Listening Predicted Improved Life Satisfaction in University Students During Early Stages of the COVID-19 Pandemic. Front. Psychol. 2021, 11, 631033. [Google Scholar] [CrossRef] [PubMed]
- Gopal, R.; Singh, V.; Aggarwal, A. Impact of Online Classes on the Satisfaction and Performance of Students during the Pandemic Period. Educ. Inf. Technol. 2021, 26, 6923–6947. [Google Scholar] [CrossRef]
- Puyana-Romero, V.; Díaz-Marquez, Á.M.; Ciaburro, G.; Hernandez-Molina, R. The Acoustic Environment and University Students’ Satisfaction with the Online Education Method during the COVID-19 Lockdown. Int. J. Environ. Res. Public Health 2023, 20, 709. [Google Scholar] [CrossRef] [PubMed]
- Logan, E.; Augustyniak, R.; Rees, A. Distance Education as Different Education: A Student-Centered Investigation of Distance Learning Experience. J. Educ. Libr. Inf. Sci. 2002, 43, 32–42. [Google Scholar] [CrossRef]
- Papadakis, S. MOOCs 2012–2022: An Overview Methods Study Design. Adv. Mob. Learn. Educ. Res. 2023, 3, 682–693. [Google Scholar] [CrossRef]
- Damián-Chávez, M.M.; Ledesma-Coronado, P.E.; Drexel-Romo, M.; Ibarra-Zárate, D.I.; Alonso-Valerdi, L.M. Physiology & Behavior Environmental Noise at Library Learning Commons Affects Student Performance and Electrophysiological Functioning. Physiol. Behav. 2021, 241, 113563. [Google Scholar] [CrossRef] [PubMed]
- Buchari, B.; Matondang, N. The Impact of Noise Level on Students’ Learning Performance at State Elementary School in Medan. AIP Conf. Proc. 2017, 1855, 040002. [Google Scholar] [CrossRef]
- Nelson, P.B.; Soli, S. Acoustical Barriers to Learning: Children at Risk in Every Classroom. Lang. Speech Hear. Serv. Sch. 2000, 31, 356–361. [Google Scholar] [CrossRef] [PubMed]
- Choi, Y. The Intelligibility of Speech in University Classrooms during Lectures. Appl. Acoust. 2020, 162, 107211. [Google Scholar] [CrossRef]
- Shield, B.M.; Dockrell, J.E. The Effects of Environmental and Classroom Noise on the Academic Attainments of Primary School Children. J. Acoust. Soc. Am. 2008, 123, 133–144. [Google Scholar] [CrossRef] [PubMed]
- World Health Organization; European Union. Guidelines for Community Noise. Document References MNB-1Q DOC2; World Health Organization: Geneva, Switzerland, 1999; Volume 5(2)(a). [Google Scholar]
- Carroll, N. European Journal of Higher Education E-Learning—The McDonaldization of Education. Eur. J. High. Educ. 2014, 3, 342–356. [Google Scholar] [CrossRef]
- Wladis, C.; Conway, K.M.; Hachey, A.C. The Online STEM Classroom—Who Succeeds? An Exploration of the Impact of Ethnicity, Gender, and Non-Traditional Student Characteristics in the Community College Context. Community Coll. Rev. 2015, 43, 142–164. [Google Scholar] [CrossRef]
- Clark-Ibáñez, M.; Scott, L. Learning to Teach Online. Teach. Sociol. 2008, 36, 34–41. [Google Scholar] [CrossRef]
- Driscoll, A.; Jicha, K.; Hunt, A.N.; Tichavsky, L.; Thompson, G. Can Online Courses Deliver In-Class Results? A Comparison of Student Performance and Satisfaction in an Online versus a Face-to-Face Introductory Sociology Course. Teach. Sociol. 2012, 40, 312–331. [Google Scholar] [CrossRef]
- Paul, J.; Jefferson, F. A Comparative Analysis of Student Performance in an Online vs. Face-to-Face Environmental Science Course From 2009 to 2016. Front. Comput. Sci. 2019, 1, 472525. [Google Scholar] [CrossRef]
- González-Gómez, D.; Jeong, J.S.; Airado Rodríguez, D.; Cañada-Cañada, F. Performance and Perception in the Flipped Learning Model: An Initial Approach to Evaluate the Effectiveness of a New Teaching Methodology in a General Science Classroom. J. Sci. Educ. Technol. 2016, 25, 450–459. [Google Scholar] [CrossRef]
- Pei, L.; Wu, H. Does Online Learning Work Better than Offline Learning in Undergraduate Medical Education? A Systematic Review and Meta-Analysis. Med. Educ. Online 2019, 24, 1666538. [Google Scholar] [CrossRef] [PubMed]
- Lockman, A.S.; Schirmer, B.R. Online Instruction in Higher Education: Promising, Research-Based, and Evidence Based Practices 3. Themes in the Research Literature on Online Learning. J. Educ. e-Learn. Res. 2020, 7, 130–152. [Google Scholar] [CrossRef]
- Qiu, F.; Zhang, G.; Sheng, X.; Jiang, L.; Zhu, L.; Xiang, Q.; Jiang, B.; Chen, P.-k. Predicting Students’ Performance in e-Learning Using Learning Process and Behaviour Data. Sci. Rep. 2022, 12, 453. [Google Scholar] [CrossRef]
- Aydoğdu, Ş. Predicting Student Final Performance Using Artificial Neural Networks in Online Learning Environments. Educ. Inf. Technol. 2020, 25, 1913–1927. [Google Scholar] [CrossRef]
- Alshabandar, R.; Hussain, A.; Keight, R.; Khan, W. Students Performance Prediction in Online Courses Using Machine Learning Algorithms. In Proceedings of the 2020 International Joint Conference on Neural Networks, Glasgow, UK, 19–24 July 2020. [Google Scholar] [CrossRef]
- Segura, M.; Mello, J. Machine Learning Prediction of University Student Dropout: Does Preference Play a Key Role? Mathematics 2022, 10, 3359. [Google Scholar] [CrossRef]
- Regnier, J.; Shafer, E.; Sobiesk, E.; Stave, N.; Haynes, M. From Crisis to Opportunity: Practices and Technologies for a More Effective Post-COVID Classroom. Educ. Inf. Technol. 2023, 29, 5981–6003. [Google Scholar] [CrossRef]
- Bashir, A.; Bashir, S.; Rana, K.; Lambert, P.; Vernallis, A. Post-COVID-19 Adaptations; the Shifts Towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting. Front. Educ. 2021, 6, 711619. [Google Scholar] [CrossRef]
- Arday, J. COVID-19 and Higher Education: The Times They Are A’Changin. Educ. Rev. 2022, 74, 365–377. [Google Scholar] [CrossRef]
- Akoglu, H. User’s Guide to Correlation Coefficients. Turk. J. Emerg. Med. 2018, 18, 91–93. [Google Scholar] [CrossRef] [PubMed]
- Miot, H.A. Analysis of Ordinal Data in Clinical and Experimental Studies. J. Vasc. Bras. 2020, 19, e20200185. [Google Scholar] [CrossRef] [PubMed]
- Kulkarni, A.; Chong, D.; Batarseh, F.A. 5—Foundations of Data Imbalance and Solutions for a Data Democracy. In Data Democracy; Batarseh, F.A., Yang, R., Eds.; Academic Press: Cambridge, MA, USA, 2020; pp. 83–106. [Google Scholar] [CrossRef]
- He, H.; Ma, Y. Imbalanced Learning: Foundations, Algorithms, and Applications, 1st ed.; Wiley-IEEE Press: Hoboken, NJ, USA, 2013. [Google Scholar]
- Google for Developer. Imbalance Data. Available online: https://developers.google.com/machine-learning/data-prep/construct/sampling-splitting/imbalanced-data?hl=es-419 (accessed on 7 April 2023).
- Brownlee, J. (Ed.) Machine Learning Mastering with R, v. 1.12.; Guiding Tech Media: Melbourne, Australia, 2016. [Google Scholar]
- Baranwal, A.; Bagwe, B.R.; Vanitha, M. Machine Learning in Python. J. Mach. Learn. Res. 2019, 12, 128–154. [Google Scholar] [CrossRef]
- Lemaître, G.; Nogueira, F.; Aridas, C.K. Imbalanced-Learn: A Python Toolbox to Tackle the Curse of Imbalanced Datasets in Machine Learning. J. Mach. Learn. Res. 2017, 18, 1–5. [Google Scholar]
- Kuhn, M.; Johnson, K. Measuring Performance in Classification Models. In Applied Predictive Modeling; Kuhn, M., Johnson, K., Eds.; Springer: New York, NY, USA, 2013; pp. 247–273. [Google Scholar] [CrossRef]
- Liu, Y.; Li, Y.; Xie, D. Implications of Imbalanced Datasets for Empirical ROC-AUC Estimation in Binary Classification Tasks. J. Stat. Comput. Simul. 2024, 94, 183–203. [Google Scholar] [CrossRef]
- Priyatno, A.M.; Widiyaningtyas, T. A Systematic Literature Review: Recursive Feature Elimination Algorithms. JITK (Jurnal Ilmu Pengetah. Teknol. Komputer) 2024, 9, 196–207. [Google Scholar] [CrossRef]
- Kuhn, M.; Johnson, K. An Introduction to Feature Selection. In Applied Predictive Modeling; Kuhn, M., Johnson, K., Eds.; Springer: New York, NY, USA, 2013; pp. 487–519. [Google Scholar] [CrossRef]
- Butcher, B.; Smith, B.J. Feature Engineering and Selection: A Practical Approach for Predictive Models. Am. Stat. 2020, 74, 308–309. [Google Scholar] [CrossRef]
- Raschka, S. Machine Learning Q and AI; No Starch Press: San Francisco, CA, USA, 2024. [Google Scholar]
- Luor, D.-C. A Comparative Assessment of Data Standardization on Support Vector Machine for Classification Problem. Intell. Data Anal. 2015, 19, 529–546. [Google Scholar] [CrossRef]
- Ganganwar, V. An Overview of Classification Algorithms for Imbalanced Datasets. Int. J. Emerg. Technol. Adv. Eng. 2012, 2, 42–47. [Google Scholar]
- Breiman, L.; Friedman, J.; Olshen, R.A.; Stone, C.J. Classification and Regression Trees, 1st ed.; Chapman and Hall/CRC: New York, NY, USA, 1984; Volume 5. [Google Scholar] [CrossRef]
- Rainio, O. Evaluation Metrics and Statistical Tests for Machine Learning. Sci. Rep. 2024, 14, 6086. [Google Scholar] [CrossRef] [PubMed]
- García-Balboa, J.L.; Alba-Fernández, M.V.; Ariza-López, F.J.; Rodríguez-Avi, J. Analysis of Thematic Similarity Using Confusion Matrices. ISPRS Int. J. Geo-Inf. 2018, 7, 233. [Google Scholar] [CrossRef]
- Das, C.; Sahoo, A.K.; Pradhan, C. Chapter 12—Multicriteria Recommender System Using Different Approaches. In Cognitive Data Science in Sustainable Computing; Mishra, S., Tripathy, H.K., Mallick, P.K., Sangaiah, A.K., Chae, G.-S., Eds.; Academic Press: Cambridge, MA, USA, 2022; pp. 259–277. [Google Scholar] [CrossRef]
- Pembury Smith, M.Q.R.; Ruxton, G.D. Effective Use of the McNemar Test. Behav. Ecol. Sociobiol. 2020, 74, 133. [Google Scholar] [CrossRef]
- Safari, S.; Baratloo, A.; Elfil, M.; Negida, A. Evidence Based Emergency Medicine Part 2: Positive and Negative Predictive Values of Diagnostic Tests. Emergency 2015, 3, 87–88. [Google Scholar]
- Fawcett, T. ROC Graphs: Notes and Practical Considerations for Researchers. Pattern Recognit. Lett. 2004, 31, 1–38. [Google Scholar]
- Zweig, M.H.; Campbell, G. Receiver-Operating Characteristic (ROC) Plots: A Fundamental Evaluation Tool in Clinical Medicine. Clin. Chem. 1993, 39, 561–577. [Google Scholar] [CrossRef] [PubMed]
- Jones, C.M.; Athanasiou, T. Summary Receiver Operating Characteristic Curve Analysis Techniques in the Evaluation of Diagnostic Tests. Ann. Thorac. Surg. 2005, 79, 16–20. [Google Scholar] [CrossRef] [PubMed]
- de Hond, A.A.H.; Steyerberg, E.W.; van Calster, B. Interpreting Area under the Receiver Operating Characteristic Curve. Lancet Digit. Health 2022, 4, e853–e855. [Google Scholar] [CrossRef]
- Nassar, A.P.; Mocelin, A.O.; Nunes, A.L.B.; Giannini, F.P.; Brauer, L.; Andrade, F.M.; Dias, C.A. Caution When Using Prognostic Models: A Prospective Comparison of 3 Recent Prognostic Models. J. Crit. Care 2012, 27, e1–e423. [Google Scholar] [CrossRef]
- Kuckartz, U.; Rädiker, S.; Ebert, T.; Schehl, J. Statistik, Eine Verständliche Einführung; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar] [CrossRef]
- Torresin, S.; Ratcliffe, E.; Aletta, F.; Albatici, R.; Babich, F.; Oberman, T.; Kang, J. The Actual and Ideal Indoor Soundscape for Work, Relaxation, Physical and Sexual Activity at Home: A Case Study during the COVID-19 Lockdown in London. Front. Psychol. 2022, 13, 1038303. [Google Scholar] [CrossRef]
- Krause, A.E.; Scott, W.G.; Flynn, S.; Foong, B.; Goh, K.; Wake, S.; Miller, D.; Garvey, D. Listening to Music to Cope with Everyday Stressors. Music. Sci. 2021, 27, 176–192. [Google Scholar] [CrossRef]
- Lesiuk, T. The Effect of Music Listening on Work Performance. Psychol. Music 2005, 33, 173–191. [Google Scholar] [CrossRef]
- Nilsson, M.E.; Alvarsson, J.; Rådsten-Ekman, M.; Bolin, K. Auditory Masking of Wanted and Unwanted Sounds in a City Park. Noise Control Eng. J. 2010, 58, 524–531. [Google Scholar] [CrossRef]
- Ciaburro, G. MATLAB for Machine Learning; Packt Publishing: Birmingham, UK, 2017. [Google Scholar]
- Pudjihartono, N.; Fadason, T.; Kempa-Liehr, A.W.; O’Sullivan, J.M. A Review of Feature Selection Methods for Machine Learning-Based Disease Risk Prediction. Front. Bioinform. 2022, 2, 927312. [Google Scholar] [CrossRef] [PubMed]
- Noroozi, Z.; Orooji, A.; Erfannia, L. Analyzing the Impact of Feature Selection Methods on Machine Learning Algorithms for Heart Disease Prediction. Sci. Rep. 2023, 13, 22588. [Google Scholar] [CrossRef]
- Puyana-Romero, V.; Maffei, L.; Brambilla, G.; Ciaburro, G. Modelling the Soundscape Quality of Urban Waterfronts by Artificial Neural Networks. Appl. Acoust. 2016, 111, 121–128. [Google Scholar] [CrossRef]
- Hétu, R.; Truchon-Gagnon, C.; Bilodeau, S.A. Problems of Noise in School Settings: A Review of Literature and the Results of an Exploratory Study. J. Speech-Lang. Pathol. Audiol. 1990, 14, 31–39. [Google Scholar]
- Shield, B.; Dockrell, J.E. External and Internal Noise Surveys of London Primary Schools. J. Acoust. Soc. Am. 2004, 115, 730–738. [Google Scholar] [CrossRef]
- Caviola, S.; Visentin, C.; Borella, E.; Mammarella, I.; Prodi, N. Out of the Noise: Effects of Sound Environment on Maths Performance in Middle-School Students. J. Environ. Psychol. 2021, 73, 101552. [Google Scholar] [CrossRef]
- Nagaraj, N.K. Effect of Auditory Distraction on Working Memory, Attention Switching, and Listening Comprehension. Audiol. Res. 2021, 11, 227–243. [Google Scholar] [CrossRef] [PubMed]
- Liu, C.; Zang, Q.; Li, J.; Pan, X.; Dai, H.; Gao, W. The Effect of the Acoustic Environment of Learning Spaces on Students’ Learning Efficiency: A Review. J. Build. Eng. 2023, 79, 107911. [Google Scholar] [CrossRef]
- Doctora, A.L.S.; Perez, W.D.D.; Vasquez, A.B.; Gumasing, M.J.J. Relationship of Noise Level to the Mental Fatigue Level of Students: A Case Study during Online Classes. In Proceedings of the International Conference on Industrial Engineering and Operations Management, Rome, Italy, 2–5 August 2021; pp. 1378–1386. [Google Scholar]
- Khan, A.; Ghosh, S.K. Student Performance Analysis and Prediction in Classroom Learning: A Review of Educational Data Mining Studies. In Education and Information Technologies; Springer: Berlin/Heidelberg, Germany, 2021; Volume 26. [Google Scholar] [CrossRef]
- Shou, Z.; Xie, M.; Mo, J.; Zhang, H. Predicting Student Performance in Online Learning: A Multidimensional Time-Series Data Analysis Approach. Appl. Sci. 2024, 14, 2522. [Google Scholar] [CrossRef]
- Ismail, N.H.; Ahmad, F.; Aziz, A.A. Implementing WEKA as a Data Mining Tool to Analyze Students’ Academic Performances Using Naïve Bayes Classifier. In Proceedings of the UniSZA Postgraduate Research Conference, Kuala Terengganu, Malaysia, 7–8 September 2013; pp. 855–863. [Google Scholar] [CrossRef]
- Pandey, M.; Kumar Sharma, V. A Decision Tree Algorithm Pertaining to the Student Performance Analysis and Prediction. Int. J. Comput. Appl. 2013, 61, 1–5. [Google Scholar] [CrossRef]
- Yang, S.J.H.; Lu, O.H.T.; Huang, A.Y.Q.; Huang, J.C.H.; Ogata, H.; Lin, A.J.Q. Predicting Students’ Academic Performance Using Multiple Linear Regression and Principal Component Analysis. J. Inf. Process. 2018, 26, 170–176. [Google Scholar] [CrossRef]
- Nedeva, V.; Pehlivanova, T. Students’ Performance Analyses Using Machine Learning Algorithms in WEKA. IOP Conf. Ser. Mater. Sci. Eng. 2021, 1031, 012061. [Google Scholar] [CrossRef]
- Al-Barrak, M.A.; Al-Razgan, M. Predicting Students Final GPA Using Decision Trees: A Case Study. Int. J. Inf. Educ. Technol. 2016, 6, 528–533. [Google Scholar] [CrossRef]
- Folorunso, S.O.; Farhaoui, Y.; Adigun, I.P.; Imoize, A.L.; Awotunde, J.B. Prediction of Student’s Academic Performance Using Learning Analytics. In Artificial Intelligence, Data Science and Applications; Farhaoui, Y., Hussain, A., Saba, T., Taherdoost, H., Verma, A., Eds.; Springer Nature: Cham, Switzerland, 2024; pp. 314–325. [Google Scholar]
- Farooq, U.; Naseem, S.; Mahmood, T.; Li, J.; Rehman, A.; Saba, T.; Mustafa, L. Transforming Educational Insights: Strategic Integration of Federated Learning for Enhanced Prediction of Student Learning Outcomes. J. Supercomput. 2024. [Google Scholar] [CrossRef]
- Monteverde-Suárez, D.; González-Flores, P.; Santos-Solórzano, R.; García-Minjares, M.; Zavala-Sierra, I.; de la Luz, V.L.; Sánchez-Mendiola, M. Predicting Students’ Academic Progress and Related Attributes in First-Year Medical Students: An Analysis with Artificial Neural Networks and Naïve Bayes. BMC Med. Educ. 2024, 24, 12909. [Google Scholar] [CrossRef]
- Erdem, C.; Kaya, M. Socioeconomic Status and Wellbeing as Predictors of Students’ Academic Achievement: Evidence from a Developing Country. J. Psychol. Couns. Sch. 2023, 33, 202–220. [Google Scholar] [CrossRef]
- Dombkowski, R.; Sullivan, S.; Widenhoefer, T.; Buckland, A.; Almonroeder, T.G. Predicting First-Time National Physical Therapy Examination Performance for Graduates of an Entry-Level Physical Therapist Education Program. J. Phys. Ther. Educ. 2023, 37, 325–331. [Google Scholar] [CrossRef] [PubMed]
- Liang, G.; Jiang, C.; Ping, Q.; Jiang, X. Academic Performance Prediction Associated with Synchronous Online Interactive Learning Behaviors Based on the Machine Learning Approach. Interact. Learn. Environ. 2023, 1–16. [Google Scholar] [CrossRef]
- Akçapinar, G.; Altun, A.; Aşkar, P. Modeling Students’ Academic Performance Based on Their Interactions in an Online Learning Environment. Elem. Educ. Online 2015, 14, 815–824. [Google Scholar] [CrossRef]
- Bergen, H.A.; Martin, G.; Roeger, L.; Allison, S. Perceived Academic Performance and Alcohol, Tobacco and Marijuana Use: Longitudinal Relationships in Young Community Adolescents. Addict. Behav. 2005, 30, 1563–1573. [Google Scholar] [CrossRef] [PubMed]
- Teuber, M.; Leyhr, D.; Sudeck, G. Physical Activity Improves Stress Load, Recovery, and Academic Performance-Related Parameters among University Students: A Longitudinal Study on Daily Level. BMC Public Health 2024, 24, 598. [Google Scholar] [CrossRef] [PubMed]
- Azpiazu, L.; Antonio-Aguirre, I.; Izar-de-la-Funte, I.; Fernández-Lasarte, O. School Adjustment in Adolescence Explained by Social Support, Resilience and Positive Affect. Eur. J. Psychol. Educ. 2024, 0123456789. [Google Scholar] [CrossRef]
- Odermatt, S.D.; Weidmann, R.; Schweizer, F.; Grob, A. Academic Performance through Multiple Lenses: Intelligence, Conscientiousness, and Achievement Striving Motivation as Differential Predictors of Objective and Subjective Measures of Academic Achievement in Two Studies of Adolescents. J. Res. Pers. 2024, 109, 104461. [Google Scholar] [CrossRef]
- Jussila, J.J.; Pulakka, A.; Halonen, J.I.; Salo, P.; Allaouat, S.; Mikkonen, S.; Lanki, T. Are Active School Transport and Leisure-Time Physical Activity Associated with Performance and Wellbeing at Secondary School? A Population-Based Study. Eur. J. Public Health 2023, 33, 884–890. [Google Scholar] [CrossRef]
- Ahmed, A.; Rashidi, M.Z. Predicting Perceived Academic Performance through Interplay of Self-Efficacy and Trait Emotional Intelligence. Glob. Manag. J. Acad. Corp. Stud. 2017, 6, 152–161. [Google Scholar]
- Petrides, K.V. Technical Manual for the Trait Emotional Intelligence Questionnaires (TEIQue), 1st ed.; London Psychometric Laboratory: London, UK, 2009. [Google Scholar] [CrossRef]
- Petrides, K.V.; Furnham, A. Trait Emotional Intelligence: Psychometric Investigation with Reference to Established Trait Taxonomies. Eur. J. Pers. 2001, 15, 425–448. [Google Scholar] [CrossRef]
- Jerusalem, M.; Schwarzer, R. Self-Efficacy as a Resource Factor in Stress Appraisal Processes. In Self-Efficacy: Thought Control of Action; Hemisphere Publishing Corp: Washington, DC, USA, 1992; pp. 195–213. [Google Scholar]
- Torres, R.A.O.; Ortega-Dela Cruz, R.A. Remote Learning: Challenges and Opportunities for Educators and Students in the New Normal. Anatol. J. Educ. 2022, 7, 83–92. [Google Scholar] [CrossRef]
- Corral, L.; Fronza, I. It’s Great to Be Back: An Experience Report Comparing Course Satisfaction Surveys Before, During and After Pandemic. In Proceedings of the SIGITE 2022—Proceedings of the 23rd Annual Conference on Information Technology Education, Chicago, IL, USA, 21–24 September 2022; pp. 66–72. [Google Scholar] [CrossRef]
- Al-Ansi, A.M.; Jaboob, M.; Garad, A.; Al-Ansi, A. Analyzing Augmented Reality (AR) and Virtual Reality (VR) Recent Development in Education. Soc. Sci. Humanit. Open 2023, 8, 100532. [Google Scholar] [CrossRef]
- Hu Au, E.; Lee, J.J. Virtual Reality in Education: A Tool for Learning in the Experience Age. Int. J. Innov. Educ. 2017, 4, 215. [Google Scholar] [CrossRef]
- Liddell, T.M.; Kruschke, J.K. Analyzing Ordinal Data with Metric Models: What Could Possibly Go Wrong? J. Exp. Soc. Psychol. 2018, 79, 328–348. [Google Scholar] [CrossRef]
- Cohen, M.A.; Horowitz, T.S.; Wolfe, J.M. Auditory Recognition Memory Is Inferior to Visual Recognition Memory. Proc. Natl. Acad. Sci. USA 2009, 106, 6008–6010. [Google Scholar] [CrossRef]
Scenario | Mean AUC Score | Sd |
---|---|---|
Scenario #1 | 0.7634 | 0.0381 |
Scenario #2.1 | 0.6655 | 0.0114 |
Scenario #2.2 | 0.7602 | 0.0381 |
Scenario #3.1 | 0.7011 | 0.0116 |
Scenario #3.2 | 0.7631 | 0.0378 |
Scenario #4 | 0.7628 | 0.0377 |
Scenario #5.1 | 0.6616 | 0.0121 |
Scenario #5.2 | 0.7612 | 0.0369 |
Algorithm | Mean AUC Score | Sd | Number of Variables |
---|---|---|---|
Random Forest | 0.7466 | 0.0387 | 34 |
Gradient Boosting | 0.7538 | 0.0225 | 18 |
Random Forest (balanced) | 0.7451 | 0.0383 | 32 |
SVM (linear) | 0.7506 | 0.0182 | 28 |
Algorithm | Mean AUC Score | Sd | Number of Variables |
---|---|---|---|
Random Forest | 0.7467 | 0.0381 | 33 |
Gradient Boosting | 0.7537 | 0.0227 | 14 |
Run | Mean AUC Score | Sd |
---|---|---|
1 | 0.7827 | 0.0368 |
2 | 0.7823 | 0.0354 |
3 | 0.7819 | 0.0361 |
4 | 0.7828 | 0.0366 |
Algorithm | Accuracy | McNemar p-Value: | Positive Pred Value | Negative Pred Value | AUC Score |
---|---|---|---|---|---|
Random Forest | 0.7794 | 7.36 × 10−11 | 0.7901 | 0.7143 | 0.7614 |
Gradient Boosting | 0.7672 | 0.0015 | 0.8077 | 0.6145 | 0.7466 |
SVM (Linear) | 0.7672 | 0.0051 | 0.8109 | 0.6111 | 0.7596 |
SVM (Polynomial) | 0.7085 | 0.9335 | 0.8000 | 0.4822 | 0.7091 |
SVM (Radial) | 0.7874 | 1.36 × 10−7 | 0.8049 | 0.7023 | 0.7462 |
Algorithm | Accuracy | McNemar p-Value: | Positive Pred Value | Negative Pred Value | AUC Score |
---|---|---|---|---|---|
Random Forest | 0.7287 | 2.38 × 10−20 | 0.7386 | 0.5806 | 0.6770 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Puyana-Romero, V.; Larrea-Álvarez, C.M.; Díaz-Márquez, A.M.; Hernández-Molina, R.; Ciaburro, G. Developing a Model to Predict Self-Reported Student Performance during Online Education Based on the Acoustic Environment. Sustainability 2024, 16, 4411. https://doi.org/10.3390/su16114411
Puyana-Romero V, Larrea-Álvarez CM, Díaz-Márquez AM, Hernández-Molina R, Ciaburro G. Developing a Model to Predict Self-Reported Student Performance during Online Education Based on the Acoustic Environment. Sustainability. 2024; 16(11):4411. https://doi.org/10.3390/su16114411
Chicago/Turabian StylePuyana-Romero, Virginia, Cesar Marcelo Larrea-Álvarez, Angela María Díaz-Márquez, Ricardo Hernández-Molina, and Giuseppe Ciaburro. 2024. "Developing a Model to Predict Self-Reported Student Performance during Online Education Based on the Acoustic Environment" Sustainability 16, no. 11: 4411. https://doi.org/10.3390/su16114411
APA StylePuyana-Romero, V., Larrea-Álvarez, C. M., Díaz-Márquez, A. M., Hernández-Molina, R., & Ciaburro, G. (2024). Developing a Model to Predict Self-Reported Student Performance during Online Education Based on the Acoustic Environment. Sustainability, 16(11), 4411. https://doi.org/10.3390/su16114411