Using Artificial Intelligence to Predict Students’ Academic Performance in Blended Learning
Abstract
:1. Introduction
2. Literature Review
3. Mathematical Modeling
3.1. Firefly Algorithm
3.2. ANN
4. Research Methodology and Framework
5. Results and Discussions
- Multicollinearity tests: Table 2 shows the correlation between input and output variables. The variables are removed because of multicollinearity between input variables. The “no multicollinearity” refers to the absence of perfect multicollinearity, which is an exact (non-stochastic) linear relation among the input values. According to the conducted tests, we removed some variables from input variables that are strongly related to other input variables. The result shows weak correlations between input variables because it is less than 50%.
- Furthermore, there is a strong correlation between assignments and mid-exam. There are strong negative correlations between attendance and final exams.
- The first plot in Figure 4, “Residuals vs. Fitted,” aids in evaluating Linearity and Homoscedasticity: If the residuals (points on the plot) are primarily spread around the zero line, zero red line (fitted line), linearity is present. The x-axis is the fitted value, and the y-axis is the residual that refers to the different distance between the fitted value in the red line and observation; the residuals are said to be normally distributed if the red trend line is approximately flat and near zero. Homoscedasticity refers to the absence of a clear pattern in the residuals. This is also referred to as a residual distribution. The second plot is a scatterplot with two sets of quantiles against one another. It is called a Q-Q plot (or quantile–quantile plot). Plotting the theoretical quantiles of the normal distribution on the x-axis and the quantiles of the residual distribution on the y-axis allows you to determine if the residuals are normally distributed. It is acceptable to infer that the residuals follow a normal distribution if the Q-Q plot forms a diagonal line. As we can see, this is generally true for the observed values. By looking at the scale-location plot, sometimes referred to as the spread-location plot, the third plot, this assumption may be verified. This graphic demonstrates if residuals are distributed similarly throughout the predictor ranges. A horizontal line with evenly spaced points is ideal. This is the case in the scenario we provide. The final plot, also known as the residuals vs. leverage plot, is a diagnostic diagram that shows which observations in a regression model are most important. Within the plot, each observation from the dataset is represented by a single point. Each point’s leverage is shown on the x-axis, and its standardized residual is shown on the y-axis. Leverage measures how much the regression model’s coefficients would change if particular datasets were left out. High-leverage observations have a significant impact on the regression model’s coefficients. The coefficients of the model will change dramatically if we remove these observations. The residual is the uniform difference between a value that was predicted and the observation’s actual value. Any site in this plot outside of Cook’s range (the red dashed lines) is regarded as a significant finding. Any position in this plot outside Cook’s distance (the red dashed lines) is regarded as a significant finding. In our model, there are no influential points. In the end, the least-squares estimation is significantly based on previous assumptions.
- OLS, fixed effects, and random-effects models for input variables issues: A statistical approach used to analyze the relationship between a single output variable and input variables is multiple regression. The ordinary least square (OLS) aims to use the input variables known to predict the value of the single output value by their values. The weights are weighed for each predictor value, which denotes their relative contribution to the overall prediction.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sun, P.-C.; Tsai, R.J.; Finger, G.; Chen, Y.-Y.; Yeh, D. What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
- Olafsen, R.N.; Cetindamar, D. E-Learning in a Competitive Firm Setting. In Innovations in Education and Teaching International; Taylor & Francis: Abingdon, UK, 2005; Volume 42, pp. 325–335. [Google Scholar]
- Masie, E. What Is the Meaning of the E in E-Learning. In ASTD Handbook for Workplace Learning Professionals; Association for Talent Development: Alexandria, VA, USA, 2008; p. 227. [Google Scholar]
- Gabai, S.; Ornager, S. Smart e-Learning Through Media and Information Literacy. In Proceedings of the Fourth TCU International e- Learning Conference: Smart Innovations in Education & Lifelong Learning, Bangkok, Thailand, 14 June 2012. [Google Scholar]
- Hu, P.J.-H.; Hui, W. Examining the role of learning engagement in technology-mediated learning and its effects on learning effectiveness and satisfaction. Decis. Support Syst. 2012, 53, 782–792. [Google Scholar] [CrossRef]
- Wolbrink, T.A.; Burns, J.P. Internet-based learning and applications for critical care medicine. J. Intensive Care Med. 2012, 27, 322–332. [Google Scholar] [CrossRef] [PubMed]
- Crawford, J.; Butler-Henderson, K.; Rudolph, J.; Malkawi, B.; Glowatz, M.; Burton, R.; Magni, P.; Lam, S. COVID-19: 20 countries’ higher education intra-period digital pedagogy responses. J. Appl. Learn. Teach. 2020, 3, 1–20. [Google Scholar]
- Kaup, S.; Jain, R.; Shivalli, S.; Pandey, S.; Kaup, S. Sustaining academics during COVID-19 pandemic: The role of online teaching-learning. Indian J. Ophthalmol. 2020, 68, 1220–1221. [Google Scholar] [CrossRef]
- Shailaja, J.; Sridaran, R. Taxonomy of E-Learning Challenges and an Insight to Blended Learning. In Proceedings of the 2014 International Conference on Intelligent Computing Applications, Coimbatore, India, 6–7 March 2014; pp. 310–314. [Google Scholar]
- Almaiah, M.A.; Al-Khasawneh, A.; Althunibat, A. Exploring the critical challenges and factors influencing the E-learning system usage during COVID-19 pandemic. Educ. Inf. Technol. 2020, 25, 5261–5280. [Google Scholar] [CrossRef]
- Al-araibi, A.A.M.; Mahrin, M.N.R.B.; Yusoff, R.C.M. Technological aspect factors of E-learning readiness in higher education institutions: Delphi technique. Educ. Inf. Technol. 2019, 24, 567–590. [Google Scholar] [CrossRef]
- Jusoff, K.; Khodabandelou, R. Preliminary study on the role of social presence in blended learning environment in higher education. Int. Educ. Stud. 2009, 2, 79–83. [Google Scholar] [CrossRef]
- Rasheed, R.A.; Kamsin, A.; Abdullah, N.A. Challenges in the online component of blended learning: A systematic review. Comput. Educ. 2020, 144, 103701. [Google Scholar] [CrossRef]
- Singh, H. Building Effective Blended Learning Programs. In Challenges and Opportunities for the Global Implementation of E-Learning Frameworks; IGI Global: Hershey, PA, USA, 2021; pp. 15–23. [Google Scholar]
- Dangwal, K.L. Blended learning: An innovative approach. Univers. J. Educ. Res. 2017, 5, 129–136. [Google Scholar]
- Albreiki, B.; Zaki, N.; Alashwal, H. A systematic literature review of student’performance prediction using machine learning techniques. Educ. Sci. 2021, 11, 552. [Google Scholar] [CrossRef]
- Lykourentzou, I.; Giannoukos, I.; Mpardis, G.; Nikolopoulos, V.; Loumos, V. Early and dynamic student achievement prediction in e-learning courses using neural networks. J. Am. Soc. Inf. Sci. Technol. 2009, 60, 372–380. [Google Scholar] [CrossRef]
- El Aissaoui, O.; El Alami El Madani, Y.; Oughdir, L.; El Allioui, Y. A fuzzy classification approach for learning style prediction based on web mining technique in e-learning environments. Educ. Inf. Technol. 2019, 24, 1943–1959. [Google Scholar] [CrossRef]
- Viloria, A.; Angulo, M.G.; Kamatkar, S.J.; Hoz–Hernandez, J.D.L.; Guiliany, J.G.; Bilbao, O.R.; Hernandez, -P.H. Prediction Rules in E-Learning Systems Using Genetic Programming; Springer: Singapore, 2020; pp. 55–63. [Google Scholar]
- Novais, P.; Gonçalves, F.; Durães, D. Forecasting Student’s Preference in E-learning Systems; Springer: Cambridge, UK, 2018; pp. 198–203. [Google Scholar]
- Yang, X.-S.; Deb, S.; Fong, S. Accelerated Particle Swarm Optimization and Support Vector Machine for Business Optimization and Applications. In International Conference on Networked Digital Technologies; Springer: Berlin/Heidelberg, Germany, 2011; pp. 53–66. [Google Scholar]
- Koziel, S.; Yang, X.-S. Computational Optimization, Methods and Algorithms; Springer: Berlin/Heidelberg, Germany, 2011. [Google Scholar]
- Tilahun, S.L.; Ngnotchouye, J.M.T.; Hamadneh, N.N. Continuous versions of firefly algorithm: A review. Artif. Intell. Rev. 2019, 51, 445–492. [Google Scholar] [CrossRef]
- Chiroma, H.; Gital, A.Y.U.; Rana, N.; Abdulhamid, S.I.M.; Muhammad, A.N.; Umar, A.Y.; Abubakar, A.I. Nature inspired Meta-heuristic Algorithms for Deep Learning: Recent Progress and Novel Perspective. In Advances in Computer Vision; Springer: Cham, Switzerland, 2019; pp. 59–70. [Google Scholar]
- Shah, P.; Sekhar, R.; Kulkarni, A.J.; Siarry, P. Metaheuristic Algorithms in Industry 4.0. CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- Ghorbani, M.A.; Deo, R.C.; Karimi, V.; Kashani, M.H.; Ghorbani, S. Design and implementation of a hybrid MLP-GSA model with multi-layer perceptron-gravitational search algorithm for monthly lake water level forecasting. Stoch. Environ. Res. Risk Assess. 2019, 33, 125–147. [Google Scholar] [CrossRef]
- Sayed, M.; Baker, F. E-Learning optimization using supervised artificial neural-network. J. Softw. Eng. Appl. 2015, 8, 26. [Google Scholar] [CrossRef]
- Paul, J.; Jefferson, F. A comparative analysis of student performance in an online vs. face-to-face environmental science course from 2009 to 2016. Front. Comput. Sci. 2019, 2019, 7. [Google Scholar] [CrossRef]
- Lau, E.; Sun, L.; Yang, Q. Modelling, prediction and classification of student academic performance using artificial neural networks. SN Appl. Sci. 2019, 1, 1–10. [Google Scholar] [CrossRef]
- Khasanah, A.U. A Comparative Study to Predict Student’s Performance Using Educational Data Mining Techniques. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2017; p. 012036. [Google Scholar]
- Maheswari, S. A review on predicting student performance using deep learning technique. Tierärztliche Prax. 2020, 40, 766–776. [Google Scholar]
- Sultana, J.; Rani, M.U.; Farquad, M. Student’s performance prediction using deep learning and data mining methods. Int. J. Recent Technol. Eng. 2019, 8, 1018–1021. [Google Scholar]
- Oyedeji, A.O.; Salami, A.M.; Folorunsho, O.; Abolade, O.R. Analysis and prediction of student academic performance using machine learning. J. Inf. Technol. Comput. Eng. 2020, 4, 10–15. [Google Scholar] [CrossRef]
- Ibrahim, Z.; Rusli, D. Predicting students’ academic performance: Comparing artificial neural network, decision tree and linear regression. In Proceedings of the 21st Annual SAS Malaysia Forum, 5th September, Kuala Lumpur, Malaysia, 5 September 2007. [Google Scholar]
- Mandal, S.; Saha, G.; Pal, R.K. Neural network training using firefly algorithm. Glob. J. Adv. Eng. Sci. 2015, 1, 7–11. [Google Scholar]
- Xin-She, Y. Firefly algorithm. Nat.-Inspired Metaheuristic Algorithms 2008, 20, 79–90. [Google Scholar]
- Kumar, V.; Kumar, D. A systematic review on firefly algorithm: Past, present, and future. Arch. Comput. Methods Eng. 2021, 28, 3269–3291. [Google Scholar] [CrossRef]
- Hamadneh, N.N.; Khan, W.A.; Khan, I. Second Law Analysis and Optimization of Elliptical Pin Fin Heat Sinks Using Firefly Algorithm. CMC-Comput. Mater. Contin. 2020, 65, 1015–1032. [Google Scholar] [CrossRef]
- Hamadneh, N.; Khan, W.; Tilahun, S. Optimization of microchannel heat sinks using prey-predator algorithm and artificial neural networks. Machines 2018, 6, 26. [Google Scholar] [CrossRef] [Green Version]
- Hamadneh, N. Dead Sea Water Levels Analysis Using Artificial Neural Networks and Firefly Algorithm. Int. J. Swarm Intell. Res. 2020, 11, 1–11. [Google Scholar] [CrossRef]
- Hamadneh, N.N.; Khan, W.A.; Ashraf, W.; Atawneh, S.H.; Khan, I.; Hamadneh, B.N. Artificial neural networks for prediction of covid-19 in Saudi Arabia. Comput. Mater. Contin. 2021, 66, 2787–2796. [Google Scholar] [CrossRef]
- Konakoglu, B. Prediction of geodetic point velocity using MLPNN, GRNN, and RBFNN models: A comparative study. Acta Geod. Geophys. 2021, 56, 271–291. [Google Scholar] [CrossRef]
- Mefoued, S. Assistance of Knee Movements Using an Actuated Orthosis through Subject’s Intention Based on MLPNN Approximators. In Proceedings of the 2013 International Joint Conference on Neural Networks (IJCNN), Dallas, TX, USA, 4–9 August 2013; pp. 1–6. [Google Scholar]
- Triola, M.F.; Goodman, W.M.; Law, R.; Labute, G. Elementary Statistics. Pearson/Addison-Wesley Reading: London, UK, 2006. [Google Scholar]
- Alkhawaldeh, A.A.; Jaber, J.J.; Boughaci, D.; Ismail, N. A novel investigation of the influence of corporate governance on firms’ credit ratings. PLoS ONE 2021, 16, e0250242. [Google Scholar] [CrossRef]
- Ullah, S.; Akhtar, P.; Zaefarian, G. Dealing with endogeneity bias: The generalized method of moments (GMM) for panel data. Ind. Mark. Manag. 2018, 71, 69–78. [Google Scholar] [CrossRef]
- Wintoki, M.B.; Linck, J.S.; Netter, J.M. Endogeneity and the dynamics of internal corporate governance. J. Financ. Econ. 2012, 105, 581–606. [Google Scholar] [CrossRef]
- Greene, W.H. The Econometric Approach to Efficiency Analysis. In The Measurement of Productive Efficiency and Productivity Growth; Oxford University Press: Oxford, UK, 2008; pp. 92–250. [Google Scholar]
- Lu, O.H.; Huang, A.Y.; Huang, J.C.; Lin, A.J.; Ogata, H.; Yang, S.J. Applying learning analytics for the early prediction of Students’ academic performance in blended learning. J. Educ. Technol. Soc. 2018, 21, 220–232. [Google Scholar]
- Xu, Z.; Yuan, H.; Liu, Q. Student performance prediction based on blended learning. Proc. IEEE Trans. Educ. 2020, 64, 66–73. [Google Scholar] [CrossRef]
- Raga, R.C.; Raga, J.D. Early Prediction of Student Performance in Blended Learning Courses Using Deep Neural Networks. In Proceedings of the 2019 International Symposium on Educational Technology (ISET), Hradec Kralove, Czech Republic, 2–4 July 2019; pp. 39–43. [Google Scholar]
- Yang, S.J.; Lu, O.H.; Huang, A.Y.; Huang, J.C.; Ogata, H.; Lin, A.J. Predicting students’ academic performance using multiple linear regression and principal component analysis. J. Inf. Process. 2018, 26, 170–176. [Google Scholar] [CrossRef]
- Kaushik, A.; Tayal, D.K.; Yadav, K.; Kaur, A. Integrating firefly algorithm in artificial neural network models for accurate software cost predictions. J. Softw. Evol. Process. 2016, 28, 665–688. [Google Scholar] [CrossRef]
- Bisoyi, S.K.; Pal, B.K. Optimization of blasting parameters in opencast mine with the help of firefly algorithm and deep neural network. Sādhanā 2022, 47, 1–11. [Google Scholar] [CrossRef]
Variables | Mean | Std. Dev. | Min. | Max. | Skewness | Kurtosis | ||
---|---|---|---|---|---|---|---|---|
Statistic | Std. Error | Statistic | Std. Error | |||||
Final exams | 35 | 15.111 | 0 | 50 | −1.051 | 0.159 | 0.141 | 0.317 |
Mid-exams | 16.06 | 6.227 | 0 | 25 | −0.674 | 0.159 | 1.389 | 0.317 |
Assignments | 21.2 | 6.253 | 0 | 25 | −2.349 | 0.159 | 4.729 | 0.317 |
Attendance | 2.71 | 2.942 | 0 | 8 | 1.867 | 0.159 | 4.557 | 0.317 |
Virtual/Face | 0.49 | 0.501 | 0 | 1 | 0.034 | 0.159 | −2.016 | 0.317 |
Final Exam | Virtual/Face | Mid-Exam | Assignment | Attendance | |
---|---|---|---|---|---|
Final exam | 1 | 0.474 ** | 0.573 ** | 0.633 ** | −0.646 ** |
Virtual/Face | 1 | 0.132 * | 0.059 | −0.06 | |
Mid-exam | 1 | 0.494 ** | −0.476 ** | ||
Assignment | 1 | −0.484 ** | |||
Attendance | 1 |
Variables | OLS | Fixed Effect | Random Effect | |||
---|---|---|---|---|---|---|
Coefficient | Std. Err. | Coefficient | Std. Err. | Coefficient | Std. Error | |
(Intercept) | 10.1592 * | 3.3144 | 9.7678 * | 3.3421 | ||
Virtual/Face | 12.4691 * | 1.0986 | 9.3633 * | 11.0740 | 12.4444 * | 1.1963 |
Mid exam | 0.4303 * | 0.1104 | 0.4163 * | 0.1518 | 0.4298 * | 0.1096 |
Assignment | 0.7639 * | 0.1167 | 0.898 * | 0.1700 | 0.7847 * | 0.1169 |
Attendance | −1.6207 * | 0.2622 | −1.6325 | 0.4080 | −1.6334 * | 0.2643 |
Observations | 234 | 234 | 234 | |||
R-square | 0.7015 | 0.6420 | 0.6923 | |||
Adjusted R-square | 0.6963 | 0.2485 | 0.6869 | |||
F-statistic/Chisq | 134.6000 * | 49.7612 * | 518.514 * |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hamadneh, N.N.; Atawneh, S.; Khan, W.A.; Almejalli, K.A.; Alhomoud, A. Using Artificial Intelligence to Predict Students’ Academic Performance in Blended Learning. Sustainability 2022, 14, 11642. https://doi.org/10.3390/su141811642
Hamadneh NN, Atawneh S, Khan WA, Almejalli KA, Alhomoud A. Using Artificial Intelligence to Predict Students’ Academic Performance in Blended Learning. Sustainability. 2022; 14(18):11642. https://doi.org/10.3390/su141811642
Chicago/Turabian StyleHamadneh, Nawaf N., Samer Atawneh, Waqar A. Khan, Khaled A. Almejalli, and Adeeb Alhomoud. 2022. "Using Artificial Intelligence to Predict Students’ Academic Performance in Blended Learning" Sustainability 14, no. 18: 11642. https://doi.org/10.3390/su141811642
APA StyleHamadneh, N. N., Atawneh, S., Khan, W. A., Almejalli, K. A., & Alhomoud, A. (2022). Using Artificial Intelligence to Predict Students’ Academic Performance in Blended Learning. Sustainability, 14(18), 11642. https://doi.org/10.3390/su141811642