Mobile Learning and Its Effect on Learning Outcomes and Critical Thinking: A Systematic Review
Abstract
:1. Introduction
2. Background
2.1. Learning Outcomes
2.2. Critical Thinking
3. Materials and Methods
3.1. Document Selection and Search Approach
- Type of Document: Articles as they are the primary means of scientific communication.
- WoS Index: Social Sciences Citation Index (SSCI), Science Citation Index-Expanded (SCI-E), and Arts & Humanities Citation Index (A&HCI). Emerging Sources Citation Index (ESCI) is excluded as it mainly incorporates journals of regional importance [38].
- Language: English is recognized as the language of international science, and continues to dominate scientific activities to this day [39].
- Time period: 2015–2024. The reason for this is because m-learning is a field that is continuously and rapidly evolving, and findings from older work may no longer be applicable today. This ensures the relevance and currency of our research [5].
- Study type and participants: Empirical (primary/participatory) research focusing on undergraduate students.
- Context: Research should not have been conducted during the COVID-19 pandemic since, in general, the use of technologies in teaching was mandatory and sudden for remote course delivery [40,41], which may have consequences for design, acceptance by students, and evaluation of the real effects (this condition does not apply to distance education universities).
- Concerning RQ1, studies that objectively measure the achievement of learning outcomes are included (self-perceived achievements are excluded). As an inclusion criterion, it is also defined that those studies with experimental or quasi-experimental designs must have a control group (same cohort) to measure the real effects of m-learning initiatives.
- Concerning RQ2, studies that objectively (e.g., test) and subjectively (opinion, self-reflection, and others) measure changes in the levels of critical thinking are included.
3.2. Concepts Associated with the Analysis
3.3. Data Analysis
4. Results
4.1. Learning Outcomes
4.2. Critical Thinking
5. Discussion
5.1. M-Learning and Learning Outcomes
5.2. M-Learning and Critical Thinking
6. Conclusions and Recommendations
7. Limitations and Future Studies
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Main Focus | Author(s) | Participants * | Data Collection ** | Main Findings | Conclusions | |
---|---|---|---|---|---|---|
Pre-Test | Post-Test *** | |||||
Adaptive mobile learning systems | Ennouamani et al. [44] | 64 s-year students in the field of Informatics at Ibn Zohr University (Morocco) | Pre-tests, intermediate tests and final tests to evaluate the students’ knowledge of JAVA programming. | CG 0 ≤ score < 40%: 43.75% 40 ≤ score < 80%: 56.25% score ≥ 80%: 0% EG 0 ≤ score < 40%: 53.13% 40 ≤ score < 80%: 46.88% score ≥ 80%: 0% | CG 0 ≤ score < 40%: 31.25% 40 ≤ score < 80%: 62.50% score ≥ 80%: 6.25% EG 0 ≤ score < 40%: 6.25% 40 ≤ score < 80%: 25.00% score ≥ 80%: 68.75% | There is a noticeable difference between the two groups after the final test, compared to the beginning of the experiment. The initiative contributes to improving the students’ level of knowledge. |
Chatbot | Vázquez-Cano et al. [45] | 103 students at the Universidad Nacional de Educación a Distancia (Spain) | Initial scoring test in Spanish similar to the entrance exam and final scoring test with more complex content. | Mean CG: 23.57 Mean EG: 22.90 | Mean CG: 28.47 Mean EG: 32.13 | Significant differences (p < 0.01) in final test scores between CG and EG. |
Cloud-based m-learning | Chang et al. [46] | 123 students from four different classes at a university in Taipei (Taiwan) | Pre-test (scores obtained from a previous task) and post-test. | - | Usefulness Mean CG: 5.02 Mean EG: 6.79 Novelty Mean CG: 4.00 Mean EG: 5.25 Creative performance Mean CG: 6.42 Mean EG: 7.24 | The EG scored significantly higher (p < 0.01) than the CG in terms of overall creative performance, as well as the usefulness and novelty of the products evaluated. |
Chang et al. [47] | 62 students from two different classes at a university in Taipei (Taiwan) | Pre-test (scores obtained from a previous assignment) and post-test. | - | Total creativity Mean CG: 23.14 Mean EG: 29.49 Novelty Mean CG: 4.12 Mean EG: 5.63 Value Mean CG: 4.62 Mean EG: 4.83 Elaboration Mean CG: 3.67 Mean EG: 5.32 | A significant difference in overall creative performance was observed between the groups (p < 0.01). The use of cloud-based mobile learning had a positive impact on students. | |
Educational application/platform with multiple functionalities | Arain et al. [25] | 212 students on a Communication Skills course (Pakistan) | Pre-test and post-test to evaluate learning achievements. | Mean CG: 7.83 Mean EG: 7.75 | Mean CG: 13.75 Mean EG: 16.69 | Learning achievement improved in both groups. However, post-test averages indicate that the EG performed better compared to the CG (p < 0.01). |
Chung et al. [48] | 119 students on the Electric Welding Practice course, Department of Engineering Science. | Pre-test and post-test on knowledge of electric welding technology and safety. | - | Technology Mean CG: 11.08 Mean EG: 11.91 Safety Mean CG: 11.20 Mean EG: 12.15 | The EG achieved greater learning effects than the CG in knowledge of electric welding technology and safety after the intervention (p < 0.05). | |
Fonseca et al. [49] | 84 students who sat for the final exam of the organic chemistry course at the University of Algarve (Portugal). | Final exam grades. | - | - | The scores of students who used the app were higher than the scores of those who did not use it (p < 0.05). | |
Hu & Wang [50] | 106 students on an ethnomusicology course at Zhejiang University (China) | Final evaluation and formative assessment to measure students’ knowledge. | - | 2018/2019 cohort Mean CG: 58.1 Mean EG: 71.9 2019/2020 cohort Mean CG: 54.5 Mean EG: 62.5 | EG students showed a higher mean score on the final evaluation compared to CG in both cohorts. | |
Kayaalp & Dinc [51] | 30 students from the Computer Engineering Department of Muş Alparslan University (Turkey) | Average final exam grades. | - | Mean CG: 51.91 Mean EG: 68.71 | The application positively impacted students’ knowledge of algorithms. | |
Mergany et al. [52] | 67 dental students from two universities (Sudan) | Pre-test and post-test to assess knowledge of dental surgical forceps. | Mean CG: 5.87 Mean EG: 5.94 | Mean CG: 5.42 Mean EG: 8.34 | The mean scores of the pre-test and post-test of the EG revealed significant differences (p < 0.01). | |
Oyelere et al. [53] | 142 students from the computer science program of the Modibbo Adama University of Technology Yola (Nigeria) | Pre-quiz and post-quiz to assess knowledge of systems analysis and design. | Mean CG: 13.83 Mean EG: 13.80 | Mean CG: 44.75 Mean EG: 50.65 | The EG outperformed the CG in post-quiz performance scores (p < 0.01). | |
Tezer & Çimşir [54] | 70 students from Giresun University (Turkey) | Pre-test and post-test to assess web design knowledge. | Mean CG: 41.35 Mean EG: 40.21 | Mean CG: 60.10 Mean EG: 68.30 | A significant difference (p < 0.01) is observed in the means of the final test. The application is effective in terms of increasing academic performance. | |
Zhonggen et al. [55] | 340 students enrolled in College English IV in the university (China) | TOEFL test. Four testing items: reading comprehension, listening comprehension, speaking, and writing. | - | Mean CG: 86.31 Mean EG: 87.49 | Proficiency in English as a foreign language in the EG is significantly higher than in the CG (p < 0.05). | |
Zhu [56] | 119 Offshore Oil and Gas Engineering students (China) | Final test aimed at measuring theoretical knowledge and basic applications. | - | The average score and excellence rate of students who used the application are 14% and 600% higher than those who did not use it. | The application helps students understand theoretical concepts more clearly. | |
Immersive experiences | Chin et al. [13] | 63 students from a liberal arts course at Aletheia University (Taiwan) | Pre-test and post-test to measure the level of knowledge about historic buildings. | No significant difference (p > 0.05) between the two groups. | Comprehension Mean CG: 14.69 Mean EG: 27.93 Average score Mean CG: 58.12 Mean EG: 73.98 | Augmented reality (AR) technology improved the learning performance of students in the EG, especially in terms of comprehension and assimilation of the instructional content (p < 0.01). |
Chin & Wang [57] | 72 students from two different classes of the cultural heritage course of the Aletheia University (Taiwan) | Pre-test and post-test to gauge knowledge levels regarding historic buildings. | Retention Mean CG: 28.34 Mean EG: 26.81 Total score Mean CG: 38.60 Mean EG: 36.86 | Retention Mean CG: 40.68 Mean EG: 45.17 Total score Mean CG: 70.05 Mean EG: 76.54 | EG students achieved better learning outcomes on the post-test compared to CG (p < 0.05). The analysis also revealed significant differences in retention questions (p < 0.01) between the two groups. | |
Moro et al. [58] | 38 students from a biomedical or health sciences program (Australia) | Pre-test and post-test to measure students’ level of knowledge of brain anatomy and physiology. | Mobile-based AR Mean: 3.84 HoloLens Mean: 4.16 | Mobile-based AR Mean: 11.05 HoloLens Mean: 11.89 | There was no significant difference observed in the test scores between the two groups (p > 0.05). The use of mobile device-based AR can be just as good at helping to improve outcomes as other more complex technological equipment. | |
Briz-Ponce et al. [59] | 30 students from the Faculty of Medicine at the University of Salamanca (Spain) | Pre-test and post-test to measure participants’ knowledge of anatomy. | Mean CG: 2.67 Mean EG: 2.20 | Mean CG: 2.4 Mean EG: 3.6 | The performance of students using the interactive 3D application was superior to those using traditional methods (p < 0.05). | |
Chang et al. [60] | 100 students from a nursing school (Taiwan) | Pre-test and post-tests were conducted on the nursing skills required for nasotracheal suctioning and medication administration, in addition to psychomotor assessments. | Knowledge Mean CG: 69.3 Mean EG: 69.4 | Knowledge Mean CG: 74.4 Mean EG: 80.9 Skill: Medication administration Mean CG: 5.29 Mean EG: 6.33 Skill: Nasotracheal suctioning Mean CG: 5.79 Mean EG: 6.70 | Learning clinical practice through a virtual simulation environment in a mobile application would allow students to obtain higher scores in knowledge and skills (p < 0.01). . | |
Hanson et al. [61] | 225 students taking a pharmacology course (Australia) | Pre-test and post-test to measure students’ knowledge. | CAVE2 Mean: 2.43 Handheld device Mean: 2.26 | CAVE2 Mean: 3.83 Handheld device Mean: 3.75 | No evidence was found that students experienced disadvantages in knowledge acquisition when using either method. In both cases, the average increase in scores was statistically significant (p < 0.01). | |
Interactive tool versus traditional text-books | Jeno et al. [15] | 71 students on a mandatory biology course (Norway) | Test to measure student achievement levels in species identification. | - | Mean CG: 5.95 Mean EG: 7.78 | Students who used the app achieved higher test scores compared to those who used the textbook, with significant differences observed between the groups (p < 0.05). |
Arauz et al. [62] | 300 students enrolled in a systemic pathology course at Ross University (Saint Kitts and Nevis) | Neuropathology test scores, where questions were classified as RQ, NRQ, and NNQ according to their linkage to iBook content. The final overall grade for the course was also evaluated. | - | RQ (neuropathology iBook related questions) Mean non-users: 69.88 Mean users: 66.95 Course final grade Mean non-users: 83.38 Mean users: 81.55 | Students’ use of the interactive iBook did not have a significant impact on their neuropathology test scores or their final course grade. | |
Game-based learning | de la Peña Esteban et al. [63] | 90 engineering students (2 cohorts) from Madrid Open University (Spain) | Overall academic performance of the Industrial Systems Optimization Techniques course. | - | 2017/2018 cohort (available game) Mean of those who did not use the game: 57.7 Mean of those who did use the game: 68.7 | Students who used the game scored 11% higher on the test. |
Ramírez-Donoso et al. [64] | 294 students on the online course “Systems Programming” at Universidad Carlos III de Madrid (Spain) | Final course grades. | - | Number of videos watched Average of those who did not use the app: 51.2 Average of those who use the app: 127.75 Number of exercise-problems solved Average of those who did not use the app: 36.46 Average of those who use the app: 88.87 | The use of the mobile application correlates with a higher consumption of didactic resources and a higher pass rate in the course. | |
Wilkinson et al. [65] | 246 students (2 cohorts) of anatomy at Middlesex University (UK) | Four evaluations throughout the course. A1: Anatomical microstructure A2: Lower limb A3: Upper limb A4: Trunk and nervous system | - | They did not use the game (NG) Mean A3: 46.9 Mean A3-A2: −6.3 They used the game (G) Mean A3: 57.2 Mean A3-A2: 3.6 | There were statistically significant differences in the mean A3 test score between the groups (p < 0.01). In addition, in group G there was a significant increase in A2 to A3 scores (p < 0.05), while in group NG there was a significant decrease (p < 0.05). | |
Troussas et al. [66] | 80 students of Computer Science (Greece) | Average grades obtained in C# programming course. | - | Class A (app with peer collaboration and advice generator) Mean: 6.23 Class B (Conventional app) Mean: 7.75 | A significant difference was found between the average grades of the groups (p < 0.01). However, students who used the enhanced app with new features did not outperform those who used the conventional app. | |
Mobile instant mes-saging | Andújar-Vaca & Cruz-Martínez [67] | 80 students on a B1 level English course at the University of Almeria (Spain) | English speaking test at the beginning and end of the course. | Pronunciation Mean CG: 1.23 Mean EG: 1.10 Grammar Mean CG: 14.25 Mean EG: 21 Vocabulary Mean CG: 7.70 Mean EG: 9.85 Fluency Mean CG: 4.70 Mean EG: 4.88 Comprehension Mean CG: 10.65 Mean EG: 13.9 | Pronunciation Mean CG: 1.80 Mean EG: 2.25 Grammar Mean CG: 15 Mean EG: 23.40 Vocabulary Mean CG: 12.4 Mean EG: 16.8 Fluency Mean CG: 5.45 Mean EG: 8.65 Comprehension Mean CG: 10.4 Mean EG: 16.47 | The post-test revealed statistically significant differences (p < 0.05) in the five dimensions evaluated between the groups. |
So [68] | 61 students at a teacher training institute (Hong Kong) | Pre-test and post-test to evaluate students’ knowledge of the course. | Mean CG: 18.23 Mean EG: 18.16 | Mean CG: 27.45 Mean EG: 31.01 | Participants in the EG with the WhatsApp intervention performed better than those in the CG (p < 0.05). | |
Wang et al. [69] | 55 freshman students (China) | Pre-test, post-test and delayed post-test to measure students’ English vocabulary level. | Mean CG: 29.00 Mean EG: 20.11 | Post-test Mean CG: 45.30 Mean EG: 52.64 Delayed post-test Mean CG: 42.70 Mean EG: 46.96 | The EG had better results (p < 0.05) in the post-test. There were no significant differences in the delayed post-test. | |
Mobile peer assessment | Liu et al. [70] | 44 students (China) | Pre-test and post-test to measure students’ vocal music ability. | - | Mean CG: 16.60 Mean EG: 17.36 | There were significant differences (p < 0.01) in vocal musical performance between the two groups. |
Mobile-based word cards | Li & Hafner [71] | 85 medical students (China) | Pre-test and post-test to measure the level of English vocabulary knowledge. | Total scores Mean CG: 28.62 Mean EG: 28.43 | Total scores Mean CG: 60.51 Mean EG: 69.57 | Significant difference (p < 0.05) in mean gain scores between the two groups. |
Mobile-supported learning analytics interventions | Cavus Ezin & Yilmaz [72] | 49 students on a Basic Information Technology I course (Europe) | Pre-test and post-test to measure students’ knowledge of the fundamental contents of the course. | Mean CG: 22.37 Mean EG: 25.44 | Mean CG: 38.25 Mean EG: 43.16 | EG obtained higher scores on the post-test compared to CG students (p < 0.01). |
Mobile-supported reflective learning | Martin & Ertzberger [73] | 103 students on pre-service teacher preparation courses (USA) | Pre-test and post-test to measure students’ knowledge of artistic content. | Mean No reflection: 2.61 Self-guided reflection: 2.63 Reflection with virtual expert: 3.15 | Mean No reflection: 5.13 Self-guided reflection: 6.42 Reflection with virtual expert: 7.85 | Significant differences in post-test scores were observed among the three groups (p < 0.05). The importance of a virtual expert in here and now m-learning environment is highlighted. |
Mobile-supported task-based teaching | Zheng et al. [74] | 60 freshman students majoring in educational technology, psychology, environment, chemistry, or management science. | Pre-test (university entrance exam) and post-test (developed by teachers) to assess students’ English proficiency. | Mean CG: 137.16 Mean EG: 138.40 | Mean CG: 70.33 Mean EG: 82.83 | The learning achievements of the EG were significantly higher than those of the CG (p < 0.01). |
Fang et al. [75] | 66 students of English as a foreign language (Taiwan) | Midterm scores and written post-test to measure students’ English language achievement. | - | Vocabulary Mean CG: 4.00 Mean EG: 6.17 Conversation comprehension Mean CG: 6.67 Mean EG: 8.47 | The EG outperformed the CG on vocabulary and conversation comprehension tests (p < 0.01). | |
Location-based contextual learning systems | Chin et al. [12] | 62 students in two concurrent classes at the College of Humanities, Aletheia University (Taiwan) | Pre-test and post-test to assess students’ knowledge of cultural heritage. | Retention Mean CG: 28.40 Mean EG: 29.25 Comprehension Mean CG: 12.27 Mean EG: 12.31 | Retention Mean CG: 40.99 Mean EG: 45.32 Comprehension Mean CG: 28.27 Mean EG: 31.53 | Significant differences in post-test scores between the two groups (p < 0.05). |
Chang et al. [76] | 137 students on an English as a foreign language course (Taiwan) | Pre-test and post-test of four parts. | Vocabulary Mean CG: 10.87 Mean EG: 11.09 Grammar Mean CG: 6.37 Mean EG: 8.00 Reading Mean CG: 6.57 Mean EG: 7.14 Writing Mean CG: 5.50 Mean EG: 6.01 | Vocabulary Mean CG: 12.76 Mean EG: 17.02 Grammar Mean CG: 11.38 Mean EG: 12.12 Reading Mean CG: 10.41 Mean EG: 13.92 Writing Mean CG: 13.34 Mean EG: 17.88 | The initiative improved academic performance in English, showing significant differences in both groups in the post-test (p < 0.01). | |
M-learning (general) | Chu & Kong [77] | 102 students (China) | Final test on environmental knowledge | - | - | Students with m-learning had better scores on all 3 dimensions of the test (natural knowledge, problem knowledge and action strategy) compared to those who received traditional instruction (p < 0.01). |
Ehsanpur & Razavi [32] | 54 primary school students from Islamic Azad University who chose the language course (Iran) | Pre-test, post-test and retention test (conducted 8 weeks after the end of the course). | - | Learning post-test Mean CG: 5.25 Mean EG: 5.97 Retention test Mean CG: 4.61 Mean EG: 6.02 | The learning and retention rate of the EG is higher than that of the CG. | |
Technology-enhanced sports instruction | Hung et al. [14] | 225 students on a badminton course (Taiwan) | Pre-test and post-test to assess students’ badminton skills. | Conventional Course Group Mean: 4.60 Tablet Course Group Mean: 4.47 | Conventional Course Group Mean: 5.34 Tablet Course Group Mean: 5.65 | A statistically significant difference was found in the post-test scores between the two groups (p < 0.05). |
Chiang et al. [78] | 326 university students taking a Physical Education–Basketball course (Taiwan) | Pre-test and post-test. The aspects evaluated were correctness of moves, maneuverability, teamwork, sense of balance, and adaptability of the students. | Learning performance Mean FCA: 8.31 Mean PT: 7.96 Mean TT: 8.39 | Learning performance Mean FCA: 20.44 Mean PT: 18.07 Mean TT: 13.65 | The learning performance of students in the FCA (Flipped Classroom with app) group was better than those in the PT (Projecting Teaching) group and those in the TT (Traditional Teaching) group. |
References
- Hu, Y.; Hwang, G.J. Promoting students’ higher order thinking in virtual museum contexts: A self-adapted mobile concept mapping-based problem posing approach. Educ. Inf. Technol. 2024, 29, 2741–2765. [Google Scholar] [CrossRef]
- Sung, Y.T.; Chang, K.E.; Liu, T.C. The effects of integrating mobile devices with teaching and learning on students’ learning performance: A meta-analysis and research synthesis. Comput. Educ. 2016, 94, 252–275. [Google Scholar] [CrossRef]
- Goundar, M.S.; Kumar, B.A. The use of mobile learning applications in higher education institutes. Educ. Inf. Technol. 2022, 27, 1213–1236. [Google Scholar] [CrossRef]
- Criollo-C, S.; Guerrero-Arias, A.; Vidal, J.; Jaramillo-Alcazar, Á.; Luján-Mora, S. A hybrid methodology to improve speaking skills in English language learning using mobile applications. Appl. Sci. 2022, 12, 9311. [Google Scholar] [CrossRef]
- Sophonhiranrak, S. Features, barriers, and influencing factors of mobile learning in higher education: A systematic review. Heliyon 2021, 7, e06696. [Google Scholar] [CrossRef] [PubMed]
- Jumaat, N.F.; Tasir, Z.; Lah, N.H.C.; Ashari, Z.M. Students’ preferences of m-learning applications in higher education: A review. Adv. Sci. Lett. 2018, 24, 2858–2861. [Google Scholar] [CrossRef]
- Mital, D.; Dupláková, D.; Duplák, J.; Mitaľová, Z.; Radchenko, S. Implementation of Industry 4.0 using e-learning and m-learning approaches in technically-oriented education. TEM J. 2021, 10, 368–375. [Google Scholar] [CrossRef]
- Santos, A.P.; Silva, E.B.D.; Candeias, A.L.B.; Costa, M.A.T.D. Educação crítica: Uma aliança entre educação ambiental e m-learning. Educ. UFSM 2019, 44, 1–24. [Google Scholar] [CrossRef]
- Inel-Ekici, D.; Ekici, M. Mobile inquiry and inquiry-based science learning in higher education: Advantages, challenges, and attitudes. Asia Pac. Educ. Rev. 2022, 23, 427–444. [Google Scholar] [CrossRef]
- Organisation for Economic Co-operation and Development—OECD. Making the Most of Technology for Learning and Training in Latin America; OECD: Paris, France, 2020. [Google Scholar]
- Alioon, Y.; Delialioğlu, Ö. The effect of authentic m-learning activities on student engagement and motivation. Br. J. Educ. Technol. 2019, 50, 655–668. [Google Scholar] [CrossRef]
- Chin, K.Y.; Lee, K.F.; Chen, Y.L. Effects of a ubiquitous guide-learning system on cultural heritage course students’ performance and motivation. IEEE Trans. Learn. Technol. 2019, 13, 52–62. [Google Scholar] [CrossRef]
- Chin, K.Y.; Wang, C.S.; Chen, Y.L. Effects of an augmented reality-based mobile system on students’ learning achievements and motivation for a liberal arts course. Interact. Learn. Environ. 2019, 27, 927–941. [Google Scholar] [CrossRef]
- Hung, H.-C.; Shwu-Ching Young, S.; Lin, K.C. Exploring the effects of integrating the iPad to improve students’ motivation and badminton skills: A WISER model for physical education. Technol. Pedagogy Educ. 2018, 27, 265–278. [Google Scholar] [CrossRef]
- Jeno, L.M.; Grytnes, J.A.; Vandvik, V. The effect of a mobile-application tool on biology students’ motivation and achievement in species identification: A Self-Determination Theory perspective. Comput. Educ. 2017, 107, 1–12. [Google Scholar] [CrossRef]
- Alrasheedi, M.; Capretz, L.F.; Raza, A. A systematic review of the critical factors for success of mobile learning in higher education (university students’ perspective). J. Educ. Comput. Res. 2015, 52, 257–276. [Google Scholar] [CrossRef]
- Alsswey, A.; Al-Samarraie, H. M-learning adoption in the Arab gulf countries: A systematic review of factors and challenges. Educ. Inf. Technol. 2019, 24, 3163–3176. [Google Scholar] [CrossRef]
- Crompton, H.; Burke, D. The use of mobile learning in higher education: A systematic review. Comput. Educ. 2018, 123, 53–64. [Google Scholar] [CrossRef]
- Naveed, Q.N.; Choudhary, H.; Ahmad, N.; Alqahtani, J.; Qahmash, A.I. Mobile Learning in Higher Education: A Systematic Literature Review. Sustainability 2023, 15, 13566. [Google Scholar] [CrossRef]
- Wahono, B.; Lin, P.L.; Chang, C.Y. Evidence of STEM enactment effectiveness in Asian student learning outcomes. Int. J. STEM Educ. 2020, 7, 36. [Google Scholar] [CrossRef]
- Garnjost, P.; Lawter, L. Undergraduates’ satisfaction and perceptions of learning outcomes across teacher-and learner-focused pedagogies. Int. J. Manag. Educ. 2019, 17, 267–275. [Google Scholar] [CrossRef]
- Pedraja-Rejas, L.; Rodríguez-Ponce, E.; Ganga-Contreras, F. Pensamiento crítico en las carreras de pedagogía. Techno Rev. 2023, 13, 1–15. [Google Scholar] [CrossRef]
- Christ, A.A.; Capon-Sieber, V.; Grob, U.; Praetorius, A.K. Learning processes and their mediating role between teaching quality and student achievement: A systematic review. Stud. Educ. Eval. 2022, 75, 101209. [Google Scholar] [CrossRef]
- Feng, Y.; Liao, Y.; Ren, Y. Effects of m-learning on students’ learning outcome: A meta-analysis. In New Media for Educational Change. Educational Communications and Technology Yearbook; Deng, L., Ma, W., Fong, C., Eds.; Springer: Singapore, 2018; pp. 115–123. [Google Scholar]
- Arain, A.A.; Hussain, Z.; Rizvi, W.H.; Vighio, M.S. An analysis of the influence of a mobile learning application on the learning outcomes of higher education students. Univers. Access Inf. Soc. 2018, 17, 325–334. [Google Scholar] [CrossRef]
- Pedraja-Rejas, L.; Rodríguez-Cisterna, C. Habilidades del pensamiento crítico y liderazgo docente: Propuesta con perspectiva de género para la formación inicial. Rev. Venez. Gerenc. 2023, 28, 1667–1684. [Google Scholar] [CrossRef]
- Facione, P.A. Critical Thinking: What It Is and Why It Counts; Insight Assessment: Hermosa Beach, CA, USA, 2020. [Google Scholar]
- Guzmán-Valenzuela, C.; Chiappa, R.; Tagle, A.R.M.; Ismail, N.; Pedraja-Rejas, L. Investigating critical thinking in higher education in Latin America: Acknowledging an epistemic disjuncture. CriStal 2023, 11, 71–99. [Google Scholar] [CrossRef]
- Lorencová, H.; Jarošová, E.; Avgitidou, S.; Dimitriadou, C. Critical thinking practices in teacher education programmes: A systematic review. Stud. High. Educ. 2019, 44, 844–859. [Google Scholar] [CrossRef]
- Haleem, A.; Javaid, M.; Qadri, M.A.; Suman, R. Understanding the role of digital technologies in education: A review. Sustain. Oper. Comput. 2022, 3, 275–285. [Google Scholar] [CrossRef]
- Asiri, Y.A.; Millard, D.E.; Weal, M.J. Assessing the impact of engagement and real-time feedback in a mobile behavior change intervention for supporting critical thinking in engineering research projects. IEEE Trans. Learn. Technol. 2021, 14, 445–459. [Google Scholar] [CrossRef]
- Ehsanpur, S.; Razavi, M.R. A Comparative analysis of learning, retention, learning and study strategies in the traditional and M-learning systems. Eur. Rev. Appl. Psychol. 2020, 70, 100605. [Google Scholar] [CrossRef]
- Wang, Y.H. Design-based research on integrating learning technology tools into higher education classes to achieve active learning. Comput. Educ. 2020, 156, 103935. [Google Scholar] [CrossRef]
- Pollock, A.; Berge, E. How to do a systematic review. Int. J. Stroke 2018, 13, 138–156. [Google Scholar] [CrossRef] [PubMed]
- Page, M.J.; Moher, D.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. BMJ 2021, 372, n160. [Google Scholar] [CrossRef] [PubMed]
- Pedraja-Rejas, L.; Rodríguez-Ponce, E.; Muñoz-Fritis, C.; Laroze, D. Sustainable development goals and education: A bibliometric review—The case of Latin America. Sustainability 2023, 15, 9833. [Google Scholar] [CrossRef]
- Birkle, C.; Pendlebury, D.A.; Schnell, J.; Adams, J. Web of Science as a data source for research on scientific and scholarly activity. Quant. Sci. Stud. 2020, 1, 363–376. [Google Scholar] [CrossRef]
- Huang, Y.; Zhu, D.; Lv, Q.; Porter, A.L.; Robinson, D.K.; Wang, X. Early insights on the Emerging Sources Citation Index (ESCI): An overlay map-based bibliometric study. Scientometrics 2017, 111, 2041–2057. [Google Scholar] [CrossRef]
- Márquez, M.C.; Porras, A.M. Science communication in multiple languages is critical to its effectiveness. Front. Commun. 2020, 5, 31. [Google Scholar] [CrossRef]
- Fernández-Arias, P.; Antón-Sancho, Á.; Sánchez-Calvo, M.; Vergara, D. Teaching experience as a key factor in dealing with digital teaching stress. Educ. Sci. 2024, 14, 809. [Google Scholar] [CrossRef]
- Pedraja-Rejas, L.; Rodríguez-Ponce, E.; Muñoz-Fritis, C.; Laroze, D. Online Learning and Experiences in Higher Education during COVID-19: A Systematic Review. Sustainability 2023, 15, 15583. [Google Scholar] [CrossRef]
- Kampenes, V.B.; Dybå, T.; Hannay, J.E.; Sjøberg, D.I. A systematic review of quasi-experiments in software engineering. Inf. Softw. Technol. 2009, 51, 71–82. [Google Scholar] [CrossRef]
- Rogers, J.; Revesz, A. Experimental and quasi-experimental designs. In The Routledge Handbook of Research Methods in Applied Linguistics; McKinley, J., Rose, H., Eds.; Routledge: London, UK, 2019; pp. 133–143. [Google Scholar]
- Ennouamani, S.; Mahani, Z.; Akharraz, L. A context-aware mobile learning system for adapting learning content and format of presentation: Design, validation and evaluation. Educ. Inf. Technol. 2020, 25, 3919–3955. [Google Scholar] [CrossRef]
- Vázquez-Cano, E.; Mengual-Andrés, S.; López-Meneses, E. Chatbot to improve learning punctuation in Spanish and to enhance open and flexible learning environments. Int. J. Educ. Technol. High. Educ. 2021, 18, 33. [Google Scholar] [CrossRef]
- Chang, Y.S.; Chien, Y.H.; Yu, K.C.; Lin, H.C.; Chen, M.Y.C. Students’ innovative environmental perceptions and creative performances in cloud-based m-learning. Comput. Hum. Behav. 2016, 63, 988–994. [Google Scholar] [CrossRef]
- Chang, Y.S.; Chen, S.Y.; Yu, K.C.; Chu, Y.H.; Chien, Y.H. Effects of cloud-based m-learning on student creative performance in engineering design. Br. J. Educ. Technol. 2017, 48, 101–112. [Google Scholar] [CrossRef]
- Chung, C.C.; Dzan, W.Y.; Cheng, Y.M.; Lou, S.J. On the push-pull mobile learning of electric welding. Eurasia J. Math. Sci. Technol. Educ. 2017, 13, 3235–3260. [Google Scholar] [CrossRef]
- Fonseca, C.S.; Zacarias, M.; Figueiredo, M. MILAGE LEARN+: A mobile learning app to aid the students in the study of organic chemistry. J. Chem. Educ. 2021, 98, 1017–1023. [Google Scholar] [CrossRef]
- Hu, M.; Wang, J. Ethnomusicology in modern classroom: Opportunities for using mobile online learning. Música Hodie 2021, 21, e68982. [Google Scholar] [CrossRef]
- Kayaalp, F.; Dinc, F. A mobile app for algorithms learning in engineering education: Drag and drop approach. Comput. Appl. Eng. Educ. 2022, 30, 235–250. [Google Scholar] [CrossRef]
- Mergany, N.N.; Dafalla, A.E.; Awooda, E. Effect of mobile learning on academic achievement and attitude of Sudanese dental students: A preliminary study. BMC Med. Educ. 2021, 21, 121. [Google Scholar] [CrossRef]
- Oyelere, S.S.; Suhonen, J.; Wajiga, G.M.; Sutinen, E. Design, development, and evaluation of a mobile learning application for computing education. Educ. Inf. Technol. 2018, 23, 467–495. [Google Scholar] [CrossRef]
- Tezer, M.; Çimşir, B.T. The impact of using mobile-supported learning management systems in teaching web design on the academic success of students and their opinions on the course. Interact. Learn. Environ. 2018, 26, 402–410. [Google Scholar] [CrossRef]
- Zhonggen, Y.; Ying, Z.; Zhichun, Y.; Wentao, C. Student satisfaction, learning outcomes, and cognitive loads with a mobile learning platform. Comput. Assist. Lang. Learn. 2019, 32, 323–341. [Google Scholar] [CrossRef]
- Zhu, H. Application of Rain classroom in formal classroom learning in the teaching of offshore engineering environment and loads. Comput. Appl. Eng. Educ. 2021, 29, 603–612. [Google Scholar] [CrossRef]
- Chin, K.Y.; Wang, C.S. Effects of augmented reality technology in a mobile touring system on university students’ learning performance and interest. Australas. J. Educ. Technol. 2021, 37, 27–42. [Google Scholar] [CrossRef]
- Moro, C.; Phelps, C.; Redmond, P.; Stromberga, Z. HoloLens and mobile augmented reality in medical and health science education: A randomised controlled trial. Br. J. Educ. Technol. 2021, 52, 680–694. [Google Scholar] [CrossRef]
- Briz-Ponce, L.; Juanes-Méndez, J.A.; García-Peñalvo, F.J.; Pereira, A. Effects of mobile learning in medical education: A counterfactual evaluation. J. Med. Syst. 2016, 40, 136. [Google Scholar] [CrossRef]
- Chang, H.Y.; Wu, H.F.; Chang, Y.C.; Tseng, Y.S.; Wang, Y.C. The effects of a virtual simulation-based, mobile technology application on nursing students’ learning achievement and cognitive load: Randomized controlled trial. Int. J. Nurs. Stud. 2021, 120, 103948. [Google Scholar] [CrossRef] [PubMed]
- Hanson, J.; Andersen, P.; Dunn, P.K. The effects of a virtual learning environment compared with an individual handheld device on pharmacology knowledge acquisition, satisfaction and comfort ratings. Nurse Educ. Today 2020, 92, 104518. [Google Scholar] [CrossRef]
- Arauz, M.; Fuentealba, C.; Vanderstichel, R.; Bolfa, P.; Sithole, F.; Laws, A.; Illanes, O. Development and application of an interactive neuropathology iBook as a complementary learning tool for veterinary medicine students. J. Vet. Med. Educ. 2022, 49, 353–362. [Google Scholar] [CrossRef]
- de la Peña Esteban, F.D.; Lara Torralbo, J.A.; Lizcano Casas, D.; Burgos García, M.C. Web gamification with problem simulators for teaching engineering. J. Comput. High. Educ. 2020, 32, 135–161. [Google Scholar] [CrossRef]
- Ramírez-Donoso, L.; Pérez-Sanagustín, M.; Neyem, A.; Alario-Hoyos, C.; Hilliger, I.; Rojos, F. Fostering the use of online learning resources: Results of using a mobile collaboration tool based on gamification in a blended course. Interact. Learn. Environ. 2023, 31, 1564–1578. [Google Scholar] [CrossRef]
- Wilkinson, K.; Dafoulas, G.; Garelick, H.; Huyck, C. Are quiz-games an effective revision tool in Anatomical Sciences for Higher Education and what do students think of them? Br. J. Educ. Technol. 2020, 51, 761–777. [Google Scholar] [CrossRef]
- Troussas, C.; Krouska, A.; Sgouropoulou, C. Collaboration and fuzzy-modeled personalization for mobile game-based learning in higher education. Comput. Educ. 2020, 144, 103698. [Google Scholar] [CrossRef]
- Andújar-Vaca, A.; Cruz-Martínez, M.S. Mobile instant messaging: WhatsApp and its potential to develop oral skills. Comunicar 2017, 25, 43–52. [Google Scholar] [CrossRef]
- So, S. Mobile instant messaging support for teaching and learning in higher education. Internet High. Educ. 2016, 31, 32–42. [Google Scholar] [CrossRef]
- Wang, Z.; Hwang, G.J.; Yin, Z.; Ma, Y. A contribution-oriented self-directed mobile learning ecology approach to improving EFL students’ vocabulary retention and second language motivation. Educ. Technol. Soc. 2020, 23, 16–29. [Google Scholar]
- Liu, C.; Wan, P.; Tu, Y.F.; Chen, K.; Wang, Y. A WSQ-based mobile peer assessment approach to enhancing university students’ vocal music skills and learning perceptions. Australas. J. Educ. Technol. 2021, 37, 1–17. [Google Scholar] [CrossRef]
- Li, Y.; Hafner, C.A. Mobile-assisted vocabulary learning: Investigating receptive and productive vocabulary knowledge of Chinese EFL learners. ReCALL 2022, 34, 66–80. [Google Scholar] [CrossRef]
- Cavus Ezin, C.; Yilmaz, R. The effect of learning analytics-based interventions in mobile learning on students’ academic achievements, self-regulated learning skills, and motivations. Univers. Access Inf. Soc. 2023, 22, 967–982. [Google Scholar] [CrossRef]
- Martin, F.; Ertzberger, J. Effects of reflection type in the here and now mobile learning environment. Br. J. Educ. Technol. 2016, 47, 932–944. [Google Scholar] [CrossRef]
- Zheng, L.; Li, X.; Chen, F. Effects of a mobile self-regulated learning approach on students’ learning achievements and self-regulated learning skills. Innov. Educ. Teach. Int. 2018, 55, 616–624. [Google Scholar] [CrossRef]
- Fang, W.C.; Yeh, H.C.; Luo, B.R.; Chen, N.S. Effects of mobile-supported task-based language teaching on EFL students’ linguistic achievement and conversational interaction. ReCALL 2021, 33, 71–87. [Google Scholar] [CrossRef]
- Chang, C.; Shih, J.L.; Chang, C.K. A mobile instructional pervasive game method for language learning. Univers. Access Inf. Soc. 2017, 16, 653–665. [Google Scholar] [CrossRef]
- Chu, Z.; Kong, L. Effect of environmental education on students’ environmental knowledge and learning satisfaction–case on mobile learning. J. Environ. Prot. Ecol. 2020, 21, 2389–2396. [Google Scholar]
- Chiang, T.H.C.; Yang, S.J.; Yin, C. Effect of gender differences on 3-on-3 basketball games taught in a mobile flipped classroom. Interact. Learn. Environ. 2019, 27, 1093–1105. [Google Scholar] [CrossRef]
- Hwang, G.J.; Wu, Y.J.; Chang, C.Y. A mobile-assisted peer assessment approach for evidence-based nursing education. Comput. Inform. Nurs. 2021, 39, 935–942. [Google Scholar] [CrossRef]
- Jodoi, K.; Takenaka, N.; Uchida, S.; Nakagawa, S.; Inoue, N. Developing an active-learning app to improve critical thinking: Item selection and gamification effects. Heliyon 2021, 7, e08256. [Google Scholar] [CrossRef]
- Khachan, A.M.; Özmen, A. IMSSAP: After-school interactive mobile learning student support application. Comput. Appl. Eng. Educ. 2019, 27, 543–552. [Google Scholar] [CrossRef]
- Parsazadeh, N.; Ali, R.; Rezaei, M. A framework for cooperative and interactive mobile learning to improve online information evaluation skills. Comput. Educ. 2018, 120, 75–89. [Google Scholar] [CrossRef]
- Song, H.; Cai, L. Interactive learning environment as a source of critical thinking skills for college students. BMC Med. Educ. 2024, 24, 270. [Google Scholar] [CrossRef]
- Wu, Y.L. Gamification design: A comparison of four m-learning courses. Innov. Educ. Teach. Int. 2018, 55, 470–478. [Google Scholar] [CrossRef]
- Cheng, Y.M. Towards an understanding of the factors affecting m-learning acceptance: Roles of technological characteristics and compatibility. Asia Pac. Manag. Rev. 2015, 20, 109–119. [Google Scholar] [CrossRef]
- Al-Emran, M.; Mezhuyev, V.; Kamaludin, A. Technology acceptance model in m-learning context: A systematic review. Comput. Educ. 2018, 125, 389–412. [Google Scholar] [CrossRef]
- Zheng, L.; Long, M.; Zhong, L.; Gyasi, J.F. The effectiveness of technology-facilitated personalized learning on learning achievements and learning perceptions: A meta-analysis. Educ. Inf. Technol. 2022, 27, 11807–11830. [Google Scholar] [CrossRef]
- Wolters, C.A.; Brady, A.C. College students’ time management: A self-regulated learning perspective. Educ. Psychol. Rev. 2021, 33, 1319–1351. [Google Scholar] [CrossRef]
- Palalas, A.; Wark, N. The relationship between mobile learning and self-regulated learning: A systematic review. Australas. J. Educ. Technol. 2020, 36, 151–172. [Google Scholar] [CrossRef]
- Xie, H.; Chu, H.C.; Hwang, G.J.; Wang, C.C. Trends and development in technology-enhanced adaptive/personalized learning: A systematic review of journal publications from 2007 to 2017. Comput. Educ. 2019, 140, 103599. [Google Scholar] [CrossRef]
- Pathak-Shelat, M.; Bhatia, K.V. Reconceptualizing ICTD: Prioritizing place-based learning experiences, socio-economic realities, and individual aspirations of young students in India. Soc. Sci. 2024, 13, 379. [Google Scholar] [CrossRef]
- Kuo, Y.C.; Kuo, Y.T.; Abi-El-Mona, I. Mobile learning: Pre-service teachers’ perceptions of integrating iPads into future teaching. Educ. Inf. Technol. 2023, 28, 6209–6230. [Google Scholar] [CrossRef]
Categories | % |
---|---|
Year | |
2016 | 8.0 |
2017 | 10.0 |
2018 | 14.0 |
2019 | 10.0 |
2020 | 16.0 |
2021 | 28.0 |
2022 | 6.0 |
2023 | 4.0 |
2024 | 4.0 |
Journal | |
British Journal of Educational Technology | 8.0 |
Interactive Learning Environments | 8.0 |
Computer Applications in Engineering Education | 6.0 |
Computers & Education | 6.0 |
Education and Information Technologies | 6.0 |
Universal Access in the Information Society | 6.0 |
Australasian Journal of Educational Technology | 4.0 |
BMC Medical Education | 4.0 |
IEEE Transactions on Learning Technologies | 4.0 |
Innovations In Education and Teaching International | 4.0 |
ReCALL | 4.0 |
Others | 40.0 |
Categories | % |
---|---|
Region | |
Asia | 53.7 |
Europe | 24.4 |
Africa | 7.3 |
America | 4.9 |
Oceania | 4.9 |
Not specified | 4.9 |
Area | |
Foreign language | 17.1 |
Health sciences and human/animal biology | 17.1 |
Computer science and programming | 17.1 |
Cultural heritage/historic buildings | 7.3 |
Engineering (industrial, electric, offshore oil and gas) | 7.3 |
Language, grammar and communication | 7.3 |
Chemistry/biology | 4.9 |
Music | 4.9 |
Education (pre-service) | 4.9 |
Sports | 4.9 |
Creative performance | 4.9 |
Environment | 2.4 |
Number of Participants | |
Less than 50 | 12.2 |
Between 50 and 100 | 46.3 |
More than 100 | 41.5 |
Main Focus | Relevant Findings |
---|---|
Adaptive mobile learning systems | Adapting the content and format (video or text) according to the students’ learning style had the potential to improve their knowledge of the programming language [44]. |
Chatbot | Using this tool improved the scores in the final test. Significant differences exist between the scores obtained by the CG and EG [45]. |
Cloud-based m-learning | It had a positive effect on students’ creative performance [46,47]. It improved both the creative process (problem presentation and answer generation and validation) and the development of creative products. |
Educational application/platform with multiple functionalities | Using educational applications with multiple functionalities (e.g., including class material, forum, chat room, multimedia material, interactive exercises, games, and others) had the potential to positively influence student learning outcomes [25,48,49,50,51,52,53,54,55,56]. |
Immersive experiences | Using augmented reality (AR) technology on mobile devices improved students’ learning performance, especially in the areas of observation, including a deeper understanding of physical objects [13], and retention [57]. The use of mobile device-based augmented reality can be just as good at helping to improve outcomes as other more complex technological equipment (e.g., Microsoft HoloLens) [58]. Using 3D image-based mobile learning [59] and virtual simulation applications [60,61] also helped to improve final test scores. |
Interactive tool versus traditional textbooks | Employing an interactive app to identify sedge species was found to be more effective in improving student test scores than using a conventional textbook [15]. Meanwhile, no statistically significant differences in mean test scores were found between users and non-users of an interactive iBook, although a low student use of the initiative was highlighted [62]. |
Game-based learning | Students who used the game/quiz app obtained higher pass rates and improved their test scores, generally outperforming those who did not use the tool [63,64,65]. However, one study [66] revealed that the conventional version of the game, without the advanced functionalities, showed better results in terms of learning achievement, suggesting that the benefits of these improvements may depend on the context and specific needs of the students. |
Mobile instant messaging | EG participants who used instant messaging applications (WhatsApp, WeChat) to view and share multimedia material, check doubts, and interact with peers, among others, obtained better results in the final tests than the CG [67,68,69]. However, this difference was not observed in the delayed test [69]. |
Mobile peer assessment | WSQ (watch–summary–question)-based peer assessment via a mobile app improved students’ vocal music skill [70]. |
Mobile-based word cards | Both EG and CG improved vocabulary learning; however, EG had better test scores [71]. |
Mobile-supported learning analytics interventions | Customized interventions (feedback, hints, and others) based on learning analytics helped students achieve better final test scores [72]. |
Mobile-supported reflective learning | Mobile device-supported learning with a significant component of reflection (self-guided or facilitated by a virtual expert) contributed to improved student achievement [73]. |
Mobile-supported task-based teaching | Task completion via mobile devices improved students’ reading comprehension [74], conversational comprehension, and vocabulary [75] in English. |
Location-based contextual learning systems | A mobile instructional game positively affected language learners’ learning outcomes (reading, writing, and vocabulary) [76]. Correspondingly, a ubiquitous guidance-learning system improved students’ learning achievements in the Cultural Heritage course [12]. |
M-learning (general) | M-learning positively influenced students’ knowledge acquisition [77] and content retention [32]. |
Technology-enhanced sports instruction | Using mobile applications and multimedia content improved students’ sports skills and knowledge [14,78]. |
Author(s) | Participants | Context | Data Collection | Main Findings | Conclusions |
---|---|---|---|---|---|
Asiri et al. [31] | 60 third-year students from the Department of Electronics and Computer Science at the University of Southampton (UK). | Using mBCI (mobile-based behavior change intervention) technology to deliver real-time feedback to engineering students in the context of a research project. | Online questionnaires before and after the intervention to obtain students’ opinions. Evaluations of the work by academic supervisors and experts. | 57% of the students found the activities useful for learning about critical thinking. Participants in both groups (CG and EG) received low ratings from academics for all nine critical thinking standards. | Students’ self-perceptions of their improvements in critical thinking skills did not match with the results of their teachers’ evaluations. |
Ehsanpur & Razavi [32] | 54 primary school students from Islamic Azad University that chose the language course (Iran) | Assessing the effect of traditional and mobile education. | Learning and study strategies are measured using the LASSI questionnaire, which is based on students’ opinions. | Self-regulation Mean CG: 94.73 Mean EG: 95.66 Skill Mean CG: 66.50 Mean EG: 70.50 | The EG’s level of self-regulation is superior to that of the CG (p < 0.05), particularly in the areas of time management, self-testing, and concentration. Additionally, students who used m-learning perceived better information processing skills (p < 0.05). |
Hu & Hwang [1] | 56 first and second-year students on a Museum Introduction Course (China). | Using the self-adaptive mobile CMPP (concept mapping-based problem-posing) approach within the context of a virtual museum. The students took notes and screenshots to complete the teacher’s task sheet, then formulated a question and attempted to answer it using a self-adaptive concept map on their mobile devices. | Student self-perception questionnaires were used. | Critical thinking Mean CG: 3.66 Mean EG: 3.88 Metacognition Mean CG: 3.49 Mean EG: 4.05 Problem-solving Mean CG: 3.56 Mean EG: 4.00 Computational thinking Mean CG: 3.19 Mean EG: 3.58 | The proposed strategy could significantly improve students’ higher order thinking skills such as critical thinking (p < 0.05), metacognition (p < 0.01), problem solving (p < 0.01) and computational tendency (p < 0.01). |
Hwang et al. [79] | 36 s-year students from a nursing school. | Proposing a problem, intervention, comparison, and outcome approach based on mobile device-assisted peer assessment. Nursing students can use the app to read, organize, and criticize scientific articles. | Questionnaire for critical thinking ability. | Mean CG: 4.12 Mean EG: 4.22 | The EG who learned through the proposed approach significantly improved self-perceived critical thinking scores (p < 0.05). |
Inel-Ekici & Ekici [9] | 80 students from the Science Education and Primary Education departments of Usak University (Turkey). | Implementing and comparing the effects of face-to-face inquiry-based learning (IBL) and mobile inquiry-based learning (m-IBL). | Participants’ experiences were measured through a questionnaire with open-ended questions. | Students who participated in the activity perceived an increase in their awareness of scientific inquiry and learning retention. They further highlighted that it helped them learn meaningfully and improved their thinking skills. | The mobile app augmented the effects of the traditional method. |
Jodoi et al. [80] | Two studies were conducted. The first study included 73 students from two Japanese universities, while the second study involved 114 students divided into 3 main groups (Group A = University A, Group B = University B, Group C = People who had experience in debate activities). | Developing a mobile application and testing its effects (with and without gamification) on developing critical thinking skills. | Study 1 Questions on reasoning and critical thinking. Study 2 Pre-test and post-test to measure critical thinking skills. | Study 1 Average answer rates University A: 0.76 University B: 0.85 Study 2 With gamification Group A Pre-test: 0.80 Post-test: 0.89 Group B Pre-test: 0.77 Post-test: 0.91 Without gamification Group A Pre-test: 0.77 Post-test: 0.88 Group B Pre-test: 0.82 Post-test: 0.96 | The app helped improve students’ pre-and post-test scores (p < 0.05), which included questions related to verbal reasoning, logical reasoning, and judgment skills. However, no apparent effect of gamification was observed. |
Khachan & Özmen [81] | 50 students from Introduction to Computer Networks, Discrete Mathematics, Data Mining, and English Language courses (Turkey). | Introducing the IMSSAP application (interactive mobile-learning student support app) designed to integrate social interaction and education. | Questionnaire to measure student perception. | Critical thinking Mean: 3.72 | Students felt that the app helped them improve their critical thinking and analytical skills. |
Parsazadeh et al. [82] | 67 s-year students from the Computer Science Department of a university in Kuala Lumpur (Malaysia). | Featuring a mobile application that seeks to improve online information evaluation skills. | A pre-test and post-test (multiple choice and writing task) on online information evaluation skills was conducted. | The Mann-Whitney U test revealed significant differences (p < 0.01) in post-tests between EG and CG students in both multiple-choice tasks and written essays. | The app significantly improves students’ online information evaluation skills more effectively than the traditional method. |
Song & Cai [83] | 60 first-year students of Philology from the Qiqihar University (China). | Using the game application Lumosity: Brain Training in Students’ Learning Process and Evaluating its Effects. | A pre-test and a post-test based on the Critical Thinking Skills Success methodology were conducted. | Pre-test Mean CG: 14.92 Mean EG: 22.13 Post-test Mean CG: 15.12 Mean EG: 24.50 | The EG improved their critical thinking skills compared to their initial skills. Additionally, they significantly outperformed the CG (p < 0.05). |
Wu [84] | 228 students on an Information Ethics and Law Course. | Assessing the effects of using a mobile application to gamify learning activities. | Objective evaluation of reviewers. | Classroom with gamified app More than 70% of the students showed a level of application or analysis of knowledge in their exercises. More than 60% performed a satisfactory analysis of the evidence. More than 70% presented a logical organization with good connections between ideas, and more than 80% achieved a logical and interesting sequence in their text. | The class that used games generally presented good levels of performance in content knowledge, analysis, synthesis and organizational skills. |
Zheng et al. [74] | 60 first-year students majoring in educational technology, psychology, environment, chemistry, or management science. | Examining the effects of using a mobile self-regulated learning approach. This system allows students to set goals and plans, monitor learning processes, reflect on themselves, and self-assess. | Self-regulated learning questionnaire. | Mean CG: 4.67 Mean EG: 5.13 | The students who used this new system improved their self-regulated learning skills, showing higher levels than the CG (p < 0.01). |
Zhu [56] | 119 Offshore Oil and Gas Engineering students (China). | Exploring the Rain Classroom mobile application which incorporates real-time communication, records the teaching process during classes, integrates learning activities, and assesses student perceptions. | Student perception questionnaire. | Understanding of the theoretical knowledge Mean: 4.65 Memory and logic thinking abilities Mean: 4.62 | Students consider that this tool improved their comprehension, memory, and logical thinking skills. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pedraja-Rejas, L.; Muñoz-Fritis, C.; Rodríguez-Ponce, E.; Laroze, D. Mobile Learning and Its Effect on Learning Outcomes and Critical Thinking: A Systematic Review. Appl. Sci. 2024, 14, 9105. https://doi.org/10.3390/app14199105
Pedraja-Rejas L, Muñoz-Fritis C, Rodríguez-Ponce E, Laroze D. Mobile Learning and Its Effect on Learning Outcomes and Critical Thinking: A Systematic Review. Applied Sciences. 2024; 14(19):9105. https://doi.org/10.3390/app14199105
Chicago/Turabian StylePedraja-Rejas, Liliana, Camila Muñoz-Fritis, Emilio Rodríguez-Ponce, and David Laroze. 2024. "Mobile Learning and Its Effect on Learning Outcomes and Critical Thinking: A Systematic Review" Applied Sciences 14, no. 19: 9105. https://doi.org/10.3390/app14199105
APA StylePedraja-Rejas, L., Muñoz-Fritis, C., Rodríguez-Ponce, E., & Laroze, D. (2024). Mobile Learning and Its Effect on Learning Outcomes and Critical Thinking: A Systematic Review. Applied Sciences, 14(19), 9105. https://doi.org/10.3390/app14199105