Next Article in Journal
Green Power Consumption Digital Identification System Based on Blockchain Technology
Next Article in Special Issue
Systems Thinking Skills and the ICT Self-Concept in Preschool Teachers for Sustainable Curriculum Transformation
Previous Article in Journal
Learning Models for Higher Education in Engineering: Motivation, Self-Determination, and the Role of Information and Communication Technologies
Previous Article in Special Issue
High School Students’ Use of Information, Media, and Technology Skills and Multidimensional 21st-Century Skills: An Investigation within the Context of Students, Teachers, and Curricula
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploring the Impact of Mobile Exams on Saudi Arabian Students: Unveiling Anxiety and Behavioural Changes across Majors and Gender

by
Mostafa Aboulnour Salem
1,* and
Ali Saleh Alshebami
2,*
1
Development and Quality Assurance, King Faisal University, Al-Ahsa 31982, Saudi Arabia
2
Applied College, King Faisal University, Al-Ahsa 31982, Saudi Arabia
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(17), 12868; https://doi.org/10.3390/su151712868
Submission received: 25 July 2023 / Revised: 17 August 2023 / Accepted: 24 August 2023 / Published: 25 August 2023
(This article belongs to the Special Issue Teaching Methods in Sustainable Education)

Abstract

:
Students’ anxiety and behavioural changes while using different examination methods (paper, PC, and mobile exam platforms (MEPs)) were estimated. The influence of academic majors and gender was also determined by answering the following questions: How do anxiety and behavioural changes vary among students using different exam methods? How do students’ anxiety levels vary according to academic majors and gender while using different exam methods? A survey was conducted with 826 students enrolled in eight colleges at King Faisal University in Saudi Arabia. The results revealed less anxiety and fewer harmful behavioural changes among students using MEPs compared with other methods. Furthermore, less anxiety and fewer behavioural changes were observed among health and science majors than humanities and social science majors and among female students than male students while using MEPS. Therefore, MEPs should be gradually adopted by higher education institutions in Saudi Arabia, especially for humanities and social science majors and for male students. Furthermore, researchers and decision-makers should find unique solutions to reduce the positive correlation between anxiety and behavioural changes among health and science majors and female students toward MEPs. This can be achieved by identifying obstacles and introducing modern solutions, such as AI-generated exams, among others.

1. Introduction

The SARS-CoV-2 (COVID-19) outbreak has led to a significant global crisis, owing to the rapid spread of the virus and increased incidents of sickness and mortality. To control and reduce COVID-19 transmissions, several countries, including the Kingdom of Saudi Arabia (KSA), have implemented necessary measures and precautions to decrease the pandemic curve [1]. The number of academic institutions that were forced to close in April 2020 reached 1.3 billion in almost 150 countries worldwide, dropping to 100 million in about 20 country-wide closures in September 2021 [2]. Likewise, higher educational institutions (HEIs) experienced massive disruptions in the teaching and assessment processes, accelerating the adoption of digital and online technologies to open promising learning prospects [3]. A recent global survey conducted by the International Association of Universities (IAU) of HEIs (N = 424, 109 countries) showed that more than 90% of surveyed institutions already replaced classroom instruction with online teaching and assessment, or were in the process of developing solutions to continue operations amid the COVID-19 pandemic [4,5].
Therefore, the shift to online education raised fundamental challenges for educational institutions, especially in organising evaluations and ensuring that students progressed in their assessments [6]. In recent years, online exams have been introduced as a practical and critical element of online education [7]. Several articles have shown that students favoured online exams because of their many advantages in accurately measuring cognitive achievements, skills and competencies [8,9]. According to Dikmen [10], online exams increased higher education students’ digital abilities in relation to the 21st-century skills that should be acquired and developed by students. Therefore, numerous universities have sustained online exams as a pillar of online learning post-COVID-19. In the KSA, HEIs have acknowledged online exam platforms as the best and most sustainable mode of assessment, aside from paper-based exams post-COVID-19 [11,12,13]. In the same vein, several articles have investigated the reactions of HEIs towards mobile learning platforms beyond COVID-19, emphasising the benefits, opportunities and significance of embracing learning to ensure the effectiveness of education [14,15]. Recent studies have shown that many students prefer the mobile learning approach as the most helpful option for maintaining the quality of digital learning and assessment beyond COVID-19 [16,17]. Other articles have reported that mobile exam platforms (MEPs) supported the success of online exams and helped manage changes in the assessment process in ways that reflected knowledge structure shifts and educational competencies beyond COVID-19 [14,18].
For Stöber [19], most higher education students experience anxiety, stress and fear before and during assessments (also known as assessment or exam anxiety). Exam anxiety manifests in three ways: cognitive, psychological and emotional [20]. According to Becket al. [21], ‘assessment anxiety’ refers to a group of psychological, phenomenological and physical responses that appear amidst the potential adverse effects of gaining low scores or being unable to answer exam questions correctly. Similarly, Cassady et al. [22] reported poor academic performance among students exhibiting assessment anxiety. Maloney et al. [20] indicated that assessment anxiety causes distorted behavioural changes among higher education students. Recent studies have reported significant behavioural changes, such as insomnia and difficulty sleeping at night, dry mouth, nausea, diarrhoea, constipation, sensing discomfort and wanting to finish the assessment quickly, among higher education students taking exams [23].
Beyond COVID-19, higher education students had diverse impressions about anxiety and behavioural changes while using different exam methods (paper, PC and MEPs). For example, although online MEPs helped these students overcome the negative impacts of COVID-19, this caused stress and anxiety compared with paper exams [14]. According to Salem and Elshaer [15] and Melgaard et al., 2022 [9], low levels of anxiety while taking mobile exams during the COVID-19 pandemic contributed to educators and students preferring MEPs over paper exams. Surprisingly, students’ anxiety about mobile exams was not caused by their fear of making mistakes or being unable to pass exams, as in paper exams [16]. In fact, Dikmen [10] and Abdelwahed et al. [13] showed that during online exams, many students were anxious about losing self-confidence and being incompetent in writing answers correctly, especially in essay questions. Moreover, during online exams, students felt constant anxiety about the possibility of power outages, poor Internet connection and device failure (PC, mobile and others).
Various studies on emerging countries have investigated the multiple benefits of MEPs. In China, online exams have been used appropriately in online learning programmes. The results revealed an enhanced significant relationship between students’ academic achievements and online exam marks compared with formal assessment, as well as positive perceptions among students [24,25]. At the same time, other studies have reported levels of insufficiency related to assessment duration and concerns about potential technical problems that may occur during the implementation of MEPs [26].
In India, a study used quantitative analysis methods and reported students’ positive attitudes towards MEPs [11]. Meanwhile, no statistically significant difference was found in the levels of satisfaction between online and traditional assessments [27]. Other studies have indicated a level of insufficiency related to hardware devices and Internet accessibility while using MEPs [28]. For example, in Thailand, the outcomes of adopting MEPs showed no disparity among students’ perspectives concerning GPAs or gender; the findings also indicated students’ positive outlooks towards MEPs [29]. Moreover, students’ satisfaction with MEPs confirmed that the online mode of examination competed with paper exams in terms of evaluating intended outcomes [30].
Some articles [31,32,33,34] reported that higher education students in the United Arab Emirates held a favourable outlook towards MEPs in terms of their ability to edit their answers efficiently, thus serving as a secure assessment system. The studies further revealed that students had no difficulties using MEPs, although they showed anxiety and behavioural changes. In Egypt, recent articles regarding students’ negative perceptions of MEPs indicated that they had an anxious outlook towards online exam methods and always felt that this was a non-secure assessment system [35,36]. Other articles showed that during MEPs, students were worried about being unable to write the correct answers to essay-type questions [35,37]. In the KSA, higher education students have an overall positive perception of MEPs [2]. Similarly, recent articles have reported that most higher education students in the KSA hold positive attitudes towards MEPs [11,12]. In contrast, other studies found that students were anxious about making mistakes in choosing answers, being unable to pass assessments and losing self-confidence during MEPs [13,38]. They also showed behavioural changes during MEPs (e.g., dry mouth and nausea, diarrhoea and constipation) [2,39].
Additionally, anxiety during online exams has been shown to vary depending on students’ academic majors [40,41]. For instance, according to Elsalem et al., 2020 [42] and Eltahir et al., 2022 [43], health and science majors always feared making mistakes in entering answers, especially mathematical calculations and science equations, which required more time than simply deriving the solution on paper. Hence, they preferred paper assessments to online ones. The degree of anxiety has also been shown to vary according to gender, as females experienced more pressure than males, tending to lose their sense of comfort and reassurance and perceiving online exams as unable to accurately measure their skills [40]. Female students were also more worried about the harmful effects of online exams on their GPAs [44].
Generally, investigators have assessed and analysed over 500 articles examining different exam methods (i.e., paper, PC and MEPs) and associated anxiety and behavioural changes among students in HEIs in the KSA. Such articles reported a lack of studies (Arabic and English) assessing students’ anxiety or behavioural changes while using online MEPs and how these are affected by students’ majors and gender. In comparison, most previous reports have focused on the anxiety, cognitive, affective, physiological and social measurements during exams in general. Hence, due to the widespread use of MEPs in HEIs in the KSA post-COVID-19, several articles highlighted the need to examine students’ anxiety and behavioural changes while using MEPs and to compare their anxiety levels while using different exam methods (paper, PC and MEPs).
Therefore, the main objectives of the current research were to understand the anxiety and behavioural changes associated with MEPs among university students in the KSA and to determine how these changes are influenced by students’ majors and gender. Hence, the current study endeavours to answer the following research questions:
Q1. How do higher education students’ anxiety and behavioural changes vary while using different exam methods (paper, PC and MEPs)?
Q2. How do students’ anxiety levels differ according to their academic majors (health and sciences and humanities/social sciences) while using different exam methods?
Q3. How do students’ anxiety levels differ according to their gender (female and male) while using different exam methods?
This research attempts to understand the anxiety and behavioural changes associated with MEPs among university students in the KSA and to determine how academic majors and gender influence them. Therefore, the current research proposes the following hypotheses:
H1. 
Negative anxiety and behavioural changes occur among students during exams while using MEPs compared with other methods.
H2. 
Positive anxiety and behavioural changes occur during MEPs between health and science majors than humanities and social science majors in favour of the former.
H3. 
Positive anxiety and behavioural changes occur during MEPs between female and male students in favour of the former.

2. Research Methodology

2.1. Research Problem and Challenges Encountered

After the COVID-19 pandemic and before initiating the study, the investigators observed that some students enrolled in King Faisal University (KFU), KSA, had negative attitudes towards using online exams, especially MEPs. Furthermore, they were reluctant to take online exams. The investigators reviewed several articles (e.g., [1,2,3,4,5,6,7]) regarding the causes of negative attitudes towards exams among higher education students. The reports indicated that some of these reasons correlate with assessment anxiety and behavioural changes associated with exam methods or devices. Therefore, the investigators surveyed a random sample of KFU students and asked them about the following items:
Which exam method (paper, PC, or smartphone) do you prefer?
Which device do you always use to access online exam platforms (PC or smartphone)?
Which exam method (paper, PC, or smartphone) do you feel less anxious about?
Which exam method (paper, PC, or smartphone) do you feel fewer harmful behavioural changes?
The students’ responses revealed that 96.8% preferred exam methods based on smartphones and used smartphones to access online exam platforms. Furthermore, 90.18% self-reported experiencing anxiety and behavioural changes correlated with both exam methods (paper, PC). About 46.04% of the respondents felt that paper exams caused more significant anxiety, while 31.85% reported that having more PC exams led to more significant anxiety. Meanwhile, 22.11% believed smartphone exams caused less anxiety and fewer behavioural changes. However, 9.82% of the respondents had no concerns about exams.
Likewise, academic majors were significantly associated with self-reported students’ experiences of anxiety during assessments. Among the health and science students, 66.96% reported anxiety while doing online exams, whereas 40.74% of humanities and social science students reported the same outcomes. In addition, gender was significantly associated with self-reported anxiety during online exams. Among those surveyed, 43% of female students reported anxiety with online exams, while 29.17% of male students reported the same anxiety. Hence, these results motivated the investigators to initiate the current study.

2.2. Research Objective

The present research aimed to investigate the anxiety and behavioural changes associated with using different exam methods (paper, PC and MEPs) among higher education students in the KSA to identify the best exam method to use. The study also aimed to determine which method resulted in the lowest degrees of anxiety and behavioural changes in relation to students’ academic majors and gender. Accordingly, an instrument was developed to measure anxiety and behavioural changes while using different exam methods (paper, PC and MEPs), and its reliability and validity were assessed and implemented.

2.3. Research Sample and Population

The research population included bachelor’s degree students enrolled in all colleges of KFU located in Al-Ahsaa, Eastern Region, KSA. The colleges relied considerably on MEPs (e.g., Blackboard and Question Mark) to manage online exams after the COVID-19 pandemic. According to statistics, in 2021, over 7000 first-year students registered in 14 colleges in the KFU. The research targeted 826 students in seven KFU colleges in the areas of health and science (Agriculture and Food Sciences, Science, Clinical Pharmacy and Applied Medical Sciences) and humanities and social science (Arts, Law, Education and Business Administration). The minimum sample size required for this research was determined based on three factors: (1) students’ population size; (2) margin of error, which was set to be ±5%; and (3) confidence level, which was set to be 95% for this study design.
According to Hill [45], the sample size calculation must be based on the total number of items, which should be at least five responses for each item. Thirty-three items were used in the current article. Muthén [46] concluded that an appropriate sample should be more than 150. Hence, the sample size for the current study (826 valid responses for analysis) can be considered appropriate (Table 1).

2.4. Instrument

To develop a research instrument (survey) draft to examine students’ anxiety and behavioural changes while using different exam methods (paper, PC and MEPs), and to answer the first research question, the investigators reviewed previous studies related to anxiety and behavioural changes associated with different exam methods (paper, PC, online, mobile and others), focusing on dimensions, components, symptoms and measurements. The researchers also reviewed and analysed the literature related to learning theories, such as Taylor Spence’s theory of anxiety, information processing theory and Skinner’s behaviour theory. Furthermore, the investigators reviewed surveys that measured exam anxiety and associated behavioural changes, such as the scales introduced by Sarason and Spielberger, as well as the Behavioural Characteristics Progression and the Vineland Adaptive Behaviour tests. After examining and analysing the literature, the first draft of the research survey was developed (Figure 1).
Notably, the investigators aimed to establish a survey specifically designed for the current research rather than use any other anxiety survey. The reason was that we wanted to ensure that the final form of the survey matched the objectives of the research, the ages of the sample (18–22 years), the research population (university education), sample characteristics (adults) and data gathering environment (KSA).
The research instrument (survey) included 33 items representing two dimensions: exam anxiety (20 items, α = 912) and behavioural changes during online exams (13 items, α = 0.904). Hence, the exam anxiety variables were operationalised on a 2-item Likert scale (‘Yes’ or ‘No’), in which the students selected one of two options to indicate the degree to which they experienced anxiety. The behavioural changes during online exam variables were operationalised on a 5-item Likert scale (‘Strongly agree’, ‘Agree’, ‘Undecided’, ‘Disagree’ and ‘Strongly disagree’). The students selected one of five options to indicate the degree to which they experienced anxiety and behavioural changes while using different exam methods (paper, PC and MEPs).
The research instrument (survey) was directed to 30 experts specialising in instructional technology, digital learning, information technology, psychology and medicine in Middle Eastern countries. This step aimed to assess the validity of the research instrument (survey). The responses were collected from only 24 experts (six did not respond; see Table 2). The experts were identified through personal networks and recommendations from colleagues.
Accordingly, Cronbach’s alpha coefficient and McDonald’s omega were employed to assess the survey’s reliability. Confirmatory factor analysis (CFA) was conducted to determine the survey’s convergent and discriminant validity. The instrument included three sections: students’ sociodemographic data, anxiety levels while using different exam methods (paper, PC and MEPs) and their behavioural changes (See Supplementary Materials).

2.5. Survey Implementation and Quality Control

This study’s population comprised all students enrolled in KFU. According to Statista, more than 10,000 students were enrolled in 14 colleges in 2021. This study examined college students who decided to sustain using mobile exam platforms in most courses post-COVID-19 (see Table 1). To ensure that students were neither over- nor underrepresented, 800 paper surveys were delivered to each college. Then, an online survey was sent out through links that generated 1000 questionnaires. A total of 826 questionnaires with relevant data for analysis were received, with an overall response rate of approximately 82%. The research team delivered the paper questionnaires to students through their educators working in their colleges. The same educators were asked to send the questionnaire link to their students through personal networks, such as WhatsApp groups or emails, in February 2022. There was no power over students, because they were told that the study was only for research purposes and that their answers would be kept anonymous. Participation was optional and anonymous, and all the required precautions were taken to ensure the confidentiality of the data generated. To guarantee the respondents’ anonymity, all personally identifiable information was removed from the publicly available analyses. In addition, the provision of sensitive items (e.g., name, age and names of their colleges) were optional. Replies were checked each day and monitored for eight weeks, after which the investigators reviewed and observed the responses, overcame any obstacles faced by the students and answered all their enquiries to ensure survey implementation and survey quality control. Finally, the link was closed, and data analysis was initiated.

2.6. Data Analysis Methods

The statistical package SPSS version 25 (SPSS Inc., Chicago, IL, USA) was employed to analyse the data. Descriptive statistics were used to describe the essential characteristics of the participants, such as academic major and gender. Cross-tabulation and Chi-square tests were conducted to identify the association between students’ experiences with different exam methods (paper, PC and MEPs), as well as their anxiety and behavioural changes.

3. Results

3.1. Anxiety among Majors and Genders While Using Different Exam Methods

Analysis was conducted using the Pearson chi-square test to investigate respondents’ experiences with different exam methods (paper, PC and MEPs). More than 40% of respondents considered paper and PC exams to induce more anxiety than online MEPs (more than 25%). Likewise, more than 15% of the respondents reported that all three exam methods produced anxiety. In comparison, only 12% did not express any anxiety about taking exams.
A student’s academic major was significantly associated with students’ self-reported experience of anxiety while using different exam methods (paper, PC and MEPs). The students in health and science majors (Clinical Pharmacy, Applied Medical Sciences, Sciences and Agriculture and Food Science) experienced more positive anxiety during MEPs than those in humanities and social science majors (Arts, Law, Education and Business Administration). Furthermore, students’ gender was significantly associated with self-reported anxiety while using different exam methods (paper, PC and MEPs). In particular, female students from all faculties reported more positive anxiety about MEPs than male respondents (see Table 3).

3.2. Anxiety Levels While Using Different Exam Methods

Table 4 compares students’ anxiety levels while using different exam methods (paper, PC and MEPs). The comparison included four factors in calculating anxiety: paper/PC exams, MEPs, all three exam methods and no anxiety. Moreover, students were asked about 20 items that might have contributed to their anxiety while using different exam methods. The Pearson chi-square analysis of anxiety factors and their link with students’ experiences using different exam methods showed a significant association between students’ experiences and anxiety. The results revealed students’ positive anxiety while using paper and PC exams compared with MEPs. In particular, the students seemed to have highly positive anxiety due to their fear of making mistakes in entering answers (83.13%), fear of not passing because of using this specific method (81.28%) and being anxious and uncomfortable while taking the exam (81.02%). Meanwhile, the students appeared to have low positive anxiety during paper and PC exams, were not afraid of running out of time (39.95%) and had confidence in the accuracy of paper and PC exams in measuring their cognitive achievements (37.67%). They generally considered the paper and PC exam methods better than others (26.92%).
In comparison, the students reported having positive anxiety while taking exams via online MEPs, felt more challenged during exams (32.30%), felt anxious and uncomfortable (31.71%) and felt like a failure every time (31.25%). At the same time, the students appeared to have low positive anxiety while taking exams via online MEPs, were anxious about technical issues that may arise during the exam (21.90%), felt that this exam method cannot accurately measure their proficiency (20.94%) and were unable to write the correct answers to essay questions using this method (19.78%).

3.3. Behavioural Changes While Using Different Exam Methods

Table 5 compares students’ behavioural change levels while using different exam methods (paper, PC and MEPs). The comparison included four factors in calculating the behavioural changes: paper/PC exams, MEPs, all three exam methods and no behavioural changes. Moreover, students were asked about 20 items that might have contributed to their behavioural changes while using the different exam methods. The Pearson chi-square analysis of behavioural change factors and their link with students’ experiences while using different exam methods showed a significant association between students’ experiences and behavioural changes.
The results showed that students experienced negative behavioural changes while using paper and PC exams compared with MEPs. Such negative behavioural changes included feeling sick and having a stomach ache before exams (57.55%); heart palpitations during exams (51.34%); dry mouth, nausea, diarrhoea and constipation during exams (51.34%); and exhaustion and thinking excessively during exams (51.21%). The students also appeared to have low negative behavioural changes during paper and PC exams, such as wanting to eat more food and junk food during exams (21.23%), to drink caffeine and high-energy drinks during exams (18.87%) and to smoke excessively during exams (11.12%).
Similarly, the students appeared to have negative behavioural changes while taking exams via MEPs, such as distraction and lack of focus during exams (38.78%); having heart palpitations during exams (23.58%); dry mouth, nausea, diarrhoea and constipation during exams (21.30%); and feeling uncomfortable during exams and wanting to finish the assessment quickly (20.59%). In contrast, the students appeared to have low negative behavioural changes while taking exams via online MEPs, such as sweating or feeling cold during exams (16.32%), wanting to smoke excessively during exams (10.14%) and feeling sick and having a stomach ache before exams (8.12%).

4. Discussion

The COVID-19 outbreak has caused a significant transformation in the learning life cycle implemented in HEIs worldwide. The pandemic has also expanded the adoption of online exam platforms (OEPs), especially online MEPs [47,48,49,50]. In the KSA, the sustainability of OEP adoption in HEIs is manifested by the fact that it continues to be used as an evaluation method beyond COVID-19 [2,12,51]. Related to this phenomenon, the current research aimed to understand students’ anxiety and behavioural changes associated with the use of online MEPs among KFU students in the KSA. The study also aimed to determine how students’ majors and gender influenced such anxiety and behavioural changes.
The results revealed that 85% of students self-reported experiencing anxiety and behavioural changes while using different exam methods (paper, PC and MEPs). About 77.42% preferred online MEPs over paper and PC exams, which might be explained by students’ fear of making mistakes in entering answers during the MEPs, fear of being unable to pass the assessment online, losing self-confidence during exams, incompetence in writing the answer correctly in essay questions, feeling anxious and uncomfortable during exams via MEPs, the challenges associated with the online review and feeling constantly anxious about the possibility of a computer failure, power outage or Internet connection issues. Given that the students’ primary concerns were passing the final assessments and completing their courses with high grades, they were worried about the harmful effects of MEPs on their GPAs. Apart from the fact that most of the students were unfamiliar with the MEPs taken via online platforms, such as Blackboard, they were also anxious and worried about whether they would do well, similar to the findings of past studies [2,12,47,48,49,51].
More than one-third of the respondents (39%) thought that the paper and PC exam methods generated more anxiety than the MEPs (reported by 30%). These results can be attributed to students’ sense of motivation, comfort and reassurance while using MEPs, along with the multiple benefits of using a smartphone, such as ease of use and the ability to auto-correct words when answering essay questions [3,12,52,53]. More than 13% of the respondents thought that different methods made them anxious. In particular, the students felt no sense of comfort and reassurance in all exams because they thought that they did not accurately measure their skills. The students also frequently used social media before exams in all three exam methods and thought that neither method accurately measured their proficiency, similar to past studies [54,55,56].
Meanwhile, over 18% of the respondents thought that different exam methods (paper, PC and MEPs) did not cause anxiety, although they preferred MEPs because of their many advantages, such as accurately measuring their cognitive achievements and skills. The students also felt that using a smartphone during exams increased their motivation and sense of self-confidence, comfort and reassurance. Furthermore, this exam method allowed them to quickly answer the online assessment questions, focus on the exam without fear of running out of time, do an excellent job and obtain better grades, similar to the findings of past studies [57,58,59,60].
The results further revealed that students’ experiences of anxiety while using different exam methods (paper, PC and MEPs) were associated with their academic majors. In particular, students in health and science majors reported having more positive anxiety about MEPs than students in humanities and social science majors. The students in health and science majors also preferred paper and PC exams. This might be explained by the fact that they feared making mistakes in entering answers, especially the mathematical calculations and scientific equations, as this required more time than simply deriving solutions on MEPs. They also feared being unable to pass the assessment online due to their diminished self-confidence and constant anxiety about the possibility of a smartphone failure, power outage, or Internet connection issues during online learning, similar to the findings of other studies [61,62,63,64]. Conversely, students in humanities and social science majors preferred MEPs because they required shorter time to complete than paper-based assessments. They also thought that MEPs accurately measured their cognitive achievements, skills and competencies and reported that using MEPs increased their motivation. They were also able to answer the online assessment questions quickly and focus on exams, similar to past studies [55,56,65,66].
Furthermore, the experience of anxiety while using different exam methods (paper, PC and MEPs) among participants was found to be associated with the students’ gender. In particular, the results indicated higher levels of anxiety about MEPs among female students than among male students. Female students generally preferred paper and PC exams, which might be attributed to how they responded to anxiety events, whereas male students were less expressive of their worries. Disparities in emotional intelligence and academic anxiety among females might also have contributed to these observations, similar to other studies [67,68,69,70,71,72].
The results showed a positive relationship between anxiety and behavioural changes among students while using different exam methods (paper, PC and MEPs), as with other studies [42,73,74,75]. The positive relationship between anxiety resulting from paper and PC exams and behavioural changes can be due to students’ exhaustion and excessive thinking, distraction and lack of focus, heart palpitations and feeling very sick and having a stomach ache while using online MEPs. These findings are in accordance with those reported in previous studies [42,76,77].
The significant behavioural changes experienced by students while taking paper and PC exams included insomnia and difficulty sleeping at night; enjoying exercise activities; dry mouth, nausea, diarrhoea and constipation; feeling uncomfortable and wanting to finish the assessment quickly; feeling nervousness and anger; and sweating and feeling cold, similar to past studies [2,78,79]. At the same time, the results indicated a low correlation between students’ behavioural changes and anxiety while taking exams via MEPs, as manifested by their desire to eat food and junk food, drink caffeine and high-energy drinks and smoke excessively before exams, similar to past studies [42,77,78,80].

5. Conclusions

In this study, more than 40% of the students reported anxiety during paper and PC exams, more than 25% felt anxiety during exams taken via online MEPs, almost 20% reported experiencing anxiety during all three exam methods, and more than 10% did not report any form of anxiety. The main factors related to students’ anxiety included worrying about making mistakes in writing answers, fear of being unable to pass, losing self-confidence, incompetence in writing the answer correctly in essay questions and feeling anxious and uncomfortable. Thus, we recommend that call session makers in HEIs adopt MEPs to reduce exam anxiety, utilise mobile adaptive assessment platforms and adopt AI technology to help develop sustainable, anxiety-free assessment environments.
Additionally, the results revealed that behavioural changes centred on feelings of having heart palpitations; insomnia and difficulty sleeping at night; exhaustion and excessive thinking; distraction and a lack of focus; and feeling sick and having a stomach ache before online exams. Accordingly, awareness and coaching are needed to encourage students to control and manage such behavioural changes during anxiety periods, which can also improve their sleep quality and increase their physical activity. The article concludes that there is a need for collaboration among social scientists, psychologists, psychosocial specialists, educators, and humanities scholars/humanists to create better educational policies and pedagogical practices to develop a robust and sustainable online examination system that can be utilised in HEIs in the KSA.

6. Limitations and Future Works

As with other studies, the current article has some limitations that must be tackled in future works to better investigate the relationship between anxiety and behavioural changes during MEPs among higher education students in the KSA. First, the data were collected from a small sample of students enrolled in HEIs in the KSA. Hence, the findings’ generalisability to other universities in the KSA, the Gulf, other Middle Eastern counties or other geographical locations may be handled cautiously. Second, this article only applied the quantitative analysis method. However, future research could integrate qualitative and quantitative approaches to discover further reasons and associations, such as students’ motives, emotions, and feelings during exams. Finally, other mediating and moderating variables, such as adaptive exams or AI-based exams, can be combined in future research.

Supplementary Materials

Click https://cutt.us/uRQWy to view and download the research instrument (survey) used in the current work.

Author Contributions

The authors M.A.S. and A.S.A. have contributed conceptualization, methodology, software, validation, formal analysis, investigation; resources; data curation; writing—original draft preparation; writing—review and editing; visualization; supervision; project administration; funding acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Deanship of Scientific Research, Vice Presidency for Graduate Studies and Scientific Research, King Faisal University, Saudi Arabia [Grant No. 3988].

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Deanship of Scientific Research Ethical Committee, King Faisal University (date of approval: 1 January 2022).

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy concerns and the need to be made anonymous on request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alrasheed, H.; Althnian, A.; Kurdi, H.; Al-Mgren, H.; Alharbi, S. COVID-19 spread in Saudi Arabia: Modeling, simulation and analysis. Int. J. Environ. Res. Public Health 2020, 17, 7744. [Google Scholar] [CrossRef] [PubMed]
  2. Alshammari, T.; Alseraye, S.; Alqasim, R.; Rogowska, A.; Alrasheed, N.; Alshammari, M. Examining anxiety and stress regarding virtual learning in colleges of health sciences: A cross-sectional study in the era of the COVID-19 pandemic in Saudi Arabia. Saudi Pharm. J. 2022, 30, 256–264. [Google Scholar] [CrossRef] [PubMed]
  3. Salem, M.A.; Alsyed, W.H.; Elshaer, I.A. Before and Amid COVID-19 Pandemic, Self-Perception of Digital Skills in Saudi Arabia Higher Education: A Longitudinal Study. Int. J. Environ. Res. Public Health 2022, 19, 9886. [Google Scholar] [CrossRef]
  4. Marinoni, G.; Van’t Land, H.; Jensen, T. The impact of COVID-19 on higher education around the world. IAU Glob. Surv. Rep. 2020, 23, 1–17. [Google Scholar]
  5. Reimers, F.M.; Marmolejo, F. Leading Learning During a Time of Crisis. Higher Education Responses to the Global Pandemic of 2020. In University and School Collaborations during a Pandemic. Knowledge Studies in Higher Education. 2022, pp. 1–41. Available online: https://link.springer.com/chapter/10.1007/978-3-030-82159-3_1 (accessed on 1 July 2023).
  6. Montenegro-Rueda, M.; Luque-de la Rosa, A.; Sarasola Sánchez-Serrano, J.L.; Fernández-Cerero, J. Assessment in higher education during the COVID-19 pandemic: A systematic review. Sustainability 2021, 13, 10509. [Google Scholar] [CrossRef]
  7. Gamage, K.A.; Wijesuriya, D.I.; Ekanayake, S.Y.; Rennie, A.E.; Lambert, C.G.; Gunawardhana, N. Online delivery of teaching and laboratory practices: Continuity of university programmes during COVID-19 pandemic. Educ. Sci. 2020, 10, 291. [Google Scholar] [CrossRef]
  8. Gamage, K.A.; Silva, E.K.d.; Gunawardhana, N. Online delivery and assessment during COVID-19: Safeguarding academic integrity. Educ. Sci. 2020, 10, 301. [Google Scholar] [CrossRef]
  9. Melgaard, J.; Monir, R.; Lasrado, L.A.; Fagerstrøm, A. Academic procrastination and online learning during the COVID-19 pandemic. Procedia Comput. Sci. 2022, 196, 117–124. [Google Scholar] [CrossRef]
  10. Dikmen, M. Test anxiety in online exams: Scale development and validity. Curr. Psychol. 2022, 1–13. [Google Scholar] [CrossRef]
  11. Khan, M.A.; Vivek, V.; Khojah, M.; Nabi, M.K.; Paul, M.; Minhaj, S.M. Learners’ perspective towards e-exams during COVID-19 outbreak: Evidence from higher educational institutions of India and Saudi Arabia. Int. J. Environ. Res. Public Health 2021, 18, 6534. [Google Scholar] [CrossRef]
  12. Al-Jarf, R. Online Exams in Language, Linguistics and Translation Courses during the Pandemic in Saudi Arabia. Online Submiss. 2022, 4, 14–25. [Google Scholar] [CrossRef]
  13. Abdelwahed, N.A.A.; Aldoghan, M.A.; Moustafa, M.A.; Soomro, B.A. Factors affecting online learning, stress and anxiety during the COVID-19 pandemic in Saudi Arabia. Int. J. Hum. Rights Healthc. 2022. ahead-of-print. [Google Scholar] [CrossRef]
  14. Alshurideh, M.T.; Al Kurdi, B.; AlHamad, A.Q.; Salloum, S.A.; Alkurdi, S.; Dehghan, A.; Abuhashesh, M.; Masa’deh, R.E. Factors Affecting the Use of Smart Mobile Examination Platforms by Universities’ Postgraduate Students during the COVID-19 Pandemic: An Empirical Study. Informatics 2021, 8, 32. [Google Scholar] [CrossRef]
  15. Alzain, E. Examining Saudi Students’ Perceptions on the Use of the Blackboard Platform during the COVID-19 Pandemic. Int. J. Learn. Teach. Educ. Res. 2021, 20, 109–125. [Google Scholar]
  16. YILMAZ, F.G.K.; Ustun, A.B.; Yilmaz, R. Investigation of pre-service teachers’ opinions on advantages and disadvantages of online formative assessment: An example of online multiple-choice exam. J. Teach. Educ. Lifelong Learn. 2020, 2, 1–8. [Google Scholar]
  17. Heitkötter, H.; Hanschke, S.; Majchrzak, T.A. Evaluating cross-platform development approaches for mobile applications. In Proceedings of the Web Information Systems and Technologies: 8th International Conference, WEBIST 2012, Porto, Portugal, 18–21 April 2012; Revised Selected Papers 8. pp. 120–138. [Google Scholar]
  18. Tawir, K.M.O.; Baharum, H.I.B. The Use of Mobile Learning in English Foreign Language Classroom: Challenges, Advantages and Disadvantages, Applications and Implications to Foreign Language Learners; Faculty of Social Sciences and Humanities UTM: Skudai, Malaysia, 2022; p. 138. ISBN 978-629-97531-3-1. [Google Scholar]
  19. Stöber, J. Dimensions of test anxiety: Relations to ways of coping with pre-exam anxiety and uncertainty. Anxiety Stress Coping 2004, 17, 213–226. [Google Scholar] [CrossRef]
  20. Maloney, E.A.; Sattizahn, J.R.; Beilock, S.L. Anxiety and cognition. Wiley Interdiscip. Rev. Cogn. Sci. 2014, 5, 403–411. [Google Scholar] [CrossRef]
  21. Davis, W.B.; Thaut, M.H. The influence of preferred relaxing music on measures of state anxiety, relaxation, and physiological responses. J. Music Ther. 1989, 26, 168–187. [Google Scholar] [CrossRef]
  22. Jacobs, S.E. Anxiety and Test Performance: An Emotion Regulation Perspective; Stanford University: Stanford, CA, USA, 2013. [Google Scholar]
  23. Rogers, E. Keep a Breast: A Qualitative Study of Motivations for Selecting, Downloading and Using a Breast Cancer Self-Exam Mobile App. Master’s Thesis, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA, 2014. [Google Scholar]
  24. Jiang, Q.; Horta, H.; Yuen, M. International medical students’ perspectives on factors affecting their academic success in China: A qualitative study. BMC Med. Educ. 2022, 22, 574. [Google Scholar] [CrossRef]
  25. Yu, P.; Lu, Y.; Hanes, E.; Gu, H. Digitized Education: Enhancement in Teaching and Learning–China Case Study. In Technology Training for Educators From Past to Present; IGI Global: Hershey, PA, USA, 2022; pp. 1–35. [Google Scholar]
  26. Huang, J. Successes and challenges: Online teaching and learning of chemistry in higher education in China in the time of COVID-19. J. Chem. Educ. 2020, 97, 2810–2814. [Google Scholar] [CrossRef]
  27. Bishnoi, M.M.; Suraj, S. Challenges and Implications of Technological Transitions: The Case of Online Examinations in India. In Proceedings of the 2020 IEEE 15th International Conference on Industrial and Information Systems (ICIIS), Rupnagar, India, 26–28 November 2020; pp. 540–545. [Google Scholar]
  28. Giri, S.; Dutta, P. Identifying challenges and opportunities in teaching chemistry online in India amid COVID-19. J. Chem. Educ. 2020, 98, 694–699. [Google Scholar] [CrossRef]
  29. Habib, S.; Parthornratt, T. Students’ Perception of Online Classes and Exams Held During COVID-19 Pandemic: The Engineering Faculty’s Experience at Assumption University. In Proceedings of the 2021 6th International STEM Education Conference (iSTEM-Ed), Pattaya, Thailand, 10–12 November 2021; pp. 1–4. [Google Scholar]
  30. Joyce, P. The effectiveness of online and paper-based formative assessment in the learning of English as a second language. PASAA J. Lang. Teach. Learn. Thail. 2018, 55, 126–146. [Google Scholar]
  31. Khan, S.; Khan, R.A. Online assessments: Exploring perspectives of university students. Educ. Inf. Technol. 2019, 24, 661–677. [Google Scholar] [CrossRef]
  32. Fernandez, A.I.; Al Radaideh, A.; Singh Sisodia, G.; Mathew, A.; Jimber del Río, J.A. Managing university e-learning environments and academic achievement in the United Arab Emirates: An instructor and student perspective. PLoS ONE 2022, 17, e0268338. [Google Scholar] [CrossRef] [PubMed]
  33. Rahman, R.M.A.; Alsalhi, N.R.; Eltahir, M.E.; Al-Qatawneh, S.S.; Alzoubi, A.M.A. Applying Online Learning in Higher Education Institutions During the COVID-19 Pandemic: A Field Study in the United Arab Emirates (UAE). Inf. Sci. Lett. 2022, 11, 639–656. [Google Scholar]
  34. Khan, S.; Kambris, M.E.K.; Alfalahi, H. Perspectives of University Students and Faculty on remote education experiences during COVID-19-a qualitative study. Educ. Inf. Technol. 2022, 27, 4141–4169. [Google Scholar] [CrossRef] [PubMed]
  35. Abd Elgalil, H.M.; Abd El-Hakam, F.E.-Z.; Farrag, I.M.; Abdelmohsen, S.R.; Elkolaly, H. Undergraduate Students’ perceptions of online assessment during COVID-19 pandemic at faculty of medicine for girls, Al-Azhar University, Cairo, Egypt. Innov. Educ. Teach. Int. 2022, 60, 185–195. [Google Scholar] [CrossRef]
  36. Ashry, A.H.; Soffar, H.M.; Alsawy, M.F. Neurosurgical education during COVID-19: Challenges and lessons learned in Egypt. Egypt. J. Neurol. Psychiatry Neurosurg. 2020, 56, 110. [Google Scholar] [CrossRef]
  37. Hussien, R.M.; Elkayal, M.M.; Shahin, M.A.H. Emotional intelligence and uncertainty among undergraduate nursing students during the COVID-19 pandemic outbreak: A comparative study. Open Nurs. J. 2020, 14, 220–231. [Google Scholar] [CrossRef]
  38. Alsaady, I.; Gattan, H.; Zawawi, A.; Alghanmi, M.; Zakai, H. Impact of COVID-19 crisis on exam anxiety levels among bachelor level university students. Mediterr. J. Soc. Sci. 2020, 11, 33–39. [Google Scholar] [CrossRef]
  39. Alhusseini, N.; Alqahtani, A. COVID-19 pandemic’s impact on eating habits in Saudi Arabia. J. Public Health Res. 2020, 9. [Google Scholar] [CrossRef] [PubMed]
  40. Marín García, P.-J.; Arnau-Bonachera, A.; Llobat, L. Preferences and scores of different types of exams during COVID-19 pandemic in Faculty of Veterinary Medicine in Spain: A cross-sectional study of paper and e-exams. Educ. Sci. 2021, 11, 386. [Google Scholar] [CrossRef]
  41. Patel, A.A.; Amanullah, M.; Mohanna, K.; Afaq, S. E-exams under e-learning system: Evaluation of on-screen distraction by first year medical students in relation to on-paper exams. In Proceedings of the Third International Conference on e-Technologies and Networks for Development (ICeND2014), Beirut, Lebanon, 29 April–1 May 2014; pp. 116–126. [Google Scholar]
  42. Elsalem, L.; Al-Azzam, N.; Jum’ah, A.A.; Obeidat, N.; Sindiani, A.M.; Kheirallah, K.A. Stress and behavioral changes with remote E-exams during the COVID-19 pandemic: A cross-sectional study among undergraduates of medical sciences. Ann. Med. Surg. 2020, 60, 271–279. [Google Scholar] [CrossRef]
  43. Eltahir, M.E.; Alsalhi, N.R.; Al-Qatawneh, S.S. Implementation of E-exams during the COVID-19 pandemic: A quantitative study in higher education. PLoS ONE 2022, 17, e0266940. [Google Scholar] [CrossRef]
  44. Yilmaz, O. Preservice Science Teachers’ Opinions on E-Exams. Indones. J. Sci. Educ. 2020, 4, 152–159. [Google Scholar] [CrossRef]
  45. Hill, R. What sample size is “enough” in internet survey research. Interpers. Comput. Technol. Electron. J. 21st Century 1998, 6, 1–12. [Google Scholar]
  46. Muthén, B.; Asparouhov, T. Recent methods for the study of measurement invariance with many groups: Alignment and random effects. Sociol. Methods Res. 2018, 47, 637–664. [Google Scholar] [CrossRef]
  47. Selwyn, N.; O’Neill, C.; Smith, G.; Andrejevic, M.; Gu, X. A necessary evil? The rise of online exam proctoring in Australian universities. Media Int. Aust. 2023, 186, 149–164. [Google Scholar] [CrossRef]
  48. Azis, A.; Abou-Samra, R.; Aprilianto, A. Online Assessment of Islamic Religious Education Learning. Tafkir: Interdiscip. J. Islam. Educ. 2022, 3, 60–76. [Google Scholar] [CrossRef]
  49. Gupta, R.; Aggarwal, A.; Sable, D.; Chahar, P.; Sharma, A.; Kumari, A.; Maji, R. COVID-19 pandemic and online education: Impact on students, parents and teachers. J. Hum. Behav. Soc. Environ. 2022, 32, 426–449. [Google Scholar] [CrossRef]
  50. Salem, M.A.; Sobaih, A.E.E. A Quadruple “E” Approach for Effective Cyber-Hygiene Behaviour and Attitude toward Online Learning among Higher-Education Students in Saudi Arabia amid COVID-19 Pandemic. Electronics 2023, 12, 2268. [Google Scholar]
  51. AlJhani, S.; Alateeq, D.; Alwabili, A.; Alamro, A. Mental health and online learning among medical students during the COVID-19 pandemic: A Saudi national study. J. Ment. Health Train. Educ. Pract. 2022, 17, 323–334. [Google Scholar]
  52. Al-Bargi, A. Exploring online writing assessment amid COVID-19: Challenges and opportunities from teachers’ perspectives. Arab World Engl. J. (AWEJ) 2nd Spec. Issue Covid 2022, 19. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4035928 (accessed on 24 June 2022).
  53. Elfirdoussi, S.; Lachgar, M.; Kabaili, H.; Rochdi, A.; Goujdami, D.; El Firdoussi, L. Assessing distance learning in higher education during the COVID-19 pandemic. Educ. Res. Int. 2020, 2020, 8890633. [Google Scholar]
  54. Salem, M.A.; Sobaih, A.E.E. ADIDAS: An Examined Approach for Enhancing Cognitive Load and Attitudes towards Synchronous Digital Learning Amid and Post COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2022, 19, 16972. [Google Scholar]
  55. Pál, Á.; Koris, R. LSP teacher perspectives on alternative assessment practices at European Universities amid the COVID-19 crisis and beyond. In Emergency Remote Teaching and Beyond: Voices from World Language Teachers and Researchers; Springer: Berlin/Heidelberg, Germany, 2022; pp. 535–555. [Google Scholar]
  56. Daminda Kuruppu, K.A.D. Education reform as a platform to improve interactions of the engineering students during online teaching and learning at higher education amidst COVID-19 pandemic. Int. J. Educ. Reform 2022, 31, 202–217. [Google Scholar]
  57. Alenezi, A.M.; Salem, M.A. Implementation of smartphones, tablets and their applications in the educational process management at Northern Border University. Int. J. Educ. Sci. 2017, 18, 56–64. [Google Scholar]
  58. Almutawa, A.M.; Sruthi, S. Students’ perspective towards online proctoring in exams during COVID-19. J. Eng. Res. 2022, 10. [Google Scholar] [CrossRef]
  59. Yu, Z. Sustaining student roles, digital literacy, learning achievements, and motivation in online learning environments during the COVID-19 pandemic. Sustainability 2022, 14, 4388. [Google Scholar]
  60. Iqbal, S.A.; Ashiq, M.; Rehman, S.U.; Rashid, S.; Tayyab, N. Students’ perceptions and experiences of online education in Pakistani Universities and Higher Education Institutes during COVID-19. Educ. Sci. 2022, 12, 166. [Google Scholar]
  61. Ebaid, I.E.-S. Cheating among accounting students in online exams during COVID-19 pandemic: Exploratory evidence from Saudi Arabia. Asian J. Econ. Financ. Manag. 2021, 3, 211–221. [Google Scholar]
  62. Almetwazi, M.; Alzoman, N.; Al-Massarani, S.; Alshamsan, A. COVID-19 impact on pharmacy education in Saudi Arabia: Challenges and opportunities. Saudi Pharm. J. 2020, 28, 1431–1434. [Google Scholar] [PubMed]
  63. Al-Qdah, M.; Ababneh, I. Comparing online and paper exams: Performances and perceptions of Saudi students. Int. J. Inf. Educ. Technol. 2017, 7, 106. [Google Scholar]
  64. Alghamdi, S.; Ali, M. Pharmacy students’ perceptions and attitudes towards online education during COVID-19 lockdown in Saudi Arabia. Pharmacy 2021, 9, 169. [Google Scholar]
  65. Ma, K.; Chutiyami, M.; Zhang, Y.; Nicoll, S. Online teaching self-efficacy during COVID-19: Changes, its associated factors and moderators. Educ. Inf. Technol. 2021, 26, 6675–6697. [Google Scholar]
  66. Alshebami, A.; Al-Jubari, I.; Alyoussef, I.; Raza, M. Entrepreneurial education as a predicator of community college of Abqaiq students’ entrepreneurial intention. Manag. Sci. Lett. 2020, 10, 3605–3612. [Google Scholar]
  67. Levy, Y.; Ramim, M.M. A study of online exams procrastination using data analytics techniques. Interdiscip. J. E-Learn. Learn. Objects 2012, 8, 97–113. [Google Scholar] [CrossRef]
  68. Ilgaz, H.; Afacan Adanır, G. Providing online exams for online learners: Does it really matter for them? Educ. Inf. Technol. 2020, 25, 1255–1269. [Google Scholar]
  69. Akkad, A.; Bonas, S.; Stark, P. Gender differences in final year medical students’ experience of teaching of intimate examinations: A questionnaire study. BJOG: Int. J. Obstet. Gynaecol. 2008, 115, 625–632. [Google Scholar]
  70. Hillier, M. The very idea of e-Exams: Student (pre) conceptions. In Proceedings of the ASCILITE 2014-Annual Conference of the Australian Society for Computers in Tertiary Education, Dunedin, New Zealand, 24–26 November 2014; pp. 77–88. [Google Scholar]
  71. Ali, M.; Allihyani, M.; Abdulaziz, A.; Alansari, S.; Faqeh, S.; Kurdi, A.; Alhajjaji, A. What just happened? Impact of on-campus activities suspension on pharmacy education during COVID-19 lockdown—A students’ perspective. Saudi Pharm. J. 2021, 29, 59–66. [Google Scholar]
  72. Al-Qadasi, N.; Zhang, G.; Al-Awlaqi, M.A.; Alshebami, A.S.; Aamer, A. Factors influencing entrepreneurial intention of university students in Yemen: The mediating role of entrepreneurial self-efficacy. Front. Psychol. 2023, 14, 1111934. [Google Scholar] [PubMed]
  73. Prathish, S.; Bijlani, K. An intelligent system for online exam monitoring. In Proceedings of the 2016 International Conference on Information Science (ICIS), Kochi, India, 12–13 August 2016; pp. 138–143. [Google Scholar]
  74. Clark, T.M.; Callam, C.S.; Paul, N.M.; Stoltzfus, M.W.; Turner, D. Testing in the time of COVID-19: A sudden transition to unproctored online exams. J. Chem. Educ. 2020, 97, 3413–3417. [Google Scholar]
  75. Blakemore, S.J.; Burnett, S.; Dahl, R.E. The role of puberty in the developing adolescent brain. Hum. Brain Mapp. 2010, 31, 926–933. [Google Scholar] [PubMed]
  76. Moawad, R.A. Online learning during the COVID-19 pandemic and academic stress in university students. Rev. Românească Pentru Educ. Multidimens. 2020, 12, 100–107. [Google Scholar]
  77. Mushtaque, I.; Awais-E-Yazdan, M.; Waqas, H. Technostress and medical students’ intention to use online learning during the COVID-19 pandemic in Pakistan: The moderating effect of computer self-efficacy. Cogent Educ. 2022, 9, 2102118. [Google Scholar]
  78. Butler-Henderson, K.; Crawford, J. A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Comput. Educ. 2020, 159, 104024. [Google Scholar]
  79. Meccawy, Z.; Meccawy, M.; Alsobhi, A. Assessment in ‘survival mode’: Student and faculty perceptions of online assessment practices in HE during COVID-19 pandemic. Int. J. Educ. Integr. 2021, 17, 16. [Google Scholar]
  80. Alshebami, A.S.; Seraj, A.H.A.; Alzain, E. Lecturers’ Creativity and Students’ Entrepreneurial Intention in Saudi Arabia. Vision 2022. [Google Scholar] [CrossRef]
Figure 1. The process of literature review to develop the research survey.
Figure 1. The process of literature review to develop the research survey.
Sustainability 15 12868 g001
Table 1. Sample demographic characteristics.
Table 1. Sample demographic characteristics.
Demographic ItemsFrequencyPercentage
Gender
Female42451.33%
Male40248.67%
Colleges
Health and science colleges36043.58%
Clinical Pharmacy141.69%
Applied Medical Sciences627.51%
Science12615.25%
Agriculture and Food Sciences15819.13%
Humanities and social science colleges46656.42%
Arts34441.65%
Law465.57%
Education242.91%
Business Administration526.30%
Sum826100.00%
Table 2. Experts’ demographic characteristics.
Table 2. Experts’ demographic characteristics.
ProfileFrequencyPercentage
Gender
Male1042%
Female1458%
Country
Saudi Arabia937.50%
Egypt833.33%
Jordan520.83%
Spain14.17%
China14.17%
Experts’ minors
Instructional technology625.00%
Digital learning312.50%
Information technology312.50%
Psychology520.83%
Medicine729.17%
Table 3. Anxiety among majors and genders while using different exam methods (n = 826).
Table 3. Anxiety among majors and genders while using different exam methods (n = 826).
ItemsPaper/PC ExamsMobile Exam Platforms (MEPs)All Exam MethodsNo MatchingTotal (%)p-Value a
Health and science colleges 0.000
Clinical Pharmacy48.92%31.83%17.48%1.37%100%
Applied Medical Sciences42.67%37.98%14.03%5.79%100%
Science46.38%39.45%12.64%1.79%100%
Agriculture and Food Sciences41.15%35.58%13.33%9.98%100%
Humanities and social science colleges 0.000
Arts31.28%31.98%6.79%29.64%100%
Law33.85%23.58%13.35%29.33%100%
Education29.85%28.89%9.97%31.03%100%
Business Administration27.43%23.74%11.37%37.48%100%
Gender 0.000
Female49.78%23.03%21.09%6.09%100%
Male33.22%29.36%18.38%19.09%100%
a Analysis was performed using the Pearson chi-square test.
Table 4. Factors that induce anxiety (n = 826).
Table 4. Factors that induce anxiety (n = 826).
ItemsMean
(% of Yes)
Paper/PC ExamsMEPsAll Exam MethodsNo Matchingp-Value a
1.
I think this exam method takes longer than others.
38.44% 0.000
      Yes 59.33%31.00%24.98%28.09%
      No 40.67%69.00%75.02%71.91%
2.
I am unable to focus during exams.
39.93% 0.000
      Yes 59.13%29.23%31.42%40.45%
      No 40.87%70.77%68.58%59.55%
3.
I feel more challenged during exams
42.89% 0.000
      Yes 67.19%32.30%29.18%51.69%
      No 32.81%67.70%70.82%48.31%
4.
This exam method does not accurately measure my skills.
34.53% 0.000
      Yes 53.49%26.36%23.74%30.34%
      No 46.51%73.64%76.26%69.66%
5.
This exam method does not accurately measure my cognitive achievement.
25.11% 0.006
      Yes 37.67%29.30%8.37%20.22%
      No 62.33%70.70%91.63%79.78%
6.
I am anxious about technical issues going wrong during exams.
39.63% 0.000
      Yes 77.35%21.90%19.64%42.70%
      No 22.65%78.10%80.36%57.30%
7.
I sense that using this exam method increases my motivation.
29.71% 0.003
      Yes 44.57%24.36%20.21%13.48%
      No 55.43%75.64%79.79%86.52%
8.
I frequently want to use social media before exams.
36.25% 0.000
      Yes 54.38%30.12%24.26%20.22%
      No 45.62%69.88%75.74%79.78%
9.
I prefer this exam method because of its multiple advantages.
29.87% 0.000
      Yes 49.35%22.31%17.94%19.10%
      No 50.65%77.69%82.06%80.90%
10.
I lose a sense of comfort and reassurance during exams.
36.67% 0.001
      Yes 57.95%23.30%28.75%20.22%
      No 42.05%76.70%71.25%79.78%
11.
This assessment method is much better than others.
24.25% 0.000
      Yes 26.92%27.36%18.47%42.70%
      No 73.08%72.64%81.53%57.30%
12.
I feel anxious and uncomfortable during exams.
45.89% 0.000
      Yes 81.02%31.71%24.95%51.69%
      No 18.98%68.29%75.05%48.31%
13.
I lose self-confidence while using this exam method.
45.02% 0.000
      Yes 79.76%27.33%27.98%30.34%
      No 20.24%72.67%72.02%69.66%
14.
I fear making mistakes in entering answers during exams.
44.40% 0.913
      Yes 83.13%26.48%23.58%20.22%
      No 16.87%73.52%76.42%79.78%
15.
I see that this exam method does not accurately measure my competencies.
28.34% 0.000
      Yes 46.32%20.94%17.77%42.70%
      No 53.68%79.06%82.23%57.30%
16.
I feel like a failure every time I use this exam method.
34.99% 0.196
      Yes 49.79%31.25%23.94%13.48%
      No 50.21%68.75%76.06%86.52%
17.
I answer quickly without focusing for fear of running out of time during exams.
30.29% 0.000
      Yes 39.95%29.58%21.35%20.22%
      No 60.05%70.42%78.65%79.78%
18.
I can do an excellent job during exams and get a better score.
31.78% 0.000
      Yes 43.45%29.35%22.54%28.09%
      No 56.55%70.65%77.46%71.91%
19.
I fear I will not pass the exams.
45.12% 0.000
      Yes 81.28%27.23%26.84%40.45%
      No 18.72%72.77%73.16%59.55%
20.
I believe that I cannot write the answer correctly to essay questions on this exam method.
40.27% 0.000
      Yes 79.57%19.78%21.45%51.69%
      No 20.43%80.22%78.55%48.31%
a Analysis was performed using the Pearson chi-square test.
Table 5. Behavioural changes while using different exam methods (n = 826).
Table 5. Behavioural changes while using different exam methods (n = 826).
ItemsPaper/PC ExamsMEPsAll Exam MethodsNo Matchingp-Value a
1. I feel uncomfortable during exams and want to finish quickly. 0.000
    Strongly Agree47.37%20.59%9.56%18.68%
    Agree29.89%19.84%24.23%27.23%
    Undecided11.25%12.76%17.86%39.78%
    Disagree9.21%8.45%41.35%7.33%
    Strongly Disagree2.28%38.36%7.00%6.98%
2. I feel heart palpitations during exams. 0.000
    Strongly Agree51.34%23.58%13.33%16.36%
    Agree21.35%19.53%29.33%26.42%
    Undecided12.60%21.25%49.89%38.78%
    Disagree7.30%19.78%5.58%7.33%
    Strongly Disagree7.41%15.86%1.87%11.11%
3. I feel sleepless and have difficulty sleeping at night before exams. 0.000
    Strongly Agree49.75%17.84%12.36%3.15%
    Agree34.21%16.35%31.57%33.25%
    Undecided7.58%25.98%43.86%12.44%
    Disagree6.62%12.45%4.98%47.96%
    Strongly Disagree1.84%27.38%7.23%3.20%
4. I feel exhausted and think a lot before and during exams. 0.000
    Strongly Agree51.21%19.46%15.48%3.32%
    Agree23.59%27.25%21.27%24.87%
    Undecided10.36%19.77%14.84%47.35%
    Disagree6.37%1.23%46.84%4.68%
    Strongly Disagree8.47%32.29%1.57%19.78%
5. I lack focus and feel distracted during exams. 0.000
    Strongly Agree41.35%38.78%12.07%6.47%
    Agree24.23%26.42%32.15%31.57%
    Undecided17.86%16.37%9.99%43.86%
    Disagree9.45%7.33%42.94%4.98%
    Strongly Disagree7.11%11.10%2.85%13.12%
6. I sense nervousness and anger during exams. 0.000
    Strongly Agree47.37%19.94%6.99%11.23%
    Agree29.89%12.78%24.23%27.23%
    Undecided11.25%14.59%17.86%19.25%
    Disagree9.21%10.11%41.35%39.78%
    Strongly Disagree2.28%42.58%9.57%2.51%
7. I sweat and feel cold during exam. 0.000
    Strongly Agree42.86%16.32%13.25%7.96%
    Agree31.54%14.25%13.85%26.42%
    Undecided8.43%23.22%21.10%16.37%
    Disagree10.26%11.25%36.89%38.78%
    Strongly Disagree6.91%34.96%14.91%10.47%
8. I feel dry mouth, nausea, diarrhoea and constipation during exams. 0.000
    Strongly Agree51.34%21.30%8.21%7.23%
    Agree21.35%24.33%26.36%12.36%
    Undecided12.60%17.45%12.22%14.52%
    Disagree7.30%36.04%39.88%7.66%
    Strongly Disagree7.41%0.88%13.33%58.23%
9. I want to drink caffeine and high-energy drinks during exams. 0.012
    Strongly Agree18.87%17.33%14.52%10.75%
    Agree12.25%12.18%13.65%8.54%
    Undecided10.20%10.09%11.23%13.95%
    Disagree7.30%5.88%12.12%7.88%
    Strongly Disagree51.38%54.52%48.48%58.88%
10. I want to eat a lot of food and junk food before and during exams. 0.009
    Strongly Agree21.23%20.18%19.88%9.57%
    Agree14.18%13.33%15.79%9.76%
    Undecided1.11%2.33%13.17%18.54%
    Disagree5.12%7.47%6.47%3.45%
    Strongly Disagree58.36%56.69%44.69%58.68%
11. I would like to smoke a lot during exams. 0.011
    Strongly Agree11.12%10.14%9.98%9.01%
    Agree9.08%8.25%7.18%6.17%
    Undecided9.12%12.24%16.37%12.87%
    Disagree5.84%2.25%7.33%6.79%
    Strongly Disagree64.84%67.12%59.14%65.16%
12. I enjoy exercising during exams. 0.000
    Strongly Agree49.75%17.02%11.13%10.25%
    Agree34.21%34.44%29.27%21.22%
    Undecided7.58%38.78%15.54%8.07%
    Disagree6.62%3.22%40.12%9.19%
    Strongly Disagree1.84%6.54%3.94%51.27%
13 I feel sick and have a stomach ache before exams. 0.000
    Strongly Agree57.55%8.12%6.09%5.32%
    Agree22.60%21.53%32.12%16.46%
    Undecided12.24%17.74%14.77%14.78%
    Disagree4.68%43.21%40.99%12.12%
    Strongly Disagree2.93%9.40%6.03%51.32%
a Analysis was performed using the Pearson chi-square test.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Salem, M.A.; Alshebami, A.S. Exploring the Impact of Mobile Exams on Saudi Arabian Students: Unveiling Anxiety and Behavioural Changes across Majors and Gender. Sustainability 2023, 15, 12868. https://doi.org/10.3390/su151712868

AMA Style

Salem MA, Alshebami AS. Exploring the Impact of Mobile Exams on Saudi Arabian Students: Unveiling Anxiety and Behavioural Changes across Majors and Gender. Sustainability. 2023; 15(17):12868. https://doi.org/10.3390/su151712868

Chicago/Turabian Style

Salem, Mostafa Aboulnour, and Ali Saleh Alshebami. 2023. "Exploring the Impact of Mobile Exams on Saudi Arabian Students: Unveiling Anxiety and Behavioural Changes across Majors and Gender" Sustainability 15, no. 17: 12868. https://doi.org/10.3390/su151712868

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop