Next Article in Journal
Agent-Based Modeling and Simulation of Tourism Market Recovery Strategy after COVID-19 in Yunnan, China
Previous Article in Journal
Perceived Effects of Climate Change and Extreme Weather Events on Forests and Forest-Based Livelihoods in Malawi
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Student Satisfaction with Online Learning during the COVID-19 Pandemic: A Study at State Universities in Sri Lanka

by
Sujeewa Hettiarachchi
1,*,
BWR Damayanthi
1,
Shirantha Heenkenda
1,
DMSLB Dissanayake
2,
Manjula Ranagalage
2 and
Lalith Ananda
1
1
International Center for Multidisciplinary Studies, Faculty of Humanities and Social Sciences, University of Sri Jayewardenepura, Nugegoda 10250, Sri Lanka
2
Department of Environmental Management, Faculty of Social Sciences and Humanities, Rajarata University of Sri Lanka, Mihintale 50300, Sri Lanka
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(21), 11749; https://doi.org/10.3390/su132111749
Submission received: 21 September 2021 / Revised: 14 October 2021 / Accepted: 19 October 2021 / Published: 25 October 2021

Abstract

:
This quantitative study investigated the determinants of students’ satisfaction with their online learning experience at Sri Lankan universities during the COVID-19 pandemic. The data was collected from 1376 undergraduates enrolled in various courses in humanities and social sciences at three state-owned universities in the country. The results of the Structural Equation Modeling (SEM) revealed that the independent variables of the model, namely perceived learner motivation, perceived challenges of e-learning, and interaction significantly affected students’ satisfaction with their new online learning experience. Out of the three variables, learner motivation exerted the strongest effect on students’ satisfaction, implying the crucial role self-regulated learning—characterized by motivation—plays in online learning environments. The study has several implications for both creating and ensuring the long-term sustainability of productive and student-friendly online learning spaces in higher education.

1. Introduction

Online learning is the buzzword in contemporary discourse on education as it has become the only viable option to provide uninterrupted education in a world that values seclusion over socialization to curb the spread of COVID-19. However, the term has been used in general education since the 1990s. It is also known by alternative terms such as e-learning, blended learning, online education, web-based education, web-based instruction and online courses [1,2] though with some subtle terminological differences which are not very obvious to those who are outside the field of educational technology [3]. For example, even though ‘e-learning’ and ‘online learning’ have commonly been used interchangeably, the two terms have also been distinguished based on how education is provided in a given context. The term ‘online learning’ is only used in those contexts where education is provided through Internet whereas e-learning refers to the type of education provided through Internet as well as other media such as television, radio, and digital versatile discs (DVDs) [2,4]. Nevertheless, the modern use of the term ‘online learning’ is mostly ambiguous as it can encompass both synchronous (e.g., videoconferencing, live chat, and instant messaging) and asynchronous (e.g., web-based course materials) modes of teaching and learning [1,3]. Thus, in a nutshell, online learning can currently mean anything from uploading materials on an online platform to teaching live using various software applications such as Zoom which facilitate “the bridging of the space between the teacher and the student through the use of web-based technologies” [5,6].
The terms e-learning and online learning have commonly been used in the recent burgeoning literature on higher education to introduce the type of education provided in different contexts around the world during the COVID-19 pandemic [7,8]. However, the type of e-learning adopted during an emergency is not necessarily synonymous with high-quality, carefully designed, web-based online learning that has been practiced in the field of higher education for decades [3,7]. Due to this, Hodges et al. [3] propose Emergency Remote Teaching (ERT) as a more accurate term to refer to the mode of teaching many education institutions have adopted during the COVID-19 pandemic. ERT, unlike pre-planned online education, is “a temporary shift of instructional delivery” the goal of which “is not to re-create a robust educational ecosystem but rather to provide temporary access to instruction and instructional supports in a manner that is quick to set up and is reliably available during an emergency or crisis” [3]. One context in which such ERT has successfully been implemented in Asia during the pandemic is universities and other higher education institutions in Sri Lanka [9]. The successful implementation of ERT in the country’s higher education is especially noteworthy since online learning was not a term that had commonly been associated with Sri Lankan universities before the pandemic hit the country in March 2020. Until then, in the 15 public universities and other institutions involved in tertiary level education in the country, online learning had largely been limited to the exchange of materials and conducting selective assessment tasks via their Learning Management Systems (LMS) [9]. This is also evident from the fact that before the pandemic, no public university in Sri Lanka had a single program of study— except some distance education programs offered by the Open University of Sri Lanka—that had at least been partially conducted online [10]. However, in response to their sudden closure due to the pandemic, many universities and other higher education institutions in Sri Lanka managed to establish an effective system of ERT in their respective institutions [9].
Like in other contexts in the world, the transition to online learning in Sri Lankan universities during the pandemic was sudden and abrupt: both lecturers and students had a very limited time to prepare for the new mode of teaching and learning [9]. However, this transition was greatly facilitated by several initiatives taken by respective universities as well as the University Grants Commission (UGC) of Sri Lanka. The main initiative taken by the UGC was to connect the Learning Management Systems of state universities with the Lanka Education and Research Network (LEARN), an association that provides Internet access for education and research in the country. As a result of this initiative, most public universities (90% of state and non-state institutions) could provide their students with free access to online education during the pandemic [9]. This was also made possible by the agreement that the government reached with the Internet providers in the country to provide free access to learning management systems of Sri Lankan universities: “This has been instrumental in promoting online learning for students in Sri Lanka” [9], while reducing the magnitude of the equity gap which is reportedly a characteristic of online learning during the pandemic in many contexts around the world [7]. The popularity of online learning in Sri Lankan universities during the pandemic is evident from the fact that LEARN had observed 13 million online activities on their system (e.g., accessing reading materials, following lecture slides, attending online quizzes) within a week in May 2020, two months after Sri Lankan universities completely shifted to online learning. Furthermore, 540,000 students had participated in synchronous teaching and learning within a week in July 2020 while 91% of faculty members also reported using learning management systems for their teaching [9]. This transition to online learning was also facilitated by training programs conducted by respective universities to train their staff on online teaching and assessment, different software for online teaching, video/audio recording and editing digital resources. For example, the Center for Digital Education and Professional Development of the Faculty of Humanities and Social Sciences of the University of Sri Jayewardenepura had reported on their website that they had conducted ten staff training programs for the faculty between March 2020 and July 2020.
In their survey conducted in June 2020 at forty-six state and ten non-state higher education institutes in Sri Lanka involving students, lecturers, and administrators, Hayashi et al., 2020 [9] report that 94% of the country’s state higher education institutions had shifted to online education in response to the pandemic. Furthermore, 79% of online learning was Internet-based. The survey also identified various challenges associated with online education in Sri Lanka, some of which included the poor Internet connection, stressful nature of e-learning, difficulty in online assessments and/or exams, inadequate faculty-student interaction, poor quality of video collaboration software, and inadequate access to devices, which are reported as common challenges associated with online learning around the world too [7,8]. The poor Internet was the most common among Sri Lankan students as 70% of the students had identified it as a challenge. However, Hayashi et al. [9] also found that despite many reported challenges, 90% of the 16,521 respondents of the survey were satisfied with their online learning experience (moderately satisfied: 66%; satisfied: 24%). Still, what exactly determined students’ satisfaction with their online learning during the pandemic remains unexplored in their study, and this, we believe, warrants further investigation. Meanwhile, recent research on online education during the pandemic reports that the level of student satisfaction in online learning is determined by a variety of factors which can be broadly categorized under challenges of e-learning, learner motivation, and interaction [7,11]. Thus, the goal of this study is to determine how these three phenomena are related to student satisfaction in online learning in Humanities and Social Sciences at state universities in Sri Lanka. Using Structural Equation Modeling, the study answers the following research questions:
  • Research Question 1: To what extent are the students satisfied with their online learning experience?
  • Research Question 2: To what extent do the challenges of e-learning, motivation, and interaction impact students’ satisfaction with their online learning experience?
The rest of this paper is structured as follows. Section 2 provides a review of literature on the phenomena investigated in this study. Meanwhile, Section 3 describes the materials and methods of the study. The results of the study are presented in Section 4. Next, Section 5 provides the discussion and conclusions. Finally, Section 6 discusses the implications of the study.

2. Literature Review

2.1. Student Satisfaction

Whether learning happens online or face-to-face in a physical classroom, one of the measures of the effectiveness of education is student satisfaction [8,12,13]. It is an important construct in higher education, the systematic study of which can lead to better student performance, improvements in online teaching practices, and the retention of students in their academic programs [14]. Furthermore, it is a crucial element that can be used to measure the effectiveness of online learning [15]. While many definitions of student satisfaction are available in the literature, in this paper, following Sanchez-Franco (2009) [16], we define student satisfaction as the extent to which a student perceives his or her needs, goals and desires have completely been met.
Many studies have investigated the determinants of student satisfaction with online learning [13,17,18,19,20]. Accordingly, some of the key determinants of student satisfaction include the role of the instructor [17,21], teacher-student interaction [22], nature of the course structure [23], course content, the role of technology [24], learner motivation [25], learner efficacy [15], self-regulated learning, learning environment and methods of assessment [24]. In their review of literature on student satisfaction in e-learning during the last decade Yunusa and Umar (2021) categorize various determinants of student satisfaction in e-learning under four dimensions: communication dynamics (e.g., interaction, informational quality), e-learning-environmental factors (e.g., course structure, content), organizational factors (e.g., technological support, service quality) and personality and situational factors (autonomy, self-efficacy, motivation). Based on a multitude of studies, they show how student satisfaction can be a complex phenomenon that integrates these dimensions. Meanwhile, Zeng and Wang (2021) [8] provide a comprehensive review of studies on online learning during the COVID-19 pandemic and report that in Emergency Remote Teaching the same factors can determine the level of student satisfaction. In the present study, based on prior literature, we categorize different determinants of student satisfaction under three headings: challenges of e-learning, learner motivation, and interaction.

2.2. Challenges of E-Learning

Compared to face-to-face learning, online learning during the pandemic is replete with a multitude of challenges. Some of those include technical difficulties in attending lectures [26], staying focused during a lecture [6], insufficient IT literacy, limited opportunities for collaboration which results in a feeling of isolation [27] and the absence of opportunities for the development of practical skills, which some subjects demand for student success [7]. One of the main technical difficulties students in many online learning environments have reported encountering is poor Internet connectivity, making it impossible for them to regularly attend synchronous sessions of online teaching (e.g., [7,24,28]). Even in the Sri Lankan study conducted by Hayashi et al., (2020) [9], 70% of the students had identified poor Internet as a challenge for their online learning. This could be the reason why some studies have reported that those courses which ingrate both synchronous and asynchronous modes of teaching result in more student satisfaction than others: asynchronous mode (e.g., recorded lectures) ensures that students have access to course content even if their synchronous learning is interrupted due to poor Internet or electricity failures.
Students’ online learning experience is made worse by software and hardware issues that they are likely to face in their devices [28], in particular when mobile devices that many students rely on for online learning [9] may not be compatible with some software (e.g., word, excel, PowerPoint), required for their active and participatory learning. It is a common finding that the lack of suitable devices which adequately facilitate online learning can impact student satisfaction with e-learning [26]. Furthermore, in a face-to-face learning space, students find it easy to maintain their focus and interest during a lesson due to the physical presence of the instructor, eye contact, tools used for teaching, and the presence of peers [6]. However, many studies have reported this as a challenge associated with online education. For instance, in Means & Neisler’s (2020) [7] study with American undergraduates, 57% of the participants had rated their ability to remain focused during an online session as worse or much worse compared to face-to-face learning. Even in Yeung and Yau’s (2021) [26] qualitative study on online learning by undergraduates in Hong Kong universities, the participants identified maintaining concentration during a session as a major challenge for them: “My home is too small and I can hear my mum, TV, street noise…I just can’t focus on the lecture (p.11)”.
One other challenge that learners have been reported to face in online learning, in contrast to face-to-face learning, is the limited opportunities that they have for collaboration [7]. For example, Means and Neisler (2020) report that 65% of the participants in their study in American universities thought that the opportunity for collaboration was worse in an online learning environment. This may not only result in a feeling of isolation but also impact the level of student satisfaction [6]. The feeling of isolation that learners develop in an online learning environment can be detrimental in the sense that it can make learners feel that they do not belong to a scholarly community [29]. This is why Huang et al. (2020) [30] also identify isolation as a major challenge that learners face in online learning.
Thus, in summary, there are diverse challenges associated with online learning, especially as it is practiced during the pandemic, and those challenges can greatly impact students’ overall satisfaction in a course of study [7]. Therefore, in this study, we hypothesize that
Hypothesis 1 (H1).
The challenges that students encounter during their online learning negatively affect their overall satisfaction.

2.3. Learner Motivation

Motivation is a key construct that determines the amount of learning that happens in any learning environment. While motivation can either be extrinsic (associated with external rewards) or intrinsic (associated with self-satisfaction) [31], internally driven self-motivation can be defined as self-generated energy that directs the behavior of an individual towards achieving a particular goal. Unlike in face-to-face learning characterized by teacher presence and peer pressure, in online education, learners are left with more responsibility of managing their learning, often identified as an inherent challenge of the online learning experience. Due to this, Self-Regulated Learning (i.e., planning, monitoring and adapting one’s thoughts, feelings and actions in a cyclical process to attain a personal goal [32] becomes more crucial for the success in an online learning environment [33]. The centerpiece of self-regulated learning is self-motivation [34,35,36]. Self-motivated adult learners, in contrast to others, tend to develop an independent learning style, display self-directed behavior and have an internal locus of control.
A frequent finding in education research is that self-motivation is a major determinant of student success and their satisfaction with online learning [37]. For example, Threlkeld and Brzoska (1994) [38] describe maturity, high motivation levels, and self-discipline as ‘necessary characteristics’ of more successful and satisfied online learners (p. 53). Meanwhile, Oxford, Young, Ito, and Sumrall (1993) [39] identify it as the most important determinant of student success in online learning. In an online learning environment, learner motivation is closely tied to their interest in participating in a lesson when the instructor and the peers are physically absent [33]. Furthermore, it is connected to the lecturer’s pedagogical approach to online teaching. This mainly concerns whether the lecturer organizes teaching in short sessions, uses breakout rooms for group work, employs technology to enhance the quality of student learning and allows students to ask questions at their disposal. These are considered recommended practices for effective online instruction [7]. However, while a lecturer’s learner-friendly pedagogy can motivate students to regularly attend their online sessions, frequent disturbances due to the poor Internet connectivity and the absence of a learner-friendly environment at home can increase learner demotivation. Means and Neisler (2020) [7] report the difficulty in staying motivated during an online session to be the most common challenge for their participants: 79% of their participants identified this as a problem. Finally, while self-motivation leads to more academic success and satisfaction in online learning, the outcome of insufficient learner motivation in online learning may produce several detrimental outcomes. First, it can increase the number of student dropouts of an online program [29]: the number of students dropping out of online classes can be between 40% and 80% [40]. Second, learners may practice passive procrastination (e.g., delaying school-related tasks even when faced with negative consequences; [33], which in turn can result in poor academic success and low satisfaction with online learning. Thus, based on this review of learner motivation in online learning, we devise the following hypothesis:
Hypothesis 2 (H2).
Students with a higher level of self-motivation will experience more satisfaction with their online learning experience.

2.4. Interaction

Interaction is a multi-faceted construct that determines how well learning takes place in any educational context. As Garrison and Shale (1990) [41] state, education is inherently characterized by interaction: “in its most fundamental form, education is an interaction among instructor, student and subject content” (p.1). In addition to the above, in online learning, interaction can also comprise learners’ engagement with the technological medium used in a course [42]. While interaction is important in any mode of education, many studies have emphasized its extreme importance in online education in enhancing its quality and effectiveness [43,44,45,46]. Considering its significance, Williams, Karen, and Rosewell (2012) [47] even go on to suggest that interaction should be a principle of curriculum design in higher education.
According to Moore’s (1989) [48] famous classification, interaction is of three different types: learner-content interaction, learner-instructor interaction and learner-learner interaction. Learner-content interaction refers to students’ perceptual and cognitive contact with the materials that they are supposed to study in a given course of study. Such materials can include prescribed textbooks, course readings, lecture notes, audio-video materials and computer software. In online education, especially during the COVID-19 pandemic, learners in higher education commonly interact with different forms of e-content which can include e-books, e-journals, simulations, presentations, animations, databases, websites, audio-video productions, discussion forums and immersive content [49]. Students’ easy access to e-content is a major determinant of student satisfaction in online learning [50,51]. Bervell et al. (2019) [52] even report student-content interaction to be the most crucial factor among all forms of interaction that leads to student satisfaction in online learning. Due to this, the development of interactive e-content comprising infographics, video clips, forums and quizzes is essential in creating a quality online learning experience for learners [51].
The second type of interaction that leads to academic success in any learning context is learner-instructor interaction. As elaborated in literature, learner-instructor interaction is characterized by providing prompt feedback [53], availability of the instructor for help [54], instructor’s presence [22] and his/her understanding of learner needs [55]. In Moore’s (1989) [48] terms, learner-instructor interaction can facilitate the maintenance of learners’ interest in reading materials, motivation of learners and the enhancement of the overall learner interest in education. Learners also perceive learner-instructor interaction to be a facilitator of online learning [56]. As reviewed in Zeng and Wang (2021) [8], in online learning, learner-instructor interaction is a major determinant of student satisfaction. The general claim in the literature is that improved student-teacher interaction leads to greater student satisfaction [7,13,24]. In the absence of adequate teacher-student interaction, learners may feel that they don’t belong to a scholarly community, which can in return lead to student dissatisfaction [29].
Finally, learner-learner interaction or peer interaction refers to the interaction that takes place among learners with or without the presence of the instructor, which Moore describes as a great resource for learning. Peer interaction can impact both learner cognition and motivation in online learning [57,58]. However, results regarding the importance of student-student interaction in online learning are mixed. For example, even though Sher (2009) [59] reports that learner-learner interaction significantly impacts student satisfaction, Kuo et al. (2013) [60] have found otherwise. Due to such conflicting findings, some researchers even assume that learner-learner interaction is not as important as the other two forms of interaction in online learning [46]. However, many studies report that higher interaction in online learning leads to higher student satisfaction [7,13,24,61]. Thus, based on our discussion on interaction on online learning, we devise the following hypothesis for testing:
Hypothesis 3 (H3).
Online learning characterized by inadequate interaction (teacher-student; student-student; student-content) leads to poor student satisfaction.
The model depicted in Figure 1 shows a causal sequence whereby perceived challenges of e-learning, learner motivation and interaction are hypothesized to impact students’ satisfaction with online learning.

3. Materials and Methods

3.1. Research Design

This study was designed using quantitative techniques, in particular Confirmatory Factor Analysis (CFA) and Structural Equation Modeling (SEM). Perceived Satisfaction (PS) was used as the dependent variable while Perceived Challenges of E-learning (PCE), Perceived Learner Motivation (PLM) and Interaction (INT) were treated as independent variables.

3.2. Research Design

This cross-sectional survey used a questionnaire for data collection, which consisted of two main sections. Section A collected demographic data, viz. gender, academic year, the level of IT literacy and the type of the academic program. Section B included measures of the dependent variable (perceived student satisfaction) and the three independent variables of the study (perceived challenges of e-learning, perceived learner motivation and interaction). The dependent variable, Perceived Satisfaction (PS) was measured using eight indicators adapted from Baker (2010) [22], Dinh and Nguyen (2020) [24], Eom and Wen (2006) [13], Khan, Nabi, Khojah and Tahir (2021) [62], and they are related to different dimensions of online learning: lecturer’s use of technology, lecturer’s preparedness for online teaching, opportunities for lecturer-student interaction, peer interaction, the conduct of continuous and final assessments, the method of sharing e-resources and how student feedback is provided. As far as independent variables are concerned, Perceived Learner Motivation (PLM) was measured using five indicators: learner’s overall interest in attending an online session, lecturer’s teaching methodology, poor Internet connectivity, home environment and the absence of peer pressure in online learning. These indicators were adapted from Means and Neisler (2020) [7], Pelikan et al. (2021) [33], and Rovai et al. (2007) [63]. Next, the Perceived Challenges of E-learning (PCE) was measured using five indicators adapted from Eom and Ashill (2016) [27], Means and Neisler (2020) [7], Richardson et al. (2017) [6], and Zielinski (2000) [29]: technical difficulties in learning, staying focused during an online session, the level of IT literacy, limited opportunities for collaboration and practical training. Finally, Interaction (INT) was measured using three indicators, designed based on Moore’s (1989) [48] famous classification of interaction in distance learning: learner-content interaction, learner-instructor interaction, and learner-learner interaction: access to e-resources, opportunities to interact with the lecturer and course mates. Each variable was assessed using a five-point Likert scale: For satisfaction, motivation and interaction, the scale included strongly disagree (1), disagree (2), neutral (3), agree (4), and strongly agree (5). For Challenges of Online learning, it included extremely challenging (1), challenging (2), neutral (3), not challenging (4), not challenging at all (5). To ensure that language does not become a barrier in comprehension for participants, the questionnaire was translated into Sinhala, the dominant language of the population of students at the three universities concerned. While the translation was conducted by the researchers themselves, it was authorized by a Sinhala-English translator.
According to Hair et al. (2013) [64], for viable SEM, three or more indicators are required for each latent construct measured in a study. Accordingly, in this study, each construct was measured using at least three indicators: Perceived Satisfaction (PS: eight items), Perceived Challenges of E-learning (PCE: five items), Perceived Learner Motivation (PLM: five items), and Interaction (INT: three items).

3.3. Data Collection and Analysis

The survey used in this study was designed on Google forms and was circulated using emails. The study collected data from a sample of 1376 undergraduates enrolled in different study programs in Humanities and Social Sciences at three public universities in Sri Lanka: University of Sri Jayewardenepura (n = 564; 41%), University of Ruhuna (n = 320; 23%) and Rajarata University of Sri Lanka (n = 492; 35.8). The method of sampling employed was simple random sampling: The invitation for the survey was initially emailed to randomly selected 3000 undergraduates from the three universities. A total of 1376 valid and unduplicated responses were received with a response rate of 45.8%. Participation in this survey was completely voluntary, and they could withdraw from the study at any point without a consequence. Furthermore, the participants were not rewarded with any incentives. To ensure that language did not impose a barrier for comprehension, the questionnaire was provided in their’ native language.
Data gathered through questionnaires were coded and initially entered into a matrix in Microsoft Excel, and this allowed the data to be analyzed using Statistical Package for Social Sciences (SPSS) and AMOS. The data screening process included clearing and transforming data into a usable form. In this process, the missing value analysis and the outlier analysis were conducted, but no missing values or outliers were found. Following this, data were subject to the Cronbach’s alpha, and Confirmatory Factor Analysis (CFA) to examine the internal reliability and validity of each scale, following which Structural Equation Modeling (SEM) was conducted to explore path coefficients. This was considered the most appropriate and efficient estimation of the methods for interdependent relationships with multiple scaled variables. Data were processed using SPSS 24.0 and AMOS 24.0.

4. Results

4.1. Participants

The demographic profile of the 1376 undergraduates who took part in the study is given in Table 1. As the table shows, the majority of the survey participants were females (88.3%); the percentage of the male participants was around 11.7%. This is reflective of the percentage of male and female students in Humanities and Social Sciences of Sri Lankan universities. In terms of age, the participants ranged from 20 to 30 years. However, the dominant group was aged between 21 and 22, and they comprised 55.6% of the sample. The second dominant group aged between 23–24 made 29.3% of the sample. In terms of the year of study, first-year students made the highest percentage (58.4%) while the rest was represented by second (15.6%), third (17.6%), and fourth (8.4%) year students. A majority of the participants self-rated their IT literacy as moderate (86.1%) while the rest had either very high (5.4%) or low (8.4%) IT literacy. The geographic distribution of the survey participants is depicted in Figure 2. As the map shows, the participants are located in different regions of the country.

4.2. Latent Variables: Descriptive Statistics

As stated elsewhere, all latent variables, viz. satisfaction, motivation, and interaction of this study were measured through a five-point Likert scale ranging from strongly disagree (1) to strongly agree (5). However, in the analysis, the points strongly agree and agree were combined to create the single response Agree, and strongly disagree and disagree were amalgamated to create the single response Disagree: the response Neutral was left as it is. In the case of Challenges of Online Learning, extremely challenging and challenging were clubbed to form the response Challenging while not challenging and not challenging at all were combined to form the single response not challenging. Table 2 shows the percentages of responses for each indicator variable measured under the four latent constructs: Only responses for Agree or Challenging are recorded.
As far as the dependent variable satisfaction is concerned, the table shows that the highest percentage of student responses (73%) is recorded for the lecturer’s preparation for online teaching, indicating that most students are satisfied with the dimension of the lecturer’s preparation in online teaching. Meanwhile, the responses received for lecturers’ use of technology during online lessons is the lowest among all (28%), implying that a large number of students are not satisfied with how lecturers manage technology (i.e., using breakout rooms for group discussions, sharing videos, etc.) during an online lesson. Relatively higher percentages are also reported for dimensions such as the opportunities for lecturer-student interaction during sessions (43%), lecturer’s provision of student feedback (42%) and how continuous assessments are conducted (42%). However, student satisfaction is generally low with the aspects such as the way summative assessment is conducted (39%), promotes peer interaction (39%) and the method of sharing learning materials (38%).
In terms of motivation, the highest response rate (53%) is reported for students’ interest in attending online lectures, implying that the majority of the respondents were generally motivated to attend their online sessions. However, responses received for the lecturer’s method of teaching (25%) and the students’ home environment (24%) imply that these factors have demotivated a majority of students to attend their online sessions. In contrast, poor internet connectivity (12%) and the physical absence of peers (11%) had demotivated only a very small percentage of participants, which could be due to the asynchronous mode of teaching in which both these dimensions can be less relevant. In Perceived Challenges of Online Learning, most responses are recorded for technical difficulties (81%), implying that software issues and the unavailability of appropriate devices for online learning and assessment activities pose a challenge for most learners. The absence of opportunities for practical training (62%) is also a challenge for a majority of students. The third highest response (46%) is recorded for feeling isolated in online learning, a reported challenge for students in many online learning contexts. However, staying focused during a lesson has received the least responses among all, indicating that it may not be as challenging as other dimensions for the participants of this study. As far as the interaction is concerned, a majority of the participants reported having limited opportunities for lecturer-student interaction (56%) in their sessions. A high percentage of students (45%) also reported that they had inadequate opportunities to interact with their peers during online sessions. However, only 35% of the students stated that they had limited e-resources for learning, indicating that opportunities for student-content interaction must have been comparatively higher in their online learning.

4.3. Assessment of the Measurement Model

4.3.1. Reliability of Latent Constructs

The reliability of all structural measurements is estimated using Cronbach’s alpha reliability, which explores the internal consistency and the properties of the measuring scale. Table 3 provides a summary of Cronbach’s alpha for each of the constructs measured. According to George and Millery (2003) [65], alpha frequencies indicate a more reliable level at 0.7 while a value greater than 0.8 indicates a higher level of reliability. As the table shows, the alpha coefficients for PS and INT are above 0.7, thus indicating adequate internal consistency for those variables. Meanwhile, PLM and PCE are 0.61 and 0.68, respectively, which can be considered acceptable. In general, it can be concluded that all the latent constructs were characterized by good internal consistency allowing further analyses.

4.3.2. Model Requirements

Univariate and multivariate normality requirements of the data for SEM in the current study were estimated using the distributional indicative measures, Skewers, and Kurtosis. Skewness values for all the indicator variables ranged from −0.71 to 1.94, of which only three indicators reported the values greater than 1. Meanwhile, Kurtosis values are less than 7 for all the indicator variables showing univariate normality. Multivariate normality was measured by using Mardia’s coefficient. Mardia value recorded for this study is at 22.66, which is well below the recommended cut-off of 483 with the 21 observed variables meeting multivariate normality. Accordingly, the assumptions of the univariate and the multivariate normality are satisfied in this study (refer to Table 4).
Meanwhile, the linearity among variables was tested by using the regression method. The dependent variable, Perceived Satisfaction was regressed concerning each independent variable, and curve fittings were tested as reported in Table 5. PLM and PS were related linearly with an F-value of 161.399 at one percent level of significance. PCE was linearly connected to PS, recording a significant F-value of 82.548, while the curve fitting between PS and INT provides an F-value of 16.765 for the linear relationship. All the other forms such as quadratic and cubic forms recorded lower F values than those for all the variables. Hence, all independent latent constructs and the dependent variable reported a satisfactory level of linearity between each pair, and this satisfies this study’s linearity assumption.
The present study used Pearson correlation, the variance inflation factor, and the tolerance for collinearity diagnosis [64]. The highest Pearson correlation value reported was 0.497 between INT and PLM, indicating the existence of serious multicollinearity among independent variables. As reported in Table 6, the variance inflation factor (VIF), which assesses the extent to which the variance of an estimated regression weight increases when predictors are correlated, range from 1 to 3. This confirmed that there were no serious collinearity issues among the predictors of the model. Tolerance values for all the observed variables which are shown in the second column of the same table report values greater than 0.10, indicating the non-existence of multicollinearity.
One of the basic requirements for structural equation modelling is the existence of a satisfactory level of correlation among variables, an essential condition for further analysis. Table 7 reports the correlation among all the variables of the present study. As expected, they are correlated with each other with expected size and statistical significance at conventional levels. The Pearson correlation coefficient for the variables ranged from 0.11 (the lowest, between PS and INT) to 0.463 (the highest, between PS and PLM) indicating a satisfactory level of expected relationships among all the variables. Thus, the results of the bivariate analysis provided a sound base for further analysis.

4.3.3. Model Fit Indices

The estimated measurement model is illustrated in Figure 3. Once the overall model fit is considered, Model χ2 = 754.263, df = 180 and CMIN/DF recorded 4.19, making the measurement model acceptable. As shown in Table 8, the Root Mean Square Error of Approximation (RMSEA), which assesses the hypothesized model fit with a population covariance matrix, is 0.048 for the estimated model, and 0.140 > PCLOSE reject the null “RAMSEA is greater than 0.05”. The root mean square residual (RMR) value for the current study (RMR = 0.020) is less than the critical value of 0.05 while GFI (Goodness of fit index) and adjusted GFI (AGFI) that represent the overall amount of the covariation among the observed variables that can be accounted for by the model is 0.949 and 0.934, respectively. They are greater than 0.9 providing evidence of well-fitting of the measurement model. The Comparative fit index (CFI) and the value for the model are greater than 0.9 (CFI = 0.931), indicating a good overall fit of the measurement model. The Normed Fit Index (NFI) value of this study was 0.912 which is greater than 0.9, and it indicates a good incremental fit. Moreover, TLI = 0.920, IFI = 0.931 are greater than the cut-off of 0.9. Accordingly, all the model fit indices meet the requirements for a good-fitting measurement model.

4.3.4. Validity of the Measurement Model

The convergent validity is verified mainly by computing the Average Variance Extracted (AVE), standardized loadings, and the construct reliability (CR) for all variables. AVE reported for the PS and INT are above 0.5 while the other two constructs PLM and PEC record AVE values at 0.318 and 0.387, respectively. CR values for PS, PCE, and INT are above the cut-off of 0.7 while it is almost 0.7 (0.698) for PLM. Thus, all the factor loadings are significant. The results of all the three indicators that are presented in Table 9 provide evidence for a satisfactory level of convergent validity for the measurement model.
In Table 10, the diagonal values indicate the square root values of AVE for relevant variables while the values below the diagonal figures show correlations. As the table shows, all the inter-variable correlations are less than the relevant AVE square root values, supporting the discriminant validity of the measurement model of the current study. Further, heterotrait-monotrait criteria (HTMT) for discriminant validity was investigated following Henseler, et al. (2015) [66] to avoid caveats of the Fornell-Larcker criterion [67]. The HTMT value was 0.167, which is well below the threshold of 0.85, and this confirmed the discriminant validity of the model.
As Table 10 shows, all correlations between the structure of the measurement model are in expected directions, and they are statistically significant. Therefore, this study guarantees the nomological validity of the setting.

4.4. Assessment of the Structural Model

Standard model fit indices were used to assess the Goodness-of-fit (GFI) of the structural model, viz. the discrepancy ratio (χ2/df; df = degrees of freedom), the adjusted goodness-of-fit (AGFI), the comparative fit index (CFI), the normative fit index (NFI) and the root mean square error of approximation (RMSEA). As given on Table 11, χ2 = 568.519, df = 176, CMIN/df = 3.230, TLI = 0.944, CFI = 0.953, IFI = 0.953, RMR = 0.021, RMSEA = 0.040. For a good model fit, the discrepancy ratio should be smaller than 5; the AGFI should be higher than 0.8 while CFI and NFI should be greater than 0.9. Meanwhile, the RMSEA should be below or equal 0.08 for a good fit and below 0.05 for an excellent fit. The results show that the model is good fit for testing the direct-effect hypotheses established in this study.
All the model fit indices of the structural model have improved compared to the measurement model. The Chi-square (χ2) has improved by 185.744 for the SEM model. The Chi-square difference test shows that χ2 of the measurement model is significantly different from that of the direct structural model which was at 0.005 significance. In the SEM, df and CMIN/df have been reduced by 4 and 0.96, respectively, compared to the measurement model. CFI has improved by 0.022 while RMSEA has slightly reduced from 0.048 to 0.040. The results of this comparison indicate that the structural model has achieved a better overall model fit than the measurement model. It provides evidence for the good fit of the SEM.
The graphical output, along with the results of the structural equation model, is represented in Figure 4 while Table 12 reports the standardized structural path estimates of the main model and the factor loadings for each item on the latent factor, SEs, CRs and p-values of the SEM model. All the path estimates are significant (1 percent level of significance) and are in the expected direction. Accordingly, the three hypotheses of the study can be tested through path coefficients (β), critical ratios and related p-values. The results show that Perceived Learner Motivation (PLM) has the strongest effect on students’ perceived satisfaction. Furthermore, PLM has a direct and positive relationship with students’ perceived satisfaction (β = 0.484; CR = 10.13; p < 0.001), which supports hypothesis one. Meanwhile, Perceived Challenges of E-Learning (PCE) has a direct and negative relationship with students’ perceived satisfaction (β = −0.149; CR = −4.456; p < 0.001), and this supports hypothesis two. Finally, Interaction (INT) has a direct and negative relationship with students’ perceived satisfaction (β = −0.112; CR = −3.612; p < 0.001), which supports hypothesis three.

5. Discussion and Conclusions

This study started with the goal of investigating the determinants of student satisfaction with online learning in Sri Lankan universities during the COVID-19 pandemic. Based on the extensive literature on online learning and student satisfaction, we hypothesized that students’ satisfaction with online learning can be determined by three key variables: Perceived Challenges of E-learning (PCE), Perceived Learner Motivation (PLM), and Interaction (INT). The hypothesized model was tested using Factor Analysis and Structural Equation Modelling (SEM). The results revealed that all three independent variables have a significant impact on student satisfaction, a finding that is consistent with the literature reviewed on online learning and student satisfaction in this study [7,13,24,37].
Among the three independent variables, PLM has the strongest significant impact on students’ satisfaction. The positive relationship between these two constructs implies that students’ higher motivation in online learning leads to their increased satisfaction with the task, a finding consistent with empirical studies across different contexts [7,37]. This finding is not surprising given that in online learning, unlike in face-to-face learning, learners are left with the additional responsibility of their learning. As a result, self-regulated learning plays an important role in its success [33,36,68]. While motivation holds the key in self-regulated learning [34], self-regulated learners, in contrast to others, tend to develop an independent learning style, display self-directed behavior and have an internal locus of control of their learning [35,69,70,71,72]. Thus, online learners are generally assumed to be self-motivated [73], and this makes them naturally become more satisfied with their learning [27,37]. It is for this reason that learner motivation has been identified as the most important determinant of student satisfaction and success in online learning [33]. Our results may also account for the finding in Hayashi et al. (2021) [9] that 90% of the students that they surveyed in Sri Lankan universities were satisfied with their online learning experience despite various challenges that they had encountered. This satisfaction of the learners may stem from their motivation to continue their online education even during the pandemic, a plausible assumption consistent with the prior literature [33]. Furthermore, as reviewed in Rovai et al. (2007) [63], it is a common finding in the literature that various factors related to online learning, viz. the novelty effect of the use of the technology, less or little travel to the instruction site, the curiosity, and the demand for knowledge can increase learner motivation.
As reported in many studies on online learning, student demotivation is tied to the poor Internet connectivity and household environments that are not as learner-friendly as a classroom [7,68,74]. This study also found that only 24% of the survey respondents were satisfied with their home environment which may imply that it was not conducive to their online learning. This number is not surprising as additional questions on the survey revealed that even 24.4% of the participants attended online sessions from various places other than their own homes due to poor Internet connectivity issues. However, surprisingly only 12% of the participants identified the poor Internet connectivity as a demotivator for attending their online sessions. This may be because all three universities, where this study was conducted, had provided access to recorded lectures and other materials via the Learning Management System so that even those students who do not have access to a reliable Internet connection to attend synchronous sessions can be engaged in online learning. Even though the poor Internet connectivity can be a challenge that they encounter, the mere existence of challenges may not always hamper student motivation in online education as distance learners are naturally more self-resilient and motivated [75]. Even in a recent study, Dhinigra, Pasricha, Sthapak and Bhatnagar (2021) [68] report that most of the medical students that they surveyed in India had the motivation to receive online education despite various challenges they had encountered. Thus, despite challenges, online learners can be motivated to continue their education, but those challenges can negatively affect their satisfaction with the learning experience, a common finding in the literature [7,9,24]. This finding is echoed in this study too. The relationship observed between students’ challenges of online learning and satisfaction was negative and significant, implying that the challenges that students face in their online learning decrease their satisfaction.
Another finding of this study that is worth further discussion is the observed relationship between student satisfaction and interaction. The study found that poor interaction (teacher-student; student-student; student-content) leads to decreased student satisfaction. This supports a common finding in the literature that overall interaction is a major determinant of student satisfaction in online learning [7,13,24,61]. As elsewhere stated, interaction can be a complex phenomenon that integrates student-lecturer, student-student and student-content dimensions. Even though this study found the interaction to have a significant impact on student satisfaction, how each dimension of interaction contributes to student satisfaction in online learning, in particular in Emergency Remote Teaching, is less known in the literature. This is something that future research could investigate.
In conclusion, this study found that student satisfaction in online learning, better represented as Emergency Remote Teaching in some recent literature, is closely related to student motivation, challenges of e-learning, and interaction [3]. The motivation was found to be the strongest predictor of student satisfaction in online learning, and students may derive motivation in online learning from various aspects associated with it. However, future research is needed to further explore different predictors of student motivation that can ultimately lead to their satisfaction with the learning experience. Meanwhile, even though the study found that perceived challenges of e-learning negatively affect student satisfaction, those challenges that the learners encounter in online learning can be diverse: inherent challenges of learning such as isolation, challenges of the new learning environment, and challenges imposed by technology are some examples. Therefore, the relationship between these different types of challenges and student satisfaction with online learning deserves attention in future research. Finally, this study replicated the common finding in the literature that poor interaction in online learning environments leads to decreased student satisfaction [7,13,24]. Even though interaction is a determinant of student success and satisfaction in any mode of learning, it seems to have extra significance in online learning. This may be because rich student-student and lecturer-student interactions can alleviate the feeling of isolation that many students are supposed to experience [6] in an online learning space.
Finally, this study has several limitations. First, the data was collected from only three state universities in the country, which may limit the generalizability of its findings. Hence, an extensive study involving a representative sample from the other universities in the country is recommended in future research. A study of that nature may have important implications for online learning in higher education in post COVID-19 Sri Lanka. Second, among many factors that can affect student satisfaction in online learning (Eom and Wen, 2006), this study was limited to only three variables: perceived challenges, perceived learner motivation and interaction. We leave it for future research to investigate how other variables such as course structure, technology, learner efficacy, learner autonomy, students’ learning style and self-regulated learning are related to student satisfaction. Finally, this study found no significant impact of some demographic variables such as gender, age, year of study and the level of IT literacy on student satisfaction. However, this might have been affected by the size of the sample used in the study. Given this, a similar study with a larger sample is recommended for future research.

6. Implications

Online education, more appropriately referred to as Emergency Remote Teaching, has become the only viable option to provide higher education in many countries in the wake of the COVID-19 pandemic. In contrast to face-to-face learning, such education has the stigma of being of lower quality [3]. Hence, it is important that online learning practiced in higher education during the pandemic is subject to constant evaluation to ensure its quality and standards. Our findings in this study have several implications in this regard.
First and foremost, the positive relationship between student motivation and satisfaction has several practical implications for online teaching in higher education. Even though learners involved in online learning are generally assumed to be self-reliant and motivated [73], all learners may not necessarily have these characteristics, in particular, when online education is imposed on learners as the only option to receive uninterrupted education during the pandemic. Those characteristics may generally be associated with learners who choose the online option over face-to-face learning under normal circumstances. Hence, a simple transition from face-to-face learning to online learning during the pandemic may not necessarily guarantee that learners can be self-reliant and take care of their learning. To do this, learners may require constant guidance, encouragement and training [76]. Thus, one additional role of the lecturers in online teaching is to empower their learners by providing them with necessary encouragement and guidance so that they can be self-reliant and self-motivated and take care of their own learning. To motivate learners in an online learning environment, lecturers may need to consider learner needs and preferences in designing their online teaching activities. For example, in this survey, the participants (n = 1376) had indicated their preferences for online learning as follows:
As Figure 5 shows, out of the four modes of receiving learning input, 75% of the students prefer live sessions facilitating student-lecturer interaction over pre-recorded video lectures, pre-recorded audio lectures and education via lecture notes and handouts. This emphasizes that the interaction between the lecturer and the student is a decisive factor in online learning too. Therefore, it is not surprising that it is a major determinant of student satisfaction in online learning [45,46]. This could also imply the learner’s long habituated dependency on the lecturer, a cognitive engagement that he/she cannot overcome in adapting to a new learning environment. Hence, more effort is needed to enhance student-student and lecturer-student interaction in online teaching. The active use of social media applications, chat rooms, breakout rooms during online sessions can facilitate this interaction, which would ultimately result in more student satisfaction with online learning. Taking learner needs into consideration in designing teaching activities may also alleviate any anxiety that they may have with the new learning experience: “For those who have never taken an online course or who have little computer experience, an online course may be frightening [77]”. This can also be supported by any training that can be provided to students by the institution concerned to enhance their online learning skills. This can help them not only to explore the possibilities available to them in the online mode but also overcome at least some of the challenges they encounter in their online learning, in particular some technical challenges that may hinder their progress.

Author Contributions

Conceptualization, S.H. (Sujeewa Hettiarachchi), S.H. (Shirantha Heenkenda), M.R., D.D., B.D.; methodology, B.D., S.H. (Shirantha Heenkenda), S.H. (Sujeewa Hettiarachchi), M.R. and D.D.; validation, formal analysis and investigation, B.D., S.H. (Shirantha Heenkenda), D.D.; writing—original draft preparation, S.H. (Sujeewa Hettiarachchi), B.D.; writing—review and editing, L.A., S.H. (Sujeewa Hettiarachchi), B.D., S.H. (Shirantha Heenkenda) and M.R.; supervision, S.H. (Shirantha Heenkenda); project administration, S.H. (Sujeewa Hettiarachchi) and M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data presented in this study are available on request from the corresponding author.

Acknowledgments

We thank all the undergraduates at the three universities who volunteered to take part in this study and all our colleagues who helped us with the data collection. The authors are grateful to the anonymous reviewers and editors for their helpful comments and suggestions to improve the quality of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Curtain, R. Online Delivery in the Vocational Education and Training Sector: Improving Cost Effectiveness; Australian National Training Authority: Leabrook, Australia, 2002. [Google Scholar]
  2. Singh, V.; Thurman, A. How many ways can we define online learning? A systematic literature review of definitions of online learning (1988–2018). Am. J. Distance Educ. 2019, 33, 289–306. [Google Scholar] [CrossRef]
  3. Hodges, C.B.; Moore, S.; Lockee, B.B.; Trust, T.; Bond, M.A. The difference between emergency remote teaching and online learning. Educ. Rev. 2020, 27, 1–12. [Google Scholar]
  4. Lee, K. Rethinking the accessibility of online higher education: A historical review. Internet High. Educ. 2017, 33, 15–23. [Google Scholar] [CrossRef] [Green Version]
  5. Miller, A.; Topper, A.M.; Richardson, S. Suggestions for Improving IPEDS Distance Education Data Collection; NPEC, U.S. Department of Education: Washington, DC, USA, 2017.
  6. Richardson, J.C.; Maeda, Y.; Lv, J.; Caskurlu, S. Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Comput. Hum. Behav. 2017, 71, 402–417. [Google Scholar] [CrossRef]
  7. Means, B.; Neisler, J. Suddenly Online: A National Survey of Undergraduates During the COVID-19 Pandemic; Digital Promise: San Mateo, CA, USA, 2020. [Google Scholar]
  8. Zeng, X.; Wang, T. College student satisfaction with online learning during COVID-19: A review and implications. Int. J. Multidiscip. Perspect. High. Educ. 2021, 6, 182–195. [Google Scholar]
  9. Hayashi, R.; Garcia, M.; Maddawin, A.; Hewagamage, K.P. Online Learning in Sri Lanka’s Higher Education Institutions during the COVID-19 Pandemic. Asian Dev. Bank 2020, 5, 12. [Google Scholar] [CrossRef]
  10. Commission, U.G. Undergraduate Handbook; University Grants Commission: Colombo, Sri Lanka, 2019/2020. [Google Scholar]
  11. Coman, C.; Țîru, L.G.; Meseșan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online teaching and learning in higher education during the coronavirus pandemic: Students’ perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  12. Alavi, M.; Wheeler, B.C.; Valacich, J.S. Using IT to reengineer business education: An exploratory investigation of collaborative telelearning. MIS Q. Manag. Inf. Syst. 1995, 19, 293–311. [Google Scholar] [CrossRef]
  13. Eom, S.B.; Wen, H.J.; Ashill, N. The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decis. Sci. J. Innov. Educ. 2006, 4, 215–235. [Google Scholar] [CrossRef]
  14. Cole, M.T.; Shelley, D.J.; Swartz, L.B. Academic integrity and student satisfaction in an online environment. In Cases Online Learning Communities and Beyond: Investigations and Applications; IGI GLOBAL: Hershey, PA, USA, 2013; pp. 1–19. [Google Scholar]
  15. Alqurashi, E. Predicting student satisfaction and perceived learning within online learning environments. Distance Educ. 2019, 40, 133–148. [Google Scholar] [CrossRef]
  16. Sanchez-Franco, M.J. The moderating effects of involvement on the relationships between satisfaction, trust and commitment in e-banking. J. Interact. Mark. 2009, 23, 247–258. [Google Scholar] [CrossRef]
  17. An, H.; Shin, S.; Lim, K. The effects of different instructor facilitation approaches on students’ interactions during asynchronous online discussions. Comput. Educ. 2009, 53, 749–760. [Google Scholar] [CrossRef]
  18. Bair, D.E.; Bair, M. Paradoxes of online teaching. Int. J. Scholarsh. Teach. Learn. 2011, 5. [Google Scholar] [CrossRef]
  19. Piccoli, G.; Ahmad, R.; Ives, B. Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic it skills training. MIS Q. Manag. Inf. Syst. 2001, 25, 401–426. [Google Scholar] [CrossRef] [Green Version]
  20. Sun, P.-C.; Tsai, R.J.; Finger, G.; Chen, Y.-Y.; Yeh, D. What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  21. Ladyshewsky, R.K. Instructor presence in online courses and student satisfaction. Int. J. Scholarsh. Teach. Learn. 2013, 7, 1–23. [Google Scholar] [CrossRef] [Green Version]
  22. Baker, C. The impact of instructor immediacy and presence for online student affective learning, cognition, and motivation. J. Educ. Online 2010, 7, 1–30. [Google Scholar] [CrossRef]
  23. Liaw, S.-S. Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system. Comput. Educ. 2008, 51, 864–873. [Google Scholar] [CrossRef]
  24. Dinh, L.P.; Nguyen, T.T. Pandemic, social distancing, and social work education: Students’ satisfaction with online education in Vietnam. Soc. Work Educ. 2020, 39, 1074–1083. [Google Scholar] [CrossRef]
  25. Artino, A.R.J. Online military training using a social cognitive view of motivation and self-regulation to understand students’ satisfaction, perceived learning, and choice. Q. Rev. Distance Educ. 2007, 8, 191–202. [Google Scholar]
  26. Yeung, M.W.L.; Yau, A.H.Y. A thematic analysis of higher education students’ perceptions of online learning in Hong Kong under COVID-19: Challenges, strategies and support. Educ. Inf. Technol. 2021, 1–28. [Google Scholar] [CrossRef]
  27. Eom, S.B.; Ashill, N. The determinants of students’ perceived learning outcomes and satisfaction in university online education: An update. Decis. Sci. J. Innov. Educ. 2016, 14, 185–215. [Google Scholar] [CrossRef]
  28. Chung, E.; Subramaniam, G.; Dass, L.C. Online learning readiness among university students in Malaysia amidst COVID-19. Asian J. Univ. Educ. 2020, 16, 45–58. [Google Scholar] [CrossRef]
  29. Zielinski, D. Can you keep learners online? Training 2000, 37, 64–66. [Google Scholar]
  30. Huang, R.H.; Liu, D.J.; Tlili, A.; Yang, J.F.; Wang, H.H. Handbook on Facilitating Flexible Learning During Educational Disruption: The Chinese Experience in Maintaining Undisrupted Learning in COVID-19; Smart Learning Institute of Beijing Normal University: Beijing, China, 2020; p. 46. [Google Scholar]
  31. Deci, E.L. Intrinsic Motivation; Plenum Press: New York, NY, USA, 1975. [Google Scholar]
  32. Zimmerman, B.J. Attaining self-regulation: A social cognitive perspective. In Handbook of Self-Regulation; Academic Press: Cambridge, MA, USA, 2000; pp. 13–39. [Google Scholar]
  33. Pelikan, E.R.; Lüftenegger, M.; Holzer, J.; Korlat, S.; Spiel, C.; Schober, B. Learning during COVID-19: The role of self-regulated learning, motivation, and procrastination for perceived competence. Z. Erziehungswiss. 2021, 24, 393–418. [Google Scholar] [CrossRef]
  34. Smith, P.A. Understanding self-regulated learning and its implications for accounting educators and researchers. Issues Account. Educ. 2001, 16, 663–700. [Google Scholar] [CrossRef]
  35. Pintrich, P.R. A conceptual framework for assessing motivation and self-regulated learning in college students. Educ. Psychol. Rev. 2004, 16, 385–407. [Google Scholar] [CrossRef] [Green Version]
  36. Ben-Eliyahu, A. A situated perspective on self-regulated learning from a person-by-context perspective. High Abil. Stud. 2019, 30, 199–236. [Google Scholar] [CrossRef]
  37. Wang, C.-H.; Shannon, D.M.; Ross, M.E. Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Educ. 2013, 34, 302–323. [Google Scholar] [CrossRef]
  38. Threlkeld, R.; Brzoska, K. Research in distance education. In Distance Education: Strategies and Tools; Educational Technology Publications: Englewood Cliffs, NJ, USA, 1994; pp. 41–66. [Google Scholar]
  39. Oxford, R.; Park-Oh, Y.; Ito, S.; Sumrall, M. Factors affecting achievement in a satellite-delivered Japanese language program. Am. J. Distance Educ. 1993, 7, 11–25. [Google Scholar] [CrossRef]
  40. Bawa, P. Retention in online courses: Exploring issues and solutions—A literature review. SAGE Open 2016, 6. [Google Scholar] [CrossRef] [Green Version]
  41. Garrison, D.R.; Shale, D. Education at a Distance: From Issues to Practice; Krieger Publishing Company: Melbourne, Australia, 1990; Volume 5. [Google Scholar]
  42. Thurmond, V.; Wambach, K. Understanding interactions in distance education: A review of the literature. Int. J. Instr. Technol. Distance Learn. 2004, 1. Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.9189&rep=rep1&type=pdf#page=17 (accessed on 20 September 2021).
  43. Anderson, T. Getting the mix right again: An updated and theoretical rationale for interaction. Int. Rev. Res. Open Distrib. Learn. 2003, 4, 126–141. [Google Scholar] [CrossRef] [Green Version]
  44. Bruning, K. The Role of Critical Thinking in Online Learning Environment. Int. J. Inst. Technol. Distance Learn. 2005, 2, 21–31. Available online: http://www.itdl.org/Journal/May_05/article03.htm (accessed on 20 September 2021).
  45. Burnett, K.; Bonnici, L.J.; Miksa, S.D.; Kim, J. Frequency, intensity and topicality in online learning: An exploration of the interaction dimensions that contribute to student satisfaction in online learning. J. Educ. Libr. Inf. Sci. 2007, 48, 21–35. [Google Scholar]
  46. Yunusa, A.A.; Umar, I.N. A scoping review of Critical Predictive Factors (CPFs) of satisfaction and perceived learning outcomes in E-learning environments. Educ. Inf. Technol. 2021, 26, 1223–1270. [Google Scholar] [CrossRef]
  47. Williams, K.; Kear, K.; Rosewell, J. Quality Assessment for E learning: A Benchmarking Approach, 2nd ed.; European Association of Distance Teaching Universities: Heerlan, The Netherlands, 2012; p. 72. [Google Scholar]
  48. Moore, M.G. Three types of interaction. Am. J. Distance Educ. 1989, 3, 1–7. [Google Scholar] [CrossRef]
  49. National Council of Educational Research and Training (NCERT), Mysuru, India. Available online: http://www.riemysore.ac.in/national-council-educational-research-and-training-ncert (accessed on 20 September 2021).
  50. Chen, T.; Peng, L.; Jing, B.; Wu, C.; Yang, J.; Cong, G. The impact of the COVID-19 pandemic on user experience with online education platforms in China. Sustainability 2020, 12, 7329. [Google Scholar] [CrossRef]
  51. Kumar, P.; Saxena, C.; Baber, H. Learner-content interaction in e-learning- the moderating role of perceived harm of COVID-19 in assessing the satisfaction of learners. Smart Learn. Environ. 2021, 8, 1–15. [Google Scholar] [CrossRef]
  52. Bervell, B.; Umar, I.N.; Kamilin, M.H. Towards a model for online learning satisfaction (MOLS): Re-considering non-linear relationships among personal innovativeness and modes of online interaction. Open Learn. J. Open Distance E-Learn. 2020, 35, 236–259. [Google Scholar] [CrossRef]
  53. Arbaugh, J.B.; Hornik, S. Do chickering and gamson’s seven principles also apply to online MBAs? J. Educ. Online 2006, 3, 1–18. [Google Scholar] [CrossRef]
  54. DeBourgh, G.A. Technology is the tool, teaching is the task: Student satisfaction in distance learning. In Society for Information Technology & Teacher Education International Conference, USA, 28 February–4 March 1999; Association for the Advancement of Computing in Education (ACCE): Waynesville, NC, USA, 1999; pp. 131–137. [Google Scholar]
  55. Kauffman, H. A review of predictive factors of student success in and satisfaction with online learning. Res. Learn. Technol. 2015, 23, 1–13. [Google Scholar] [CrossRef] [Green Version]
  56. Amir, L.R.; Tanti, I.; Maharani, D.A.; Wimardhani, Y.S.; Julia, V.; Sulijaya, B.; Puspitawati, R. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med. Educ. 2020, 20, 1–8. [Google Scholar] [CrossRef]
  57. Salmon, G. E-Moderating: The Key to Teaching and Learning Online; Psychology Press: London, UK, 2003. [Google Scholar]
  58. Bernard, R.M.; Abrami, P.C.; Borokhovski, E.; Wade, C.A.; Tamim, R.M.; Surkes, M.A.; Bethel, E.C. A Meta-analysis of three types of interaction treatments in distance education. Rev. Educ. Res. 2009, 79, 1243–1289. [Google Scholar] [CrossRef]
  59. Sher, A. Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in Web-based Online Learning Environment. J. Interact. Online Learn. 2009, 8, 102–120. [Google Scholar]
  60. Kuo, Y.-C.; Walker, A.E.; Belland, B.R.; Schroder, K.E.E. A predictive study of student satisfaction in online education programs. Int. Rev. Res. Open Distance Learn. 2013, 14, 15–39. [Google Scholar] [CrossRef] [Green Version]
  61. Swan, K. Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Educ. 2001, 22, 306–331. [Google Scholar] [CrossRef]
  62. Khan, M.A.; Vivek, V.; Nabi, M.K.; Khojah, M.; Tahir, M. Students’ perception towards e-learning during COVID-19 pandemic in India: An empirical study. Sustainability 2021, 13, 57. [Google Scholar] [CrossRef]
  63. Rovai, A.P.; Ponton, M.K.; Wighting, M.J.; Baker, J.D. A comparative analysis of student motivation in traditional classroom and e-learning courses. Int. J. E-Learn. 2007, 6, 413–432. [Google Scholar]
  64. Hair, J.F.; Ringle, C.M.; Sarstedt, M. Partial least squares structural equation modeling: Rigorous applications, better results and higher acceptance. Long Range Plan. 2013, 46, 1–12. [Google Scholar] [CrossRef]
  65. George, D.; Mallery, P. SPSS for Windows Step by Step: A Simple Guide and Reference. 11.0 Update, 4th ed.; Allyn & Bacon: Boston, MA, USA, 2003; ISBN 8131762254. [Google Scholar]
  66. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  67. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  68. Dhingra, S.; Pasricha, N.; Sthapak, E.; Bhatnagar, R. Assessing the role of internal motivation and extrinsic factors on online undergraduate medical teaching in a resource-poor setting during COVID-19 pandemic in north india: An observational study. Adv. Med. Educ. Pract. 2021, 12, 817–823. [Google Scholar] [CrossRef] [PubMed]
  69. Winne, P.H.; Hadwin, A.F. Studying as self-regulated learning. In Metacognition in Educational Theory and Practice; Hacker, D.J., Dunlosky, J., Graesser, A.C., Eds.; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1998; pp. 277–304. [Google Scholar]
  70. Panadero, E. Instructional Help for Self-Assessment and Self-Regulation: Evaluation of the Efficacy of Self-Assessment Scripts vs. Rubrics; Universidad Autónoma de Madrid: Madrid, Spain, 2011. [Google Scholar]
  71. Panadero, E. A review of self-regulated learning: Six models and four directions for research. Front. Psychol. 2017, 8, 422. [Google Scholar] [CrossRef]
  72. Richardson, M.; Abraham, C.; Bond, R. Psychological correlates of university students’ academic performance: A systematic review and meta-analysis. Psychol. Bull. 2012, 138, 353–387. [Google Scholar] [CrossRef] [Green Version]
  73. Clarke, A. E-Learning Skills; Palgrave Macmillan: London, UK, 2004. [Google Scholar]
  74. Muthuprasad, T.; Aiswarya, S.; Aditya, K.S.; Jha, G.K. Students’ perception and preference for online education in India during COVID-19 pandemic. Soc. Sci. Humanit. Open 2021, 3, 100101. [Google Scholar] [CrossRef]
  75. Naidu, S. Building resilience in education systems post-COVID-19. Distance Educ. 2021, 42, 1–4. [Google Scholar] [CrossRef]
  76. Brunner, B.J. Before they even start: Hope and incoming 1Ls. Duquesne Law Rev. 2010, 48, 473. [Google Scholar]
  77. Block, A.; Udermann, B.; Felix, M.; Reineke, D.; Murray, S.R. Achievement and Satisfaction in a Computer-Assisted versus a Traditional Lecturing of an Introductory Statistics Course; University of Wisconsin: La Crosse, WI, USA, 2008. [Google Scholar]
Figure 1. The proposed conceptual model.
Figure 1. The proposed conceptual model.
Sustainability 13 11749 g001
Figure 2. The geographic distribution of the survey participants.
Figure 2. The geographic distribution of the survey participants.
Sustainability 13 11749 g002
Figure 3. Graphical Representation of Measurement Model. Model fit indices: χ2 = 754.263, df = 180, CFI = 0.931, RMSEA = 0.048, TLI = 0.920, IFI = 0.931, RMR = 0.020.
Figure 3. Graphical Representation of Measurement Model. Model fit indices: χ2 = 754.263, df = 180, CFI = 0.931, RMSEA = 0.048, TLI = 0.920, IFI = 0.931, RMR = 0.020.
Sustainability 13 11749 g003
Figure 4. Graphical output of SEM. Model fit indices: χ2 = 568.519, df = 176, CMIN/df = 3.230, TLI = 0.944, CFI = 0.953, IFI = 0.953, RMR = 0.021, RMSEA = 0.040.
Figure 4. Graphical output of SEM. Model fit indices: χ2 = 568.519, df = 176, CMIN/df = 3.230, TLI = 0.944, CFI = 0.953, IFI = 0.953, RMR = 0.021, RMSEA = 0.040.
Sustainability 13 11749 g004
Figure 5. Learner preferences for online learning.
Figure 5. Learner preferences for online learning.
Sustainability 13 11749 g005
Table 1. Descriptive Statistics of Sample Respondents (N = 1376).
Table 1. Descriptive Statistics of Sample Respondents (N = 1376).
GenderTotal Number of ResponsesResponse Rate (%)
Female121588.3
Male16111.7
Total1376100
Academic YearTotal Number of ResponsesResponse Rate (%)
First-year80358.4
Second-year21515.6
Third-year24217.6
Fourth-year1168.4
Total1376100
UniversityTotal Number of ResponsesResponse Rate (%)
Rajarata University of Sri Lanka49235.8
University of Ruhuna32023.3
University of Sri Jayewardenepura56441.0
Total1376100
Age (Y)Total Number of ResponsesResponse Rate (%)
20846.1
2137827.5
2238728.1
2325718.7
2414610.6
25977.0
26151.1
2770.5
2800.0
2910.1
3040.3
Total1376100.0
Table 2. Descriptive Statistics.
Table 2. Descriptive Statistics.
SatisfactionAgree
PS1The lecturer used technology effectively in classCount390
Percentage28%
PS2The lecturer was well prepared for online teachingCount1013
Percentage73%
PS3My online sessions provided adequate opportunities for lecturer-student interactionCount598
Percentage43%
PS4My online sessions provided adequate opportunities for peer-interactionCount545
Percentage39%
PS5Continuous assessment was fair and practicalCount579
Percentage42%
PS6The final assessment was fair and practicalCount538
Percentage39%
PS7The method of sharing learning resources was appropriate.Count527
Percentage38%
PS8The lecturer provided student feedback effectivelyCount576
Percentage42%
Learner MotivationTotal Number of ResponsesAgree
PLM1I was always motivated to attend my online sessions72853%
PLM2The lecturer’s teaching method motivated me to attend online sessions34525%
PLM3Poor internet connectivity demotivated me to attend online sessions16412%
PLM4My home environment motivated me to attend online sessions32524%
PLM5The physical absence of my classmates demotivated me to attend online sessions15611%
Perceived Challenges of E-LearningTotal Number of ResponsesChallenging
PCE1Technical difficulties in learning110981%
PCE2Staying focused during a lecture41430%
PCE3The level of IT literacy55640%
PCE4Feeling isolation63946%
PCE5Absence of practical experience85962%
InteractionTotal Number of ResponsesAgree
INT1In online sessions, I had limited opportunities to interact with lecturers76856%
INT2In online learning, I had limited e-resources for learning52538%
INT3In online learning, I had limited opportunities to interact with my coursemates.61545%
Table 3. Reliability values: Cronbach Alpha.
Table 3. Reliability values: Cronbach Alpha.
InstrumentCronbach’s AlphaStatus
Peer Satisfaction (PS)0.876Good
Perceived Learner Motivation (PLM)0.609Acceptable
Perceived Challenges of E-Learning (PCE)0.677Acceptable
Interaction (INT)0.712Good
Table 4. Normality Measures for Indicator Variables.
Table 4. Normality Measures for Indicator Variables.
VariableSkewCRKurtosisCR
PS10.182.710.141.07
PS20.8312.560.685.16
PS30.294.38−0.07−0.56
PS40.263.87−0.46−3.48
PS50.304.60−0.22−1.68
PS60.314.66−0.38−2.90
PS70.202.95−0.41−3.07
PS80.233.480.050.39
PCE11.5423.430.392.99
PCE2−0.71−10.74−1.50−11.33
PCE3−0.22−3.28−1.95−14.79
PCE40.121.85−1.99−15.03
PCE50.9113.73−1.18−8.92
PLM1−0.12−1.76−1.99−15.04
PLM20.8412.64−1.30−9.87
PLM31.9229.021.6712.67
PLM40.548.16−1.71−12.95
PLM51.9429.381.7613.35
INT1−0.23−3.55−1.95−14.73
INT20.335.00−1.89−14.32
INT3−0.52−7.82−1.73−13.13
Variable22.6613.52
Table 5. Functional Forms between Dependent and Independent Variables: Linearity.
Table 5. Functional Forms between Dependent and Independent Variables: Linearity.
Equation *F Values **
PLMPCEINT
Linear161.39982.54816.765
Quadratic81.71546.1068.975
Cubic54.54631.0427.660
* Dependent PS; ** p < 0.001.
Table 6. Collinearity Diagnosis: Tolerance and VIF Values.
Table 6. Collinearity Diagnosis: Tolerance and VIF Values.
Coefficients
Observed VariableCollinearity Statistics
ToleranceVIF
1PS10.6821.465
PS20.6301.586
PS30.4492.227
PS40.6031.658
PS50.4302.323
PS60.4972.011
PS70.4852.063
PS80.4472.239
PLM10.7671.304
PLM20.7491.334
PLM30.7641.309
PLM40.8381.193
PLM50.8181.223
PCE10.8511.175
PCE20.7221.385
PCE30.6881.454
PCE40.7311.368
PCE50.6811.469
INT10.5661.766
INT20.5971.674
INT30.7221.386
Table 7. Estimated Pearson Correlations among Latent Constructs.
Table 7. Estimated Pearson Correlations among Latent Constructs.
PSPLMPCEINT
PS1
PLM0.4481
PCE−0.309−0.4971
INT−0.110.551−0.4011
Table 8. Model fit Indices of the Measurement Model.
Table 8. Model fit Indices of the Measurement Model.
CategoryModel Fit IndexIndex ValueThresholdComment
1. Absolute fitRMSEA0.048<0.05 good fit; 0.05–0.01 mediocre fitSatisfied
GFI0.949>0.90Satisfied
RMR0.020<0.05Satisfied
2. Incremental fitAGFI0.934>0.80Satisfied
CFI0.931>0.90Satisfied
NFI0.912>0.90Satisfied
TLI0.920>0.90Satisfied
3. Parsimonious fitCMIN/df4.190<3 good.
<5 acceptable
Satisfied
Table 9. Standardized loadings, AVE, and CR Values.
Table 9. Standardized loadings, AVE, and CR Values.
PSPLMPCEINT
PS10.600
PS20.665
PS30.775
PS40.651
PS50.784
PS60.735
PS70.762
PS80.786
PLM1 0.572
PLM2 0.620
PLM3 0.611
PLM4 0.492
PLM5 0.517
PCE1 0.622
PCE2 0.489
PCE3 0.632
PCE4 0.624
PCE5 0.723
INT1 0.813
INT2 0.778
INT3 0.601
AVR0.5220.3180.3870.542
CR0.8960.6980.8700.777
Table 10. Comparison of Square Root AVE values and Correlations.
Table 10. Comparison of Square Root AVE values and Correlations.
PSPLMPCEINT
PS0.723
PLM0.4480.565
PCE0.3090.4970.623
INT0.1100.5510.4010.737
Table 11. The Fitness Indexes of the Structural Model.
Table 11. The Fitness Indexes of the Structural Model.
CategoryModel Fit IndexIndex ValueThresholdComment
1. Absolute fitRMSEA0.040<0.05 good fit; 0.05–0.01 mediocre fitSatisfied
GFI0.962>0.90Satisfied
RMR0.021<.0.05Satisfied
2. Incremental fitAGFI0.950>0.80Satisfied
CFI0.953>0.90Satisfied
NFI0.933>0.90Satisfied
TLI0.944>0.90Satisfied
3. Parsimonious fitCMIN/df3.230<3 good.
<5 acceptable
Satisfied
Table 12. Path coefficients estimated through structural equation modeling (SEM).
Table 12. Path coefficients estimated through structural equation modeling (SEM).
EstimateS.E.C.R.p
PTS--->PS0.4840.19210.1320.001
PCE--->PS−0.1490.074−4.4560.001
INT--->PS−0.1120.095−3.6120.001
PS--->PS10.5690.03919.6180.001
PS--->PS20.6350.03721.9240.001
PS--->PS30.7540.04125.7510.001
PS--->PS40.5840.04519.8030.001
PS--->PS50.7430.04225.3390.001
PS--->PS60.6630.04522.5510.001
PS--->PS70.7050.03629.6160.001
PS--->PS80.7440.03229.0820.001
PLM--->PLM10.3310.1128.3280.001
PLM--->PLM20.4680.11510.6000.001
PLM--->PLM30.6380.11011.8120.001
PLM--->PLM40.4350.11610.2240.001
PLM--->PLM50.4940.0839.8690.001
PCE--->PCE10.3970.04411.0120.001
PCE--->PCE20.3640.05310.0040.001
PCE--->PCE30.4940.06012.8020.001
PCE--->PCE40.6280.06914.0910.001
PCE--->PCE50.7050.07312.1610.001
INT--->INT10.8520.13413.6820.001
INT--->INT20.7100.10015.1300.001
INT--->INT30.4780.07815.1260.001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hettiarachchi, S.; Damayanthi, B.; Heenkenda, S.; Dissanayake, D.; Ranagalage, M.; Ananda, L. Student Satisfaction with Online Learning during the COVID-19 Pandemic: A Study at State Universities in Sri Lanka. Sustainability 2021, 13, 11749. https://doi.org/10.3390/su132111749

AMA Style

Hettiarachchi S, Damayanthi B, Heenkenda S, Dissanayake D, Ranagalage M, Ananda L. Student Satisfaction with Online Learning during the COVID-19 Pandemic: A Study at State Universities in Sri Lanka. Sustainability. 2021; 13(21):11749. https://doi.org/10.3390/su132111749

Chicago/Turabian Style

Hettiarachchi, Sujeewa, BWR Damayanthi, Shirantha Heenkenda, DMSLB Dissanayake, Manjula Ranagalage, and Lalith Ananda. 2021. "Student Satisfaction with Online Learning during the COVID-19 Pandemic: A Study at State Universities in Sri Lanka" Sustainability 13, no. 21: 11749. https://doi.org/10.3390/su132111749

APA Style

Hettiarachchi, S., Damayanthi, B., Heenkenda, S., Dissanayake, D., Ranagalage, M., & Ananda, L. (2021). Student Satisfaction with Online Learning during the COVID-19 Pandemic: A Study at State Universities in Sri Lanka. Sustainability, 13(21), 11749. https://doi.org/10.3390/su132111749

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop