Next Article in Journal
Does the Inclusion of Disabled Employees Affect Firm Performance? Empirical Evidence from China
Previous Article in Journal
Patterns of Human–Brown Bear Conflict in the Urban Area of Brașov, Romania
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Understanding Students’ Acceptance and Usage Behaviors of Online Learning in Mandatory Contexts: A Three-Wave Longitudinal Study during the COVID-19 Pandemic

1
Institute of Human Factors and Ergonomics, College of Mechatronics and Control Engineering, Shenzhen University, Shenzhen 518060, China
2
Department of Music Education, School of Primary Education, Hunan Vocational College for Nationalities, Yueyang 414000, China
3
Department of Educational Technology, Faculty of Education and Institute of KEEP Collaborative Innovation, Shenzhen University, Shenzhen 518060, China
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(13), 7830; https://doi.org/10.3390/su14137830
Submission received: 26 May 2022 / Revised: 23 June 2022 / Accepted: 24 June 2022 / Published: 27 June 2022

Abstract

:
Online learning has been mandatorily adopted in many countries due to the closure of educational institutions caused by the COVID-19 pandemic. However, antecedents of the acceptance and continuance of online learning in such a situation and their changing role over time have not been well understood. This study proposed and empirically tested a longitudinal acceptance model of online learning by integrating the technology acceptance model (TAM) with the task–technology fit (TTF). Data were collected using a three-wave longitudinal survey from 251 Chinese college students after the outbreak of the COVID-19 pandemic. The results showed that most hypothesized relationships in the proposed model were supported and remained across the three-time stages, while the effects of perceived ease of use on perceived usefulness and behavioral intention changed over time. In addition, students’ perceptions at previous stages had little impact on perceptions at subsequent stages, except for perceived usefulness and behavioral intention. Our study demonstrates that the integrated model of TAM and TTF could be an effective tool to understand students’ acceptance of online learning across different time stages in a mandatory setting and that longitudinal design could be applicable to examine the changing mechanism of the acceptance and continuance use of online learning over time.

1. Introduction

The rapid development of information technology and the large-scale popularization of intelligent mobile devices have given rise to online learning as a worldwide trend for learning activities at varying levels of education in the past decade [1,2,3]. The trend has even been accelerated by the closure of educational institutions due to the disturbance of the COVID-19 pandemic, as an increasing number of educational institutions have been adopting online learning as an alternative to normal classroom teaching so that learning would not be discontinued [4]. Online learning can be generally referred to as Internet-based learning approaches where courses are offered synchronously and/or asynchronously to students remotely through multiple terminals (e.g., computers or mobile devices) [5,6]. Nowadays, online learning has been endorsed as a major advancement in higher education. It even appears indispensable, especially during the spread of COVID-19, when normal classroom teaching often became unavailable [4]. For example, in China, online learning became a mandatory option for educational institutions during the COVID-19 pandemic to meet the national educational policy, which advocates that “learning continues while classrooms are closed”. Educational institutions in China have been using Tencent Meeting, Tencent Classroom, Chaoxing Learning, and DingDing as online learning tools to deliver teaching activities. Other commonly used online learning solutions worldwide include Zoom, Skype, MOOC, etc.
Compared with traditional educational approaches, online learning has a number of advantages in terms of accessibility, affordability, flexibility, learning pedagogy, and life-long learning [4]. First, online learning can deliver both real-time and video-recorded “face-to-face” lectures to a large number of students who are geographically widely distributed, thus breaking location restrictions [7,8]. Second, online learning allows flexible time schedule and is adjustable to contexts, and therefore allows students to plan for their learning process and develop self-control skills for lifelong learning practice [9]. Third, online learning can utilize a variety of features (such as video conferencing, chat rooms, discussion forums, online quizzes, and blackboards) to facilitate mutual communication between teachers and students. Finally, online learning is characterized by digitization and allows for multiple media channels to present educational resources, thereby providing theoretically productive and convenient approaches for learning purposes.
Despite its impressive advantages, online learning has long been criticized for its high non-completion rates, in both voluntary and mandatory contexts. This indicates that online learning is not always accepted or sustainably used by users. For example, Massive Open Online Courses (MOOCs), as one of the most widely used voluntary online learning systems, have been criticized for their high non-completion rate (approximately 80–95%) [10,11], representing low acceptance. Moreover, when online learning is mandatory, students’ participation and performance could still be lower than expected, as the seemingly high usage rate does not necessarily represent acceptance [12]. For example, some students would remain logged in the system while being absent from the screen, representing an unauthentic case of engaging in online learning. The low acceptance of online learning may result in failure to achieve learning goals and a waste of educational resources. In addition, some scholars have pointed out that individuals’ perceptions to accept a new technology fluctuated over time [13]. However, previous studies have largely reported the voluntary use of online learning, few studies have been conducted to examine how students’ perceptions and behaviors during the use of online learning may change over time in a mandatory context. Therefore, the implications of user acceptance and the continued use of online learning in mandatory settings appear to be lacking, especially in the context of the long-term application of online learning [12,14]. In light of this, determining how to improve students’ acceptance and motivate their continued use of online learning has been a great concern in online learning research.
Many rigorous and comprehensive models have been applied to explain user acceptance of online learning, among which the Technology Acceptance Model (TAM) and its extensions are the most widely applied, given its parsimony and robustness [6,7,15,16,17]. However, the TAM focuses more on users’ perceptions of ease and usefulness of technology while it appears to neglect the fit between task requirements and technology characteristics that are used to meet the task requirements [8,18]. In practice, it is very likely that users fail to utilize the functions of online learning systems to fulfil their task requirements in learning. The mismatch between technology characteristics and task requirements may result in the failure to achieve satisfactory acceptance and sustainable use by users [18]. As indicated by the task–technology fit (TTF) [19], we should consider the fit between task requirements and technology characteristics to understand users’ acceptance and usage of online learning more effectively [17,20,21].
Consequently, drawn from the TAM and TTF, the present study aims to examine the factors that influence the acceptance and continuance use of students’ online learning in a mandatory setting through a longitudinal approach. Data were collected from students who were engaged in the mandatory use of online learning across an academic semester at three stages (i.e., the beginning (T1), middle (T2), and end of the semester (T3)), with two months between each time stage. We focused on the changing roles of the antecedents in determining students’ acceptance and continued use of online learning. The next section offers an overview of related work and the development of the proposed model.

2. Literature Review and Research Hypotheses

2.1. Technology Acceptance Model (TAM)

The TAM is one of the most widely recognized theories to examine users’ acceptance of technology [22,23]. Based on theories from social-psychological and behavioral research, TAM proposes that the perceived ease of use and perceived usefulness are the two direct determinants of behavioral intention, which is usually regarded as an indicator of the actual usage of technology [24] and is also considered the agent of acceptance [25,26]. Perceived ease of use refers to the degree to which individuals believe that using a particular technology would be free of effort, and perceived usefulness is defined as the degree to which individuals believe that using a particular technology would enhance performance in their tasks [27]. Moreover, perceived ease of use has a positive influence on perceived usefulness. The relationships between perceived ease of use, perceived usefulness and behavioral intention have been consistently verified in varied contexts of information technologies, such as automated vehicles [15], health information technologies [25,28], mobile health [29,30], and online community applications [31]. These relationships have also been proven in online learning or e-learning applications [7,9,32]. For example, Al-Rahmi et al. [9] confirmed the validity of the relationships between perceived ease of use, perceived usefulness, and behavioral intention when examining students’ intention to use e-learning systems. Similar results were also reported by Tao et al. [7] when they investigated key characteristics of user acceptance of MOOCs in an extended TAM model. Dečman [33] examined the acceptance of e-learning in mandatory environments and found that perceived usefulness was the most significant factor in the acceptance of e-learning. Based on the above evidence, the following hypotheses were proposed:
H1. 
Perceived usefulness will positively affect behavioral intention to use online learning systems at each time stage (H1a for T1, H1b for T2, and H1c for T3).
H2. 
Perceived ease of use will positively affect behavioral intention to use online learning systems at each time stage (H2a for T1, H2b for T2, and H2c for T3).
H3. 
Perceived ease of use will positively affect perceived usefulness of online learning systems at each time stage (H3a for T1, H3b for T2, and H3c for T3).

2.2. Actual Usage

Prior studies stated that individuals’ behaviors will be influenced by their beliefs and attitudes [34]. According to TAM, behavioral intentions could predict subsequent actual usage behaviors [27]. In particular, the relationship between behavioral intention and actual usage has been proven to be robust in e-learning contexts [2,35,36]. Therefore, the following hypotheses were developed:
H4. 
Behavioral intention to use online learning systems will positively affect subsequent actual usage of online learning systems (H4a for T2, H4b for T3).
Prior research has suggested that the actual experience of using technology will affect users’ beliefs and attitudes towards technology [34]. Particularly, Cheng and Yuen [37] investigated junior students’ acceptance and continuance of e-learning system use. They found that actual usage would indirectly affect behavioral intention. The actual usage of online learning systems will likely shape users’ experience with the technology, which, in turn, exerts influence on their behavioral intention to use the technology. For example, Annelies Raes and Fien Depaepe [38] investigated students’ acceptance of technological reform and found that experience with the technology would positively influence the intention to use the technology. Hence, the following hypotheses were developed:
H5. 
Actual usage of online learning systems will positively affect behavioral intention to use online learning systems (H5a for T2, H5b for T3).

2.3. Task-Technology Fit (TTF)

Goodhue and Thompson [19] initially proposed TTF. They stated that TTF is the degree to which technology could assist an individual to perform his or her portfolio of tasks. TTF has been widely applied in various contexts to evaluate the fitness between task requirements and technology characteristics, and its subsequent impact on the performance gain of information technology [39,40,41]. It has also been commonly used in evaluating the usage intention and acceptance of varied technologies, such as mobile commerce and e-learning applications [17,18,42,43,44]. For example, Dishaw and Strong’s [18] study revealed that the integrated model of TTF and TAM offered better explanatory power over either model alone. More specifically, perceived ease of use is indirectly influenced by TTF, while perceived usefulness is indirectly affected by TTF. Thus, the following hypotheses were developed:
H6. 
Task–technology fit will positively affect perceived ease of use at each time stage (H6a for T1, H6b for T2, and H6c for T3).
H7. 
Task–technology fit will positively affect perceived usefulness at each time stage (H7a for T1, H7b for T2, and H7c for T3).

2.4. The Longitudinal Component

A review of existing literature indicates that most of the previous studies on technology acceptance were based on cross-sectional designs, while few adopted a longitudinal approach. However, cross-sectional designs might not be able to detect the potential dynamic changes in individuals’ perceptions and acceptance of technology during its long-term use. A few empirical studies have demonstrated the changing nature of perceptions regarding technology use over time [8,45,46,47,48]. For example, Venkatesh and Davis [47] tested an extended TAM model in a longitudinal way. They found that the effect of perceived usefulness on behavioral intention remained over time, while the role of perceived ease of use in influencing behavioral intention became less significant over time. Giger et al. [46] reported that users’ behavioral intentions to use a remote patient monitoring system would decline during a three-month period. Recently, Vladova et al. [8] also reported that students’ perceived ease of use, perceived usefulness, and behavioral intention in their acceptance of technology-mediated teaching are likely to change over time during the COVID-19 pandemic.
The reasons for the changes in perceptions over time may be explained by the belief-updating theory [49] and the repeated behavioral pattern [50]. On the one hand, the belief-updating theory [49] argues that users would dynamically evaluate their experience with technology use and would like to adjust (either enhance or weaken) their subsequent perceptions regarding the technology based on such an evaluation. Such a belief-updating process is likely to affect users’ technology perceptions and usage behavior over time [26]. For example, Kesharwani [51] explored the acceptance of new technology between digital natives and digital immigrants in a longitudinal approach, and the results confirmed the sequential belief-updating mechanism of technology acceptance over time. On the other hand, the repeated behavioral pattern suggests that users tend to repeat their previous behavioral pattern, which is likely to have a cumulative and enhanced impact on subsequent behavior [50]. Thus, it is very likely that students’ perceptions of online learning that are developed in the initial stage would be reinforced in subsequent stages, as they have repeated interactions with the online learning system. In addition, the actual usage behaviors would be reinforced accordingly. Thus, the following hypotheses were proposed:
H8. 
Current perceived ease of use will positively affect subsequent perceived ease of use (H8a for T1→T2, H8b for T2→T3).
H9. 
Current perceived usefulness will positively affect subsequent perceived usefulness (H9a for T1→T2, H9b for T2→T3).
H10. 
Current behavioral intention will positively affect subsequent behavioral intention (H10a for T1→T2, H10b for T2→T3).
H11. 
Current task–technology fit will positively affect subsequent task–technology fit (H11a for T1→T2, H11b for T2→T3).
H12. 
Current actual usage behaviors will positively affect subsequent actual usage behaviors.
Figure 1 illustrates the proposed longitudinal online learning acceptance model based on the above-mentioned hypotheses.

3. Methods

3.1. Participants and Procedures

This study collected data through a web-based questionnaire survey that was published on Sojump (www.sojump.com, accessed on 23 June 2022), one of the largest online survey companies in China. The survey was conducted over the semester that followed the outbreak of the COVID-19 pandemic in 2020. At that time, China launched a national-wide initiative for online learning to meet the educational policy that advocates “learning continues while classrooms are closed” during the COVID-19 pandemic. Most of the students in China had minimal experience with online learning at that time. The survey was administered at three stages: At the beginning (T1), middle (T2), and the end of the semester (T3), with a two-month break between each time stage. College students from representative universities in different regions of China were invited to participate in this survey. All of them used online learning for school courses in their universities under a mandatory context. Only those who completed the survey at all three stages were included for data analysis. The final sample size consisted of 251 students, including 175 male students and 76 female students, with an average age of 20.7 (SD = 1.2). Approximately 90% of them had previous online learning experience. The demographic characteristics of the study sample are shown in Table 1.

3.2. Questionnaire Development

The questionnaire had three parts. The first section starts with a brief introduction to online learning systems currently employed in China. The second part collected students’ demographic information, including gender, age, and their experience with online learning. The unique student number allocated to each student was also collected to identify if the student had completed multiple rounds of the survey. The third part included scales to measure the constructs in the proposed research model. The scale items were adapted from prior research and modified to fit the online learning context in this study. In particular, the scales were translated from English to Chinese by two bilingual experts with extensive questionnaire design experience. Modifications were then made based on feedback from two rounds of cognitive interviews with five undergraduate students and four postgraduate students (potential online learning users). It is widely recognized that a cognitive interview is an effective way to identify and remove confusion issues in questionnaires, as it can diagnose cognitive problems that participants would encounter when answering scale items [52]. After that, the two bilingual experts were consulted to ensure the quality of the scales. Task–technology fit was assessed with four items adapted from Thompson [19]. Perceived usefulness and perceived ease of use were measured with four items adapted from Davis et al. [27], respectively. Behavioral intention was assessed with three items drawn from Venkatesh et al. [26]. All the items were rated on 5-point Likert-type scales, ranging from “strongly disagree” (1) to “strongly agree” (5). Actual usage was assessed with one item to ask about the frequency of effective use of an online learning system. The details of the questionnaire items are shown in Table 2.

3.3. Data Analysis

One-way repeated measure analyses of variance were used to assess how the perceptions of online learning changed over time. Sphericity assumption was assessed and verified. If the assumption was violated, the Greenhouse–Geisser correction was applied. Paired t-tests were performed to examine the change in actual usage between the two time points. Unpaired t-tests were performed to examine gender differences in behavioral intention and actual usage across the three stages.
The convergent validity of the measurement model was assessed by the three commonly used criteria, including Cronbach’s alpha, composite reliability, and average variance extracted (AVE) [53]. To guarantee convergent validity is appropriated, Cronbach’s alpha and composite reliability values of each construct should be greater than 0.70 [6,54] and the AVE should be greater than 0.50 [53]. Discriminant validity was achieved if the square root of AVE for a given construct is greater than its correlations with all other constructs in the model [53]. The fit between the research model (Figure 1) and data was tested using structural equation modeling. Several goodness-of-fit indices were applied to assess the model fitness. The model fit the data if the ratio of χ2 to the degrees of freedom was less than 5 (χ2/df < 5), the root mean square residual was less than 0.05 (RMR < 0.05), the incremental fit index was not less than 0.90 (IFI ≥ 0.90), the Tucker Lewis index was not less than 0.90 (TLI ≥ 0.90), the comparative fit index was not less than 0.90 (CFI ≥ 0.90), and the root mean square error of approximation was less than 0.08 (RMSEA < 0.08) [55,56]. The model was estimated with AMOS 24.

4. Results

4.1. Convergent and Discriminate Validity

All the factor loadings were significant and larger than the recommended value of 0.50 (see Table 2), indicating adequate convergent validity at item levels. Cronbach’s alpha and the composite reliability values of all the constructs were greater than the threshold value of 0.7 (see Table 3). The AVE for all the constructs exceeded 0.50. Thus, convergent validity was achieved. Thus, the measurement for the constructs had good convergence. Table 4 showed that the square root of the AVE for each construct was larger than any of the bivariate correlations involving the construct in the model. Therefore, the discriminate validity of the constructs was achieved.

4.2. The Results of ANOVAs and Paired t-Test

As shown in Table 5, there were significant differences in TTF (F (2, 500) = 3.929, p < 0.05), perceived ease of use (F (2, 500) = 4.810, p < 0.01), perceived usefulness (F (2, 500) = 10.196, p < 0.001), and behavioral intention (F (2, 500) = 6.183, p < 0.01) over time. The post hoc test revealed a significant decrease in TTF from T2 (M = 2.38) to T3 (M = 2.25), in perceived ease of use from T1 (M = 2.13) to T3 (M = 2.01) and from T2 (M = 2.10) to T3 (M = 2.01), in perceived usefulness from T1 (M = 2.43) to T3 (M = 2.26) and from T2 (M = 2.45) to T3 (M = 2.26), and in behavioral intention from T1 (M = 2.45) to T3 (M = 2.33) and from T2 (M = 2.51) to T3 (M = 2.33). In addition, there were no gender differences in behavioral intention and actual usage across the three stages (p > 0.05).

4.3. Model Testing

The fit statistics indicated that the fit between the research model and data was good (Table 6). The results of the hypothesis testing are demonstrated in Table 7, and the results of the estimated model are shown in Figure 2.
As shown in Figure 2, TTF represented as an essential construct in predicting perceived ease of use and perceived usefulness at all the three time points. Specifically, TTF had a significant effect on perceived usefulness at T1 (β = 0.62, p < 0.001), T2 (β = 0.62, p < 0.001), and T3 (β = 0.80, p < 0.001). Similarly, TTF yielded a significant effect on perceived usefulness at T1 (β = 0.77, p < 0.001), T2 (β = 0.84, p < 0.001), and T3 (β = 0.85, p < 0.001). The effect of perceived ease of use on perceived usefulness was significant at T1 (β = 0.18, p < 0.01) and T2 (β = 0.11, p < 0.05), but not at T3 (β = 0.10, p = 0.112). The effect of perceived ease of use on behavioral intention was non-significant at T1 and T2 (p > 0.05), while the effect became significant at T3 (β = 0.20, p < 0.01). In contrast, perceived usefulness was strongly and significantly related to behavioral intention throughout the three stages: T1 (β = 0.79, p < 0.001), T2 (β = 0.77, p < 0.001), and T3 (β = 0.73, p < 0.001). The effect of behavioral intention on actual usage was significant (β = 0.13, p < 0.05) during the T1–T2 period but became non-significant (β = −0.05, p = 0.447) during the T2–T3 period. Besides, the effect of actual usage on behavioral intention was not significant at both T2 and T3 (p > 0.05). Finally, most of the relationships in the longitudinal component were non-significant, except perceived usefulness at T3 was significantly affected by perceived usefulness at T2 (β = 0.09, p < 0.01). Overall, the amount of variance in behavioral intention accounted for by its antecedents was 63% at T1, 69% at T2, and 80% at T3.

5. Discussion

Online learning has become mandatory in many countries and areas after the outbreak of the COVID-19 pandemic. While previous studies mainly focused on voluntary online learning and examined students’ acceptance with cross-sectional approaches, few have paid attention to the mandatory usage contexts and the changing mechanism of the acceptance and continued use of online learning over time. To fill this gap, this study has integrated TTF into TAM with a longitudinal design to explain the changing mechanism of online learning acceptance.

5.1. Primary Findings

Our findings show that the values of actual usage at T2 and T3 were at high levels. It indicates that students used online learning systems frequently during the COVID-19 pandemic. This echoes the mandatory usage policy employed in universities. However, all the values of TTF, perceived ease of use, perceived usefulness, and behavioral intention at the three stages were at mid-low levels. This indicates that existing online learning systems might fail to tailor their functions to fulfil requirements from learning activities. Besides, users appeared unsatisfied with the usability and ease of use of online learning systems and considered current online learning platforms less useful to meet their learning purposes. Altogether, students had developed relatively weak behavioral intentions to continuously use the online learning systems.
Intriguingly, we found that, in general, students’ perceptions of TTF, perceived ease of use, perceived usefulness, and behavioral intention remained stable from T1 to T2, but decreased significantly from T2 to T3 (p < 0.05). The results are in line with what is suggested by the belief-updating theory [49]. It indicates that the students had dynamically evaluated their experience with the online learning systems by adjusting their subsequent perceptions regarding the technology based on such an experience. Similar trends of diminished perception of the perceived ease of use have also been reported in previous studies [47]. The explanation for the results may be that the use of online learning systems during the COVID-19 pandemic could be a hasty decision, which means the systems are not compatible with students’ previous learning patterns and habits [57,58]. Such incompatibility between online learning systems and students’ learning patterns/habits might become even larger as students gain sufficient usage experience with the systems. Therefore, students are likely to perceive lower levels of TTF, perceived ease of use, perceived usefulness and behavioral intention at the last time stage.
An important contribution of this study is that it supports the integration of TAM and TTF as an effective theoretical model to understand students’ acceptance of online learning across different time stages. Our results showed that most of the relationships in the examined integrated model were supported and remained stable across the three stages. More specifically, the predictive effects of TTF on perceived ease of use and perceived usefulness, and the predictive effects of perceived usefulness on behavioral intention were well supported across the three stages. The results are consistent with the findings from previous cross-sectional online learning acceptance studies [7,33,59]. For example, previous studies also showed that TTF could positively and strongly influence perceived ease of use and perceived usefulness [17,43]. As also expected in our study, if the functions of online learning systems fit with students’ needs in their learning activities, their perceptions of ease of use and usefulness would be improved. In addition, the influence of TTF on perceived ease of use and perceived usefulness appeared to become even stronger over time in our study, as indicated by their increased path coefficients from T1 to T3. Overall, the results strongly suggest that regardless of whether during initial usage or continuous usage, online learning systems should be maintained as useful in facilitating students’ learning process and be compatible with their learning patterns and habits. The results also emphasize the importance of TTF in enhancing students’ acceptance and continuous usage via the mediating role of perceived usefulness.
Another contribution of this study is that it demonstrates whether the examined model relationships would change over time. While the effects of TTF on perceived usefulness and perceived ease of use and the effect of perceived usefulness on behavioral intention remained over time, our results indeed showed that the effects of perceived ease of use on perceived usefulness and behavioral intention had varied magnitudes and changed over time. First, as shown in Figure 2, the effects of perceived ease of use on behavioral intention were not significant at T1 and T2, whereas the effect was significant at T3. Similarly, Cheng and Yuen [6] examined the acceptance and continued usage of the learning management system by Hong Kong teenagers and found that the effect of perceived ease of use on behavioral intention increased over time. The growing effect of perceived ease of use on behavioral intention could be explained. In our study, students’ perceptions of perceived ease of use decreased over time. This means that students considered online learning systems to have low perceived ease of use, and they became more concerned with this issue, likely because low perceived ease of use caused difficulty in using the systems. Thus, they might be likely to allocate more weight to perceived ease of use in their behavioral intention to use the systems, consistent with what was found in previous studies [60,61,62]. In contrast, the effect of perceived ease of use on perceived usefulness decreased over time. Specifically, perceived ease of use had a significant effect on perceived usefulness at T1 and T2, whereas the effect became non-significant at T3. This result appears inconsistent with previous studies, which found that perceived ease of use was a direct determinant of perceived usefulness [27,37,63]. In general, our results suggest that perceived ease of use would affect behavioral intention through the mediating role of perceived usefulness at earlier usage stages, while it would directly affect behavioral intention, rather than indirectly through perceived usefulness, at later usage stages, especially when students have low perceptions of perceived ease of use.
Our study also took an important step to extend the acceptance model from widely adopted cross-sectional approaches (static view) to a longitudinal approach (dynamic view). In particular, we examined whether students’ previous perceptions of TTF, perceived ease of use, perceived usefulness, and behavioral intention would strengthen their corresponding perceptions in the subsequent stage. Our results showed that students’ perceptions at previous stages had no notable effect on perceptions at subsequent stages, except perceived usefulness at T2 had a significant effect on perceived usefulness at T3 and behavioral intention at T1 had a significant effect on actual usage at T2. Generally, this result appears to contradict what was suggested by the repeated behavioral pattern, which suggests that users are likely to repeat previous behavioral patterns would enhance subsequent behavior [50]. The reason may be that, in our study, the students generally had relatively negative perceptions of TTF, perceived ease of use, perceived usefulness, and behavioral intention (indicated by the less than average scores on a 5-point Likert scale), so they might have little willingness to repeat their experience with the online learning systems. Thus, their perceptions are unlikely to be reinforced by their previous experience. However, our results do suggest that behavioral intention at T1 could reinforce usage behaviors at T2, though the relationship failed to hold from T2 to T3. The reason for this could be that, as the use of online learning systems in our study was mandatory, students had to use them even if they had decreased behavioral intention to use them. This is likely to change the significance of the relationship between behavioral intention and actual usage. In contrast, Cheng and Yuen found that behavioral intention could significantly affect junior secondary students’ subsequent use of online learning systems in a relatively voluntary context during their 6-month longitudinal study [6]. Finally, together with the results from each time stage, our study appears to suggest that users’ perceptions of online learning are more likely to be predicted by antecedents in the same time stages, rather than by their previous perceptions. Thus, more attention should be paid to the influence of antecedents when promoting students’ acceptance of online learning.

5.2. Implications

The findings from this study provide both theoretical and practical implications for online learning acceptance. Theoretically, our study demonstrates the applicability of the integrated model of TAM and TTF for understanding students’ acceptance of online learning across different time stages. For example, the findings showed that TTF is an essential antecedent in developing behavioral intention to use online learning, and its role appears to become even greater over time. In addition, the present study represents one of the first longitudinal investigations to understand online learning acceptance in a mandatory setting after the outbreak of the COVID-19 pandemic. The integrated model of online learning acceptance and usage behaviors was empirically examined with a four-month longitudinal study on college students. The findings demonstrate the changing mechanism of the acceptance and continued use of online learning over time. In particular, our study suggests that TAM relationships that hold significance in cross-sectional studies may not remain over time. For example, our findings indicate that the effects of perceived ease of use on perceived usefulness and behavioral intention had varied magnitudes and changed over time. Thus, it is suggested that a longitudinal design may be adopted in future studies to verify whether the key factors for acceptance would reliably play their role over time. Finally, it should be noted that the relationships in the examined model might change once online learning systems are not mandatorily used. Previous studies suggest that variables that are largely determined by user experience (e.g., perceived ease of use and TTF) would have weakened impacts on behavioral intention to use online learning systems [33]. Thus, these variables are likely to exert more of an impact on usage behaviors in voluntary contexts.
Practically, our findings could benefit practitioners in promoting the use of online learning among students in mandatory contexts. Specifically, to increase students’ acceptance of online learning, the system should provide useful educational services and effective functions to facilitate students’ learning activities (usefulness) and do so in a way that could elicit their favorable perceptions regarding their user experience (ease of use). For example, technology developers could collect information on obstacles and problems encountered by students and teachers in actual use and provide solutions to improve users’ perceived ease of use. System developers could also hold training programs and publish instructional videos to support students’ usage so as to create a positive user experience. In addition, concerning the mid-low levels of students’ perceptions of TTF, perceived ease of use, perceived usefulness, and behavioral intention, practitioners should recognize the difference between online learning and offline learning and keep in mind that designing an online learning system that is useful and easy-to-use may be not sufficient. The system should work in a way that could fulfill students’ learning requirements by matching online learning functions with students’ learning patterns and habits to improve task–technology fit. Finally, practitioners should also be aware that the relatively high frequency of use of online learning systems may come from mandatory use of the system, instead of students’ voluntary behaviors, as students had mid-low levels of perceptions of the ease of use and usefulness of the systems. Thus, system usage is likely to decrease when the systems become voluntarily used, and evidence from voluntary usage contexts should then be adopted to develop appropriate measures for user acceptance.

6. Limitations and Future Research

Some limitations should be noted in our study. First, our participants were college students, and the sample was relatively small. This may limit the generalizability of our findings. Future research could consider recruiting more samples with diverse sociodemographic backgrounds and examine the possible influence of sociodemographic characteristics (e.g., gender, age, and course participation). Similarly, contextual variations, such as access or bandwidth conditions when using online learning systems, were not considered in this study, and could be examined further, as these variables might have impacts on the relationships within the model. Second, the survey was conducted during the early period of the COVID-19 pandemic when online learning could be a new educational mode for most students. They might have unstable affection and perceptions during that time, either due to their curiosity about online learning or due to less tolerance for the shortcomings of online learning. Thus, students’ perceptions of online learning may differ from later periods of the COVID-19 pandemic. Follow-up studies could be conducted to examine students’ acceptance of online learning at later periods and draw comparisons with the findings from our study.

7. Conclusions

This study proposed a longitudinal acceptance model of online learning in a mandatory setting and empirically tested the research model by using survey data collected from students in different regions of China after the outbreak of the COVID-19 pandemic. Our study demonstrates that the integrated model of TAM and TTF could be an effective tool to understand students’ acceptance of online learning across different time stages. TTF played an important role in determining perceived ease of use and perceived usefulness, and perceived usefulness had a strong effect on behavioral intention to use online learning systems at all three stages. In addition, our study also demonstrates the changing mechanism of the acceptance and continued use of online learning over time. While students generally had less favorable perceptions of the mandatory use of online learning systems, the perception scores tended to decrease over time. In particular, perceived ease of use had an indirect effect on behavioral intention through the mediating role of perceived usefulness at the first two stages, while it yielded a direct effect on behavioral intentions at the third time stage. Practitioners could base their design refinements and implementation strategies on our findings to promote more effective use of varied online learning systems. Future studies could also endeavor to examine students’ acceptance and continuance of online learning with a diverse educational background sample and at periods when online learning is adopted as a routine tool for learning purposes.

Author Contributions

Conceptualization and methodology, D.T., W.L., M.Q. and M.C.; formal analysis, W.L. and D.T.; investigation, D.T. and M.Q.; data curation, D.T. and M.Q.; writing—original draft preparation, W.L. and D.T.; writing—review and editing, D.T., W.L., M.Q. and M.C.; supervision, D.T. and M.C.; funding acquisition, D.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Shenzhen Education Science Planning Project (grant no. ybzz20002), the Graduate Educational Reform Programme of Shenzhen University (grant no. SZUGS2020JG08), the Foundation of Shenzhen Science and Technology Committee (grant no. 20200813225029002), and the Guangdong Basic and Applied Basic Research Foundation (grant no. 2021A1515110081).

Institutional Review Board Statement

Ethical review and approval were waived for this study. We have informed the respondents of the intention before investigation. All respondents participated voluntarily and responded anonymously.

Data Availability Statement

Data are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Kizilcec, R.F.; Saltarelli, A.J.; Reich, J.; Cohen, G.L. Closing global achievement gaps in MOOCs. Science 2017, 355, 251–252. [Google Scholar] [CrossRef] [PubMed]
  2. Lo, M.C.; Ramayah, T.; Mohamad, A.A. Does intention really lead to actual use of technology? A study of an E-learning system among university students in Malaysia. Croat. J. Educ. Hrvat. Časopis Za Odgoj. I Obraz. 2015, 17, 835–863. [Google Scholar]
  3. Jung, Y.; Lee, J. Learning Engagement and Persistence in Massive Open Online Courses (MOOCS). Comput. Educ. 2018, 122, 9–22. [Google Scholar] [CrossRef]
  4. Adedoyin, O.B.; Soykan, E. COVID-19 pandemic and online learning: The challenges and opportunities. Interact. Learn. Environ. 2020, 1–13. [Google Scholar] [CrossRef]
  5. Singh, V.; Thurman, A. How Many Ways Can We Define Online Learning? A Systematic Literature Review of Definitions of Online Learning (1988-2018). Am. J. Distance Educ. 2019, 33, 289–306. [Google Scholar] [CrossRef]
  6. Cheng, M.; Yuen, A.H.K. Student continuance of learning management system use: A longitudinal exploration. Comput. Educ. 2018, 120, 241–253. [Google Scholar] [CrossRef]
  7. Tao, D.; Fu, P.; Wang, Y.; Zhang, T.; Qu, X. Key characteristics in designing massive open online courses (MOOCs) for user acceptance: An application of the extended technology acceptance model. Interact. Learn. Environ. 2019, 30, 882–895. [Google Scholar] [CrossRef]
  8. Vladova, G.; Ullrich, A.; Bender, B.; Gronau, N. Students’ acceptance of technology-mediated teaching–how it was influenced during the COVID-19 Pandemic in 2020: A Study from Germany. Front. Psychol. 2021, 12, 69. [Google Scholar] [CrossRef]
  9. Al-Rahmi, W.M.; Yahaya, N.; Aldraiweesh, A.A.; Alamri, M.M.; Aljarboa, N.A.; Alturki, U.; Aljeraiwi, A.A. Integrating technology acceptance model with innovation diffusion theory: An empirical investigation on students’ intention to use E-learning systems. IEEE Access 2019, 7, 26797–26809. [Google Scholar] [CrossRef]
  10. Zhang, C.; Chen, H.; Phang, C.W. Role of Instructors’ Forum Interactions with Students in Promoting MOOC Continuance. J. Glob. Inf. Manag. 2018, 26, 105–120. [Google Scholar] [CrossRef] [Green Version]
  11. Bartolome, A.; Steffens, K. Are MOOCs Promising Learning Environments? Comunicar. Media Educ. Res. J. 2015, 22, 91–100. [Google Scholar] [CrossRef] [Green Version]
  12. Zhang, Z.; Cao, T.; Shu, J.; Liu, H. Identifying key factors affecting college students’ adoption of the e-learning system in mandatory blended learning environments. Interact. Learn. Environ. 2020, 1–14. [Google Scholar] [CrossRef]
  13. Sun, Y.; Jeyaraj, A. Information technology adoption and continuance: A longitudinal study of individuals′ behavioral intentions. Inf. Manag. 2013, 50, 457–465. [Google Scholar] [CrossRef]
  14. Back, D.A.; Haberstroh, N.; Sostmann, K.; Schmidmaier, G.; Putzier, M.; Perka, C.; Hoff, E. High efficacy and students’ satisfaction after voluntary vs mandatory use of an e-learning program in traumatology and orthopedics—A follow-up study. J. Surg. Educ. 2014, 71, 353–359. [Google Scholar] [CrossRef] [PubMed]
  15. Zhang, T.; Tao, D.; Qu, X.; Zhang, X.; Zeng, J.; Zhu, H.; Zhu, H. Automated vehicle acceptance in China: Social influence and initial trust are key determinants. Transp. Res. Part C Emerg. Technol. 2020, 112, 220–233. [Google Scholar] [CrossRef]
  16. Costa, A.; Costa, A.; Olsson, I.A.S. Students′ acceptance of e-learning approaches in Laboratory Animal Science Training. Lab. Anim. 2020, 54, 487–497. [Google Scholar] [CrossRef]
  17. Wu, B.; Chen, X. Continuance intention to use MOOCs: Integrating the technology acceptance model (TAM) and task technology fit (TTF) model. Comput. Hum. Behav. 2017, 67, 221–232. [Google Scholar] [CrossRef]
  18. Dishaw, M.T.; Strong, D.M. Extending the technology acceptance model with task-technology fit constructs. Inf. Manag. 1999, 36, 9–21. [Google Scholar] [CrossRef]
  19. Goodhue, D.L.; Thompson, R.L. Task-Technology Fit and Individual Performance. MIS Q. 1995, 19, 213–236. [Google Scholar] [CrossRef]
  20. Chang, H.H. Task-technology fit and user acceptance of online auction. Int. J. Human-Comput. Stud. 2010, 68, 69–89. [Google Scholar] [CrossRef]
  21. Vanduhe, V.Z.; Nat, M.; Hasan, H.F. Continuance intentions to use gamification for training in higher education: Integrating the technology acceptance model (TAM), Social motivation, and task technology fit (TTF). IEEE Access 2020, 8, 21473–21484. [Google Scholar] [CrossRef]
  22. Tao, D.; Wang, T.; Wang, T.; Zhang, T.; Zhang, X.; Qu, X. A systematic review and meta-analysis of user acceptance of consumer-oriented health information technologies. Comput. Hum. Behav. 2020, 104, 106147. [Google Scholar] [CrossRef]
  23. Granić, A.; Marangunić, N. Technology acceptance model in educational context: A systematic literature review. Br. J. Educ. Technol. 2019, 50, 2572–2593. [Google Scholar] [CrossRef]
  24. Venkatesh, V.; Thong, J.Y.; Xu, X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef] [Green Version]
  25. Tao, D.; Yuan, J.; Shao, F.; Li, D.; Zhou, Q.; Qu, X. Factors Affecting Consumer Acceptance of an Online Health Information Portal Among Young Internet Users. CIN Comput. Inform. Nurs. 2018, 36, 530–539. [Google Scholar] [CrossRef]
  26. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  27. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  28. Tao, D.; Shao, F.; Wang, H.; Yan, M.; Qu, X. Integrating usability and social cognitive theories with the technology acceptance model to understand young users’ acceptance of a health information portal. Health Inform. J. 2020, 26, 1347–1362. [Google Scholar] [CrossRef] [Green Version]
  29. Wang, H.; Tao, D.; Yu, N.; Qu, X. Understanding consumer acceptance of healthcare wearable devices: An integrated model of UTAUT and TTF. Int. J. Med. Inform. 2020, 139, 104156. [Google Scholar] [CrossRef]
  30. Liu, K.; Tao, D. The roles of trust, personalization, loss of privacy, and anthropomorphism in public acceptance of smart healthcare services. Comput. Hum. Behav. 2021, 127, 107026. [Google Scholar] [CrossRef]
  31. Chen, X.; Tao, D.; Zhou, Z. Factors affecting reposting behaviour using a mobile phone-based user-generated-content online community application among Chinese young adults. Behav. Inf. Technol. 2018, 38, 120–131. [Google Scholar] [CrossRef]
  32. Salloum, S.A.; Alhamad, A.Q.M.; Al-Emran, M.; Monem, A.A.; Shaalan, K. Exploring Students’ Acceptance of E-Learning Through the Development of a Comprehensive Technology Acceptance Model. IEEE Access 2019, 7, 128445–128462. [Google Scholar] [CrossRef]
  33. Decman, M. Modeling the acceptance of e-learning in mandatory environments of higher education: The influence of previous education and gender. Comput. Hum. Behav. 2015, 49, 272–281. [Google Scholar] [CrossRef]
  34. Melone, N.P. A Theoretical Assessment of the User-Satisfaction Construct in Information Systems Research. Manag. Sci. 1990, 36, 76–91. [Google Scholar] [CrossRef]
  35. Akman, I.; Turhan, C. User acceptance of social learning systems in higher education: An application of the extended Technology Acceptance Model. Innov. Educ. Teach. Int. 2015, 54, 229–237. [Google Scholar] [CrossRef]
  36. Kaewsaiha, P.; Chanchalor, S. Factors affecting the usage of learning management systems in higher education. Educ. Inf. Technol. 2020, 26, 2919–2939. [Google Scholar] [CrossRef]
  37. Cheng, M.; Yuen, A.H.K. Junior secondary students’ acceptance and continuance of e-learning system use: A multi-group analysis across social backgrounds. Behav. Inf. Technol. 2022, 41, 324–347. [Google Scholar] [CrossRef]
  38. Raes, A.; Depaepe, F. A longitudinal study to understand students’ acceptance of technological reform. When experiences exceed expectations. Educ. Inf. Technol. 2020, 25, 533–552. [Google Scholar]
  39. Tam, C.; Oliveira, T. Understanding the impact of m-banking on individual performance: DeLone & McLean and TTF perspective. Comput. Hum. Behav. 2016, 61, 233–244. [Google Scholar]
  40. D’Ambra, J.; Wilson, C.S.; Akter, S. Application of the task-technology fit model to structure and evaluate the adoption of E-books by Academics. J. Am. Soc. Inf. Sci. Technol. 2012, 64, 48–64. [Google Scholar] [CrossRef] [Green Version]
  41. Lee, C.-C.; Cheng, H.K.; Cheng, H.-H. An empirical study of mobile commerce in insurance industry: Task–technology fit and individual differences. Decis. Support Syst. 2007, 43, 95–110. [Google Scholar] [CrossRef]
  42. Yen, D.C.; Wu, C.-S.; Cheng, F.-F.; Huang, Y.-W. Determinants of users’ intention to adopt wireless technology: An empirical study by integrating TTF with TAM. Comput. Hum. Behav. 2010, 26, 906–915. [Google Scholar] [CrossRef]
  43. Pal, D.; Patra, S. University Students’ Perception of Video-Based Learning in Times of COVID-19: A TAM/TTF Perspective. Int. J. Human-Comput. Interact. 2020, 37, 903–921. [Google Scholar] [CrossRef]
  44. Mokhtar, S.A.; Katan, H.; Hidayat-ur-Rehman, I. Instructors’ behavioural intention to use learning management system: An integrated TAM perspective. TEM J. 2018, 7, 513. [Google Scholar]
  45. Bhattacherjee, A. Understanding Information Systems Continuance: An Expectation-Confirmation Model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  46. Giger, J.T.; Pope, N.D.; Vogt, H.B.; Gutierrez, C.; Newland, L.A.; Lemke, J.; Lawler, M.J. Remote patient monitoring acceptance trends among older adults residing in a frontier state. Comput. Hum. Behav. 2015, 44, 174–182. [Google Scholar] [CrossRef]
  47. Venkatesh, V.; Davis, F.D. A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef] [Green Version]
  48. Hart, J.; Sutcliffe, A. Is it all about the Apps or the Device?: User experience and technology acceptance among iPad users. Int. J. Human-Comput. Stud. 2019, 130, 93–112. [Google Scholar] [CrossRef]
  49. Hogarth, R.; Einhorn, H.J. Order effects in belief updating: The belief-adjustment model. Cogn. Psychol. 1992, 24, 1–55. [Google Scholar] [CrossRef]
  50. Conner, M.; Armitage, C. Extending the Theory of Planned Behavior: A Review and Avenues for Further Research. J. Appl. Soc. Psychol. 1998, 28, 1429–1464. [Google Scholar] [CrossRef]
  51. Kesharwani, A. Do (how) digital natives adopt a new technology differently than digital immigrants? A longitudinal study. Inf. Manag. 2019, 57, 103170. [Google Scholar] [CrossRef]
  52. Willis, G. Cognitive Interviewing: A Tool for Improving Questionnaire Design; Sage Publications: Thousand Oaks, CA, USA, 2005. [Google Scholar]
  53. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  54. DeVellis, R. Scale Development: Theory and Applications, 2nd ed.; Applied Social Research Methods Series; Sage Publications: Thousand Oaks, CA, USA, 2003; Volume 26. [Google Scholar]
  55. Cangur, S.; Ercan, I. Comparison of model fit indices used in structural equation modeling under multivariate normality. J. Mod. Appl. Stat. Methods 2015, 14, 152–167. [Google Scholar] [CrossRef]
  56. Hu, L.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  57. Lockee, B.B. Online education in the post-COVID era. Nat. Electron. 2021, 4, 5–6. [Google Scholar] [CrossRef]
  58. Lin, C.-L.; Jin, Y.Q.; Zhao, Q.; Yu, S.-W.; Su, Y.-S. Factors influence students’ switching behavior to online learning under COVID-19 pandemic: A push–pull–mooring model perspective. Asia-Pac. Educ. Res. 2021, 30, 229–245. [Google Scholar] [CrossRef]
  59. Mohammadi, H. Factors affecting the e-learning outcomes: An integration of TAM and IS success model (Retration of vol 32, pg 701, 2015). Telemat. Inform. 2015, 32, R1. [Google Scholar] [CrossRef]
  60. Drennan, J.; Kennedy, J.; Pisarski, A. Factors Affecting Student Attitudes Toward Flexible Online Learning in Management Education. J. Educ. Res. 2005, 98, 331–338. [Google Scholar] [CrossRef]
  61. Venkatesh, V.; Davis, F.D. A model of the antecedents of perceived ease of use: Development and test. Decis. Sci. 1996, 27, 451–481. [Google Scholar] [CrossRef]
  62. Metallo, C.; Agrifoglio, R. The effects of generational differences on use continuance of Twitter: An investigation of digital natives and digital immigrants. Behav. Inf. Technol. 2015, 34, 869–881. [Google Scholar] [CrossRef]
  63. Lee, M.-C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010, 54, 506–516. [Google Scholar] [CrossRef]
Figure 1. The longitudinal online learning acceptance model proposed in this study. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Figure 1. The longitudinal online learning acceptance model proposed in this study. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Sustainability 14 07830 g001
Figure 2. The final model and significant standardized path coefficients. * p < 0.05 ** p < 0.01; *** p < 0.001. Solid lines represent significant relationships while dotted lines represent non-significant relationships. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Figure 2. The final model and significant standardized path coefficients. * p < 0.05 ** p < 0.01; *** p < 0.001. Solid lines represent significant relationships while dotted lines represent non-significant relationships. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Sustainability 14 07830 g002
Table 1. Summary of demographic characteristics of the study sample.
Table 1. Summary of demographic characteristics of the study sample.
CharacteristicsCategoriesn (%)
GenderMale175 (69.7%)
Female76 (30.3%)
Type of student identity Junior college students14 (5.6%)
Undergraduate students 224 (89.2%)
Postgraduate students13 (5.2%)
Students’ majorArts and humanities3 (1.2%)
Science20 (8%)
Engineering183 (72.9%)
Business16 (6.4%)
Other major29 (11.6%)
Frequency of using online learning before the semesterNone29 (11.6%)
Several times or less per year 84 (33.5%)
Several times or less per month75 (29.9%)
Several times or less per week43 (17.1%)
Several times or less per day20 (8.0%)
Table 2. Items for the constructs and their factor loadings for the three stages.
Table 2. Items for the constructs and their factor loadings for the three stages.
ConstructsItemsFactor Loading
T1T2T3
TTFThe online learning system is fit for the requirements of my learning.0.870.900.91
Using online learning system fits with my e-learning practice.0.900.900.86
The functions in online learning system fit with my learning needs.0.880.880.95
The online learning system is suitable for helping me with my learning.0.860.920.91
PEOULearning to use online learning system is easy for me.0.750.710.81
I find it easy to use online learning system to do what I want it to do.0.690.740.88
It will be easy for me to become skillful at using online learning system.0.870.890.87
I will find online learning system easy to use.0.880.870.88
PUUsing online learning system is useful in meeting my learning needs.0.830.890.88
Using online learning system enables me to accomplish learning goals more quickly.0.770.700.80
Using online learning system improves the quality of my learning.0.860.870.93
I find online learning system useful in my learning.0.760.810.84
BII plan to use online learning system in the future.0.780.860.93
I am willing to use online learning system more in my future learning.0.910.920.93
I want to continuously use online learning system in my future learning.0.900.910.95
USEIn the past two months, the frequency of effective use of online learning system---
TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Table 3. Summary of Cronbach’s alpha, composite reliability, and AVE for the three stages.
Table 3. Summary of Cronbach’s alpha, composite reliability, and AVE for the three stages.
Time StagesConstructsNumber of ItemsCronbach’s AlphaComposite ReliabilityAVE
T1TTF40.930.930.77
PEOU40.870.870.63
PU40.880.880.65
BI30.900.900.75
T2TTF40.940.940.81
PEOU40.870.880.65
PU40.880.890.67
BI30.920.930.81
T3TTF40.950.950.82
PEOU40.920.920.74
PU40.920.920.75
BI30.950.960.88
AVE: Average variance extracted; TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention.
Table 4. Discriminant validity of the measured constructs.
Table 4. Discriminant validity of the measured constructs.
FactorsT1T2T3
TTFPEOUPUBITTFPEOUPUBIUSETTFPEOUPUBIUSE
T1TTF0.88
PEOU0.62 ***0.80
PU0.79 ***0.63 ***0.81
BI0.70 ***0.53 ***0.69 ***0.87
T2TTF0.040.010.060.060.90
PEOU−0.06−0.020.00−0.010.62 ***0.81
PU−0.03−0.040.00−0.010.82 ***0.63 ***0.82
BI0.00−0.040.040.010.72 ***0.58 ***0.74 ***0.90
USE0.130 *0.155 *0.090.12−0.07−0.06−0.02−0.03-
T3TTF0.00−0.05−0.03−0.10−0.040.000.010.030.030.91
PEOU−0.03−0.12−0.05−0.08−0.030.05−0.010.050.040.75 ***0.86
PU−0.01−0.05−0.01−0.110.040.080.080.110.020.86 ***0.74 ***0.86
BI−0.05−0.08−0.03−0.12−0.020.030.030.06−0.040.81 ***0.73 ***0.83 ***0.94
USE0.090.060.050.020.05−0.07−0.01−0.060.060.020.000.020.02-
* p < 0.05; *** p < 0.001. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage. The numbers in the diagonal row are square root of AVE.
Table 5. Main effects of time on constructs.
Table 5. Main effects of time on constructs.
FactorsStagesMeanSDF/tp-Value
TTFT12.360.753.9290.020
T22.380.74
T32.250.78
PEOUT12.130.634.810.009
T22.100.60
T32.010.64
PUT12.430.6910.196<0.001
T22.450.73
T32.260.79
BIT12.450.796.1830.002
T22.510.79
T32.330.88
USET14.520.683.3750.001
T24.350.83
TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Table 6. Fit indices for the tested model.
Table 6. Fit indices for the tested model.
Fit IndicesRecommended ValuesTested Model
χ2/df<31.642
RMR<0.050.036
IFI>0.90.941
TLI>0.90.936
CFI>0.90.941
RMSEA<0.080.051
Table 7. Results of hypothesis testing.
Table 7. Results of hypothesis testing.
HypothesesPath Coefficientsp-ValuesHypothesis Supported?
H1a: PU (T1)→BI (T1)0.79<0.001 ***Yes
H1b: PU (T2)→BI (T2)0.77<0.001 ***Yes
H1c: PU (T3)→B I(T3)0.73<0.001 ***Yes
H2a. PEOU (T1)→BI (T1)0.010.922No
H2b: PEOU (T2)→BI (T2)0.090.127No
H2c: PEOU (T3)→BI (T3)0.200.002 **Yes
H3a: PEOU (T1)→PU (T1)0.180.002 **Yes
H3b: PEOU (T2)→PU (T2)0.110.017 *Yes
H3c: PEOU (T3)→PU (T3)0.100.112No
H4a: BI (T1)→USE (T2)0.130.045 *Yes
H4b: BI (T2)→USE (T3)−0.050.447No
H5a: USE (T2)→BI (T2)0.020.592No
H5b: USE (T3)→BI (T3)0.000.968No
H6a: TTF (T1)→PEOU (T1)0.62<0.001 ***Yes
H6b: TTF (T2)→PEOU (T2)0.62<0.001 ***Yes
H6c: TTF (T3)→PEOU (T3)0.80<0.001 ***Yes
H7a: TTF (T1)→PU (T1)0.77<0.001 ***Yes
H7b: TTF (T2)→PU (T2)0.84<0.001 ***Yes
H7c: TTF (T3)→PU (T3)0.85<0.001 ***Yes
H8a: PEOU (T1)→PEOU (T2)−0.030.567No
H8b: PEOU (T2)→PEOU (T3)0.040.338No
H9a: PU (T1)→PU (T2)−0.070.052No
H9b: PU (T2)→PU (T3)0.090.007 **Yes
H10a: BI (T1)→BI (T2)0.030.48No
H10b: BI (T2)→BI (T3)−0.040.318No
H11a: TTF (T1)→TTF (T2)0.040.511No
H11a: TTF (T2)→TTF (T3)−0.030.618No
H12: USE (T2)→USE (T3)0.060.314No
* p < 0.05; ** p < 0.01; *** p < 0.001. TTF: Task–technology fit; PEOU: Perceived ease of use; PU: Perceived usefulness; BI: Behavioral intention; USE: Actual usage.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Tao, D.; Li, W.; Qin, M.; Cheng, M. Understanding Students’ Acceptance and Usage Behaviors of Online Learning in Mandatory Contexts: A Three-Wave Longitudinal Study during the COVID-19 Pandemic. Sustainability 2022, 14, 7830. https://doi.org/10.3390/su14137830

AMA Style

Tao D, Li W, Qin M, Cheng M. Understanding Students’ Acceptance and Usage Behaviors of Online Learning in Mandatory Contexts: A Three-Wave Longitudinal Study during the COVID-19 Pandemic. Sustainability. 2022; 14(13):7830. https://doi.org/10.3390/su14137830

Chicago/Turabian Style

Tao, Da, Wenkai Li, Mingfu Qin, and Miaoting Cheng. 2022. "Understanding Students’ Acceptance and Usage Behaviors of Online Learning in Mandatory Contexts: A Three-Wave Longitudinal Study during the COVID-19 Pandemic" Sustainability 14, no. 13: 7830. https://doi.org/10.3390/su14137830

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop