Next Article in Journal
Assessment of Disused Public Buildings: Strategies and Tools for Reuse of Healthcare Structures
Previous Article in Journal
Linking CSR Communication to Corporate Reputation: Understanding Hypocrisy, Employees’ Social Media Engagement and CSR-Related Work Engagement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

University Accounting Students and Faculty Members Using the Blackboard Platform during COVID-19; Proposed Modification of the UTAUT Model and an Empirical Study

1
Accounting Department, College of Business Administration, Jazan University, Jazan 45142, Saudi Arabia
2
School of Accounting, Information System and Supply Chain, College of Business and Law, RMIT University, Melbourne 3000, Australia
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(4), 2360; https://doi.org/10.3390/su14042360
Submission received: 13 January 2022 / Revised: 5 February 2022 / Accepted: 9 February 2022 / Published: 18 February 2022

Abstract

:
The current COVID-19 pandemic has changed education systems in most countries: some have shut down whilst others, especially in the higher education sector, have introduced electronic/distance learning systems, such as Blackboard platforms. The current study aimed to identify and test the factors that influence accounting students’ and faculty members’ Blackboard platform use during the pandemic. The Unified Theory of Acceptance and Use of Technology (UTAUT) model was extended and modified by adding four new variables: perceived risk, mobility, self-efficacy and self-managed learning. This was done to generate an understanding of people’s usage traits. This is the main contribution of the article: the extension of UTAUT in this context and the consideration of online learning in a pandemic environment situation. Pandemics oppose sustainability from numerous contexts. Measures which counteract sustainability risks associated with pandemics, such as use of technology, are critical risk management instruments and are, thus, important for consideration. Through an online survey, the data was gathered from accounting students and faculty staff in an accounting department at Jazan University, Saudi Arabia. Structural equation modelling (SEM) was used to analyse the data and examine the hypotheses. This study confirmed the hypotheses concerning the influence of the Blackboard platform on mobility, self-efficacy and self-managed learning. This article contributes to the existing UTAUT model by extending our understanding of the influence of factors to use Blackboard platforms. Moreover, the results have practical implications for policymakers, practitioners, online learning product providers and teaching staff looking to develop efficient strategies concerning learning-related information technologies.

1. Introduction

In today’s world, the use of technology in education is ubiquitous. As online technology improves and changes, for instance learning management systems (LMSs), we see the rise of robust software systems devised to manage education activities, with a focus on assisting instructors to impart knowledge to their students. LMSs can assist learning institutions for storing, managing, and sharing subject and program content [1]. The COVID-19 pandemic has greatly disrupted traditional forms of learning and teaching [2,3] and, furthermore, triggered a major global crisis in how the tertiary education system functions [3]. The impacts are numerous and, subsequently, required emergency response thinking. One such example was identified in a survey carried out by Times Higher Education in 2020. It was revealed in the survey that universities’ finances continue to suffer and, as an emergency response, their management teams have been required to make urgent decisions.
These decisions have included various cost-cutting initiatives, such as people being laid off or forced to work from home without infrastructure support [4]. A direct result of this crisis response by higher education authorities has been the complete transfer to online learning. Guri-Rosenblit (2005) [5] defines electronic learning as the utilization of electronic media for augmenting traditional classrooms and to some extent replacing physical meetings with online encounters. E-learning is technology-based learning involving electronic delivery of materials to remote students through a computer network. E-learning mechanisms are now deemed widespread and popular applications in universities that supported course learning digitally [4,6] during the pandemic.
Numerous factors influence online learning; there is a feeling of engagement and reward for students when they attend courses electronically [7]. According to Rovai and Downey (2010) [8], adequate training is mandatory for teachers; the availability of resources to promote meaningful outcomes is also necessary. If there is no confidence in full online learning, then it, and teaching, faces the risk of failing [9]. Universities that use Blackboard, Zoom, and Google Class to produce learning systems can experience serious technical issues that result in hurdles for students and faculty members [10]. In the process of utilising online learning technologies, such as Blackboard, unsuitable infrastructure as well as absence of technical support can create issues relating to the delivery of quality education.
As far as online teaching and learning platforms are concerned, Blackboard holds the highest market share [11]. Blackboard is increasingly popular in developed countries compared to developing ones. Recently, Blackboard has offered a variety of opportunities and features. Research and marketing studies find that usage of online learning platforms is accelerating and, in 2020, was valued at $13 billion; it can potentially be worth $25 billion by 2025. It is acknowledged that a high level of usability is critical for online learning systems and they need to be designed and implemented successfully for program and course delivery [12].
Hence, institutions are expected to carry out a careful evaluation of the usability of online learning systems beforehand. As suggested earlier, the COVID-19 pandemic has led to much more reliance on online systems for education delivery as an emergency response. It is the purpose of this article to attain feedback from users of Blackboard in the crisis environment. The study seeks to identify and test the factors that have impacted accounting students’ and faculty members’ use of this platform throughout the COVID-19 crisis. The Accounting Department at Jazan University in the Kingdom of Saudi Arabia has been selected for this purpose; the main justification for this is that the deployment of e-learning platforms was in its infancy in Saudi universities prior to the pandemic. The pandemic subsequently created a crisis situation. Crises typically require emergency responses and actions. Although emergency responses, such as the rapid transfer to e-learning, have been required at numerous HE institutions around the world, an environment such as a Saudi university helps to identify additional critical factors which exert an influence on rapid technology adoption.
Students and faculty members in the accounting discipline were selected because they typically comprise large cohorts, and because technology use has been recently promoted in the discipline from an education perspective. The key contribution of this article is to evaluate user experience of the Blackboard platform during the pandemic. A discussion of relevance of the Blackboard platform (as a risk management tool for sustainability) impact during the pandemic from the perspective of sustainability is provided in the following section. In addition, this paper contributes to the existing UTAUT model by extending our understanding of the influence of factors to use Blackboard platforms.
The article is organised as follows. After the introduction, the second section reviews the relevant literature. In the second section, the classical UTAUT model is presented for hypotheses development. For the third section, the research model is explained by examining ten hypotheses. Then, the fourth section explains the methods used to conduct the analysis. There is a discussion of the major findings in the fifth section. Lastly, the findings are summarised, implications are drawn, and future research and limitations are explained.

2. Literature Review

Sustainability, COVID-19 Pandemic and Technology as a Risk Management Tool

Sustainable Development Goal (SDG) 4 promotes inclusive and equitable education for all [13]. In addition, according to the United Nations (2022) [13], the COVID-19 Pandemic has wiped out 20 years of education gains. The severe lack of access to necessary infrastructure and facilities, not only in schools, but also at higher education institutes, has posed as a major risk against the sustainability of quality education. Information and communication technologies (ICTs), including online learning technologies, play a critical role in pursuing the SDGs [14], including SDG 4, especially during the pandemic, when conventional infrastructure and resource access has been denied due to COVID-19 related restrictions such as lockdowns [15].
Thus, during the pandemic, universities throughout the world started offering distance learning programs and, hence, ICT usage became critical. ICTs have, thus, become critical risk management instruments in the pandemic environment. These institutions have, by and large, been able to develop models incorporating ICT into their education systems, programs and courses [6,7,16]. Such frameworks of e-learning enhance learning and reduce the unfavourable outcomes of traditional teaching techniques that have been severely restricted due to the pandemic. Al-araibi et al. (2019) [17] have identified that e-learning in developing nations is either completely unsuccessful (45%), partially unsuccessful (40%), or only 15% successful. Other researchers in the IS/IT discipline have covered the factors that shape the successful use of e-learning systems [17,18].
E-learning is a method of learning that offers resources through internet browsers, with which students are already familiar. There is repetition and adjustment of conventional instruction segments (evaluation, backing, cooperation, philosophy, explicit content, and development) in e-learning innovations [10]. Additionally, e-learning frameworks motivate the learning cycle and can be adapted to suit individuals’ personal learning styles [19,20]. According to Sawaftah and Aljeraiwi (2018) [10], there is a remarkable difference between conventional educational techniques and internet teaching strategies, thereby necessitating careful development, checking, and control. Likewise, according to Sawaftah and Aljeraiwi (2018) [10], the term “blended” stands for merging some instructional strategies: non-organised and organised, shared and individual, on the web and disconnected, on-location and off via a website, simultaneous and non-concurrent. Being able to adjust instructional strategies to a personal learning style is the best way of applying the mixed learning concept.
However, according to others [17,18] technological issues such as absence of security, infrastructure and confidentiality concerns have been identified as leading problems with e-learning [17,18,21]. Naveed et al. (2017) [22], found that the absence of students’ awareness was responsible for unsuccessful e-learning. Furthermore, Al-araibi et al. (2019) [17], stated that universities’ readiness can determine the success or otherwise of e-learning. Low levels of implementation was a key factor prior to the pandemic that still occurs due to the reluctance of learners to embrace the new technology in developing nations [18,23,24]. Consequently, empirical research is required to understand and test the factors that affect HE students’ and faculty members’ use of e-learning systems, specifically the Blackboard platform, as behavioural intentions to use it during the COVID-19 pandemic.

3. Theoretical Framework and Hypotheses Development

A number of studies have suggested technology acceptance theories and models to explain and predict users’ acceptance of a particular hardware or software. It has become important to consider the extent of users’ experiences [25]. Numerous researchers proposed their own acceptance models to explain people’s actions and behaviours [26,27]. Some theoretical frameworks need to be highlighted, such as TAM—technology acceptance model [27], MM—motivational model [28], IDT—innovation diffusion theory [29], TPB—theory of planned behaviour [30], TBP [31], C-TAM-TPB—combined model of SCT and TAM—the theory of social cognitive [32], TAM 2 [33], UTAUT [34], and more recently, TAM 3 [35]. Several studies conducted extensive comparative analyses of models and theories pertaining to information technology use [25,36,37].
Regarding the UTAUT, TAMs are very popular in the technology acceptance scenario. The UTAUT model helps to assess these dependent variables (DVs): behavioural intention (BI), as well as the user behaviour (UB) through estimating the impact of four main independent variables (IVs): performance expectancy (PE), social influence (SI), effort expectancy (EE), and facilitating conditions (FC) [34]. Likewise, UTAUT pays attention to the impact of four moderators (M): voluntariness of use, past experience with technologies, gender, and age. Some researchers were able to modify the classical form of the UTAUT model by adding to it more determinants and independent variables and by removing older aspects of it. The new modifications pay attention to novel variables such as system flexibility [38], system interactivity [39], system enjoyment (SE) [40], or results demonstrability [41].
UTAUT is useful when it comes to surveys devised for a variety of technology adoption contexts, such as e-government [42], e-commerce [43], e-learning or m-learning [44,45,46], internet banking [47], or social networking [48]. Moreover, UTAUT makes it possible for organisational research to study various viewpoints, such as gender [49], cross-cultural and cultural differences [50]. E-learning is a critical theme in the latest developments of IS within the novel technological platforms of virtual reality and virtual learning environments (VLE) [25]. The aim of VLE as a computerised system is to enhance learning and teaching tasks within a conventional setting [51,52]. Such e-learning contexts operate online and assist instructors in tasks such as assessing, communicating, managing of class materials, collection and organisation of grades, etc. [51]. Diffusion of innovation refers to the communication of an innovation via particular channels amongst members of a social system and considers spreading novel ideals and concepts [53]. In numerous higher education organisations, communicating e-learning techniques helped to spread e-learning as an innovation.
When an e-learning strategy is clear and communicated properly, any concerns or doubts about e-learning are minimised [20,54]. According to McLean (2005) [55], with regard to academics’ interest in technology, they may be reluctant to engage in e-learning tasks. Further, what is absent is a demonstrable e-learning institutional technique that can encourage academic teachers to embrace e-learning. With this in mind, TAM can predict the success of technology adoption by 30%; TAM2 is capable of predicting it by 40% [36], while there has been a 70% increase in the prediction when the moderators and variables in the UTAUT model were combined [56]. Nevertheless, other reasons for choosing UTAUT include the agreement with [57], who found that UTAUT is ideal for further investigating the IS context. An empirical test was carried out by [47] on five TAMs, and they showed that UTAUT is a suitable model for assessing e-textbooks’ acceptance and other aspects of e-learning technology that are covered in this paper.
Thus, the current study utilises UTAUT in order to determine the factors that impact accounting students’ and faculty members’ use of Blackboard platforms. The development of the UTAUT model for this research was done examining eight models: “motivational model (MM)”, “TAM”, “TRA”, “model of PC utilization (MPCU)”, “TPB”, “combined TAM and TPB (CTAM-TPB)”, “social cognitive theory (SCT)”, and “innovation diffusion theory (IDT)” [34]. Currently, scholars deem the UTAUT model to be the best one for investigating different technology applications. It is widely applicable in that it possesses satisfactory authority to explain technology use behaviour (more than 70%) [56,58]. The development of the UTAUT 2 model meant adding more variables to the basic UTAUT model. Since the institution under consideration utilises the Blackboard platform for e-learning, users do not have to pay for access to it. It should be noted that users might not be experienced at all, or only minimally, in the use of the Blackboard platform. This study considers the UTAUT model’s four major factors—(PE), EE), (SI), and (FC)—which have guided technology use behaviour (UB) and behavioural intention (BI), as shown in Figure 1. PE, EE, SI, and FC are the four determining constituents of usage behaviour and BI [34]. The moderators that shape technology use include willingness to use, experience, age, and gender (see Figure 1).
Selecting a suitable model refers to all factors affecting the intention of Blackboard platforms’ users in the context of the institution selected. This research chose the UTAUT model as the theoretical foundation for creating the conceptual model. Literature shows that the dominant factors which affect the adoption and usage of Blackboard are perceived risk (PR), mobility (M), self-efficacy (SE) and self-managed learning (SML). This study attempted to expand the UTAUT model with new variables that could explain Blackboard platform use in a particular setting. The proposition of this research is that the behavioural intention to use the platform is guided by PR, SE, PE, EE, SI, SML, M, and FC. The suggested conceptual model is displayed in Figure 2.

3.1. Perceived Risk

Pavlou (2003) [59] defines perceived risk (PR) as the unknown direct impact of the electronic transaction on users. When it comes to the Blackboard platform, here the perceived risk is how this technology will affect the behavioural intention of users [60]. The literature regards perceived risk as an important factor that could wield a negative impact on the use of the technology [61,62], Hence, this study posits the following hypothesis:
Hypothesis 1 (H1).
Perceived risk negatively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.2. Self-Efficacy

Self-efficacy (SE) refers to the level to which a specific technology achieves a particular task [63,64]. Prior studies pointed out that SE exerts a significant impact on behavioural intention in the e-learning context [47,65]. On this basis, the following hypothesis is suggested:
Hypothesis 2 (H2).
Self-efficacy positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.3. Performance Expectancy

Performance expectancy (PE) is defined as the level to which someone believes that a certain technology will improve their efficiency and performance [34]. PE indicates the efficacy of learning and retrieving essential information through e-learning anytime and anywhere [66]. Prior studies pointed out that PE significantly impacts BI towards UB in the e-learning context [11,67]. Based on this, the following hypothesis is written here:
Hypothesis 3 (H3).
Performance expectancy positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.4. Effort Expectancy

Effort expectancy (EE) means the ease of use of a certain system [34]. The literature states that EE is one of the most effective factors affecting e-learning adoption [11,34,68]. People expect that an improved ease in using e-learning heightens the BI of users [66]. Prior studies pointed out that the EE wields a significant impact on the BI regarding UB in the e-learning context [66,68]. Therefore, this study puts forward the subsequent hypothesis:
Hypothesis 4 (H4).
Effort Expectancy positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.5. Social Influence

Social influence (SI) is defined as the level to which someone’s technology usage is influenced by opinions of others [34]. Social influence is the same as TAM2’s subjective norm and the theory of reasoned action (TRA) social norms. People expect that SI is the most critical and dominant factor in predicting the acceptance of a novel technology. According to the literature, SI positively affects BI concerning the use of Blackboard [33]. Consequently, this study posits the hypothesis:
Hypothesis 5 (H5).
Social influnce positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.6. Mobility

Mobility (Mob) means access to the Blackboard platform being flexible when time and location are not issues [69]. The literature on mobile learning shows that mobility is a key factor that impacts on BI to utilise a system or technology [70]. With reference to the Blackboard platform, it is also possible for mobility to directly affect behavioural intention to utilise Blackboard. Hence, the following hypothesis is written:
Hypothesis 6 (H6).
Mobility positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.

3.7. Self-Managed Learning

Self-managed learning (SML) refers to people managing their learning by themselves; in effect, they are independent learners [71]. Literature shows that SML is a critical factor in digital learning [11,72]. Al-Adwan (2018) [72] discovered that SML does not significantly influence BI. Yet, SML is investigated here because other studies concentrated on a different platform and not the Blackboard platform. Hence, the hypothesis that follows is considered:
Hypothesis 7 (H7).
Self-Managed Learning positively impacts on the behavioural intention to utilise the Blackboard platform.

3.8. Facilitating Conditions

Facilitating conditions (FC) comprise the organisational support offered in using systems and technology [34]. When it comes to Blackboard, FC is the support offered by the university, including the provision of remote access such as resources, training on platform usage, management issues, etc. According to Cheong et al. (2004) [61], FC is critical to BI. In the opinion of Venkatesh et al. (2003) [34], FC can directly affect a technology’s UB. Sultana (2020) [11], noted the possibility of FC having a direct effect on technology’s UB and BI. These lead to the hypotheses below:
Hypothesis 8 (H8).
Facilitating conditions positively impacts on the behavioural intention of the learner to to utilise the Blackboard platform.
Hypothesis 9 (H9).
Facilitating conditions positively impacts on the Use Behaviour of the learner to to utilise the Blackboard platform.

3.9. Behavioural Intention

Behavioural Intention (BI) deals with being willing to embrace a certain technology [73,74,75,76]. Venkatesh and Davis (2000) [33] argue that BI influences technology usage behaviour. According to the literature, BI greatly influences technology’s UB [11,34,68,72,77]. Based on this reason, the following hypothesis was written here:
Hypothesis 10 (H10).
Behavioural Intention positively impact on use Behavioural of the Blackboard platform.

4. Research Methodology

4.1. Measurement Instrument

Every factor’s measurement items are based on what is documented in the literature to assess multiple factors, as mentioned above, and the predictive validity of the survey instrument [78,79]. Table 1 summarises all the factors’ measurement items.

4.2. Questionnaire Design and Data Collection

A survey in the form of a structured questionnaire served to collect information. Users of Blackboard at an accounting department at Jazan University, Saudi Arabia, participated in the survey. Jazan University is a fairly new institution that was established in the mid-2000s as part of the government’s concerted effort to expand the nation’s higher education system. It is now one of the largest public sector universities in the Kingdome of Saudi Arabia (KSA). The questionnaire consisted of two parts (Part A and Part B). Here, the introduction of the questionnaire notes the purpose, respondents’ rights, ethical standards and time requirements. The aim of Part A was to gather demographic information on respondents’ engagement and understanding of the technology. Part B contained self-exploratory statements related to factors’ measurement items (Table 1) to collect people’s opinions on Blackboard platform usage with a 5-point Likert scale. This scale ranged from 1 (strongly disagree) to 5 (strongly agree). A 5-point Likert scale offers good quality data and is able to generate very clear responses [79].
In order to know whether the questionnaire is appropriate and respondents understood it, the researchers carried out a pilot study. The choice of respondents for the pilot study was on the basis of their education background, skills, and present position. The groups of users included students, administration personnel and academic staff. The authors obtained ten responses. Modifications were made based on the respondents’ feedback. Researchers can differ on the issue of appropriate sample size; ideally, it should be based on margin of error, variance, level of confidence, and population [83]. Roscoe (1975) [84] suggested that the calculation of sample size should be based on the aggregate number of items. Muthén and Muthén (2018) [83], argue that a sample should be more than 150. This study obtained an aggregate of 222 responses (out of 479 people who were emailed at Jazan University) and then carried out a statistical analysis. The data collection was conducted between July and August 2021. Such a timeframe, as well as the study population representation, meant that the sample size was suitable for executing the statistical analysis. Prior to the survey, the researcher informed all respondents about the aim of the project, their right to participate and withdraw at any time.

4.3. Demographic Characteristics of the Respondents

Table 2 summarises the demographic characteristics of respondents. In terms of gender, there were 58.10% women and 41.90% men, while most participants were in the 18–25 age range. Accounting students responded typically in terms of having simple access to the questions. About 89.1% of them were bachelor’s degree students. Regarding ethnicity, there was some variety in terms of Saudi nationals and people from overseas, which made it possible to get various kinds of perceptions or answers.

4.4. Measurement Model

Structural equation modelling (SEM) was utilised in this study. The survey data SEM consists of two major phases: confirmatory factor analysis (measurement model) and path analysis (structural model) [85,86,87,88]. In order to thoroughly analyse the data, the authors first measured overall model fitness using a variety of statistical methods, namely chi-square, root mean square error of approximation (RMSEA), goodness-of-fit (GFI), comparative fit index (CFI), Tucker–Lewis index (TLI), and adjusted goodness-of-fit index (AGFI) [86,87,88,89]. For the confirmatory factor analysis, tests were employed to validate the measurement model: (1) goodness-of-fit indices, (2) discriminant validity, and (3) convergent validity [85,86,87,88,89]. This study’s measurement model attained good levels of goodness-of-fit indices, as revealed in Table 3 below.
The test of reliability was carried out with composite reliability (CR) and Cronbach’s alpha. A minimum value of 0.70 was required here. The construct internal consistency that reflects a suitable replacement of Cronbach’s alpha is given by CR [86,87]. The aforementioned conditions of reliability tests were met in this study. The Cronbach’s alpha values ranged between, firstly, 0.716 and 0.874 and, secondly, 0.72 and 0.87 for CRs, as shown in Table 2. The current study also involved evaluation of discriminant validity and convergent validity in the model. All the standardised factor loadings (SFL) must be ≥0.60 [86,87]. Furthermore, every construct’s CR value must exhibit a minimum of 0.70 or above [86,87,88,89], while a minimum of 0.50 is necessary for the average variance extracted (AVE) value [89]. After eliminating items (SE3, BI3, and USE3) that possessed SFL, all other remaining constructs and items satisfied the required conditions for good convergent validity. The items’ values of CR, AVE, and SFL, which ranged between 0.72 and 0.87, between 0.52 and 0.78, and between 0.61 and 0.93, respectively, exhibited good convergent validity. The items’ value must be ≥ 0.4 for a modest square multiple correlation (SMC) because this expresses the extent to which an item measures a construct [88]. Table 4 below summarises the convergent validity and reliability numbers.
In the test for discriminant validity, Fornell and Larcker (1981) [87] and Byrne (2013) [86] state that the square root of the AVE value for each single latent variable must be more than its correlation evaluations with other constructs [86,87,88]. For this reason, the researchers compared the square root of AVE of every construct and its correlation coefficient with other constructs (see Table 5 below).

4.5. Structural Model/Path Analysis

Referring to the path analysis, this involves testing a structural model for the level of dependence between the set of control and independent variables on one side, and the dependent variables on the other [86,87,88,89]. The goodness-of-fit helped to achieve the value of the cut-off threshold of χ2/d.f, RMSEA, CFI, AGFI, GFI, and TLI. They are illustrated in Table 6 immediately below.
Table 7 illustrates that PR exerted an insignificant influence on BI (β = 0.072); thus, H1 was unsupported. SE did positively and significantly influence BI (β = 0.24, p < 0.05); therefore, H2 was supported. H3, which corresponds with PE positively and significantly influenced BI (β = 0.16, p < 0.05); thus, this hypothesis was supported. H4, which denotes EE, exerted a significant influence on BI (β = 0.23, p < 0.01); there, it was also supported. According to H5, SI positively and significantly influenced BI (β = 0.25, p < 0.05); thus, it is supported. H6 contends that SML has a significant and direct influence on BI (β = 0.32, p < 0.01), which was supported. H7, which assumes that M wields a significant influence on BI (β = 0.17, p < 0.05) was supported. Conversely, H8, which is concerned with FC and its influence on BI (β = −0.13), was not supported, because this influence was only very minimal. Yet, FC is also part of H9, which posits that FC significantly influences USE (β = 0.19, p < 0.05). Finally, H10 was supported, because there was a positive and significant relationship between BI and USE (β = 0.26, p < 0.001) (see Table 7).

5. Discussion

The findings documented here show that perceived risk has only an insignificant negative impact on the behavioural intention to utilise the Blackboard platform, which contradicts other studies [61,62,90,91]. They reported that an increase in the risk of using the Blackboard platform increases the negative behavioural intention involved here. Hence, Blackboard users at Jazan University need to know the risks of such systems affecting their online learning methods. This issue needs to be examined in more detail at the university. Regarding the impact of self-efficacy on behavioural intention to employ Blackboard, the outcomes revealed that self-efficacy exhibited a significant positive influence on behavioural intention, which agrees with other analyses, such as that of Alshammari (2020) [92]. They asserted that users are more likely to adopt the Blackboard platform if it proves to be effective.
Hence, the creators and vendors of Blackboard must update the platform so that it works efficiently at Jazan University and ensure that there are no problems with it. Moreover, users who are technologically very adept will exhibit a behavioural intention to use Blackboard platforms compared to those with low technology self-efficacy. Consequently, self-efficacy is deemed to be a critical predictor of the behavioural intention to use Blackboard. Nevertheless, other studies do contradict this finding, because they claim there is no association between effort expectancy and behavioural intention [93,94,95,96]. Nonetheless, according to Venkatesh et al. (2003) [34], who developed the original UTAUT model, their study confirmed that self-efficacy was a significant predictor of behavioural intention when using technology; their finding backs up what is reported here.
This study revealed that performance expectancy positively impacted the behavioural intention of using Blackboard. It indicated that the performance of the technology is vital to people’s perceptions, which positively impacts use. This finding agrees with other studies, for example Sultana (2020) [11], Cheong et al. (2004) [61], Abu-Al-Aish and Love (2013) [68]. However, others differ, e.g., Šumak et al. (2010) [90]. When students and faculty members perceive that the Blackboard platform functions excellently, they will embrace it. Moreover, an increase in the perceived efficiency of Blackboard simply encourages the behavioural intention to use it. It can improve the systems involved in e-learning at Jazan University. Another finding is that effort expectancy exhibited a significant and positive impact on the behavioural intention to use the Blackboard platform, which is in line with two studies, namely, Sultana (2020) [11] and Abu-Al-Aish and Love [68]. Nonetheless, it can be debated whether effort expectancy is a vital predictor of behavioural intention leading to increased use of Blackboard. Those in charge of developing and updating the platform need to focus on devising easy and user-friendly aspects of the technology for staff and students at Jazan University.
Social influence exhibited a significant and positive influence on the behavioural intention of using Blackboard, which echoes what other research has found; for instance, Raman (2014) [95] and El-Masri and Tarhini (2017) [93] noted that important peers and social pressure positively influence the behavioural intention to access Blackboard. Their finding is echoed in this paper. The finding for self-managed learning is that it is significant and positive on behavioural intention regarding the use of the Blackboard platform, which agrees with what other studies, such as Sultana (2020) [11] reported, yet contradicts the research of Al-Adwan et al. (2018) [72]. Nevertheless, self-managed learning does not appear to produce any issues concerning validity and reliability. Thus, it can be claimed that self-managed learning is a critical aspect of Blackboard. Another finding is that the mobility factor turned out to be significant and positive regarding behavioural intention to use Blackboard, which is consistent with the findings of Mallat et al. (2008) [70], but not with the more recent study by Sultana (2020) [11]. Therefore, the findings for self-efficacy, performance expectancy, effort expectancy, social influence, self-managed learning, and mobility indicate that if the students and facility members perceive technology to be very beneficial for their work, they would not hesitate to embrace it. Moreover, if preservice users realise and understand that their peers employ Blackboard, they will take it up as well.
The finding concerning facilitating conditions is that it they do not impact on behavioural intention and, in fact, contradict the conclusions of other research, for example Mallat et al. (2008) [70]. There is some agreement with more recent work [90,93,94,95,97]. Facilitating conditions are concerned with technical support, provision of technology, etc., but, alone, might not be enough to motivate people to accept the technology. Other considerations were access to computers, organisational support, training, etc., which all play a role in the use of new technology. Our result aligns with other studies documented, for instance, Buabeng-Andoh and Baah (2020) [95]. He argued that facilitating conditions do not have a positive influence on behavioural intention when it comes to technology usage. Nonetheless, the facilitating conditions significantly influenced the use of the Blackboard platform. This finding agrees with Venkatesh et al. (2003) [34]. It is suggested that if facilitating conditions generate people’s access to Blackboard and other user-friendly infrastructure, they might be inspired to employ the technology. Furthermore, the finding for behavioural intention positively influenced the UB of Blackboard. This agrees with Sultana (2020) [11], Venkatesh et al. (2003) [34], Wang et al. (2009) [66], and Raman and Don (2013) [98], who concluded that there is a positively significant association between behavioural intention and actual use.

6. Conclusions

This article set out to identify and test the factors that impact accounting students’ and faculty members’ use of the online learning platform, Blackboard, during the recent COVID-19 pandemic. It meant modifying the current UTAUT model by adding four extra variables, these being perceived risk, mobility, self-efficacy and self-managed learning. A structural research model was devised and validated using a survey questionnaire given to respondents. Empirical findings indicated that self-efficacy, performance expectancy, effort expectancy, social influence, mobility and self-managed learning are significant factors behind the behavioural intention to utilise Blackboard, while perceived risk and facilitating conditions are not. Furthermore, facilitating conditions represent a significant factor encouraging the use of the platform. The findings reported here contribute to our knowledge of the strength of the pandemic’s impact, specifically on online learning and education. This article serves as a steppingstone for further research on the topic of e-learning in developing countries with different factors, such as attitudes towards online technologies.

6.1. Contributes and Implication

This study’s findings theoretically, methodologically and practically contribute to e-learning research and understanding of practice. Theoretically, the findings add to the literature relating to e-learning and its associated features. Prior studies utilised the basic UTAUT model or another version of it to assess the spread and depth of e-learning. To the best of the researchers’ knowledge, this is the first study to extend and modify the UTAUT model and apply it to the Blackboard platform. Regarding the methodological contribution, this study illustrates that, of the four measurement items, three confirmed the validity and reliability aspects, self-efficacy, self-managed learning and mobility.
Regarding the practical implications of this study, it is the first to report the factors influencing the adoption of the Blackboard system in Saudi Arabia’s higher education sector. This means decision-makers will have a better comprehension of its functions and the feedback given on aspects of its features. Further, this study’s outcomes will help Blackboard application developers understand what must be changed, refined, and particularly pay attention to socially-sound, user-friendly and easy-to-use applications on various devices. Students and their teachers must complete tasks much more quickly these days in the Saudi university sector. Additionally, universities’ managers need to devise and promote awareness campaigns to inform people about e-learning and the possible risks if Blackboard-type platforms are not used properly. Further, those who offer this service need to guarantee the ability of Blackboard to function properly all the time, wherever and whenever. Moreover, this study’s results will also aid the Blackboard platform provider in terms of improving user experience and reducing their switching rate. The institutions utilising the Blackboard platform can, likewise, enhance user experience with a focus on key factors

6.2. Limitation and Future Research

This paper possesses some limitations that need to be addressed in future endeavors. It should be noted that data were gathered from only a small sample of accounting students and faculty members from Jazan university; thus, the generalisability of the findings to elsewhere in Saudi Arabia or a wider geographical area must be treated with caution. Furthermore, this study applied the quantitative research approach, and future studies could combine qualitative and quantitative methods to find further explanations for the relationships between the suggested constructs. Other moderating and mediating variables (age, experience and gender) can be incorporated in future studies.

Author Contributions

Theoretical framework and hypotheses development, A.M., A.A. and T.K.; Research Methods, A.M. and A.A.; Analysis, A.M.; Discussion, A.M., A.A. and T.K.; Conclusion, A.M., A.A. and T.K.; Revisions, T.K.; Introduction, A.M., A.A. and T.K.; Literature Review, A.M. and A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research did not receive external funding.

Institutional Review Board Statement

The research study was approved by Jazan University Research Ethics Committee on 25 October 2021 with the approval number REC-43/03/046.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is available in excel format if required. Please contact the authors directly.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Adelsberger, H.H.; Collis, B.; Pawlowski, J.M. Handbook on Information Technologies for Education and Training; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
  2. Khan, T.; Siriwardhane, P. Barriers to Career Progression in the Higher Education Sector: Perceptions of Australian Academics. Sustainability 2021, 13, 6255. [Google Scholar] [CrossRef]
  3. Karalis, T. Planning and evaluation during educational disruption: Lessons learned from COVID-19 pandemic for treatment of emergencies in education. Eur. J. Educ. Stud. 2020, 7, 125–142. [Google Scholar]
  4. Mohapatra, A.K. Impact of COVID-19 on higher education. J. Manag. Public Policy 2020, 11, 4–6. [Google Scholar]
  5. Guri-Rosenblit, S. ‘Distance education’and ‘e-learning’: Not the same thing. High. Educ. 2005, 49, 467–493. [Google Scholar] [CrossRef]
  6. Findik-Coşkunçay, D.; Alkiş, N.; Özkan-Yildirim, S. A structural model for students’ adoption of learning management systems: An empirical investigation in the higher education context. J. Educ. Technol. Soc. 2018, 21, 13–27. [Google Scholar]
  7. Moawad, R.A. Online learning during the COVID-19 pandemic and academic stress in university students. Rev. Românească Educ. Multidimens. 2020, 12 (Suppl. S2), 100–107. [Google Scholar] [CrossRef]
  8. Rovai, A.P.; Downey, J.R. Why some distance education programs fail while others succeed in a global environment. Internet High. Educ. 2010, 13, 141–147. [Google Scholar] [CrossRef]
  9. Aljaber, A. E-learning policy in Saudi Arabia: Challenges and successes. Res. Comp. Int. Educ. 2018, 13, 176–194. [Google Scholar] [CrossRef] [Green Version]
  10. Sawaftah, W.A.; Aljeraiwi, A.A. The Quality of Blended Learning Based on the Use of Blackboard in Teaching Physics at King Saud University: Students’ Perceptions. J. Educ. Psychol. Sci. 2018, 19, 616–646. [Google Scholar]
  11. Sultana, J. Determining the factors that affect the uses of Mobile Cloud Learning (MCL) platform Blackboard—A modification of the UTAUT model. Educ. Inf. Technol. 2020, 25, 223–238. [Google Scholar] [CrossRef]
  12. Alturki, U.T.; Aldraiweesh, A. Evaluating the usability and accessibility of LMS ‘Blackboard’ at King Saud University. Contemp. Issues Educ. Res. 2016, 9, 33–44. [Google Scholar] [CrossRef]
  13. United Nations. Ensure Inclusive and Equitable Quality Education and Promote Lifelong Learning Opportunities for All; UN: New York, NY, USA, 2022. [Google Scholar]
  14. Wu, J.; Guo, S.; Huang, H.; Liu, W.; Xiang, Y. Information and communications technologies for sustainable development goals: State-of-the-art, needs and perspectives. IEEE Commun. Surv. Tutorials 2018, 20, 2389–2406. [Google Scholar] [CrossRef] [Green Version]
  15. Lorente, L.M.L.; Arrabal, A.A.; Pulido-Montes, C. The right to education and ICT during COVID-19: An international perspective. Sustainability 2020, 12, 9091. [Google Scholar] [CrossRef]
  16. Alruwaie, M.; El-Haddadeh, R.; Weerakkody, V. Citizens’ continuous use of eGovernment services: The role of self-efficacy, outcome expectations and satisfaction. Gov. Inf. Q. 2020, 37, 101485. [Google Scholar] [CrossRef]
  17. Al-araibi, A.A.M.; Naz’ri bin Mahrin, M.; Yusoff, R.C.M. Technological aspect factors of E-learning readiness in higher education institutions: Delphi technique. Educ. Inf. Technol. 2019, 24, 567–590. [Google Scholar] [CrossRef]
  18. Almaiah, M.A.; Al-Khasawneh, A.; Althunibat, A. Exploring the critical challenges and factors influencing the E-learning system usage during COVID-19 pandemic. Educ. Inf. Technol. 2020, 25, 5261–5280. [Google Scholar] [CrossRef] [PubMed]
  19. Ferdousi, B.J. A Study of Factors that Affect Instructors’ Intention to Use E-Learning Systems in Two-Year Colleges. Ph.D. Dissertation, Nova Southeastern University, Fort Lauderdale, FL, USA, 2009. [Google Scholar]
  20. Alzahrani, L.; Seth, K.P. Factors influencing students’ satisfaction with continuous use of learning management systems during the COVID-19 pandemic: An empirical study. Educ. Inf. Technol. 2021, 26, 6787–6805. [Google Scholar] [CrossRef] [PubMed]
  21. Al Gamdi, M.A.; Samarji, A. Perceived barriers towards e-Learning by faculty members at a recently established university in Saudi Arabia. Int. J. Inf. Educ. Technol. 2016, 6, 23–28. [Google Scholar] [CrossRef] [Green Version]
  22. Naveed, Q.N.; Qureshi, M.R.N.; Alsayed, A.O.; Muhammad, A.; Sanober, S.; Shah, A. Prioritizing barriers of E-Learning for effective teaching-learning using fuzzy analytic hierarchy process (FAHP). In Proceedings of the 2017 4th IEEE International Conference on Engineering Technologies and Applied Sciences (ICETAS), Salmabad, Bahrain, 29 November–1 December 2017; pp. 1–8. [Google Scholar]
  23. Al-Khasawneh, A.M.; Obeidallah, R. E-learning in the Hashemite University: Success factors for implementation in Jordan. In Advanced Online Education and Training Technologies; IGI Global: Hershey, PA, USA, 2019; pp. 135–145. [Google Scholar]
  24. Almaiah, M.A.; Al Mulhem, A. Analysis of the essential factors affecting of intention to use of mobile learning applications: A comparison between universities adopters and non-adopters. Educ. Inf. Technol. 2019, 24, 1433–1468. [Google Scholar] [CrossRef]
  25. Shin, D.-H. The role of affordance in the experience of virtual reality learning: Technological and affective affordances in virtual reality. Telemat. Inform. 2017, 34, 1826–1836. [Google Scholar] [CrossRef]
  26. Ajzen, I.; Fishbein, M. A Bayesian analysis of attribution processes. Psychol. Bull. 1975, 82, 261–277. [Google Scholar] [CrossRef]
  27. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  28. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. Extrinsic and intrinsic motivation to use computers in the workplace 1. J. Appl. Soc. Psychol. 1992, 22, 1111–1132. [Google Scholar] [CrossRef]
  29. Moore, G.C.; Benbasat, I. Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf. Syst. Res. 1991, 2, 192–222. [Google Scholar] [CrossRef] [Green Version]
  30. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  31. Taylor, S.; Todd, P. Decomposition and crossover effects in the theory of planned behavior: A study of consumer adoption intentions. Int. J. Res. Mark. 1995, 12, 137–155. [Google Scholar] [CrossRef]
  32. Compeau, D.R.; Higgins, C.A. Computer self-efficacy: Development of a measure and initial test. MIS Q. 1995, 19, 189–211. [Google Scholar] [CrossRef] [Green Version]
  33. Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef] [Green Version]
  34. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  35. Venkatesh, V.; Bala, H. Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 2008, 39, 273–315. [Google Scholar] [CrossRef] [Green Version]
  36. Oye, N.D.; Iahad, A.N.; Rahim, N.A.; Zairah, N. A comparative study of acceptance and use of ICT among university academic staff of ADSU and LASU: Nigeria. Int. J. Sci. Technol. 2012, 1, 40–52. [Google Scholar]
  37. Roca, J.C.; Gagné, M. Understanding e-learning continuance intention in the workplace: A self-determination theory perspective. Comput. Hum. Behav. 2008, 24, 1585–1604. [Google Scholar] [CrossRef]
  38. Cody-Allen, E.; Kishore, R. An extension of the UTAUT model with e-quality, trust, and satisfaction constructs. In Proceedings of the 2006 ACM SIGMIS CPR Conference on Computer Personnel Research: Forty Four Years of Computer Personnel Research: Achievements, Challenges & the Future, Claremont, CA, USA, 13–15 April 2006; pp. 82–89. [Google Scholar]
  39. Alrawashdeh, T.A.; Muhairat, M.I.; Alqatawnah, S.M. Factors affecting acceptance of web-based training system: Using extended UTAUT and structural equation modeling. arXiv 2012, arXiv:1205.1904. [Google Scholar] [CrossRef]
  40. Alwahaishi, S.; Snásel, V. Consumers’ acceptance and use of information and communications technology: A UTAUT and flow based theoretical model. J. Technol. Manag. Innov. 2013, 8, 61–73. [Google Scholar] [CrossRef] [Green Version]
  41. Godin, J.; Goette, T. A pilot study of virtual teamwork training. Commun. IIMA 2013, 13, 3. [Google Scholar]
  42. Alzahrani, M.E.; Goodwin, R.D. Towards a UTAUT-based model for the study of E-Government citizen acceptance in Saudi Arabia. World Acad. Sci. Eng. Technol. 2012, 6, 376–382. [Google Scholar]
  43. Pahnila, S.; Siponen, M.; Zheng, X. Integrating habit into UTAUT: The Chinese eBay case. Pac. Asia J. Assoc. Inf. Syst. 2011, 3, 2. [Google Scholar] [CrossRef]
  44. Badwelan, A.; Drew, S.; Bahaddad, A.A. Towards acceptance m-learning approach in higher education in Saudi Arabia. Int. J. Bus. Manag. 2016, 11, 12. [Google Scholar] [CrossRef]
  45. Mtebe, J.S.; Raisamo, R. Investigating perceived barriers to the use of open educational resources in higher education in Tanzania. Int. Rev. Res. Open Distrib. Learn. 2014, 15, 43–66. [Google Scholar] [CrossRef] [Green Version]
  46. Shorfuzzaman, M.; Alhussein, M. Modeling learners’ readiness to adopt mobile learning: A perspective from a GCC higher education institution. Mob. Inf. Syst. 2016, 2016, 6982824. [Google Scholar] [CrossRef] [Green Version]
  47. Hsiao, C.-H.; Tang, K.-Y. Explaining undergraduates’ behavior intention of e-textbook adoption: Empirical assessment of five theoretical models. Libr. Hi Tech 2014, 32, 139–163. [Google Scholar] [CrossRef]
  48. Pardamean, B.; Susanto, M. Assessing user acceptance toward blog technology using the UTAUT model. Int. J. Math. Comput. Simul. 2012, 6, 203–212. [Google Scholar]
  49. Alfarani, L.A. Influences on the adoption of mobile learning in Saudi women teachers in higher education. In Proceedings of the 2014 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL2014), Thessaloniki, Greece, 13–14 November 2014; pp. 30–34. [Google Scholar]
  50. Iqbal, S.; Qureshi, I.A. M-learning adoption: A perspective from a developing country. Int. Rev. Res. Open Distrib. Learn. 2012, 13, 147–164. [Google Scholar] [CrossRef] [Green Version]
  51. Shin, D.-H.; Biocca, F.; Choo, H. Exploring the user experience of three-dimensional virtual learning environments. Behav. Inf. Technol. 2013, 32, 203–214. [Google Scholar] [CrossRef]
  52. Uğur, N.G.; Turan, A.H. E-learning adoption of academicians: A proposal for an extended model. Behav. Inf. Technol. 2018, 37, 393–405. [Google Scholar] [CrossRef]
  53. Rogers, E.M. Diffusion of Innovations; Simon & Shuster. Inc.: New York, NY, USA, 2003. [Google Scholar]
  54. Chen, M.; Wang, X.; Wang, J.; Zuo, C.; Tian, J.; Cui, Y. Factors Affecting College Students’ Continuous Intention to Use Online Course Platform. SN Comput. Sci. 2021, 2, 114. [Google Scholar] [CrossRef] [PubMed]
  55. McLean, L.D. Organizational culture’s influence on creativity and innovation: A review of the literature and implications for human resource development. Adv. Dev. Hum. Resour. 2005, 7, 226–246. [Google Scholar] [CrossRef] [Green Version]
  56. Schaper, L.K.; Pervan, G.P. ICT and OTs: A model of information and communication technology acceptance and utilisation by occupational therapists. Int. J. Med. Inform. 2007, 76, S212–S221. [Google Scholar] [CrossRef]
  57. Marchewka, J.T.; Kostiwa, K. An application of the UTAUT model for understanding student perceptions using course management software. Commun. IIMA 2007, 7, 10. [Google Scholar]
  58. Bradley, J. The technology acceptance model and other user acceptance theories. In Handbook of Research on Contemporary Theoretical Models in Information Systems; IGI Global: Hershey, PA, USA, 2009; pp. 277–294. [Google Scholar]
  59. Pavlou, P.A. Consumer acceptance of electronic commerce: Integrating trust and risk with the technology acceptance model. Int. J. Electron. Commer. 2003, 7, 101–134. [Google Scholar]
  60. Mallat, N. Exploring consumer adoption of mobile payments—A qualitative study. J. Strateg. Inf. Syst. 2007, 16, 413–432. [Google Scholar] [CrossRef]
  61. Al-Saedi, K.; Al-Emran, M.; Ramayah, T.; Abusham, E. Developing a general extended UTAUT model for M-payment adoption. Technol. Soc. 2020, 62, 101293. [Google Scholar] [CrossRef]
  62. Slade, E.; Williams, M.; Dwivedi, Y.; Piercy, N. Exploring consumer adoption of proximity mobile payments. J. Strateg. Mark. 2015, 23, 209–223. [Google Scholar] [CrossRef]
  63. Wu, J.-H.; Wang, S.-C. What drives mobile commerce?: An empirical evaluation of the revised technology acceptance model. Inf. Manag. 2005, 42, 719–729. [Google Scholar] [CrossRef]
  64. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef] [Green Version]
  65. Terzis, V.; Economides, A.A. The acceptance and use of computer based assessment. Comput. Educ. 2011, 56, 1032–1044. [Google Scholar] [CrossRef]
  66. Wang, Y.; Wu, M.; Wang, H. Investigating the determinants and age and gender differences in the acceptance of mobile learning. Br. J. Educ. Technol. 2009, 40, 92–118. [Google Scholar] [CrossRef]
  67. Chiu, C.-M.; Wang, E.T.G. Understanding Web-based learning continuance intention: The role of subjective task value. Inf. Manag. 2008, 45, 194–201. [Google Scholar] [CrossRef]
  68. Abu-Al-Aish, A.; Love, S. Factors influencing students’ acceptance of m-learning: An investigation in higher education. Int. Rev. Res. Open Distrib. Learn. 2013, 14, 82–107. [Google Scholar] [CrossRef]
  69. Peters, K. m-Learning: Positioning educators for a mobile, connected future. Int. Rev. Res. Open Distrib. Learn. 2007, 8. [Google Scholar] [CrossRef] [Green Version]
  70. Mallat, N.; Rossi, M.; Tuunainen, V.K.; Öörni, A. An empirical investigation of mobile ticketing service adoption in public transportation. Pers. Ubiquitous Comput. 2008, 12, 57–65. [Google Scholar] [CrossRef]
  71. Conner, M.; Smith, N.; McMillan, B. Examining normative pressure in the theory of planned behaviour: Impact of gender and passengers on intentions to break the speed limit. Curr. Psychol. 2003, 22, 252–263. [Google Scholar] [CrossRef]
  72. Al-Adwan, A.S.; Al-Madadha, A.; Zvirzdinaite, Z. Modeling students’ readiness to adopt mobile learning in higher education: An empirical study. Int. Rev. Res. Open Distrib. Learn. 2018, 19. [Google Scholar] [CrossRef] [Green Version]
  73. Yi, M.Y.; Jackson, J.D.; Park, J.S.; Probst, J.C. Understanding information technology acceptance by individual professionals: Toward an integrative view. Inf. Manag. 2006, 43, 350–363. [Google Scholar] [CrossRef]
  74. Madden, T.J.; Ellen, P.S.; Ajzen, I. A comparison of the theory of planned behavior and the theory of reasoned action. Personal. Soc. Psychol. Bull. 1992, 18, 3–9. [Google Scholar] [CrossRef]
  75. Ekblom, Ö.; Ekblom-Bak, E.; Bolam, K.A.; Ekblom, B.; Schmidt, C.; Söderberg, S.; Bergström, G.; Börjesson, M. Concurrent and predictive validity of physical activity measurement items commonly used in clinical settings-data from SCAPIS pilot study. BMC Public Health 2015, 15, 978. [Google Scholar] [CrossRef] [Green Version]
  76. Revilla, M.A.; Saris, W.E.; Krosnick, J.A. Choosing the number of categories in agree-disagree scales. Sociol. Methods Res. 2014, 43, 73–97. [Google Scholar] [CrossRef]
  77. Featherman, M.S.; Pavlou, P.A. Predicting e-services adoption: A perceived risk facets perspective. Int. J. Hum. Comput. Stud. 2003, 59, 451–474. [Google Scholar] [CrossRef] [Green Version]
  78. Zhang, L.; Zhu, J.; Liu, Q. A meta-analysis of mobile commerce adoption and the moderating effect of culture. Comput. Hum. Behav. 2012, 28, 1902–1911. [Google Scholar] [CrossRef]
  79. Lai, C.; Wang, Q.; Lei, J. What factors predict undergraduate students’ use of technology for learning? A case from Hong Kong. Comput. Educ. 2012, 59, 569–579. [Google Scholar] [CrossRef]
  80. Austin, H.W. Sample size: How much is enough? Qual. Quant. 1983, 17, 239–245. [Google Scholar] [CrossRef]
  81. Roscoe, J.T. Fundamental Research Statistics for the Behavioral Sciences; Holt, Rinehart and Winston: New York, NY, USA, 1975. [Google Scholar]
  82. Muthén, L.K.; Muthén, B.O. How to use a Monte Carlo study to decide on sample size and determine power. Struct. Equ. Model. 2002, 9, 599–620. [Google Scholar] [CrossRef]
  83. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis. Always Learning; Pearson Education Limited: London, UK, 2013. [Google Scholar]
  84. Kline, R.B. Principles and Practice of Structural Equation Modeling, 3rd ed.; Guilford: New York, NY, USA, 2011. [Google Scholar]
  85. Mujalli, A.; Almgrashi, A. A Conceptual Framework for Generalised Audit Software Adoption in Saudi Arabia by Government Internal Auditing Departments using an Integrated Institutional Theory-TOE Model. In Proceedings of the 2020 IEEE Asia-Pacific Conference on Computer Science and Data Engineering (CSDE), Gold Coast, Australia, 16–18 December 2020; pp. 1–8. [Google Scholar]
  86. Byrne, B.M. Structural Equation Modeling with LISREL, PRELIS, and SIMPLIS: Basic Concepts, Applications, and Programming; Psychology Press: Abingdon, UK, 2013. [Google Scholar]
  87. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  88. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  89. Chin, W.W. The partial least squares approach to structural equation modeling. In Modern Methods for Business Research; Psychology Press: Abingdon, UK, 1998; pp. 295–336. [Google Scholar]
  90. Šumak, B.; Polancic, G.; Hericko, M. An empirical study of virtual learning environment adoption using UTAUT. In Proceedings of the 2010 Second International Conference on Mobile, Hybrid, and On-Line Learning, Saint Maarten, The Netherlands, 10–16 February 2010; pp. 17–22. [Google Scholar]
  91. Taiwo, A.A.; Downe, A.G. The theory of user acceptance and use of technology (UTAUT): A meta-analytic review of empirical findings. J. Theor. Appl. Inf. Technol. 2013, 49, 48–58. [Google Scholar]
  92. Alshammari, S.H. Determining the Factors That Affect the Use of Virtual Classrooms: A Modification of UTAUT Model. J. Inf. Technol. Educ. Res. 2021, 20, 117–135. [Google Scholar] [CrossRef]
  93. El-Masri, M.; Tarhini, A. Factors affecting the adoption of e-learning systems in Qatar and USA: Extending the Unified Theory of Acceptance and Use of Technology 2 (UTAUT2). Educ. Technol. Res. Dev. 2017, 65, 743–763. [Google Scholar] [CrossRef]
  94. Buabeng-Andoh, C.; Baah, C. Pre-service teachers’ intention to use learning management system: An integration of UTAUT and TAM. Interact. Technol. Smart Educ. 2020, 17, 455–474. [Google Scholar] [CrossRef]
  95. Raman, V.; Horner, H.T.; Khan, I.A. New and unusual forms of calcium oxalate raphide crystals in the plant kingdom. J. Plant Res. 2014, 127, 721–730. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  96. Kabra, G.; Ramesh, A.; Akhtar, P.; Dash, M.K. Understanding behavioural intention to use information technology: Insights from humanitarian practitioners. Telemat. Inform. 2017, 34, 1250–1261. [Google Scholar] [CrossRef]
  97. Siswanto, T.; Shofiati, R.; Hartini, H. Acceptance and utilization of technology (UTAUT) as a method of technology acceptance model of mitigation disaster website. IOP Conf. Ser. Earth Environ. Sci. 2018, 106, 012011. [Google Scholar] [CrossRef] [Green Version]
  98. Raman, A.; Don, Y. Preservice teachers’ acceptance of learning management software: An application of the UTAUT2 model. Int. Educ. Stud. 2013, 6, 157–164. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The unified theory of acceptance and use of technology (UTAUT) model.
Figure 1. The unified theory of acceptance and use of technology (UTAUT) model.
Sustainability 14 02360 g001
Figure 2. Conceptualized extended UTAUT model.
Figure 2. Conceptualized extended UTAUT model.
Sustainability 14 02360 g002
Table 1. Constructs, measurements and their sources.
Table 1. Constructs, measurements and their sources.
ConstructsCodeIndicators Source
Perceived RiskPR1 “I wouldn’t feel protected when providing personal information through the Blackboard platform”.[80,81]
PR2 “I wouldn’t feel comfortable about the use of the Blackboard platform because other people might be able to access my data”.
PR3 “There is a high chance that something wrong would occur when using the Blackboard system”.
Self-EfficacySE1 “I would use the Blackboard platform if I had a built-in guide for assistance”.[34]
SE2 “I would use the Blackboard platform if someone showed me how to use it”.
SE3 “I would use the Blackboard platform if it would be used by others”.
Performance ExpectancyPE1 “I found Blackboard is useful for learning or teaching”.[34]
PE2“I think through Blackboard I can do my work more quickly”.
PE3“I think Blackboard makes learning and obtaining information more effective”.
Effort ExpectancyEE1“Learning how to use Blackboard is easy”.[34]
EE3“My interaction and navigation with Blackboard is clear and understandable”.
EE3“Overall I found that Blackboard is easy to use”.
Social InfluenceSI1 “I use Blackboard because my university has introduced it”.[34,82]
SI2 “I use Blackboard because all teachers and students use it”.
Facilitating ConditionFC1 “IT dept. provides support and assistance for using Blackboard”.[34]
FC2 “I have necessary resources and knowledge to use Blackboard”.
FC3 “Use of Blackboard is suitable for my work”.
MobilityMob1 “I can access Blackboard from anywhere”.[11,46]
Mob2“I can access Blackboard through mobile devices”
Self-Managed LearningSML1 “Blackboard increases learner autonomy”[11]
SML2“It is possible to do self-directed learning through Blackboard”
Table 2. Demographic characteristics of respondents.
Table 2. Demographic characteristics of respondents.
Demographic ItemsFrequencyPercentage
Gender
Male12958.10%
Female9341.90%
Age
18–2519889.1%
26–35167.2%
36–4562.7%
46+20.9%
Occupation
Student19889.18%
Faculty members2410.8%
Level of education
Bachelor19889.18%
Masters219.45%
PhD31.35%
Ethnicity
Saudi20592.3%
International177.6%
Table 3. A Comparison of Goodness-of-Fit Statistics of Full Measurement Models.
Table 3. A Comparison of Goodness-of-Fit Statistics of Full Measurement Models.
Threshold ValuesX2/d.f
(<2)
CFI
(>0.9)
AGFI
(>0.8)
TLI
(>0.9)
GFI
(>0.9)
RMSEA
(<0.08)
Full Measurement Structural
Model Fit Indices

1.61

0.930

0.841

0.911

0.90

0.053
Table 4. Outcomes of the Measurement Model’s Convergent Validity and Reliability.
Table 4. Outcomes of the Measurement Model’s Convergent Validity and Reliability.
Constructs & ItemsFactor Loading (>0.7)SMCCRCronbach’s αAVE
(PR) 0.7650.7600.52
PR10.640.41
PR20.800.64
PR30.720.51
(SE) 0.870.8600.77
SE10.930.86
SE20.830.67
(PE) 0.8480.8440.65
PE10.900.80
PE20.770.59
PE30.750.56
(EE) 0.820.8190.61
EE10.850.72
EE20.710.50
EE30.770.60
(SI) 0.7200.7160.57
SI10.620.40
SI20.870.76
(SML) 0.730.7240.59
SML10.850.73
SML20.670.45
(M) 0.760.7550.62
M10.700.50
M20.880.77
(FC) 0.830.8220.63
FC10.610.40
FC20.930.86
FC30.810.65
(BI) 0.870.8740.78
BI10.840.70
BI20.930.86
(UB) 0.800.8000.67
UB10.800.64
UB20.830.69
Notes: The values of convergent and discriminant validity are excellent. Also, internal consistency values are excellent (Cronbach’s alpha test of reliability); PR, Perceived Risk; SE, Self-Efficacy; PE, Performance Expectancy; EE, Effort Expectancy; SI, Social Influence; SML, Self-Managed Learning; M, Mobility; FC, Facilitating Condition; BI, Behavioural Intention; UB, Use Behaviour.
Table 5. Discriminant Validity Assessment: An Overview.
Table 5. Discriminant Validity Assessment: An Overview.
AVEPRSEPEEESISMLMFCBIUSE
PR0.520.723
SE0.770.188 †0.879
PE0.650.350 **0.0020.807
EE0.610.344 ***0.1030.181 *0.778
SI0.570.222 *0.211 *0.192 *−0.0340.883
SML0.590.0790.0840.0440.190 *0.264 **0.767
M0.620.014−0.058−0.065−0.1130.261 **0.268 **0.791
FC0.63−0.353 **0.290 **0.0890.0470.0980.0670.1120.794
BI0.780.380 ***0.535 ***0.174 †0.173 *0.0040.1250.144 †0.323 ***0.883
USE0.670.0500.0800.1110.0600.190 *0.159 †0.1110.181 *0.1080.817
Note: Significance threshold Values † p < 0.100, * p < 0.050, ** p < 0.010, *** p < 0.001 and the diagonals are indicative of the square root of average variance extracted [86], while the entries for the other matrix are the correlation factors; PR, Perceived Risk; SE, Self-Efficacy; PE, Performance Expectancy; EE, Effort Expectancy; SI, Social Influence; SML, Self-Managed Learning; M, Mobility; FC, Facilitating Condition; BI, Behavioural Intention; UB, Use Behaviour.
Table 6. Measurement Model fit indices.
Table 6. Measurement Model fit indices.
Threshold Valuesχ2/d.f
(<2)
RMSEA
(<0.08)
CFI
(>0.9)
AGFI
(>0.8)
GFI
(>0.9)
TLI
(>0.9)
Structural Model Fit Indices1.610.0530.9300.8410.900.911
Table 7. An Overview of the Structural Model Analysis Results.
Table 7. An Overview of the Structural Model Analysis Results.
HypothesesRelationshipC.R.
(t-Value)
pStandardised Structural CoefficientsResult
H1PR→BI0.7430.4570.07Unsupported
H2SE→BI2.0670.0390.24 *Supported
H3PE→BI2.0470.0410.16 *Supported
H4EE→BI2.9310.0030.23 **Supported
H5SI→BI2.2920.0220.25 *Supported
H6SML→BI2.9030.0040.32 **Supported
H7M→BI1.8750.0480.17 *Supported
H8FC→BI−1.4410.150−0.13Unsupported
H9FC→USE2.4200.0160.19 *Supported
H10BI→USE3.3660.00010.26 ***Supported
Notes: Dependent Variable: IAE; *** p < 0.001, ** p < 0.01, * p < 0.05; p-value is Two-tailed.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mujalli, A.; Khan, T.; Almgrashi, A. University Accounting Students and Faculty Members Using the Blackboard Platform during COVID-19; Proposed Modification of the UTAUT Model and an Empirical Study. Sustainability 2022, 14, 2360. https://doi.org/10.3390/su14042360

AMA Style

Mujalli A, Khan T, Almgrashi A. University Accounting Students and Faculty Members Using the Blackboard Platform during COVID-19; Proposed Modification of the UTAUT Model and an Empirical Study. Sustainability. 2022; 14(4):2360. https://doi.org/10.3390/su14042360

Chicago/Turabian Style

Mujalli, Abdulwahab, Tehmina Khan, and Ahmed Almgrashi. 2022. "University Accounting Students and Faculty Members Using the Blackboard Platform during COVID-19; Proposed Modification of the UTAUT Model and an Empirical Study" Sustainability 14, no. 4: 2360. https://doi.org/10.3390/su14042360

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop