Next Article in Journal
Conservation, Sustainability, Conflict and Coexistence: Key Themes in Wildlife Management
Previous Article in Journal
Exploring the Consumer Acceptance of Circular Housing from the Perspective of SOR Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Blended Learning Delivery Methods for a Sustainable Learning Environment: A Delphi Study

by
Ali Saleh Alammary
College of Computing and Informatics, Saudi Electronic University, Jeddah 393453, Saudi Arabia
Sustainability 2024, 16(8), 3269; https://doi.org/10.3390/su16083269
Submission received: 10 March 2024 / Revised: 28 March 2024 / Accepted: 12 April 2024 / Published: 14 April 2024
(This article belongs to the Special Issue Intelligent Systems and Technology for Education Sustainability)

Abstract

:
The outbreak of COVID-19 necessitated social distancing. Universities around the world were left with two options: a total suspension or a major reduction in students’ attendance. While the nature of many courses made it very difficult to teach them online, blended learning was the most suitable approach to teach these types of courses. Academics and educational institutions have realized the significance of blended learning not only as a response to immediate disruptions but also as a pivotal element in fostering a sustainable learning environment. However, designing successful blended learning courses requires making challenging decisions regarding the selection of the most appropriate delivery methods to achieve learning outcomes. To support higher education adoption of blended learning, this study identifies and rates the importance of the delivery methods that need consideration when designing a blended learning course. The aim is to assist academics to prioritize their delivery options and provide adaptable and resilient educational models. A Delphi study of two rounds was conducted to identify and rate the delivery methods. An expert panel of 19 academics with extensive experience in course design and online delivery was recruited. The findings indicate that online collaborative work and face-to-face collaborative work should be academics’ first delivery choices. Interestingly, face-to-face instructor-led learning was at the bottom of the list and rated well below all the other delivery methods.

1. Introduction

Blending online learning with traditional face-to-face instruction has proven to be an effective approach for enhancing student outcomes and providing a sustainable learning environment. Blended learning importance has increased following the COVID-19 outbreak. The flexibility inherent in blended learning mitigated the impact of the suspension of in-class activities, ensuring continued learning experiences for students while fostering resilience in the face of unforeseen challenges [1,2]. Therefore, various scholars advocate for the adoption of blended learning [3]. Research indicates that in the post-pandemic era, blended learning is likely to be the new traditional approach for teaching in higher education [4,5,6,7].
Large numbers of blended learning delivery methods (e.g., face-to-face lectures, virtual sessions, self-paced activities, or collaborative activities) need to be considered to construct a successful blended learning experience [8,9]. Oliver and Stallings [10] explained that in order for blended learning to be effective, a combination of different online and traditional delivery methods must be integrated. Alammary [11] pointed out that academics, when designing their blended learning courses, should try to use each delivery method for what it does best. Li et al. [12] stated that a critical design decision that academics would experience when designing their blended learning courses is deciding the most appropriate delivery method to deliver particular content. Versteijlen and Wals [1] argued that selecting the right blended learning components can create a sustainable learning environment by promoting inclusivity, reducing environmental impact, and fostering student-centric education.
To assist academics in higher education to better select delivery methods for their blended courses and to help educational institutions to accelerate the adoption of blended learning and provide sustainable learning environment for their students, the current study aims to identify blended learning delivery methods, rate the importance of each one of them, and understand what could motivate the selection of one method over the other. Without a list of possible blended learning delivery methods, academics would simply rely on their own limited experience to design their blended courses [13].
The remainder of this paper is organized as follows: Section 2 introduces the background of the research. Section 3 describes the Delphi method adopted to rate and validate these delivery methods. Section 4 presents the results obtained. In Section 5, the results are discussed, while Section 6 concludes this paper and proposes future work.

2. Research Background

While interest in blended learning is on the rise, the definition of the concept remains a topic of debate. Picciano [14] emphasizes that blended learning encompasses a diverse range of forms, approaches, and implementations, signifying different aspects to different individuals. Kintu, et al. [15] explain that the term “blended learning” may seem conceptually vague, yet this ambiguity reveals its vast potential and flexibility. The lack of a fixed definition allows for academics to tailor the concept to fit their specific course or institutional needs [16]. This research broadly defines a blended learning course as one that thoughtfully combines various delivery methods, both face-to-face and online, enabling a spectrum of implementations, from adding simple online elements into a traditional course to designing an entirely new blended course from scratch [11]. Stating that the integration should be thoughtful gives a strong implication that the process of blending the course components necessitates significant planning and deliberate consideration.
The right selection of delivery methods plays a pivotal role in cultivating a sustainable learning environment that fosters long-term students’ success. By integrating technology thoughtfully, academics can cater to diverse learning styles, offering students the opportunity to engage with content through multimedia resources, collaborative activities, and self-paced learning. This diversity in delivery methods not only accommodates varying learning preferences but also promotes a more inclusive educational experience [17]. Moreover, the thoughtful use of blended learning components supports resource optimization and reduces environmental impact, contributing to the sustainability of educational practices. In addition to enhancing accessibility and inclusivity, the right selection of blended learning components addresses the evolving needs of students and teachers alike. Through careful consideration of instructional design principles, academics can create seamless transitions between online and face-to-face activities, fostering a cohesive learning experience. Moreover, the integration of real-time feedback mechanisms and personalized learning pathways enables students to take ownership of their learning. This empowerment not only enhances motivation and engagement but also contributes to the development of critical digital literacy skills [18,19]. Ultimately, the proper use of blended learning components aligns with the principles of sustainability by creating an adaptable and resilient educational framework that prepares students for the challenges of the future while minimizing the environmental footprint associated with traditional teaching models [1].
Blended learning provides academics with a diverse range of delivery methods to select from. Elsenheimer [20] suggests a catalogue of around 50 different delivery methods for blended learning courses. Bath and Bourke [21] also list 51 blended learning “possibilities” (p. 4). Westerlaken et al. [22] outline 11 online and face-to-face components. Farooq, et al. [23] discuss 10 different fundamental components of blended learning along with their advantages. Studying such a large number of components is tedious, complex, and time-consuming. Therefore, it is important to classify delivery methods into a few categories. Two key criteria to classify these methods are the type of interaction that each of them supports and the location of this interaction, i.e., online or face-to-face.
Bolliger and Martin [24] highlight the importance of interaction to students learning. Bonk and Graham [25] refer to interaction as the binding element in blended courses. Wagner [26] identifies interaction as a key quality indicator in learning. The constructivist theory indicates that learners construct knowledge through their experiences and interactions [27].
Moore [28] identifies three different types of interactions in the learning process: interactions between the learner and the instructor, among learners themselves, and between the learner and the content. Learner–content interaction can be described as the intellectual engagement with the material that leads to shifts in the learner’s understanding, viewpoint, or cognitive frameworks. This interaction is central to the educational process [29]. Constructivist theory posits that knowledge is not a standalone entity but is constructed through engagement with content or through social interactions [30]. In such constructivist environments, learners are expected to take charge of their learning, equipped with the necessary resources, tools, and support to guide their educational journey [31]. Blended learning facilitates this interaction by offering online self-directed activities such as interactive videos, podcasts, and digital readings. These resources enable students to learn at their convenience, pace, and location [32].
Learner–instructor interaction seeks to inspire and engage the learner, providing opportunities to clear up any confusion related to the content [33]. In fact, cognitive development originates from the interaction between a student and a more knowledgeable individual [34]. Research by authors such as Iaconelli and Anderman [35] and Liu [36] indicates that both nonverbal and verbal cues from a teacher can enhance students’ emotional and cognitive learning. In the context of blended learning, this interaction between learner and instructor is facilitated through both direct, in-person teaching methods (like lectures and laboratory sessions) and digital methods (such as online classes, webcasts, and scheduled internet-based teaching). These methods enable educators to effectively oversee and customize their students’ learning journey [13].
Learner–learner interaction can be defined as the communication that happens between one learner and another, either individually or in groups. This interaction may occur with or without an instructor being actively present [37]. Echoing this, social constructivism posits that learning emerges from the group interactions of students. Accordingly, social constructivism emphasizes the importance of collaborative learning strategies, including peer collaboration, problem-based learning, and other group-based educational approaches [38]. In the realm of blended learning, such collaboration is facilitated through a mix of in-person and digital formats, encompassing activities like writing groups, peer teaching, online discussions, and virtual learning communities. These diverse collaborative opportunities enable instructors to more effectively engage their students, foster critical thinking skills, and facilitate a more profound comprehension of the subject matter being studied [39].
Learner–interface interaction can be described as the act of utilizing tools to complete a specific task. This type of interaction emphasizes the necessity for learners to employ technologies in order to engage with the material, understand and agree upon its meaning, and confirm this understanding with both the instructor and fellow learners [40]. Therefore, students need to possess essential skills to use learning technologies and should be comfortable with an online learning environment [41]. This also highlights the need for interfaces that provide easy access to rich content resources and enable dialogue [42].
Based on the type and the location of interaction, blended learning components can be classified into five categories:
  • Face-to-face instructor-led: This component involves students attending in-person classes where an instructor delivers the content. It often includes some opportunities for hands-on learning and interaction [43]. This format is commonly used in university education. It offers the advantages of efficiently presenting extensive material to large groups of students and gives instructors greater control over the learning process, allowing them to adjust their teaching methods as needed [44]. However, this component might not be the most effective for long-term material retention post-class, and instructors might face a significant challenge in maintaining student engagement. Furthermore, the traditional classroom setup is limited by geographical and temporal constraints [43].
  • Online instructor-led: In this component, students participate in real-time virtual classes led by an instructor who guides the learning pace and engages in interaction [45]. This component shares many of the same advantages and disadvantages as face-to-face instructor-led, with the notable difference being its freedom from geographical constraints. However, students might face technical issues in this mode of learning [46].
  • Face-to-face collaborative: This component involves students working together physically in the classroom on designated tasks. Activities typical of this method include workshops, peer teaching, group discussions, collaborative writing, and problem-solving [43]. Research by Xu et al. [47], indicates that such collaborative work in a face-to-face setting can enhance student engagement, promote critical thinking, deepen understanding of the subject matter, and motivate students to take control of their learning process. However, successful collaboration does not always occur. Some students might struggle with self-directed learning in these group scenarios. Implementing collaborative work in large class settings is also challenging. Additionally, like face-to-face instructor-led, this method is also limited by physical location and timing constraints [48].
  • Online collaborative: This component involves students collaborating online to complete specific tasks. Examples include participating in online learning communities, using online whiteboards, and engaging in online discussion groups [49]. This component offers advantages over face-to-face collaboration, such as not being bound by location and time. However, it also presents certain challenges. For instance, some students might struggle with managing their workload and time, as observed by Monteiro and Morrison [50]. Additionally, a lack of direct teacher guidance and interaction can make it difficult for students to coordinate their collaborative efforts effectively [51].
  • Online self-paced learning: This component offers students the opportunity to access learning materials online, allowing them to study at their convenience from any location, at any time, and at their preferred pace. This method of learning is characterized by its flexibility and autonomy, allowing students to tailor their learning experiences to their personal schedules, learning styles, and speed [52]. However, this method also presents certain risks, such as (i) the possibility of students selecting unsuitable environments for learning; (ii) a lack of peer pressure and motivation; (iii) the risk of procrastination; and (iv) the chance that students might opt for ineffective learning processes [53].

3. Methods

In order to address the complex problem of rating and validating the identified delivery methods, a two-round modified Delphi survey was used. The Delphi method is an interactive and systematic technique for collecting responses from subject matter experts through multiple rounds of surveys. The aim of these multiple rounds is to allow for experts to reach consensus on an issue under consideration [54].
While a traditional Delphi method has three to four rounds of questionnaires, a modified Delphi could have as few as two rounds. This is due to the fact that the first round of a modified Delphi is developed based on data derived from the literature. Penciner et al. [55] and Pickard [56] indicated that the number of rounds can be reduced to as few as two if an initial list of preselected items is developed from the literature. In this study, two rounds was found to be sufficient for two reasons: First, because the content of the Round 1 questionnaire was developed from the literature. As has been discussed in the previous section, the list of delivery methods that the participants were requested to rate was derived from the literature. Second, because the target population consisted of busy academics who had high workload. Conducting more rounds would have discouraged some of them from participating in this study.
The Delphi method was chosen due to the complexity of the research problem and the need to reach consensus among a diverse group of experts drawn from different academic disciplines and different universities. The Delphi method has proved to be a suitable method to generate consensus or convergence of viewpoints on complex issues, providing valuable insights for decision-making and problem-solving [54]. The method has the ability to build consensus among geographically dispersed experts, provide anonymous responses, and reduce the influence of dominant individuals. Its iterative process allows experts to express, refine, and revise their opinions based on collective feedback. This method also minimizes “noise” or irrelevant group discussion, focusing instead on problem-solving and insight development [57,58]. While other methods, including focus groups and expert panels, could have been considered, employing such approaches would have necessitated direct interactions among panel members in structured meetings, which proved challenging to organize given the geographical dispersion of the experts. Before conducting this study, approval was granted by the Monash University Human Research Ethics Committee (Reference number: CF15/273—2015000132).

3.1. Developing and Piloting the Survey

The Delphi questionnaires were developed using an open-source online web application called LimeSurvey. After developing the Delphi survey, a pilot study was initiated to ensure the validity of the survey instrument, assess the difficulty of its items, and obtain an estimate of the time and effort involved for the Delphi method [59,60]. Validity was assessed in terms of content validity and face validity. Burton and Mazerolle [61] stated that the purpose of content validity is “establishing an instrument’s credibility, accuracy, relevance, and breadth of knowledge regarding the domain”, while face validity is used to assess the survey instruments in terms of “ease of use, clarity, and readability” (p. 29).
To conduct the pilot study, three academics with more than ten years of experience in educational technology were asked to validate the survey instruments. The pilot survey was developed on LimeSurvey and the survey link was sent to the three participants. Next to each survey item, participants were provided with an empty textbox to comment on the relevance, readability, and clarity of that item. Participants were also requested to note whether the item needed to be revised or removed from the survey. If a participant felt a revision was necessary, they were requested to suggest a revision. Participants’ comments were reviewed and many changes to the survey items were applied.

3.2. Expert Panel Recruitment

An important step when conducting a Delphi study is to recruit a group of experts who possess both the knowledge and experience in the area under investigation. They also need to be representative of the experts in their field [62]. Therefore, a group of academics with good experience and practical knowledge in both traditional and online instruction methods were requested to partake in this study. In order to select this group of experts, a purposive approach was utilized. At first, 26 academics who are known to the researcher were selected to form an initial list of potential participants. These academics are members of different academic groups such as the Monash Education Academy (MEA) and the Australasian Computing Education Conference (ACE) committee. They all have six years or more of experience in online delivery methods and instructional design. In the next step, the researcher searched the websites of several universities to identify more experts. He used the following three inclusion criteria to identify these experts:
  • The academic should have at least three publications in top-tier publication venues in the area of educational technology. Publication record, according to Mirza et al. [63], is an indicator of the professional expertise of academics.
  • The academic should have at least one year of experience with online delivery methods such as online discussion boards, wikis, webcasts, and blogs.
  • The academic should have at least one year of experience in instructional design.
While searching the universities’ websites, the researcher tried to find experts from different disciplines and academic levels. The aim was to assess the impact of discipline on the selection and rating of delivery methods.
By the end of the recruitment process, 48 academics were found eligible to participate in this study. An invitation e-mail was sent to each one of them with information about this study and how it would be conducted. The e-mail also contained consent information. Around 40% of them (19 participants) agreed to participate and completed the first-round questionnaire.
The panel of 19 participants was deemed sufficient for this study. Vogel et al. [64] explained that an expert panel has the largest confidence when it has at least 12 members and that a larger number can cause diminishing returns regarding the validity of the findings. De Villiers et al. [65] pointed out that a larger panel of more than 30 has seldom been found to improve the results, as it can make the Delphi process more challenging to manage and can lead to low response rates. Veugelers et al. [66] discusses the ideal number of panelists for a Delphi study, suggesting that 10 to 18 is recommended for developing productive group dynamics and arriving at consensus among experts. Armstrong [67] suggests that Delphi panels in general should comprise between 5 and 20 members. Ziglio [68] argues that high-quality results can be obtained with a panel of 10 to 15 members.
No distinctive characteristics differentiated academics who agreed to participate from the others who did not. As can be seen in Table 1, academics who agreed to participate in this study came from a wide range of academic disciplines. The vast majority of them had a minimum of four years’ experience in online delivery, had taught both postgraduate and undergraduate courses, and had a minimum of six years’ experience in instructional design. They all had a minimum of five publications in top-tier publication venues in the area of educational technology. The list of participants only included academics from Australia and New Zealand, and perhaps participants from other countries could have been invited. Despite this, academics who took part in this study provided a wide range of opinions and adequately represented their academic disciplines [54,69].

3.3. Round 1 Delphi Survey

At the beginning of the first round, e-mails containing a link to the survey were sent to the 19 academics who agreed to participate. The survey began with a short section describing the aim of the Delphi study and outlining the delivery methods that they were required to rate. Academics were asked to rate the importance of each delivery method by using an ordinal scale of 1 to 5, where 5 indicates “very important” and 1 indicates “very unimportant”. Academics were also asked to explain the rationale of their ratings and suggest additional delivery methods to be added to the list.
To analyze the results of this round, quantitative and qualitative analysis methods were used. The standard deviation (level of dispersion) and the mean (central tendency) were calculated to present the delivery methods’ ratings. The mean was calculated to indicate the participants opinion, while standard deviation was calculated to represent the spread of these opinions [70]. These analysis methods are the most widely used statistics in Delphi studies and are found to be valid methods for analyzing ratings from Delphi studies [71].
For the open-ended questions, the content analysis technique described by Neuendorf and Kumar [72,73] was used to analyze them. All the answers that provided by the participants were organized into a single Word document. Each answer was examined to decide if it was a new delivery method suggested by a participant or a comment related to one of the methods on the list. The anonymized analysis data were shared with two academics with years of experience in qualitative analysis. The aim was to make sure that the analysis was a valid and reliable interpretation of the original data [74].

3.4. Round 2 Delphi Survey

The 19 academics who completed the Round 1 survey were sent e-mails containing a link to the second round of the Delphi survey. This survey contained a list of delivery methods that they had rated in the first round. Academics were requested to reconsider their ratings while taking into consideration the Round 1 mean rating, which was indicated by a red text next to each delivery method. In order to achieve higher consensus, the academics were encouraged to adjust their rating toward the group mean rating (see Figure 1). They were also notified that if their new rating was more than one point away from the group mean rating, they should justify their response. As with the Round 1 survey, the standard deviation and mean were calculated to present academics’ ratings. Content analysis was used to analyze the academics’ comments.
To determine if there is a significant difference in the ratings by academic level and discipline, the Kruskal–Wallis H-test was performed. The Kruskal–Wallis H-test is a non-parametric technique. It is used to assess whether two or more independent groups show significant differences in their ranks on a continuous or ordinal variable [75]. The Kruskal–Wallis H-test was performed for each blended learning delivery method. The p-values obtained from the Kruskal–Wallis H-test for each blended learning delivery method were then examined. A p-value threshold (alpha) of 0.05 is used to determine significance. If the p-value is less than 0.05, the null hypothesis is rejected, and it is concluded that there are significant differences [76].
To gain detailed insights into which specific groups differ from each other, a post-hoc test is performed following the Kruskal–Wallis H-test. A common post-hoc approach is to use the Mann–Whitney U test, which is used for pairwise comparisons [77]. The p-values from the pairwise Mann–Whitney U tests are adjusted for multiple comparisons using the Bonferroni correction. A p-value less than 0.05 is typically considered statistically significant [78].

4. Results

Nineteen participants (out of forty-eight invited) completed the Round 1 survey with a response rate of around 40%. For Round 2, fifteen out of the nineteen who participated in Round 1 completed the survey with a response rate slightly less than 80%.

4.1. Round 1 Results

Of the five delivery method categories, which participants were requested to rate, on a five-point ordinal scale, one method (20%) scored a mean importance rating equal 4. The other four delivery methods (80%) scored mean importance ratings between 3 and 4. Overall, face-to-face collaborative work scored the highest mean (4), followed by online collaborative work, which scored 3.95. One delivery method achieved consensus. Consensus was considered to be achieved when the standard deviation of an item was less than or equal to 1 [79]. Two delivery methods were near consensus while the other two were slightly far from consensus. The data in Table 2 present the panel mean score and the standard deviation (SD), respectively, for each delivery method category.
Eight academics responded to the open-ended question that required them to suggest additional delivery method categories. However, no new categories were identified. All the statements generated were comments such as “Regardless of the mode of delivery the role of instructor and peer collaboration are essential for the delivery of any course. Yes, self-paced activities are also highly important”, “These are comprehensive”, and “Instructor-led, collaboration and self-paced delivery methods are all applicable on-line. More important is if the course is synchronous or asynchronous”.

4.2. Round 2 Results

Of the five delivery method categories, which respondents were asked to rate, one category (20%) scored a mean importance rating over 4, while the remainder (80%) scored mean importance ratings between 3 and 4. The experts reached consensus on three categories. The other two were near consensus (see Table 3).
The rating of face-to-face instructor-led moved slightly far from consensus as compared to the Round 1 score and was the only item in the Round 2 survey that scored lower consensus. Its standard deviation changed from 0.91 to 1.07. The rating of the other four categories moved closer to consensus, especially online instructor-led and online collaborative work, such that their standard deviation changed largely from 1.04 and 1.05 to 0.65 and 0.57.

4.3. Measuring the Impact of Disciplines and Academic Levels

The result of using the Kruskal–Wallis H-test to examine if there are significant differences in ratings among the different disciplines for each delivery method is shown in Table 4. As can be seen, the p-values for all the delivery methods are above 0.05, suggesting that there are no significant differences in ratings based on disciplines. Therefore, it was not necessary to perform post-hoc tests since the Kruskal–Wallis test did not indicate significant differences across the disciplines for any of the delivery methods.
For examining the impact of academic levels, the p-values obtained from the Kruskal–Wallis H-test for each blended learning delivery method are shown in Table 5. For online collaborative work, the p-value is 0.034, which is less than 0.05, indicating that there is a statistically significant difference in ratings for this method among different academic levels. For the other methods (face-to-face instructor-led, online instructor-led, face-to-face collaborative work, and online self-paced), the p-values are higher than 0.05, suggesting that there is no significant difference in ratings based on academic level.
To gain detailed insights into which specific groups differ from each other in rating online collaborative work, the Mann–Whitney U test was used to compare each pair of academic levels. The Bonferroni correction was also used to adjust the p-values for the multiple comparisons. The p-values from the pairwise Mann–Whitney U tests with Bonferroni correction for the “online collaboration” method are presented in Table 6.
The results indicate the following regarding the difference in ratings for online collaborative work between the different groups:
  • There is no statistically significant difference in the ratings for online collaborative work between the groups “both” and “postgraduate” (p = 0.117).
  • There is no statistically significant difference in the ratings for online collaborative work between the groups “both” and “undergraduate” (p = 1.0).
  • There is no statistically significant difference in the ratings for online collaborative work between the groups “postgraduate” and “undergraduate” (p = 0.074).
In summary, the pairwise comparisons suggest that there are no significant differences in the ratings for online collaborative work among the participants teaching at different academic levels after adjusting for multiple comparisons.

5. Discussion

This study identified the different delivery methods that need consideration when designing a blended learning course and rated the importance of each category. The aim is to help academics in prioritizing their delivery options and construct sustainable learning environment that can support students’ learning. Five types of delivery methods were identified and rated: face-to-face instructor-led, online instructor-led, face-to-face collaborative work, online collaborative work, and online self-paced.
Overall, online collaborative work and face-to-face collaborative work were the experts’ favorite delivery options, as these scored the highest in importance ratings. This demonstrates that students’ active engagement with course material through collaborative activities is vital for blended learning courses. It strengthens the findings of other research that learning can be more enriched when students go beyond the passive tasks of viewing, reading, or listening to engage in higher-order thinking, such as analysis, synthesis, and evaluation [80,81]. It also aligns with social constructivism learning theory. Social constructivism stresses that learning takes place through communication, collaborative activity, and interactions with others [38,82]. It argues that student collaboration enhances students’ ability to test their own ideas, synthesize their peers’ ideas, and construct a deeper understanding of the material being learned [83,84]. This in turn helps students to achieve sustainable development, as well as an improvement in the capacity for self-employment [17].
Of the two delivery methods, online collaborative work scored a higher importance rating with a score far higher than face-to-face collaborative work. This is consistent with previous studies that have recognized the need to shift towards a more active and group-oriented online learning approach [85,86]. This shift is essential in the post-COVID-19 era, considering the online learning expertise acquired by academics and students during the pandemic. Such a shift should be achieved through new learning strategies such as gamification and group problem solving, which could motivate students to learn more, spend more time on tasks, become more engaged, and perform better [87,88]. Such online technologies can overcome geographical barriers to learners’ interactions. As noted by Cakula [89], the assurance of sustainable education lies in the accessibility, openness, and diversity of the learning environment. One expert explained: “I just finished a course where I was a student in a group with two people on the other side of the planet. Apart from the difficulty of scheduling real-time meetings, this worked fine. We used e-mail Skype and Google Docs”. However, experts noted some challenges associated with online collaboration. One of them explained: “Facilitation of the learning space is often overlooked in online courses. There is a certain level of student proficiency that needs to be considered. If they don’t have the skills, we can’t expect them to deliver the material we want them to”.
Online instructor-led was the experts’ third option, with a score very close to the second one, i.e., face-to-face collaborative work. This score, along with the high score of online collaborative work, highlights the need to design blended courses that have a high proportion of online components in order to offer students greater flexibility and convenience. It could also minimize the environmental impact associated with traditional teaching and allow for greater scalability and adaptability in response to changing educational needs. Additionally, the reduced need for travel to physical classrooms decreases the carbon footprint associated with commuting, aligning with broader sustainability goals [90]. In fact, online tools such as video conferencing proved to play a significant role in enabling effective learning and teaching during and after the COVID-19 pandemic [91]. What might limit the proportion of online components in a blended course, according to the experts, is the absence of such online tools. An expert remarked: “If the course requires equipment the student will not have access to or them to work in a group, then that will dictate more on-campus work. Ideally I want to design courses which can be taken entirely on-line, with optional face-to-face activities”.
In fourth place was online self-paced. It seems that the experts believe that active and group-oriented learning approaches can be more appropriate for the blended experience. This aligns with other studies, which suggest that student–teacher and student–student interactions are crucial for learning [33,92]. This is not to say that online self-paced is not appropriate for blended learning; rather, it gives an indication of how academics should prioritize their delivery options. An expert noted: “The role of instructor and peer collaboration are essential for the delivery of any course. Yes, self-paced activities are also important”. In fact, new strategies that utilize AI-powered technologies such as adaptive learning have proved to provide personalized, efficient, and self-paced learning experiences [93,94]. They could personalize the educational journey, empowering students to progress at their own pace and according to their individual needs. This approach not only enhances student motivation and engagement but also promotes the development of critical digital literacy skills, preparing students for the demands of the modern workforce and contributing to the sustainability of the overall educational system [90].
Surprisingly, at the bottom of the list and rated well below the other delivery methods is face-to-face instructor-led. The experts’ comments provided some justification for their low rating. In part, the low rating seems to be related to the fewer face-to-face class hours that blended courses offer compared to traditional courses. The experts seem to believe that with fewer face-to-face classes, a teacher can better utilize class time to engage students in collaborative activities. One expert commented: “online self-paced would be used to identify ‘gross’ or foundation knowledge, with face-to-face used to implement knowledge/skills and foster collaboration/discussion between student and educator”.
Another possible explanation for the low rating of face-to-face instructor-led might be related to the experts’ perception of the need of blended courses to adopt new teaching approaches. An expert explained: “When designing a blended learning course, it is important that teachers have an open mind to renew the course from a different perspective. Certainly the staff can incorporate traditional aspects but not be hampered by clinging to the traditional approaches”. In traditional face-to-face lectures, academics deliver instruction for most of the lecture time and provide their students with little opportunity to ask questions, interact, and exchange knowledge [43]. A number of studies have found that students retain less information after the traditional lecture has ended in comparison to lectures taught using an active learning approach [44,95]. Active learning, on the other hand, is based on the assumption that students who are more engaged in learning learn much more effectively and their learning experience is more intense and permanent [96]. Active learning can be facilitated through collaborative activities [97].
An important point to mention is that discipline and academic levels were not found to have an impact on the participants’ rating of the different delivery methods. This indicates that the findings of this study are applicable across different disciplines and academic levels.
Overall, the Delphi method was an appropriate technique to examine the different delivery methods and rate their importance to blended learning courses. It made it possible to build consensus among a panel of academics drawn from different disciplines. These academics were geographically spread across New Zealand and Australia. It would not have been possible for them to meet face-to-face. By using the Delphi method, these academics were able to express their views privately without being intimidated by other participants [54]. The use of statistical analysis to analyze participants’ feedbacks was helpful in ensuring that all opinions were well represented in the final results [98,99].
While this research has enhanced the understanding of blended learning course design, it is important to acknowledge its limitations. The study’s participants were exclusively academics from universities in New Zealand and Australia, which may limit the applicability of the findings to those regions. The findings obtained could be region-specific. Another limitation stems from the use of the Delphi method, particularly the challenge of getting all participants to answer open-ended questions justifying their responses. Although many experts provided justifications for their ratings, not everyone did. As noted by Lim et al. [100] and Glastonbury and MacKean [101], the format of open-ended questions in surveys might not elicit the in-depth explanations typically found in individual interviews. A third limitation is that the participants were all selected for their expertise in the area of online and blended learning. Therefore, there might be some bias towards online components.

6. Conclusions

This study identified and rated five types of delivery methods that need consideration when designing a blended course, i.e., face-to-face instructor-led, online instructor-led, face-to-face collaborative work, online collaborative work, and online self-paced. It contributes several findings regarding how academics should prioritize their delivery options and how sustainable learning could be built. From the outcomes of this study, it is possible to conclude that active engagement with course content through collaborative activities is most preferred for blended learning courses. It also seems that the historical face-to-face instructor-led delivery is least appropriate for 21st-century students. Moreover, it seems that teachers need to incorporate more online components into their blended courses to provide more student flexibility and minimize the environmental footprint associated with traditional educational methods.
Academics’ exposure to e-learning during the COVID-19 pandemic has certainly accelerated the adoption of blended learning. Therefore, it is useful for universities around the world to consider the outcomes of this study. Traditional courses need to be redesigned with learning outcomes and available technology in mind. Online self-paced activities are appropriate to introduce students to new content. Time that students spend in class is valuable and should be used to engage students in collaborative activities that can make their learning intense and permanent. Students’ collaboration should continue online. Online collaborative activities should be designed so students can continue to be actively engaged in learning. Using each delivery method for what it does best can create a sustainable learning environment by promoting inclusivity, reducing environmental impact, and fostering student-centric learning. By harnessing the power of technology to complement traditional methods, educational institutions can build a resilient and adaptable framework that not only meets the diverse needs of students but also contributes to a more sustainable and environmentally conscious future.
Even though this study revealed several meaningful research findings, future studies are needed to investigate students’ preferences for the five delivery methods. This investigation would look at criteria that influence students’ preference for one method over another. It would also look at the relationship between blended learning design and the nature of the physical spaces in which it is delivered. It is also important to conduct the same study in a different region, e.g., the Middle East. This can be helpful in determining whether the results are context-dependent or not.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and approved by the Human Research Ethics Committee of Monash University (Reference number: CF15/273—2015000132, date: 27 June 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

Data are unavailable due to privacy restrictions; no consent was sought from the subjects (participants) to make their responses publicly available.

Acknowledgments

I would like to thank all the esteemed academic participants who generously contributed their expertise and insights to enrich this Delphi study.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Versteijlen, M.; Wals, A.E.J. Developing Design Principles for Sustainability-Oriented Blended Learning in Higher Education. Sustainability 2023, 15, 8150. [Google Scholar] [CrossRef]
  2. Alammary, A.; Alshaikh, M.; Pratama, A.R. Awareness of security and privacy settings in video conferencing apps among faculty during the COVID-19 pandemic. PeerJ Comput. Sci. 2022, 8, e1021. [Google Scholar] [CrossRef] [PubMed]
  3. Theoret, C.; Ming, X. Our Education, Our Concerns: Medical Student Education Impact due to COVID-19. Med. Educ. 2022, 8, e1021. [Google Scholar] [CrossRef] [PubMed]
  4. Ehrlich, H.; McKenney, M.; Elkbuli, A. We Asked the Experts: Virtual Learning in Surgical Education During the COVID-19 Pandemic-Shaping the Future of Surgical Education and Training. World J. Surg. 2020, 44, 2053–2055. [Google Scholar] [CrossRef] [PubMed]
  5. Jones, K.; Sharma, R. On Reimagining a Future for Online Learning in the Post-COVID Era. 2020. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3578310 (accessed on 25 January 2024).
  6. Yu, T.; Dai, J.; Wang, C. Adoption of blended learning: Chinese university students’ perspectives. Humanit. Soc. Sci. Commun. 2023, 10, 390. [Google Scholar] [CrossRef]
  7. Alammary, A. Blended learning models for introductory programming courses: A systematic review. PLoS ONE 2019, 14, e0221765. [Google Scholar] [CrossRef] [PubMed]
  8. Yusoff, S.; Yusoff, R.; Md Noh, N.H. Blended learning approach for less proficient students. SAGE Open 2017, 7, 2158244017723051. [Google Scholar] [CrossRef]
  9. Alammary, A.S. A Toolkit to Support the Design of Blended Learning Courses. IEEE Access 2022, 10, 85530–85548. [Google Scholar] [CrossRef]
  10. Oliver, K.; Stallings, D. Preparing teachers for emerging blended learning environments. J. Technol. Teach. Educ. 2014, 22, 57–81. [Google Scholar]
  11. Alammary, A.S. How to Decide the Proportion of Online to Face-to-Face Components of a Blended Course? A Delphi Study. SAGE Open 2022, 12, 21582440221138448. [Google Scholar] [CrossRef]
  12. Li, C.; Cheung, S.K.S.; Wang, F.L.; Lu, A.; Kwok, L.F. Blended Learning: Lessons Learned and Ways Forward: 16th International Conference on Blended Learning; Springer Nature Switzerland: Hong Kong, China, 2023; p. 286. [Google Scholar]
  13. Müller, C.; Mildenberger, T.; Steingruber, D. Learning effectiveness of a flexible learning study programme in a blended learning design: Why are some courses more effective than others? Int. J. Educ. Technol. High. Educ. 2023, 20, 10. [Google Scholar] [CrossRef] [PubMed]
  14. Picciano, A.G. Blending with purpose: The multimodal model. J. Res. Cent. Educ. Technol. (RCET) 2009, 5, 4–14. [Google Scholar] [CrossRef]
  15. Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017, 14, 7. [Google Scholar] [CrossRef]
  16. Dziuban, C.; Graham, C.R.; Moskal, P.D.; Norberg, A.; Sicilia, N. Blended learning: The new normal and emerging technologies. Int. J. Educ. Technol. High. Educ. 2018, 15, 3. [Google Scholar] [CrossRef]
  17. Míguez-Álvarez, C.; Crespo, B.; Arce, E.; Cuevas, M.; Regueiro, A. Blending learning as an approach in teaching sustainability. Interact. Learn. Environ. 2022, 30, 1577–1592. [Google Scholar] [CrossRef]
  18. Ramalingam, S.; Yunus, M.M.; Hashim, H. Blended Learning Strategies for Sustainable English as a Second Language Education: A Systematic Review. Sustainability 2022, 14, 8051. [Google Scholar] [CrossRef]
  19. Alammary, A.S. LOsMonitor: A Machine Learning Tool for Analyzing and Monitoring Cognitive Levels of Assessment Questions. IEEE Trans. Learn. Technol. 2021, 14, 640–652. [Google Scholar] [CrossRef]
  20. Elsenheimer, J. Got tools? The blended learning analysis and design expediter. Perform. Improv. 2006, 45, 26–30. [Google Scholar] [CrossRef]
  21. Bath, D.; Bourke, J. Getting Started with Blended Learning; 2010, Volume 73. Available online: https://onlinelibrary.wiley.com/doi/abs/10.1002/pfi.4930450806 (accessed on 4 September 2022).
  22. Westerlaken, M.; Christiaans-Dingelhoff, I.; Filius, R.M.; de Vries, B.; de Bruijne, M.; van Dam, M. Blended learning for postgraduates; an interactive experience. BMC Med. Educ. 2019, 19, 289. [Google Scholar] [CrossRef]
  23. Farooq, M.S.; Hamid, A.; Alvi, A.; Omer, U. Blended Learning Models, Curricula, and Gamification in Project Management Education. IEEE Access 2022, 10, 60341–60361. [Google Scholar] [CrossRef]
  24. Bolliger, D.U.; Martin, F. Critical design elements in online courses. Distance Educ. 2021, 42, 352–372. [Google Scholar] [CrossRef]
  25. Bonk, C.J.; Graham, C.R. The Handbook of Blended Learning: Global Perspectives, Local Designs; Wiley, John & Sons, Incorporated: San Francisco, CA, USA, 2012. [Google Scholar]
  26. Wagner, E.D. On desiging interaction experience for the next generation of blended learning. In The Handbook of Blended Learning: Global Perspectives, Local Designs; Bonk, C.J., Graham, C.R., Eds.; Pfeiffer Publishing: San Francisco, CA, USA, 2012; pp. 41–55. [Google Scholar]
  27. Birgin, O.; Topuz, F. Effect of the GeoGebra software-supported collaborative learning environment on seventh grade students’ geometry achievement, retention and attitudes. J. Educ. Res. 2021, 114, 474–494. [Google Scholar] [CrossRef]
  28. Moore, M.G. Editorial: Three types of interaction. Am. J. Distance Educ. 1989, 3, 1–7. [Google Scholar] [CrossRef]
  29. Powell, S.T.; Leary, H. Measuring learner–content interaction in digitally augmented learning experiences. Distance Educ. 2021, 42, 520–546. [Google Scholar] [CrossRef]
  30. Rannikmäe, M.; Holbrook, J.; Soobard, R. Social Constructivism—Jerome Bruner. In Science Education in Theory and Practice: An Introductory Guide to Learning Theory; Akpan, B., Kennedy, T.J., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; pp. 259–275. [Google Scholar]
  31. Kwangmuang, P.; Jarutkamolpong, S.; Sangboonraung, W.; Daungtod, S.J.H. The development of learning innovation to enhance higher order thinking skills for students in Thailand junior high schools. Heliyon 2021, 7, e07309. [Google Scholar] [CrossRef] [PubMed]
  32. Kliziene, I.; Taujanskiene, G.; Augustiniene, A.; Simonaitiene, B.; Cibulskas, G.J.S. The impact of the virtual learning platform EDUKA on the academic performance of primary school children. Sustainability 2021, 13, 2268. [Google Scholar] [CrossRef]
  33. Ho, I.M.K.; Cheong, K.Y.; Weldon, A. Predicting student satisfaction of emergency remote learning in higher education during COVID-19 using machine learning techniques. PLoS ONE 2021, 16, e0249423. [Google Scholar] [CrossRef]
  34. Ferreira, M.; da Silva Filho, O.L.; da Silva Nascimento, A.B.; Strapasson, A.B. Time and cognitive development: From Vygotsky’s thinking to different notions of disability in the school environment. Humanit. Soc. Sci. Commun. 2023, 10, 768. [Google Scholar] [CrossRef]
  35. Iaconelli, R.; Anderman, E.M. Classroom goal structures and communication style: The role of teacher immediacy and relevance-making in students’ perceptions of the classroom. Soc. Psychol. Educ. 2021, 24, 37–58. [Google Scholar] [CrossRef]
  36. Liu, W.J.F.i.P. Does teacher immediacy affect students? A systematic review of the association between teacher verbal and non-verbal immediacy and student motivation. Front. Psychol. 2021, 12, 2475. [Google Scholar] [CrossRef]
  37. Derakhshandeh, Z.; Vora, V.; Swaminathan, A.; Esmaeili, B. On the importance and facilitation of learner-learner interaction in online education: A review of the literature. In Proceedings of the Society for Information Technology & Teacher Education International Conference 2023, New Orleans, LA, USA, 13 March 2023; pp. 207–215. [Google Scholar]
  38. Okochi, C.; Gold, A.U.; Christensen, A.; Batchelor, R.L. Early access to science research opportunities: Growth within a geoscience summer research program for community college students. PLoS ONE 2023, 18, e0293674. [Google Scholar] [CrossRef] [PubMed]
  39. Lane, S.; Hoang, J.G.; Leighton, J.P.; Rissanen, A. Engagement and satisfaction: Mixed-method analysis of blended learning in the sciences. Can. J. Sci. Math. Technol. Educ. 2021, 21, 100–122. [Google Scholar] [CrossRef]
  40. Kokoç, M.; Altun, A. Building a Learning Experience: What Do Learners’ Online Interaction Data Imply? In Learning Technologies for Transforming Large-Scale Teaching, Learning, and Assessment; Sampson, D., Spector, J.M., Ifenthaler, D., Isaías, P., Sergis, S., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 55–70. [Google Scholar]
  41. Lockee, B.B. Online education in the post-COVID era. Nat. Electron. 2021, 4, 5–6. [Google Scholar] [CrossRef]
  42. Wang, Y.; Wu, J.; Chen, F.; Li, J. Analyzing Teaching Effects of Blended Learning with LMS: An Empirical Investigation. IEEE Access 2024, 12, 42343–42356. [Google Scholar] [CrossRef]
  43. Lewohl, J.M. Exploring student perceptions and use of face-to-face classes, technology-enhanced active learning, and online resources. Int. J. Educ. Technol. High. Educ. 2023, 20, 48. [Google Scholar] [CrossRef]
  44. Paul, J.; Jefferson, F. A Comparative Analysis of Student Performance in an Online vs. Face-to-Face Environmental Science Course From 2009 to 2016. Front. Comput. Sci. 2019, 1, 472525. [Google Scholar] [CrossRef]
  45. Kamble, A.; Gauba, R.; Desai, S.; Golhar, D. Learners’ Perception of the Transition to Instructor-Led Online Learning Environments: Facilitators and Barriers During the COVID-19 Pandemic. Int. Rev. Res. Open Distrib. Learn. 2021, 22, 199–215. [Google Scholar] [CrossRef]
  46. Kamal, T.; Illiyan, A. School teachers’ perception and challenges towards online teaching during COVID-19 pandemic in India: An econometric analysis. Asian Assoc. Open Univ. J. 2021, 16, 311–325. [Google Scholar] [CrossRef]
  47. Xu, E.; Wang, W.; Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit. Soc. Sci. Commun. 2023, 10, 16. [Google Scholar] [CrossRef]
  48. Sjølie, E.; Espenes, T.C.; Buø, R. Social interaction and agency in self-organizing student teams during their transition from face-to-face to online learning. Comput. Educ. 2022, 189, 104580. [Google Scholar] [CrossRef]
  49. Haugland, M.J.; Rosenberg, I.; Aasekjær, K. Collaborative learning in small groups in an online course—A case study. BMC Med. Educ. 2022, 22, 165. [Google Scholar] [CrossRef] [PubMed]
  50. Monteiro, E.; Morrison, K. Challenges for collaborative blended learning in undergraduate students. Educ. Res. Eval. 2014, 20, 564–591. [Google Scholar] [CrossRef]
  51. Salarvand, S.; Mousavi, M.-S.; Rahimi, M. Communication and cooperation challenges in the online classroom in the COVID-19 era: A qualitative study. BMC Med. Educ. 2023, 23, 201. [Google Scholar] [CrossRef]
  52. Aslam, S.; Akram, H.; Saleem, A.; Zhang, B. Experiences of international medical students enrolled in Chinese medical institutions towards online teaching during the COVID-19 pandemic. PeerJ 2021, 9, e12061. [Google Scholar] [CrossRef]
  53. Moustakas, L.; Robrade, D. The Challenges and Realities of E-Learning during COVID-19: The Case of University Sport and Physical Education. Challenges 2022, 13, 9. [Google Scholar] [CrossRef]
  54. Green, R.A. The Delphi technique in educational research. Sage Open 2014, 4, 2158244014529773. [Google Scholar] [CrossRef]
  55. Penciner, R.; Langhan, T.; Lee, R.; McEwen, J.; Woods, R.A.; Bandiera, G. Using a Delphi process to establish consensus on emergency medicine clerkship competencies. Med. Teach. 2011, 33, e333–e339. [Google Scholar] [CrossRef]
  56. Pickard, A.J. Research Methods in Information; Facet Publishing: London, UK, 2013. [Google Scholar]
  57. Xie, S.; Dong, S.; Chen, Y.; Peng, Y.; Li, X.J.R.E.; Safety, S. A novel risk evaluation method for fire and explosion accidents in oil depots using bow-tie analysis and risk matrix analysis method based on cloud model theory. Reliab. Eng. Syst. Saf. 2021, 215, 107791. [Google Scholar] [CrossRef]
  58. Barrios, M.; Guilera, G.; Nuño, L.; Gómez-Benito, J.J.T.F.; Change, S. Consensus in the delphi method: What makes a decision change? Technol. Forecast. Soc. Change 2021, 163, 120484. [Google Scholar] [CrossRef]
  59. Rubin, A.; Babbie, E. Brooks/Cole Empowerment Series: Essential Research Methods for Social Work; Cengage Learning: Boston, MA, USA, 2012. [Google Scholar]
  60. Syed Kholed, S.N.; Maon, S.N.; Mohd Hassan, N. Reliability and validity of the inter-professional collaboration practice instrument. J. Interprofessional Educ. Pract. 2021, 24, 100450. [Google Scholar] [CrossRef]
  61. Burton, L.J.; Mazerolle, S.M. Survey instrument validity part I: Principles of survey instrument development and validation in athletic training education research. Athl. Train. Educ. J. 2011, 6, 27–35. [Google Scholar] [CrossRef]
  62. Altay, S.; Berriche, M.; Heuer, H.; Farkas, J.; Rathje, S. A survey of expert views on misinformation: Definitions, determinants, solutions, and future of the field. Harv. Kennedy Sch. Misinformation Rev. 2023, 4, 1–34. [Google Scholar] [CrossRef]
  63. Mirza, M.; Harrison, E.A.; Miller, K.A.; Jacobs, E. Indicators of Quality Rehabilitation Services for Individuals with Limited English Proficiency: A 3-Round Delphi Study. Arch. Phys. Med. Rehabil. 2021, 102, 2125–2133. [Google Scholar] [CrossRef] [PubMed]
  64. Vogel, C.; Zwolinsky, S.; Griffiths, C.; Hobbs, M.; Henderson, E.; Wilkins, E. A Delphi study to build consensus on the definition and use of big data in obesity research. Int. J. Obes. 2019, 43, 2573–2586. [Google Scholar] [CrossRef]
  65. De Villiers, M.R.; De Villiers, P.J.T.; Kent, A.P. The Delphi technique in health sciences education research. Med. Teach. 2005, 27, 639–643. [Google Scholar] [CrossRef] [PubMed]
  66. Veugelers, R.; Gaakeer, M.I.; Patka, P.; Huijsman, R. Improving design choices in Delphi studies in medicine: The case of an exemplary physician multi-round panel study with 100% response. BMC Med. Res. Methodol. 2020, 20, 156. [Google Scholar] [CrossRef]
  67. Armstrong, J.S. Long Range Forecasting: From Crystal Ball to Computer; Wiley: New York, NY, USA, 1985. [Google Scholar]
  68. Ziglio, E. The Delphi method and its contribution to decision-making. In Gazing into the Oracle: The Delphi Method and Its Application to Social Policy and Public Health; Jessica Kingsley: London, UK, 1996; pp. 3–33. [Google Scholar]
  69. Retzer, A.; Ciytak, B.; Khatsuria, F.; El-awaisi, J.; Harris, I.M.; Chapman, L.; Kelly, T.; Richards, J.; Lam, E.; Newsome, P.N.; et al. A toolkit for capturing a representative and equitable sample in health research. Nat. Med. 2023, 29, 3259–3267. [Google Scholar] [CrossRef]
  70. Giannarou, L.; Zervas, E. Using Delphi technique to build consensus in practice. Int. J. Bus. Sci. Appl. Manag. (IJBSAM) 2014, 9, 65–82. [Google Scholar]
  71. Franc, J.; Hung, K.; Pirisi, A.; Weinstein, E. Analysis of Delphi Study Seven-Point Linear Scale Data by Parametric Methods–Use of the Mean and Standard Deviation. Prehospital Disaster Med. 2023, 38, s30–s31. [Google Scholar] [CrossRef]
  72. Neuendorf, K.A.; Kumar, A. Content analysis. In The International Encyclopedia of Political Communication; JohnWiley & Sons, Inc.: Hoboken, NJ, USA, 2015; pp. 1–10. [Google Scholar]
  73. Broomfield, C.; Noetel, M.; Stedal, K.; Hay, P.; Touyz, S. Establishing consensus for labeling and defining the later stage of anorexia nervosa: A Delphi study. Int. J. Eat. Disord. 2021, 54, 1865–1874. [Google Scholar] [CrossRef]
  74. Tsai, A.C.; Kohrt, B.A.; Matthews, L.T.; Betancourt, T.S.; Lee, J.K.; Papachristos, A.V.; Weiser, S.D.; Dworkin, S.L. Promises and pitfalls of data sharing in qualitative research. Soc. Sci. Med. (1982) 2016, 169, 191–198. [Google Scholar] [CrossRef] [PubMed]
  75. Grover, R.; Emmitt, S.; Copping, A. Trends in sustainable architectural design in the United Kingdom: A Delphi study. Sustain. Dev. 2020, 28, 880–896. [Google Scholar] [CrossRef]
  76. Han, Y.; Zeng, X.; Hua, L.; Quan, X.; Chen, Y.; Zhou, M.; Chuang, Y.; Li, Y.; Wang, S.; Shen, X.; et al. The fusion of multi-omics profile and multimodal EEG data contributes to the personalized diagnostic strategy for neurocognitive disorders. Microbiome 2024, 12, 12. [Google Scholar] [CrossRef] [PubMed]
  77. Wang, J.; Zhang, F.; Jiang, F.; Hu, L.; Chen, J.; Wang, Y. Distribution and reference interval establishment of neutral-to-lymphocyte ratio (NLR), lymphocyte-to-monocyte ratio (LMR), and platelet-to-lymphocyte ratio (PLR) in Chinese healthy adults. J. Clin. Lab. Anal. 2021, 35, e23935. [Google Scholar] [CrossRef] [PubMed]
  78. Parikh, R.; Mathai, A.; Parikh, S.; Chandra Sekhar, G.; Thomas, R. Understanding and using sensitivity, specificity and predictive values. Indian. J. Ophthalmol. 2008, 56, 45–50. [Google Scholar] [CrossRef] [PubMed]
  79. von der Gracht, H.A. Consensus measurement in Delphi studies: Review and implications for future quality assurance. Technol. Forecast. Soc. Change 2012, 79, 1525–1536. [Google Scholar] [CrossRef]
  80. Kaddoura, S.; Al Husseiny, F. The rising trend of Metaverse in education: Challenges, opportunities, and ethical considerations. PeerJ Comput. Sci. 2023, 9, e1252. [Google Scholar] [CrossRef] [PubMed]
  81. Villegas-Ch, W.; García-Ortiz, J.; Román-Cañizares, M.; Sánchez-Viteri, S. Proposal of a remote education model with the integration of an ICT architecture to improve learning management. PeerJ Comput. Sci. 2021, 7, e781. [Google Scholar] [CrossRef] [PubMed]
  82. Halavais, A. Computer-Supported Collaborative Learning. In The International Encyclopedia of Communication Theory and Philosophy; AIP Publishing: College Park, MD, USA, 2016; pp. 1–5. [Google Scholar]
  83. Zajda, J. Social Constructivism to Improve Students’ Motivation. In Globalisation and Dominant Models of Motivation Theories in Education; Zajda, J., Ed.; Springer Nature Switzerland: Cham, Switzerland, 2023; pp. 63–79. [Google Scholar]
  84. Umar, N.; Atan, N.A.; Majid, U.M.A. Learning activities based on social constructivism theory to promote social interaction and student’s performance (EPSISM). In Proceedings of the 5th International Conference on Mathematics and Science Education (ICOMSE) 2021: Science and Mathematics Education Research: Current Challenges and Opportunities, Malang, Indonesia, 3–4 August 2021. [Google Scholar] [CrossRef]
  85. Cheung, C.M.K.; Chiu, P.-Y.; Lee, M.K.O. Online social networks: Why do students use facebook? Comput. Hum. Behav. 2011, 27, 1337–1343. [Google Scholar] [CrossRef]
  86. Zhang, D.; Hwang, G.-J. Effects of Interaction between Peer Assessment and Problem-Solving Tendencies on Students’ Learning Achievements and Collaboration in Mobile Technology-Supported Project-Based Learning. J. Educ. Comput. Res. 2022, 61, 208–234. [Google Scholar] [CrossRef]
  87. Taşkın, N.; Kılıç Çakmak, E. Effects of Gamification on Behavioral and Cognitive Engagement of Students in the Online Learning Environment. Int. J. Hum.—Comput. Interact. 2023, 39, 3334–3345. [Google Scholar] [CrossRef]
  88. DeJarnette, A.F.; Rollmann, S.M.; Vanderelst, D.F.; Layne, J.E. Coordinated activity and common ground during group problem solving in biology. Learn. Cult. Soc. Interact. 2023, 43, 100767. [Google Scholar] [CrossRef]
  89. Cakula, S. Active learning methods for sustainable Education Development. Rural. Environ. Educ. Personal. 2021, 14, 59–65. [Google Scholar]
  90. Caird, S.; Roy, R. Blended Learning and Sustainable Development. In Encyclopedia of Sustainability in Higher Education; Leal Filho, W., Ed.; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 107–116. [Google Scholar]
  91. Ozfidan, B.; Fayez, O.; Ismail, H. Student perspectives of online teaching and learning during the COVID-19 pandemic. Online Learn. 2021, 25, 461–485. [Google Scholar] [CrossRef]
  92. Han, F.; Ellis, R.A. Patterns of student collaborative learning in blended course designs based on their learning orientations: A student approaches to learning perspective. Int. J. Educ. Technol. High. Educ. 2021, 18, 66. [Google Scholar] [CrossRef]
  93. Peng, H.; Ma, S.; Spector, J.M. Personalized adaptive learning: An emerging pedagogical approach enabled by a smart learning environment. Smart Learn. Environ. 2019, 6, 9. [Google Scholar] [CrossRef]
  94. Sein, M. AI-assisted knowledge assessment techniques for adaptive learning environments. Comput. Educ. Artif. Intell. 2022, 3, 100050. [Google Scholar] [CrossRef]
  95. Kozanitis, A.; Nenciovici, L. Effect of active learning versus traditional lecturing on the learning achievement of college students in humanities and social sciences: A meta-analysis. High. Educ. 2023, 86, 1377–1394. [Google Scholar] [CrossRef]
  96. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef]
  97. Nguyen, K.A.; Borrego, M.; Finelli, C.J.; DeMonbrun, M.; Crockett, C.; Tharayil, S.; Shekhar, P.; Waters, C.; Rosenberg, R. Instructor strategies to aid implementation of active learning: A systematic literature review. Int. J. STEM Educ. 2021, 8, 9. [Google Scholar] [CrossRef]
  98. Hsu, C.-C.; Sandford, B.A. The Delphi technique: Making sense of consensus. Pract. Assess. Res. Eval. 2007, 12, 1–8. [Google Scholar]
  99. Krukowski, R.A.; Denton, A.H.; König, L.M. Impact of feedback generation and presentation on self-monitoring behaviors, dietary intake, physical activity, and weight: A systematic review and meta-analysis. Int. J. Behav. Nutr. Phys. Act. 2024, 21, 3. [Google Scholar] [CrossRef] [PubMed]
  100. Lim, D.H.; Morris, M.L.; Kupritz, V.W. Online vs. blended learning: Differences in instructional outcomes and learner satisfaction. J. Asynchronous Learn. Netw. 2014, 11, 27–42. [Google Scholar] [CrossRef]
  101. Glastonbury, B.; MacKean, J. Survey methods. In Handbook for Research Students in the Social Sciences; Routledge: London, UK, 2020; pp. 225–247. [Google Scholar]
Figure 1. Extract from the Round 2 survey.
Figure 1. Extract from the Round 2 survey.
Sustainability 16 03269 g001
Table 1. Academics involved in the study.
Table 1. Academics involved in the study.
Number of Experts
Experience in instructional design
    1 to 5 years1
    6 to 10 years7
    11 to 20 years8
    More than 20 years3
Academic level
    Postgraduate3
    Undergraduate6
    Both 10
Experience with online delivery
    1 to 3 years2
    4 to 6 years4
    7 to 10 years5
    More than 10 years8
Discipline 1
    Information technology and computer science 8
    Business and management6
    Education4
    Economics2
    Social sciences2
    Engineering1
    Library1
    Medicine1
    Exercise science1
    Human sciences1
1 Some academics belong to more than one academic discipline.
Table 2. Round 1 result.
Table 2. Round 1 result.
Delivery MethodRound 1 MeanRound 1 SD
Face-to-face instructor-led3.740.91
Online instructor-led3.841.04
Face-to-face collaborative work4.001.17
Online collaborative work3.951.05
Online self-paced3.791.24
Table 3. Round 2 result.
Table 3. Round 2 result.
Delivery MethodRound 2 MeanRound 2 SD
Face-to-face instructor-led3.331.07
Online instructor-led3.80.65
Face-to-face collaborative work3.931.12
Online collaborative work4.270.57
Online self-paced3.670.79
Table 4. Result of using the Kruskal–Wallis H-test to examine differences in ratings among the different disciplines.
Table 4. Result of using the Kruskal–Wallis H-test to examine differences in ratings among the different disciplines.
Delivery Methodp-Value
Face-to-face instructor-led0.432
Online instructor-led0.146
Face-to-face collaborative work0.196
Online collaborative work0.573
Online self-paced0.661
Table 5. Result of using the Kruskal–Wallis H-test to examine differences in ratings among the different academic levels.
Table 5. Result of using the Kruskal–Wallis H-test to examine differences in ratings among the different academic levels.
Delivery Methodp-Value
Face-to-face instructor-led0.198
Online instructor-led0.062
Face-to-face collaborative work0.102
Online collaborative work0.034
Online self-paced0.312
Table 6. Results of the pairwise Mann–Whitney U tests with Bonferroni correction for online collaborative work.
Table 6. Results of the pairwise Mann–Whitney U tests with Bonferroni correction for online collaborative work.
Academic Level Pairp-Value
Postgraduate vs. both0.117
Undergraduate vs. both1.000
Postgraduate vs. undergraduate0.074
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alammary, A.S. Blended Learning Delivery Methods for a Sustainable Learning Environment: A Delphi Study. Sustainability 2024, 16, 3269. https://doi.org/10.3390/su16083269

AMA Style

Alammary AS. Blended Learning Delivery Methods for a Sustainable Learning Environment: A Delphi Study. Sustainability. 2024; 16(8):3269. https://doi.org/10.3390/su16083269

Chicago/Turabian Style

Alammary, Ali Saleh. 2024. "Blended Learning Delivery Methods for a Sustainable Learning Environment: A Delphi Study" Sustainability 16, no. 8: 3269. https://doi.org/10.3390/su16083269

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop