Next Article in Journal
Management Assessment and Future Projections of Construction and Demolition Waste Generation in Hai Phong City, Vietnam
Next Article in Special Issue
Developing Students’ Attitudes toward Convergence and Creative Problem Solving through Multidisciplinary Education in Korea
Previous Article in Journal
The Characteristics of Soil Ca and Mg Leakage in a Karst Depression
Previous Article in Special Issue
Visualizing Source-Code Evolution for Understanding Class-Wide Programming Processes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education

by
Lama Soubra
1,*,
Mohammad A. Al-Ghouti
1,
Mohammed Abu-Dieyeh
1,
Sergio Crovella
1 and
Haissam Abou-Saleh
1,2
1
Environmental Science Program, Department of Biological and Environmental Sciences, College of Arts and Sciences, Qatar University, Doha 2713, Qatar
2
Biomedical Research Center, QU Health, Qatar University, Doha 2713, Qatar
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(15), 9625; https://doi.org/10.3390/su14159625
Submission received: 4 June 2022 / Revised: 30 July 2022 / Accepted: 1 August 2022 / Published: 5 August 2022

Abstract

:
Online education became more prevalent during the COVID-19 pandemic in many countries around the world, including the Gulf Cooperation Council (GCC) countries. This study aims at assessing the impacts on learning and skills of two student-centered instructional strategies (problem-based learning (PBL) and just-in-time teaching (JiTT)) used online and their implementation challenges. The PBL and JiTT were implemented in modules taught in various courses delivered at different bachelor’s study levels and disciplines. The research used a mixed design research method. Quantitative data were collected from exam scores and two self-administered surveys. Qualitative data were collected using individual structured interviews. The lecture-based learning method was used for comparisons. A total of 134 students participated in the quizzes and exams, 85 students completed the self-perceived impacts on learning and skills survey, and 82 students completed the implementation challenges survey. Ten students participated in the structured interviews. Tests and survey scores showed that both online PBL and JiTT had significant impacts on students learning and skills and that these effects are consistent across various disciplines. A non-conducive online learning climate, internet connectivity problems, heavy workloads, and time management issues were reported as the implementation challenges. The PBL and JiTT can be considered as effective teaching/learning strategies in online education.

1. Introduction

As information becomes more readily accessible, technology more widespread, and competition more prominent, modern professions call for graduates with complex skills such as critical thinking, problem-solving, communication, organizational, collaboration, and self-directed learning, in addition to the core professional skills. To cope with the changing market needs and promote the achievement of required skills, higher education institutions had to reform their curricula and call for adopting innovative instructional practices methods relying on student-centered learning (SCL) pedagogy and digital technology [1,2]. Qatar University, as a leading institution of higher education, has embraced the need for instructional change and has endorsed student-centered education and digital learning by considering them as education excellence themes [3].
Student-centered learning (SCL), also known as learner-centered education, broadly encompasses teaching and learning methods that focus on creating and implementing active roles of the learners by placing them at the heart of learning [4]. It encourages students to deeply engage with the material, develop a dialogue and collaboration, critically think, and reflect on their progress [5]. It promotes many cognitive and soft skills such as critical thinking, problem-solving, organization, collaboration, and life-long learning. SCL is founded on the constructivism learning theory that enables learners to actively construct their knowledge from new and prior experiences [6].
There is a myriad of SLC instructional strategies such as problem-based learning (PBL), case studies, just-in-time teaching (JiTT), flipped classrooms, and many others. These strategies focus on providing tools and learning environments that facilitate interactions and collaborations between students while promoting deep learning [7].
The benefits of integrating SCL instructional strategies in the physical classroom setting were reported to seem substantial in terms of improved learning outcomes compared to the traditional instructional methods [8,9,10]. In addition, SCL strategies increase in-class teaching efficiency and effectiveness. They also improve students’ preparation for the class sessions, endorse collaborative problem-solving during the class session, enhance student motivation for learning, and promote the ongoing formative assessment of student learning (by both instructors and students) [11]. In addition, they adopt the active learning approach that was found to promote student learning [12]. Furthermore, they provide structured opportunities for students to actively construct new knowledge from prior knowledge [11].
Online education, also called e-learning, distance learning, and distance education, is a form of education in which the main elements include a physical separation of teachers and students during instruction and the use of various technologies, digital tools, and learning platforms to facilitate teacher–student and student–student interactions. Correspondence courses were among the first types of distance education, but distance education did not pick up steam until communications technology evolved in the 1990s [13]. Moreover, the advent of the COVID-19 pandemic has given a push for the acceleration of digitalization of universities all around the world and led to the need for a rapid transition to online education [14]. The transition from traditional face-to-face learning to online education was smooth for universities that had established experiences in online education, and students appreciated the online education during the pandemic, the teacher’s teaching skills, and the quality of online courses [13]. Nevertheless, this transition presented many challenges for the universities that did not engage in online education prior to the pandemic. These challenges were reported to be related to the flexibility of the available learning platforms and digital tools, acquaintance with information and communication technologies (ICT) of both educators and the learners, learning environments, and readiness for online teaching/learning [15,16,17,18,19,20,21,22,23,24]. It also raised uncertainties on the learning pedagogy that applies best to the online setting [17] and opened opportunities for good practices that are necessary for professional development [25].
Although some groups have already shared their teaching practices in online education, very few of them addressed SCL in the online setting [16,17,18,19,20,21,22,23,24,26]. Thus, it remains uncertain whether SCL instructional strategies delivered in the physical classroom setting will yield similar results when delivered online due to contextual differences such as lack of face-to-face interactions, lack of instructor availability during the whole time of the session, non-conducive at-home learning environment, internet connection interruptions, and social isolation. Moreover, most of the previously published studies were reported from countries outside Qatar. Since the cultural background and social context might be fairly different, the findings of these studies cannot be fully extrapolated to the Qatari context to learn lessons and pave the road for future changes in the higher education system. In addition, most of these studies reported online education experiences instead of measuring their impacts on student outcomes. In addition, very few previous studies were performed using a structured methodology, and according to our knowledge, they rarely assessed the effectiveness of a teaching method. Thus, we designed this research to address these research gaps and respond to the research needs in the field of online education in general and the Qatari context in specific.
Therefore, this research aims to test the impacts of two student-centered learning (SCL) strategies, namely problem-based learning (PBL) and just-in-time teaching (JiTT), on student learning of the subject matter and skills, mainly critical thinking, problem-solving, communication, organization, and collaborative and independent learning skills, that are needed for contemporary professions. It also aims to assess whether these effects on learning and skills are consistent across the various course subjects and the students’ study levels and are sustained over time.
Figure 1 shows the research questions that are addressed by the current study:
  • What are the impacts on the short-term learning of the traditional instructional strategy?
  • What are the impacts on the long-term learning of the traditional instructional strategy?
  • SCL instructional strategies effectively improve learning when used in the in-class physical setting compared to traditional instructional strategies. Would this be also observed in online education?
  • SCL instructional strategies effectively develop students’ critical thinking and problem-solving skills when used in the in-class physical setting. Would this be also observed in online education?
  • SCL instructional strategies are effective in motivating students to learn the course material when used in the in-class physical setting. Would this be also observed in online education?
  • SCL instructional strategies effectively develop students’ communication, collaboration, and independent learning skills when used in the in-class physical setting. Would this be also observed in online education?
  • What are the challenges in implementing just-in-time teaching (JiTT) that includes short web-based exercises when used in online education?
  • What are the challenges in implementing problem-based learning (PBL) when used in online education?
Figure 1. Study methods and research questions.
Figure 1. Study methods and research questions.
Sustainability 14 09625 g001

2. Problem-Based Learning (PBL) and Just-in-Time Teaching (JiTT) Overview

2.1. Problem-Based Learning (PBL)

Problem-based learning is an SCL instructional strategy that originated in medical education and has been widely adopted in diverse disciplines and educational contexts [27,28,29]. It is a form of active learning where students assume responsibility for their learning [30]. In principle, PBL revolves around four learning principles: constructivism, contextual learning, collaborative learning, and self-directed education [31]. In PBL, students learn about a subject while working in groups to solve an open-ended real-world problem [27]. The problem drives both the motivation to learn and the learning itself [27]. Critical to the success of PBL is the selection of the problem. The problem should be ill-structured, authentic, complex, and unexpected [32]. It should be able to motivate and enable the students to learn new materials in the process of solving the problem [32]. In the context of PBL, instructors act as facilitators, guiding the learning process and conducting a thorough debriefing at the end of the learning experience [31]. In brief, the goals of PBL are the acquisition of an integrated body of knowledge that can be retrieved, applied, and transformed when needed and the development of critical thinking, team-building, and self-directed learning skills that allow students to masterfully deal with new and complex problems in their careers [27]. PBL is a process that starts with a problem that students analyze as a group based on the background knowledge they have. Then, the group brainstorms possible solutions and decides what further information is needed to solve the problem. These ideas and suggestions are formulated as learning objectives afterward. Independent study follows as each group member is tasked to find the desired information. The group members gather again to share collected information, discuss the problem further in light of the new information obtained, and suggest possible solutions [33]. The students complete the learning process by reflecting with the intention to improve their learning performance [34]. They proceed to make generalizations about the problem so they can transfer their learning to new future problems [34]. The process ends with feedback and assessment of their individual work and team members’ work [34]. This process has been described as the seven classical steps of PBL: (1) understand the situation/clarify terminology, (2) identify the problem, (3) suggest possible causes (hypothesize), (4) connect problems and causes, (5) decide what type of information is needed, (6) obtain information, and (7) apply the information [35]. The process is repeated in many rounds until the problem is solved [34]. There is evidence supporting the effectiveness of PBL across various disciplines [27]. PBL was shown to effectively enhance longer knowledge retention and the application of knowledge [28,29,33,34,36]. In addition, PBL was found to promote the development of critical thinking skills, problem-solving abilities, communication skills, and self-directed learning skills [28,29,33,34,36]. It can also provide opportunities for working in groups and finding and evaluating research materials [37]. Therefore, PBL was reported to enhance interdisciplinary knowledge creation and collaborative skills [28,29,38]. The entire process is very engaging, which has been shown to improve retention and student satisfaction [28,29,33]. However, studies on the process are still inconclusive with regards to which step most significantly impacts students’ learning, although causal studies have demonstrated that the whole process is indispensable in influencing students’ learning outcomes [28].

2.2. Just-in-Time Teaching (JiTT)

Just-in-time teaching (JiTT) is an SCL teaching and learning strategy that is based on the interaction between web-based study assignments (warm-ups) and an active learner classroom [a]. It relies on a feedback loop between web-based learning materials and the classroom [39,40,41]. JiTT consists of providing students with learning resources and short web-based assignments that are usually completed and returned to the instructor before the class session [39,40,41]. The instructor reviews students’ responses to the assignments before the class session, adapts the lesson, and tailors class activities according to students’ actual learning needs [39,40,41]. JiTT allows both students and instructors to be better prepared for the class session, yielding a more efficient use of the class time [39,40,41,42,43,44]. JiTT is built on the constructivism learning theory, where students actively construct their knowledge from prior knowledge [39,41]. Initially developed for introductory physics courses, its use has spread to various disciplines [39,45]. More recently, JiTT using video-based lectures (VBLs) was incorporated and was very well perceived by students [42,46]. JiTT has proven effective in enhancing students learning, promoting the students’ responsibility to learn the content, improving classroom climate, motivating students to learn, promoting good learning habits, and fostering deeper learning of the materials [42,43,44,46,47,48,49,50]. In addition, JiTT was found to increase student satisfaction and cognitive gains [44,47,48].

3. Materials and Methods

3.1. Adopted Instructional Strategies

Two SCL instructional strategies were selected and implemented online in this study, namely just-in-time teaching (JiTT) and problem-based learning (PBL). These strategies were implemented in two independent course modules within the same course. In addition, one course module was delivered using lecture-based learning (LBL) in the physical setting and was considered the reference standard for comparisons.

3.2. Setting

The study was conducted at the Department of Biological and Environmental Sciences, College of Arts and Sciences, Qatar University, between January and May 2022 (spring 2022 semester). The files, consents, and surveys were prepared by the researchers and submitted for Institutional Review Board (IRB) approval using the IRB net website before the start of the study. The study was approved by the IRB with the number 1823096-1.
Four courses (two from biological sciences and two from environmental science curricula) were selected: two courses were delivered at the junior level (BIOL 110 (Human Biology) and BIOL 212 (Genetics)), and the other two at the senior level (BIOL 452 (Molecular Analytical Techniques) and BIOL 433 (Monitoring and Toxicology)) (Table 1). The courses were selected based on the willingness of their instructors to participate in this study. Each course was run in one section and delivered by one instructor, with the exception of the human biology course, which was run in many sections and taught by multiple instructors. However, only one instructor teaching one section of the human biology course agreed to participate in this study. Table 2 presents a summary of the experiment.
All instructors who participated in this study had extensive experience in teaching the subject matter and received training on student-centered pedagogy, focusing on PBL and JiTT. All included courses were implementing an online PBL/JiTT component for the first time. These courses were redesigned to include at least one module that is taught online using PBL/JiTT. The course instructors had the freedom to select the module for PBL and to develop the problem.

3.3. Course Material Development and Implementation

All instructors followed the core PBL principles in the scenario design, including contextual, constructive, collaborative, and self-directed learning. Each instructor first articulated the learning objectives of the module that will be delivered online using PBL, and then the PBL scenario was crafted. All crafted scenarios contained minimal information and incomplete picture mimicking real-life situations and an embedded problem emerging from student brainstorming. Moreover, the complexity level of the scenarios depended on the course level; i.e., the courses taught at the junior level had more simple scenarios than those taught at the higher level. All scenarios were reviewed by an expert committee formed of two members who have extensive experience in PBL to guarantee their quality and ability to meet the learning objectives. Examples of PBL scenarios are presented in Table 3.
Prior to the launch of the PBL module, the instructors randomly assigned the students who were enrolled in their courses into groups, with each group composed of 4–6 students. Moreover, the instructor explained the PBL process to them and his/her expectations. The PBL module was launched online using the distance-learning window of the Blackboard learning platform. The module was run fully online over four sessions with no face-to-face interactions between the students and the course instructors. The first and last sessions were scheduled by the instructor and conducted in the instructor’s presence, while the other two were scheduled by the students according to their own preferences and were conducted in the absence of the course instructors. The first session objectives were to define the problem and formulate learning objectives that would enable solving the problem.
Therefore, in the first session, the scenario was distributed to the students who were asked to clarify concepts. In groups, students started to read the scenario presented to them and unpack its components in an open and inclusive brainstorming process. During this session, the students were guided by the following questions: What information is being given? What information is missing (what do we not know)? What is the problem that we need to address and resolve? What are the information and tools needed to solve the problem? In addition, each group had to define the problem, develop the hypotheses (based on the possible causes), rank them according to priorities, and prepare requests for additional data. The instructor moved among the groups during brainstorming, observing students’ interactions, providing guidance when needed, and prompting them for data requests. Then, the whole class reconvened, and each group started to share their hypotheses and their data requests accordingly. Based on their hypotheses and data requests, the instructor released the data incrementally related to the scenario. After exposing the whole scenario, students were again split into their respective groups and started developing the problem statement (in the form of a question), formulating their learning objectives, and dividing tasks among group members. Afterward, during the second session, students had to work independently to investigate a topic area as determined in the first phase and prepare an individual report. This session was followed by the third session where students shared their reports with their groups, essentially teaching their group members what they had learned. The group then discussed how this new knowledge informs the problem. Once all individual reports had been discussed, the group revisited the questions presented in the first session and attempted to address or solve the problem. Moreover, during this session, students also collaborated to prepare a final report that outlined the solution and recommendations. The students were also asked to include supporting and properly cited evidence from their research and evidence for their online meetings and discussions in this report.
In the final session, solutions were shared and discussed in the presence of the whole class, and the instructor provided feedback and a brief recap of the main learned concepts.
The JiTT was introduced in two courses: BIOL 110 (Human Biology) and BIOL 433 (Monitoring and Toxicology). The course instructor also selected the study module where JiTT was to be used. The module was run over two online sessions: one synchronous and one asynchronous. In the asynchronous session, the student had to watch a prerecorded lecture to learn the module content and go through the exercises/case studies and complete them before the scheduled class session time. Once completed, the instructor went over the answers and tailored the content and activities of his/her upcoming lecture based on the learning needs of the students. This session was delivered online in a synchronous mode. Examples of exercises are presented in Table 4.

3.4. Participants

All students enrolled in the four described above courses participated in this study, yielding 134 students (Table 5). These students were initially divided into four cohorts based on their course enrollment. They were also divided into junior and senior student cohorts based on their study level.

3.5. Data Generation and Collection

To achieve the objectives of this study, a mixed-methods research design was used. This design combines the strengths of quantitative and qualitative data [51]. Quantitative data were collected from test scores and two surveys. In addition, structured individual interviews were conducted to generate qualitative data that would help explain findings from surveys.
Quizzes and final exams were prepared according to best practices guidelines and were reviewed by a committee composed of the researchers and the course instructors. The quizzes were knowledge-based and were administered to students a week after module completion. A set of knowledge-based questions and problems (or case studies that require higher-order thinking levels) related to the modules taught using these strategies were prepared and included in the final exams. The knowledge-based questions had a similar level of complexity to those of the quizzes. The final exams were administered during the final exam period as scheduled by Qatar University approximately 3 months after the teaching encounter. Mean test scores (±SD) were calculated for the quizzes and the set of questions of the final exams for the module delivered online using PBL and JiTT and for a module delivered in person using a traditional instructional strategy. Moreover, the percentages of students who passed the tests (i.e., graded above 5/10 on quizzes and answered correctly half or more of the questions related to the modules) were determined.
Two surveys that aimed to assess the self-perceived impacts on student learning and skills of the two instructional strategies and the implementation challenges were developed based on a thorough review of the current literature. A committee composed of three experts qualitatively evaluated the face and content validity of the surveys. For the face validity, the experts were asked to give their comments on whether the measured items can—truly assess the concept of the research. As for the content validity, the experts were asked to give their comments about the coherence of the questionnaire and the relevance, difficulty, and clarity of the items. The survey items were modified based on received feedback.
Moreover, the surveys were pilot tested on 34 students to check for their clarity, flow, and time needed to be completed. The pilot test was conducted using a sample of students who were enrolled in another section of the human biology course and were also exposed to PBL and JiTT, just after completing the learning activities. No modifications were made to the surveys based on pilot test results. The internal consistency of the questionnaire was measured by determining the Cronbach’s alpha coefficient of the different sections and of the overall surveys (Table 6 and Table 7). The Cronbach’s alpha coefficient values for the sections of the survey and for the overall surveys were above 0.7, demonstrating that the surveys are reliable instruments [52]. Surveys collected during the pilot testing were not used in the final study sample.
The students were invited to fill in the surveys at the end of the course. In the invitation, students were informed of the objectives of the study. They were also explicitly told that participation in the surveys is voluntary and will not affect the instructor/student relationships or students’ grades and that they can withdraw from the research at any time, without any consequences. The surveys were collected by one of the researchers and coded to ensure anonymity (each student’s survey was assigned a code), and collected data were entered into an Excel sheet and treated confidentially to serve the purpose of the study only. In addition, all participants had to provide written consent prior to filling in the surveys.
The self-perceived impacts on learning and skills survey contained three sections with a total of 18 items (Table 6): impacts on learning of the subject matter (5 items); impacts on intrinsic interest to learn (3 items); impacts on preparedness level (3 items); impacts on critical thinking and problem-solving skills (4 items); and impacts on the personal skills (communication, collaboration, and self-directed learning (3 items). All items were assessed using a 5-point Likert scale (1 (strongly disagree) to 5 (strongly agree)), and participants were asked to evaluate the extent to which they agreed with each of the statements included in each section. Mean scores for each section were then calculated to obtain a final score for the section. In addition, the percentages of students strongly agreeing and agreeing with each item statement of the five sections were determined.
The implementation challenges surveys contained four sections with a total of 14 items (Table 7): adequacy of the learning platform (3), teaching and learning methods (4), learning environment (1), and easiness of interactions (5). Each item was also assessed using a 5-point Likert scale (1 (strongly disagree) to 5 (strongly agree)). The percentages of students strongly agreeing and agreeing with each item statement were determined.
Finally, a structured interview was conducted to understand the survey results. The interview questions included questions about the aspects of the teaching method (PBL/JiTT) that they liked/did not like most and the reasons behind that, as well as their feelings towards the use of these strategies in the online setting. The interview questions were administered in the English language to individual students by one of the researchers (who was not the students’ instructor), using Microsoft Teams app. The interviews were recorded and transcribed verbatim. An invitation to participate in the structured interview was sent via e-mail to the students enrolled in the selected courses after the end of the course and the survey collection period. Students were offered the option to select the interview date and time that best suited them based on a preset schedule. Here also, it was clearly stated that the participation in the interview is voluntary, it will not affect the instructor/student relationships or students’ grades, and collected data will be treated confidentially and used to serve the purpose of the study only. Each participant was assigned a code to ensure anonymity. Moreover, all participants had to provide written consent prior to participating in the interview.

3.6. Data Statistical Analysis

Quantitative data derived from exam scores and surveys were analyzed using descriptive statistics. Means with standard deviations were determined for continuous variables (test and survey scores) and compared using Student’s t-test (when comparing two groups) and ANOVA (when comparing more than two groups) with post hoc analysis. Percentages were derived for categorical data (passing the exams, agreeing with the survey items) and were compared using Pearson’s chi-square test. Pearson correlation and regression analysis between final exam scores and self-perceived impacts on learning of the subject matter were done. Qualitative data generated from the transcription of the individual interviews were subjected to content analysis to explore the narrative themes and the students’ main concepts related to impacts on learning and skills and implementation challenges.

4. Results and Discussion

4.1. Test Scores

A total of 134 students participated in this study and completed both quizzes and final exams. Table 8 presents the average test scores of the quizzes and final exams for the modules taught using different instructional methods. The final exam scores showed that the test scores of modules taught using online PBL were the highest, followed by the test scores of modules taught using online JiTT and the test scores of the module taught using in-person LBL. The difference in final exam scores was statistically significant. Moreover, although no statistically significant difference was observed among quiz scores for both courses, the quiz scores of modules taught using online PBL were the highest, followed by scores of quizzes taught using JiTT and the scores of quizzes for modules taught using in-person LBL. This trend was observed among all cohorts and across different disciplines and student levels.

4.2. Self-Perceived Impacts on Learning and Skills

A total of 85 students participated in the self-perceived impacts on learning and skills survey, yielding a response rate of 63.4%. Forty-four (52%) students were junior students. Twenty-five participants (30%) were enrolled in the human biology course, 19 participants (22%) were enrolled in the genetic course, and 41 participants (48%) were senior students enrolled in the environmental science courses (BIOL 433 = 25, BIOL 452 = 16). The percentages of students strongly agreeing/agreeing with the statements on the impacts on learning and skills and average scores (±SD) are presented in Table 9. Results showed that the PBL and JiTT used online were perceived to positively impact the understanding of the subject matter, in terms of improving the learning of the module material, concepts, and applications and enhancing the learning process (engagement with the course material and the instructor). Moreover, the PBL and JiTT used online were also perceived to increase the intrinsic interest in learning in terms of motivation for learning the module concepts (average score ± SD = 4.44 ± 0.83) and improved preparedness level for class discussions, exams, and workplace placement ((average score ± SD = 3.65 ± 0.83). In addition, results showed that PBL and JiTT used online were perceived to enhance the students’ skills in terms of critical thinking and problem-solving (average score ± SD = 4.19 ± 1.21) and communication, collaboration, and independent learning skills (average score ± SD = 3.98 ± 0.95). Further, analysis of variance showed that there was a statistically significant difference between senior students’ and junior students’ survey scores related to self-perceived impacts on intrinsic interest for learning and preparedness level, but not to learning of the subject matter, critical thinking/problem-solving skills, and personal skills. Finally, no statistically significant difference was observed among survey scores of groups enrolled in different courses that are delivered at the same study level (Table 10).

4.3. Correlation between the Self-Perceived Impacts on Learning and Critical Thinking and Problem-Solving Skills and Performance on the Final Exam

Figure 2 shows the correlation between the distribution of the scores on the learning of subject matter section of the survey when online PBL and JiTT are used and the students’ performance as reflected by their final exam scores. The linear regression shows a slope of 1.76 and an intercept of 1.1037. Importantly, the regression analysis results indicate that there is a significant relationship (R (83) = 0.852 (p < 0.001)) between the scores of the impacts on the learning of the subject matter when online PBL/JiTT is used and the final exam scores on the PBL/JiTT module. The value of R2 is 0.727.

4.4. Implementation Challenges

A total of 82 students participated in the implementation challenges survey, yielding a response rate of 61.2%. Forty-four (53%) students were junior students. Twenty-five participants (30%) were enrolled in the human biology course, 19 participants (23%) were enrolled in the genetic course, and 38 participants (47%) were enrolled in the environmental sciences courses (22 in BIOL 433, 16 in BIOL 452). The percentages of students strongly agreeing/agreeing with the statements related to the challenges faced during the online implementation of PBL and JiTT are presented in Table 11. Results showed that the available learning platforms were adequate for the online implementation of both the PBL and JiTT. Moreover, around 90% of the participants strongly agreed/agreed that both learning strategies (PBL and JiTT) were suitable for online education and that it was not difficult for them to sustain focus and interest during online sessions or to collaborate and communicate between them. In addition, 90% and 73% of the participants strongly agreed/agreed that the online interaction with their colleagues and instructors was easy and that it was similar to the in-class physical setting. In addition, more than 50% strongly agreed/agreed that the online learning environment is not conducive to learning because of internet instability and noisy at-home environments. Finally, only 44% of participants strongly agreed/agreed that the interactions and communications with other teams and the whole class were easy as they would have been in the physical class setting.

4.5. Structured Interviews

Ten students participated in the structured interviews, out of which six were seniors. Data analysis indicated the following aspects to be the most liked about using online PBL as an instructional strategy: its ability to fully engage in the learning process; its ability to make students understand how learned material applies to real-life situations; its ability to enhance their learning of the subject matter through teaching others, discussions, and searching for solutions to the problem; its ability to have control over their learning; its ability to enhance their skills such as research, communication, teamwork, leadership, analysis, and problem-solving skills; its ability to engage all team members in learning activities; its ability to make students accept and value the opinions of other team members. In addition, two concerns emerged related to using SCL instructional strategies. The first addressed workloads imposed by both PBL and JiTT, and the second was related to time management in terms of students having to organize their learning activities and tasks in such a way to be ready for collaborative activities. Regarding online JiTT, data analysis indicated that students found that JiTT made learning more meaningful, provided students with an opportunity for timely feedback, provided them with an opportunity to identify their learning needs, reduced stress during class sessions, and developed students’ problem-solving skills.
As for the recommendation to use PBL and JiTT as instructional strategies in the online setting, most of the participants highly recommended the use of both instructional methods as they were enjoyable, can be easily done using available technology and learning platforms, and provide more flexibility in terms of time and place for learning encounters. However, three themes emerged as concerns for the use of PBL in the online mode. The first was related to the learning climate (learning places) that was described as unusual and non-conducive for learning (such as cafeterias, coffee shops, and homes). The second was related to the lack of social interactions, which might hinder the development of social skills and collegiality among team members. The third was related to internet connectivity.

4.6. Interpretation of Findings

This study aimed at assessing the impacts on learning and skills of two SCL instructional learning strategies, PBL and JiTT, in an online setting and their implementation challenges.
Results of this study showed that online PBL and JiTT are as effective as face-to-face LBL on short-term knowledge acquisition and retention, as demonstrated by the absence of a significant difference between the scores on the quizzes and percentages of students passing the quizzes for the modules taught using the three instructional strategies. Moreover, based on the results of the test scores, online PBL and JiTT had significant positive impacts on long-term knowledge acquisition and retention, in addition to critical thinking and problem-solving skills, when compared to the LBL method. These impacts were maintained across the various course disciplines and study levels. These results are similar to findings from other studies where PBL and JiTT were used in the physical classroom in a variety of courses delivered at various study levels [36,38,42,47,48,49,51,52,53,54,55,56]. These results can be explained by the underpinning pedagogy related to the used instructional strategies. Indeed, both PBL and JiTT are active learning pedagogies that engage students in deep learning through thinking, investigating, discussing, and creating. They also provide students with multiple opportunities for deep engagement and interaction with the learning content. Furthermore, applying new knowledge to solve problems helps students organize knowledge, make connections, and develop a deeper understanding of the course material. In other words, they promote a deep approach to learning, which was reported to improve long-term knowledge acquisition and retention, and enhance critical thinking and problem-solving skills as well [8,9,10,50,57,58,59,60]. In contrast to active learning, passive learning holds the student to absorb the information that is usually presented in the form of lectures. This type of learning promotes a surface approach to learning, which was reported to enhance the abilities of students to recall facts rather than to have meaning to what they learn.
Therefore, this method promotes a surface approach to learning. It also induces convergent thinking, where a given question typically has only one right answer and therefore enables students to perform well in knowledge-based quizzes administered shortly after the lecture, as observed in the current study [5].
Further, the results of this study suggest that these instructional strategies maintain their effectiveness in terms of impacts on learning and critical thinking and problem-solving skills in the online learning setting. This result is further supported by a limited number of studies that compared online PBL with PBL delivered in the physical classroom setting [60,61,62,63,64].
Moreover, students’ self-perceived impacts on learning the subject matter and critical thinking and problem-solving skills correlated well with their performance on their final exams. This finding reflects well that these methods work well online and are still able to highly engage the student in the learning of the subject matter, critical thinking, and problem-solving activities. This result is additionally confirmed by the responses to the interview questions where students reiterated the ability of these instructional strategies to engage them in the learning process, help them in learning the subject matter, and develop their critical thinking and problem-solving skills.
Moreover, the results of this study revealed that both PBL and JiTT were perceived by the students to have positive impacts on their learning and skills in the five survey domains: learning the subject matter, intrinsic interest in learning the subject matter, preparedness level, critical thinking and problem-solving skills, and personal skills. This result was variably reported in the literature where some studies showed positive and higher impacts of online PBL on learning and skills, whereas one study showed lower impacts of online PBL when compared to face-to-face PBL [61,62,63,64]. Moreover, even though the self-perceived impacts on learning and skills were positive for the general cohort of students, the junior cohort had significantly lower self-perceptions than the senior cohort in the following two survey domains: intrinsic interests in learning the subject matter and preparedness level. This finding might be multifactorial. The first factor may be related to the increased cognitive efforts associated with SCL instructional strategies, which greatly negatively affect students’ motivation and engagement in learning [65]. Another factor may be related to the students’ course enrollment motivation [66,67]. The reasons motivating students have a powerful influence on their intrinsic interests to learn the subject matter and perceptions of the importance of the subject matter for their future careers. Indeed, most of the junior students enrolled in human biology and genetics courses because these were designated as general university-required courses for certain majors. Therefore, they may basically have limited interest in learning the subject matter and insights into its usefulness. Thus, they are less able to self-assess the impacts of the teaching pedagogy on their motivation to learn and on preparing them for future careers. A final factor could be that junior students are unfamiliar with such active learning instructional strategies and, therefore, may not be able to appreciate all their short-term and long-term benefits [68].
Although the use of PBL and JiTT as instructional strategies was perceived to positively impact students’ learning and skills, their implementation was coupled with many challenges from the students’ point of view.
Firstly, the use of these instructional strategies in the online setting was perceived to impose heavy workloads on the students. Indeed, it is well reported in the literature that both SCL strategies and online education pose additional workloads for students [5,18,69]. Hence combining both methods would have resulted in the perception of increased workloads.
Perceived workloads were reported to influence the students’ approaches to learning, making them more inclined towards the surface approach to learning. Therefore, this challenge should be addressed carefully when considering the implementation of SCL instructional strategies in online education. Workloads might be adjusted through close coordination between the courses that are delivered at the same study level, on the one hand, and through varying the types of active learning activities (use a combination of low-stakes and high-stakes active learning strategies) within the same course. Calculating students’ workloads in hours is also a recommended strategy. This strategy would help in planning appropriately the course activities and tasks so that they will not be imposing heavy workloads on students.
In addition, some students reported having difficulties managing their time to complete assigned tasks on time at an appropriate performance level. This finding could be attributed to differences in the learning abilities of students, where quick and moderate learners might complete assigned tasks more easily at a more rapid pace than slow learners [70]. Despite that time management issues were also reported in courses delivered using traditional learning methods in the classroom physical settings, they are more critical when online SCL strategies are employed. Indeed, the students’ actual learning in active learning strategies depends significantly on their level of engagement in the assigned tasks. Moreover, the lack of physical interaction in the online setting makes it difficult for the instructors to keep all students engaged at the same level and to pay attention to each student’s learning needs. Therefore, appropriate planning of activities and student monitoring would be effective strategies to help all students achieve their learning goals.
Likewise, internet instability was reported as the main challenge for the online implementation of synchronous PBL and JiTT sessions in both the survey responses and student interviews. This was described in previous studies that tackled online education in general [69,71]. Since communication, interactions, and collaborations among learners are considered as core characteristics of these methods [5], internet instability impacts on learning should not be overlooked.
The lack of learning climate and socialization were also reported among the implementation challenges. Learning climate plays an essential role in the students’ academic life because it significantly influences their learning processes [72]. Indeed, in contrast to the classroom physical setting, in the online setting, learning can occur in places that are not conceived for that purpose, such as cafeterias, homes, and cars. Therefore, when present in such places, students’ mindsets will not be making connections to learning. Further, these places may be noisy and full of distractions, which might disturb the students’ learning processes.
Moreover, in the online setting, students are deprived of building rapport with classmates [69]. All these factors might have implications for the students learning in the online setting. Hence, there is a need for educators to take into consideration these factors when designing their online learning modules to optimize students’ learning experiences. The use of blended learning may also help address these challenges.
Lastly, similar to other studies, the use of these instructional strategies and online education seems to be better appreciated by students when working in small groups rather than in large groups [73,74,75,76]. This might be explained by the fact that working in small groups provides the students with a conducive and collaborative learning environment. In addition, it facilitates student adaptation to new learning environments and pedagogies. The use of blended learning may help address this challenge as well.

4.7. Limitations and Opportunities

This study has many strengths, including the selection of the SCL instructional strategies to be tested, the use of a mixed design research method, the use of the in-person LBL method as a standard instructional strategy for comparison, the use of the tested instructional strategies in multiple courses that are delivered at different study levels (junior and senior), and being among very few studies that addressed assessing the impacts of online SCL strategies on student learning and skills. The study also has limitations. First, there was no direct comparison between the same SCL instructional methods when used in the physical classroom setting and when used in the online setting. Second, the implementation challenges survey did not address challenges related to information technology (IT) skills.
Indeed, the students who were enrolled in the courses belonged to Generation Z, which is recognized as the first social generation to have grown up with access to the Internet and portable digital technology from a young age. Members of Generation Z are considered as information technology adepts. Moreover, their IT skills and computer literacy were further developed by online education that was implemented during the COVID-19 pandemic.
Moreover, students were able to report challenges related to their IT skills during the interview. In addition, no observations were conducted by the instructors or the researchers during student-led sessions and/or teacher-led sessions. No formal feedback was collected from instructors on the implementation challenges. Indeed, this study aimed to assess the effectiveness and challenges from a student perspective, and many meetings were conducted between the researchers and instructors to optimize the student learning experiences. Finally, only female students participated in this study. This might limit the generalization of the findings since males and females might have differences in learning method preferences and views.
Despite these limitations, this study yielded important information that can inform educational planners and academicians on effective instructional strategies that can be applied in online education in times of crisis. In addition, future research can build on the key issues and limitations identified in this research.

5. Conclusions

Despite the abundance of data showing the effectiveness of PBL and JiTT on learning and skills in the physical setting, limited data exist on their usefulness in online education. This study showed that, based on the test scores, JiTT and PBL, when used online, were as effective as face-to-face LBL in promoting short-term learning and more effective than face-to-face LBL in promoting long-term learning, problem-solving, and critical thinking skills (research questions A, B, C, and D). In addition, based on the self-perceived impacts on learning and skills survey and interview responses, these instructional methods, when used online, were perceived to promote students’ critical thinking and problem-solving skills; students’ motivation for learning; and students’ communication, collaborative and independent learning skills (research questions D, E, and F). Moreover, the main challenges for the online implementation of these methods that were revealed by the survey responses and interviews were internet instability, lack of a learning climate, and lack of socialization. Interviews highlighted two additional challenges related to workloads and time management (research questions G and H).
In conclusion, this study demonstrated that these methods, when used online, had positive impacts on students’ learning and skills, and that these impacts were consistent across various disciplines and study levels. Therefore, PBL and JiTT can be considered as effective teaching/learning strategies that might be used in various disciplines and study levels in online education. Moreover, the findings of the present study shed light on the need for future studies that focus on comparing the same SCL learning strategy when used in different education modes (physical, blended, and online), and on identifying factors that would address identified challenges to optimize the students’ learning experiences in the online setting.

Author Contributions

Conceptualization, L.S.; formal analysis, L.S., M.A.A.-G. and M.A.-D.; funding acquisition, L.S., M.A.A.-G. and M.A.-D.; investigation, L.S., S.C. and H.A.-S.; methodology, L.S., M.A.A.-G. and M.A.-D.; project administration, L.S., M.A.A.-G., M.A.-D., S.C. and H.A.-S.; writing—original draft, L.S.; writing—review and editing, M.A.A.-G., M.A.-D., S.C. and H.A.-S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was conducted under a grant from Qatar National Research Fund (a member of Qatar foundation) (RRC02-0824-210043).

Institutional Review Board Statement

This study received the approval of Qatar University IRB (1823096-1).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

Special acknowledgment to all students and instructors who participated in this study, and to Qatar National Research Fund for making this research possible. The publication was made possible by RRC2 Grant # RRC02-0824-210043 ] from Qatar National Research Fund (a member of Qatar foundation).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. American Association for the Advancement of Science (AAAS). Describing and Measuring STEM Teaching Practices: A Report from a National Meeting on the Measurement of Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Teaching; American Association for the Advancement of Science: Washington, DC, USA, 2013; Available online: http://ccliconference.org/files/2013/11/Measuring-STEM-Teaching-Practices.pdf (accessed on 31 July 2022).
  2. Henderson, C.; Beach, A.; Finkelstein, N. Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. J. Res. Sci. Teach. 2011, 48, 952–984. [Google Scholar] [CrossRef]
  3. Qatar University. Qatar University Strategy 2018–2022. From Reform to Transformation. 2018. Available online: http://www.qu.edu.qa/static_file/qu/about/documents/Qatar%20University%20Strategy%202018-2022%20Booklet%20-%20EN.pdf (accessed on 31 July 2022).
  4. Jones, L. The Student-Centered Classroom; Cambridge University Press: Cambridge, UK, 2007. [Google Scholar]
  5. Biggs, J. Student Approaches to Learning and Studying; Australian Council for Educational Research: Melbourne, Australia, 1987. [Google Scholar]
  6. McLeod, S. Constructivism as a Theory for Teaching and Learning. Simply Psychology. 2019. Available online: https://www.simplypsychology.org/constructivism.html (accessed on 31 July 2022).
  7. Hoidn, S. Student-Centered Learning Environments in Higher Education Classrooms; Palgrave Macmillan: New York, NY, USA, 2017. [Google Scholar]
  8. McGreevy, K.M.; Church, F.C. Active learning: Subtypes, intra-exam comparison, and student survey in an undergraduate biology course. Educ. Sci. 2020, 10, 185. [Google Scholar] [CrossRef]
  9. Trinidad, J.E. Understanding student-centred learning in higher education: Students’ and teachers’ perceptions, challenges, and cognitive gaps. J. Furth. High. Educ. 2020, 44, 1013–1023. [Google Scholar] [CrossRef]
  10. Chau, S.; Cheung, C. Academic satisfaction with hospitality and tourism education in Macao: The influence of active learning, academic motivation, and student engagement. Asia Pacific J. Educ. 2018, 38, 473–487. [Google Scholar] [CrossRef]
  11. Saed, S.; Xiangyun, D. University faculty’s perceptions and practices of student centered learning in Qatar: Alignment or gap? J. Appl. Res. High. Educ. 2018, 10, 514–533. [Google Scholar] [CrossRef] [Green Version]
  12. Udovic, D.; Morris, D.; Dickman, A.; Postlethwait, J.; Wetherwax, P. Workshop Biology: Demonstrating the effectiveness of active learning in an introductory biology course. BioScience 2002, 52, 272–281. [Google Scholar] [CrossRef] [Green Version]
  13. Boca, G.D. Factors Influencing Students’ Behavior and Attitude towards Online Education during COVID-19. Sustainability 2021, 13, 7469. [Google Scholar] [CrossRef]
  14. Frecker, K.; Bieniarz, E. Why Online Education Is Here to Stay. Available online: https://www.lighthouselabs.ca/en/blog/why-education-is-moving-online-for-good (accessed on 31 July 2022).
  15. Pokhrel, S.; Chhetri, R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  16. Sunasee, R. Challenges of Teaching Organic Chemistry during COVID-19 Pandemic at a Primarily Undergraduate Institution. J. Chem. Educ. 2020, 97, 3176–3181. [Google Scholar] [CrossRef]
  17. Murgatrotd, S. COVID-19 and Online Learning. 2020. Available online: https://www.researchgate.net/publication/339784057_COVID-19_and_Online_Learning?channel=doi&linkId=5e653d424585153fb3cdf241&showFulltext=true (accessed on 31 July 2022).
  18. Donitsa-Schmidt, S.; Ramot, R. Opportunities and challenges: Teacher education in Israel in the COVID-19 pandemic. J. Educ. Teach. 2020, 46, 586–595. [Google Scholar] [CrossRef]
  19. Dutta, D.A. Impact of Digital Social Media on Indian Higher Education: Alternative Approaches of Online Learning during COVID-19 Pandemic Crisis. Int. J. Sci. Res. Publ. 2020, 10, 604–611. [Google Scholar] [CrossRef]
  20. Ghazi-Saidi, L.; Criffield, A.; Kracl, C.L.; McKelvey, M.; Obasi, S.N.; Vu, P. Moving from Face-to-Face to Remote Instruction in a Higher Education Institution during a Pandemic: Multiple Case Studies. Int. J. Technol. Educ. Sci. 2020, 4, 370–383. [Google Scholar] [CrossRef]
  21. Murphy, M.P.A. COVID-19 and emergency eLearning: Consequences of the securitization of higher education for post-pandemic pedagogy. Contemp. Secur. Policy 2020, 41, 492–505. [Google Scholar] [CrossRef]
  22. Qiang, Z.; Obando, A.G.; Chen, Y.; Ye, C. Revisiting Distance Learning Resources for Undergraduate Research and Lab Activities during COVID-19 Pandemic. J. Chem. Educ. 2020, 97, 3446–3449. [Google Scholar] [CrossRef]
  23. Regier, D.S.; Smith, W.E.; Byers, H.M. Medical genetics education in the midst of the COVID-19 pandemic: Shared resources. Am. J. Med. Genet. 2020, 182, 1302–1308. [Google Scholar] [CrossRef] [PubMed]
  24. Van der Spoel, I.; Noroozi, O.; Schuurink, E.; van Ginkel, S. Teachers’ online teaching expectations and experiences during the Covid-19-pandemic in the Netherlands. Eur. J. Teach. Educ. 2020, 43, 623–638. [Google Scholar] [CrossRef]
  25. Moore, R.L.; Fodrey, B.P.; Piña, A.A.; Lowell, V.L.; Harris, B.R. Distance Education and Technology Infrastructure: Strategies and Opportunities. In Leading and Managing e-Learning; Springer: Berlin/Heidelberg, Germany, 2018; pp. 87–100. [Google Scholar]
  26. Rossia, I.V.; de Limaa, J.; Sabatkea, B.; Ferreira Nunesa, M.A.; Ramirez, G.E.; Ramirezc, M.E. Active learning tools improve the learning outcomes, scientific attitude and critical thinking in higher education: Experiences in an online course during the COVID-19 pandemic. Biochem. Mol. Biol. Educ. 2021, 49, 888–903. [Google Scholar] [CrossRef]
  27. Barrows, H.S. Problem-Based Learning Applied to Medical Education; Springer: New York, NY, USA, 2000. [Google Scholar]
  28. Yew, E.; Goh, K. Problem-Based Learning: An Overview of its Process and Impact on Learning. Health Prof. Educ. 2016, 2, 75–79. [Google Scholar] [CrossRef] [Green Version]
  29. Zakaria, M.; Maat, S.; Khalid, F. A Systematic Review of Problem Based Learning in Education. Creat. Educ. 2019, 10, 2671–2688. [Google Scholar] [CrossRef] [Green Version]
  30. Grabinger, S.; Dunlap, J.C. Problem-Based Learning as an Example of Active Learning and Student Engagement. In Advances in Information Systems; Yakhno, T., Ed.; Lecture Notes in Computer Science; Springer: Berlin, Heidelberg, 2002; Volume 2457. [Google Scholar] [CrossRef]
  31. Barrows, H.S. Essentials of problem-based learning. J. Dent. Educ. 1998, 62, 630–633. [Google Scholar] [CrossRef] [PubMed]
  32. Barrows, H.S. An overview of authentic problem-based learning. In Authentic Problem-Based Learning: Rewriting Business Education; Wee, K.N.L., Kek, C.M.A., Eds.; Prentice Hall: Singapore, 2002; pp. 1–9. [Google Scholar]
  33. Klegeris, A.; Hurren, H. Impact of problem-based learning in a large classroom setting: Student perception and problem-solving skills. Adv. Physiol. Educ. 2011, 35, 408–415. [Google Scholar] [CrossRef] [PubMed]
  34. Kek, M.Y.C.A.; Huijser, H. The power of problem-based learning in developing critical thinking skills: Preparing students for tomorrow’s digital futures in today’s classrooms. High. Educ. Res. Dev. 2011, 30, 329–341. [Google Scholar] [CrossRef] [Green Version]
  35. Wood, D.F. ABC of learning and teaching in medicine: Problem-based learning. Brit. Med. J. 2003, 326, 328–330. [Google Scholar] [CrossRef] [PubMed]
  36. Trullàs, J.C.; Blay, C.; Sarri, E.; Pujol, R. Effectiveness of problem-based learning methodology in undergraduate medical education: A scoping review. BMC Med. Educ. 2022, 22, 104. [Google Scholar] [CrossRef]
  37. Duch, B.J.; Groh, S.E.; Allen, D.E. (Eds.) The Power of Problem-Based Learning; Stylus: Sterling, VA, USA, 2001. [Google Scholar]
  38. Brassler, M.; Dettmers, J. How to Enhance Interdisciplinary Competence—Interdisciplinary Problem-Based Learning versus Interdisciplinary Project-Based Learning. Interdiscip. J. Probl.-Based Learn 2017, 11, 2. [Google Scholar] [CrossRef] [Green Version]
  39. Novak, G.; Gavrin, A.; Christian, W.; Patterson, E. Just-in-Time Teaching: Blending Active Learning with Web Technology; Prentice Hall: Upper Saddle River, NJ, USA, 1999. [Google Scholar]
  40. Novak, G.M. Just-in-time teaching. New Dir. Teach. Learn. 2011, 128, 63–73. [Google Scholar] [CrossRef]
  41. Brame, C. Just-in-Time Teaching (JiTT). Vanderbilt University Center for Teaching. Available online: https://cft.vanderbilt.edu/guides-sub-pages/just-in-time-teaching-jitt/ (accessed on 31 July 2022).
  42. Dominguez, M.; DiCapua, D.; Leydon, G.; Loomis, C.; Longbrake, E.E.; Schaefer, S.M.; Becker, K.P.; Detyniecki, K.; Gottschalk, C.; Salardini, A.; et al. A Neurology Clerkship Curriculum Using Video-Based Lectures and Just-in-Time Teaching (JiTT). MedEdPORTAL 2018, 14, 10691. [Google Scholar] [CrossRef] [PubMed]
  43. Gavrin, A.; Watt, J.X.; Marrs, K.; Blake, R.E., Jr. Just-in-time teaching (JITT): Using the web to enhance classroom learning. Comput. Educ. J. 2004, 14, 51–60. [Google Scholar]
  44. Carter, P. An experiment with online instruction and active learning in an introductory computing course for engineers: JiTT meets CS1. In Proceedings of the 14th Western Canadian Conference on Computing Education, Burnaby, BC, Canada, 1–2 May 2009; pp. 103–108. [Google Scholar]
  45. Simkins, S.; Maier, M. Just-in-Time Teaching: Across the Disciplines, Across the Academy; Stylus Publishing: Sterling, VA, USA, 2010. [Google Scholar]
  46. Madiraju, C.; Tellez-Corrales, E.; Hua, H.; Stec, J.; Nauli, A.M.; Brown, D.M. Analysis of Student Perceptions of Just-In-Time Teaching Pedagogy in PharmD Microbiology and Immunology Courses. Front. Immunol. 2020, 28, 351. [Google Scholar] [CrossRef]
  47. Brown, D.M.; Brazeal, K.R.; Couch, B.A. Implementation and student perceptions of the just in time teaching (JiTT) strategy in an upper level immunology course. J. Immunol. 2017, 198 (Suppl. 1), 3. [Google Scholar]
  48. Schuller, M.C.; DaRosa, D.A.; Crandall, M.L. Using just-in-time teaching and peer instruction in a residency program’s core curriculum: Enhancing satisfaction, engagement, and retention. Acad. Med. 2015, 90, 384–391. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Marrs, K.A.; Blake, R.E.; Gavrin, A.D. Use of warm up exercises in just-in-time teaching to determine students prior knowledge and misconceptions in biology, chemistry, and physics. J. Coll. Sci. Teach. 2003, 33, 42–47. [Google Scholar]
  50. Cupita, L.; Andrea, L. Just in Time Teaching: A Strategy to Encourage Students’ Engagement. How 2016, 23, 89–105. [Google Scholar] [CrossRef] [Green Version]
  51. Creswell, J.W. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research; Pearson Education: Upper Saddle River, NJ, USA, 2002. [Google Scholar]
  52. Tsang, S.; Royse, C.F.; Terkawi, A.S. Guidelines for developing, translating, and validating a questionnaire in perioperative and pain medicine. Saudi J. Anaesth. 2017, 11 (Suppl. 1), S80–S89. [Google Scholar] [CrossRef] [PubMed]
  53. Zhang, Y.; Zhou, L.; Liu, X.; Liu, L.; Wu, Y.; Zhao, Z.; Yi, D.; Yi, D. The effectiveness of the problem-based learning teaching model for use in introductory Chinese undergraduate medical courses: A systematic review and meta-analysis. PLoS ONE 2015, 10, e0120884. [Google Scholar] [CrossRef] [PubMed]
  54. Anderson, J.C. Effect of Problem-Based Learning on Knowledge Acquisition, Knowledge Retention and Critical Thinking Ability of Agricultural Students in Urban Schools. Ph.D. Thesis, University of Missouri, Columbia, MI, USA, 2007. [Google Scholar]
  55. Tiwari, A.; Chan, S.; Wong, E.; Wong, D.; Chui, C.; Wong, A.; Patil, N. The effect of problem-based learning on students’ approaches to learning in the context of clinical nursing education. Nurse Educ. Today 2006, 26, 430–438. [Google Scholar] [CrossRef] [PubMed]
  56. Fagen, A.P.; Crouch, C.H.; Mazur, E. Peer instruction: Results from a range of classrooms. Phys. Teach. 2002, 40, 206. [Google Scholar] [CrossRef]
  57. Mesgar, M. Active Learning and Task-Based Instruction (TBI) via Online Platform during COVID-19 Pandemic. In Engineering and Sciences Teaching and Learning Activities; Springer: Berlin/Heidelberg, Germany, 2022; pp. 7–18. [Google Scholar] [CrossRef]
  58. Bevan, S.J.; Chan, C.W.; Tanner, J.A. Divers assessment and active student engagement sustain deep learning: A comparative study of outcomes in two parallel introductory biochemistry courses. Biochem. Molec. Biol. Educ. 2014, 42, 474–479. [Google Scholar] [CrossRef] [PubMed]
  59. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 23, 8410–8415. [Google Scholar] [CrossRef] [Green Version]
  60. Carrió, M.; Larramona, P.; Baños, J.E.; Pérez, J. The effectiveness of the hybrid problem-based learning approach in the teaching of biology: A comparison with lecture-based learning. J. Biol. Educ. 2011, 45, 229–235. [Google Scholar] [CrossRef]
  61. Kristianto, H.; Gandajaya, L. Offline vs online problem-based learning: A case study of student engagement and learning outcomes. Interac. Technol. Smart Educ. 2022, ahead-of-print. [CrossRef]
  62. Leisi, P.; Hongbin, W. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis . Med. Educ. Online 2019, 24, 1666538. [Google Scholar] [CrossRef] [Green Version]
  63. Gürsul, F.; Keser, H. The effects of online and face to face problem-based learning environments in mathematics education on student’s academic achievement. Procedia-Soc. Behav. Sci. 2009, 1, 2817–2824. [Google Scholar] [CrossRef] [Green Version]
  64. Foo, C.C.; Cheung, B.; Chu, K.M. A comparative study regarding distance learning and the conventional face-to-face approach conducted problem-based learning tutorial during the COVID-19 pandemic. BMC Med. Educ. 2021, 21, 141. [Google Scholar] [CrossRef] [PubMed]
  65. Deslauriers, L.; McCarty, L.S.; Miller, K.; Callaghan, K.; Kestin, G. Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. USA 2019, 116, 19251–19257. [Google Scholar] [CrossRef] [Green Version]
  66. Ferrer, J.; Ringer, A.; Saville, K.; Parris, M.A.; Kashi, K. Students’ motivation and engagement in higher education: The importance of attitude to online learning. High. Educ. 2020, 83, 317–338. [Google Scholar] [CrossRef]
  67. Kris, M.Y.L.; Shuang, G.; Tongmao, L. Student enrollment, motivation and learning performance in a blended learning environment: The mediating effects of social, teaching, and cognitive presence. Comput. Educ. 2019, 136, 1–12. [Google Scholar] [CrossRef]
  68. Porter, S.R. Self-reported learning gains: A theory and test of college student survey response. Res. High. Educ. 2013, 54, 201–226. [Google Scholar] [CrossRef]
  69. Ruiz-Gallardo, J.-R.; Castaño, S.; Gómez-Alday, J.J.; Valdés, A. Assessing student workload in Problem Based Learning: Relationships among teaching method, student workload and achievement. A case study in Natural Sciences. Teach. Teach. Educ. 2011, 27, 619–627. [Google Scholar] [CrossRef]
  70. Erickson, S.; Neilson, C.; O’Halloran, R.; Bruce, C.; McLaughlin, E. I was quite surprised it worked so well’: Student and facilitator perspectives of synchronous online Problem Based Learning. Innov. Educ. Teach. Int. 2021, 58, 316–327. [Google Scholar] [CrossRef]
  71. Zakarneh, B.; Al-Ramahi, N.; Mahmoud, M. Challenges of Teaching English Language Classes of Slow and Fast Learners in the United Arab. Int. J. High. Educ. 2019, 9, 256–269. [Google Scholar] [CrossRef]
  72. Babatunde Adedoyin, O.; Soykan, E. Covid-19 pandemic and online learning: The challenges and opportunities. Interact. Learn. Environ. 2020, 1–13. [Google Scholar] [CrossRef]
  73. Seif, E.; Tableman, B.; Carlson, J.S. Climate of Learning. In Encyclopedia of the Sciences of Learning; Seel, N.M., Ed.; Springer: Boston, MA, USA, 2012. [Google Scholar] [CrossRef]
  74. Margaret, C.L.; Finkelstein, M. Designing Groups in Problem-Based Learning to Promote Problem-Solving Skill and Self-Directedness. Instr. Sci. 2000, 28, 291–307. Available online: https://www.jstor.org/stable/23371450 (accessed on 21 July 2022).
  75. McLean, M.; Van Wyk, J.M.; Peters-Futre, E.M.; Higgins-Opitz, S. The small group in problem-based learning: More than a cognitive ‘learning’ experience for first-year medical students in a diverse population. Med. Teach. 2006, 28, e94–e103. [Google Scholar] [CrossRef] [PubMed]
  76. Saqr, M.; Nouri, J.; Jormanainen, I. A Learning Analytics Study of the Effect of Group Size on Social Dynamics and Performance in Online Collaborative Learning. In Transforming Learning with Meaningful Technologies; Scheffel, M., Broisin, J., Pammer-Schindler, V., Ioannou, A., Schneider, J., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2019; Volume 11722. [Google Scholar] [CrossRef] [Green Version]
Figure 2. Correlation between scores of the impacts on the learning of the subject matter and critical thinking and problem-solving skills of the survey and exam scores.
Figure 2. Correlation between scores of the impacts on the learning of the subject matter and critical thinking and problem-solving skills of the survey and exam scores.
Sustainability 14 09625 g002
Table 1. Description of the selected courses.
Table 1. Description of the selected courses.
Course Number and TitleCourse DescriptionMajor and LevelCredits
BIOL 110
Human Biology
This course is an introduction to human biology. It covers principles of structure and function of cells, tissues, and human body systems such as the digestive system, cardiovascular system, respiratory system, nervous system, muscular system, urinary system, and endocrine system.Biological Science—Junior3
BIOL 212
Genetics
This course considers the diverse aspects of genetics in both prokaryotes and eukaryotes by pondering the key players involved in inheritance. The following topics are extensively treated: chromosomes and genes, Mendelian inheritance; modification of Mendelian inheritance; gene interaction, inheritance and environment; sex determination; sex linkage; sex-limited and sex-influenced characteristics; linkage and crossing over; chromosome mapping; mutation; cytoplasmic inheritance; quantitative inheritance.Biological Science—Junior3
BIOL 452
Molecular Analytical Techniques
The course introduces students to various analytical methods focusing on maintaining a detailed laboratory notebook. Topics include multitasking, hands-on experience with analytical equipment, strategies that can be used in experimental design, troubleshooting experiments, and outcomes.Environmental Science—Senior3
BIOL 433
Monitoring and Toxicology
This course introduces students to the principles of environmental monitoring and toxicology. Topics include principles of risk assessment of contaminants with emphasis on
the Gulf Region, principles in the design of monitoring systems, monitoring systems for the management of renewable natural resources, and use of monitoring data in assessing natural resource management and pollution risks at both the individual and population levels.
Environmental Science—Senior3
Table 2. Summary of the experiment.
Table 2. Summary of the experiment.
Course Number of Course SectionsImplemented Instructional StrategiesModule Where the Instructional Strategy Was ImplementedNumber of StudentsInstructor
BIOL 110
Human Biology
One sectionPBLDigestive system46A
JiTTBlood46
LBLMuscular system46
BIOL 212
Genetics
One sectionPBLGene editing37B
LBLMendelian inheritance37
BIOL 452
Molecular Analytical Techniques
One sectionPBLAnalysis of organic compounds26C
LBLAnalysis of inorganic compounds26
BIOL 433
Monitoring and Toxicology
One sectionPBLRisk assessment25D
JiTTFactors affecting toxic responses25
LBLMonitoring of environmental pollutants25
Table 3. Examples of PBL scenarios.
Table 3. Examples of PBL scenarios.
CourseModule Learning ObjectivesProblem Scenario
BIOL 110 (Human Biology)Explain the process of digestion in the GIT tract in humans.
Discuss the absorption of nutrients in the small intestine.
A 45-year-old mother brought her son to a medical clinic for consultation regarding his digestive problems and malnourishment.
BIOL 212 (Genetics)Explain gene editing and discuss its potential applications in various fields such as medical, agronomy, and zootechny.
Create a hypothesis on how to use gene editing to solve medical and agronomic issues.
You have been enrolled as Research Assistant in a Molecular Genetics Unit whose main task is related to gene editing aimed at solving a problem related to hereditary human diseases, crop production enhancing techniques, and animal production. Before you start working in the research team of Molecular Genetics, you have been requested to extensively review the techniques of genetic editing and find applications in real-time situations such as those mentioned in the topics above.
BIOL 433 (Monitoring and Toxicology)Interpret the evidence from the literature to determine the toxic effects of substances.
Determine the safe limit of exposure based on available evidence.
Perform risk assessment for a substance and determine its risk level.
After you completed your BSc degree, you were offered an opportunity to work in the Ministry of Public Health. You were called for a meeting by the head of the risk assessment department who would like to share concerns about possible toxic effects observed in the population due to exposure to benzoates.
BIOL 452 (Molecular Analytical Techniques)Categorize the molecular technologies and equipment used to analyze, purify, and characterize molecules, including organic compounds, nucleic acids, proteins, and other molecules of the environment.
Explain how to apply modern molecular analytical techniques.
Explain statistical tools used for data analysis.
Your environmental science lab has developed a method for quantifying a particular pharmaceutical product (drug quantitation and quality
control) commonly found in hospital wastewater. This method involves an extraction followed by fluorescence measurement at the emission maximum for the drug. One of the samples analyzed in this method gave a result that showed an unusually large amount of this drug in this wastewater sample.
Table 4. Examples of JiTT exercises.
Table 4. Examples of JiTT exercises.
CourseModule Learning ObjectivesExercises
BIOL 110
(Human Biology)
- Recognize the composition of blood
- Explain the functions of blood elements
- Identify the role of A and B antigens in blood typing
Persons presenting with anemia usually have a high ventilation rate. Why?
Would you expect a person with thrombocytopenia (low platelet count) to have an increased or decreased risk of bleeding? Why?
Can a person with O blood type accept blood from someone with A blood type? Why or why not?
BIOL 433
(Monitoring and Toxicology)
Identify the factors that might affect toxic responses of toxicantsA group of people was exposed to a substance that is known to cause hypertension, arrhythmia, and rash at doses equal to or above 6 mg/kg.bw. Would you expect that all of them develop a similar degree of toxicities from that substance? Why or why not?
Table 5. Study participants.
Table 5. Study participants.
Course Number
and Name
Cohort/Sub-CohortTotal Number of Enrolled Students
BIOL 110 Human BiologyBiological science/Junior46
BIOL 212
Genetics
Biological science/Junior37
Junior Cohort83
BIOL 433
Monitoring and Toxicology
Environmental science/Senior26
BIOL 452
Molecular Analytical
Techniques
Environmental science/Senior25
Senior Cohort51
Total134
Table 6. Reliability testing for the self-perceived impacts on learning and skills survey.
Table 6. Reliability testing for the self-perceived impacts on learning and skills survey.
FactorNumber of ItemsCronbach’s Alpha Coefficient
Learning the subject matter50.773
Intrinsic interest in learning30.741
Preparedness level30.864
Critical thinking/problem-solving skills40.756
Personal skills30.865
Overall survey180.785
Table 7. Reliability testing for the implementation challenges survey.
Table 7. Reliability testing for the implementation challenges survey.
FactorNumber of ItemsCronbach’s Alpha Coefficient
Adequacy of learning platform30.837
Teaching and learning methods50.812
Learning environment20.774
Interactions40.796
Overall survey140.815
Table 8. Test scores.
Table 8. Test scores.
Quizzes PBL
Module
Quizzes JiTT
Module
Quizzes LBL
Module
p-ValueFinal Exam PBL
Module
Final Exam JiTT
Module
Final Exam LBL
Module
p-Value
BIOL 110 (Human Biology)
Means ± SD of the test scores9.98 ± 0.059.66 ± 0.648.32 ± 2.010.084 *a
0.078 *b
7.89
± 1.12
7.12
± 0.42
6.45
± 0.62
0.01 *a
0.0098 *b
Number (percentage)
of students passing
the test
46 (100)98 (45)37 (80)0.23
0.09
38 (82)36 (78)25 (54)0.03 **c
0.01 **d
BIOL 212 (Genetics)
Means ± SD of the test score8.57 ± 0.61-8.375
± 0.12
0.11 *b5.2
± 3.12
-3.81
± 1.72
0.02 *b
Number (percentage)
of students passing
the test
37 (100)-37 (100)0.087 **d19 (53)-15 (40)0.98 **d
BIOL 433 (Monitoring and Toxicology)
Means ± SD of the test score9 ± 0.58.9 ± 0.68.5 ± 0.70.078
0.08
8.1 ± 0.77.7 ± 0.25 ± 2.10.001 *a
0.0009 *b
Number (Percentage)
of students passing
the test
26 (100)26 (100)26 (100)0.16 **c
0.21 **d
26 (100)23 (90)13 (50)0.03 **c
0.009 **d
BIOL 452 (Molecular Analytical Techniques)
Means ± SD of the test score--- 9.3 ± 0.4-8.3 ± 0.20.05 *b
Number (percentage)
of students passing
the test
--- 24 (95)-22 (87)0.05 **d
- Not done. *a p-value obtained by using Student’s t-test when comparing the means of the test scores after JiTT and LBL. *b p-value obtained by using Student’s t-test when comparing the means of the test scores after PBL and LBL. **c p-value obtained by using chi-square test when comparing number of students passing the test after JiTT and LBL. **d p-value obtained by using chi-square test when comparing number of students passing the test after PBL and LBL.
Table 9. Self-perceived impacts on learning and skills: number (percentage) of students agreeing and strongly agreeing with the statements and average scores.
Table 9. Self-perceived impacts on learning and skills: number (percentage) of students agreeing and strongly agreeing with the statements and average scores.
Item StatementNumber (Percentage) of Students Strongly Agreeing and Agreeing with the Item Statement
Learning the Subject Matter
Improved my understanding of the subject matter72 (85)
Helped me relate subject ideas and concepts76 (89.4)
Made me engage in the course material in a deeper way72 (85)
Helped me draw conclusions and come up with recommendations and solutions related to the subject matter66 (77.6)
Helped me interact effectively with my instructor and colleagues to discuss the subject matter in depth69 (82.1)
Average score/5 (± SD)4.48 ± 1.23
Intrinsic interest in learning
Increased my interest in learning the subject matter68 (80)
Increased my understanding of the importance
of the subject matter in real-life applications
78 (91.7)
Increased my motivation for learning62 (72.9)
Average score/5 (± SD)4.44 ± 0.83
Preparedness level
Online PBL and JiTT made me prepare better for the class session53 (62.3)
Online PBL and JiTT enhanced my preparedness
level for the exams
53 (62.3)
Online PBL and JiTT improved my preparedness level for the work/training61 (71.7)
Average score/5 (± SD):3.65 ± 0.83
Critical thinking/problem-solving skills
Increased my abilities to search for information or data on the problem using appropriate
searching strategies
69 (82.1)
Increased my abilities to organize and sort data and findings70 (82.3)
Increased my abilities to create inferences on why the problem exists and how it can be solved67 (78.8)
Increased my abilities to analyze data and develop solutions to problems70 (82.3)
Average score/5 (± SD):4.19 ± 1.21
Personal skills
Made me communicate more effectively with my colleagues63 (74.1)
Made me value teamwork63 (74.1)
Enhanced my independent learning skills60 (70.6)
Average score (± SD)/5:3.98 ± 0.95
Table 10. Self-perceived impacts on learning and skills survey scores distribution among study cohorts.
Table 10. Self-perceived impacts on learning and skills survey scores distribution among study cohorts.
Learning the Subject MatterIntrinsic Interest in Learning the Subject MatterPreparedness LevelCritical Thinking/Problem-Solving SkillsPersonal Skills
All Cohorts
Average Score ± SD
4.48 ± 0.234.44 ± 0.833.65 ± 0.834.19 ± 1.213.98 ± 0.95
Junior Cohorts
(BIOL 110 & BIOL 212)
Average Score ± SD
4.37 ± 0.673.61 ± 1.12.95 ± 0.814.01 ± 0.714.13 ± 0.21
Senior Cohort
BIOL 433 and BIOL 452)
Average Score ± SD
4.57 ± 0.474.81 ± 0.44.45 ± 0.314.39 ± 0.313.88 ± 0.14
p-Value *0.140.030.0230.560.72
BIOL 110
Average Score ± SD
4.34 ± 0.233.72 ± 0.983.1 ± 0.783.89 ± 0.944.21 ± 0.11
BIOL 212
Average Score ± SD
4.13 ± 0.273.56 ± 1.232.89 ± 0.684.11 ± 0.564.01 ± 0.45
p-Value **0.230.120.2200.510.65
BIOL 433
Average Score ± SD
4.67 ± 0.234.87 ± 0.234.65 ± 0.334.41 ± 0.184.05 ± 0.27
BIOL 452
Average Score ± SD
4.51 ± 0.644.74 ± 0.334.29 ± 0.234.29 ± 0.213.77 ± 0.03
p-Value **0.120.840.640.0720.09
* ANOVA test. ** Student’s t-test
Table 11. Implementation challenges for the online use of PBL and JiTT: number (percentage) of students agreeing and strongly agreeing with implementation challenges statements.
Table 11. Implementation challenges for the online use of PBL and JiTT: number (percentage) of students agreeing and strongly agreeing with implementation challenges statements.
Item StatementNumber (Percentage) of Students Strongly Agreeing and Agreeing with the Item Statement
N (%)
Learning Platform Was Adequate for PBL and JiTT
Options included in the platform were sufficient to conduct PBL and JiTT conveniently78 (95)
Options included in the platform were sufficient to post my assignments and receive feedback78 (95)
The learning platform favors the implementation of teamwork when required78 (95)
Teaching/Learning Method (PBL and JiTT)
Online learning is suitable for both PBL and assignment-based learning74 (90)
Online learning is better for assignment-based learning than PBL16 (20)
It was not difficult to sustain my interest and focus during online sessions in PBL and assignment-based sessions71 (87)
It was not hard to collaborate and communicate online between team members in online PBL to organize tasks and discuss topics76 (93)
It was not difficult to engage all members of the team during discussions in online PBL78 (50%)
Learning Environment
Home environment is more noisy and distractive, which would hinder my participation or concentration50 (60)
Internet instability makes learning and interaction sometimes difficult in PBL42 (51.2)
Interactions (online is appropriate for interactions with the instructors, team members, and other classmate students)
It was not hard for me to interact with my instructor and to receive his/her feedback in a timely manner in online PBL74 (90)
It was not hard for me to interact with my colleagues in online PBL74 (90)
The interactions with my instructor and colleagues to organize tasks and share ideas were the same in online PBL as they would have been in a real class setting 60 (73.2)
The interactions with other team members were not hard and I was able to communicate with other teams and the whole class to share some points/discuss ideas in online PBL in the same manner as it would have been in a real class setting 36 (44)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Soubra, L.; Al-Ghouti, M.A.; Abu-Dieyeh, M.; Crovella, S.; Abou-Saleh, H. Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education. Sustainability 2022, 14, 9625. https://doi.org/10.3390/su14159625

AMA Style

Soubra L, Al-Ghouti MA, Abu-Dieyeh M, Crovella S, Abou-Saleh H. Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education. Sustainability. 2022; 14(15):9625. https://doi.org/10.3390/su14159625

Chicago/Turabian Style

Soubra, Lama, Mohammad A. Al-Ghouti, Mohammed Abu-Dieyeh, Sergio Crovella, and Haissam Abou-Saleh. 2022. "Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education" Sustainability 14, no. 15: 9625. https://doi.org/10.3390/su14159625

APA Style

Soubra, L., Al-Ghouti, M. A., Abu-Dieyeh, M., Crovella, S., & Abou-Saleh, H. (2022). Impacts on Student Learning and Skills and Implementation Challenges of Two Student-Centered Learning Methods Applied in Online Education. Sustainability, 14(15), 9625. https://doi.org/10.3390/su14159625

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop