Next Article in Journal
Construction of Legal Knowledge Graph Based on Knowledge-Enhanced Large Language Models
Previous Article in Journal
Few-Shot Methods for Aspect-Level Sentiment Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Online Learning from the Learning Cycle Perspective: Discovering Patterns in Recent Research

School of Business and Economics, Hochschule für Wirtschaft und Recht Berlin, Badensche Str. 50-52, 10875 Berlin, Germany
Information 2024, 15(11), 665; https://doi.org/10.3390/info15110665
Submission received: 19 September 2024 / Revised: 11 October 2024 / Accepted: 16 October 2024 / Published: 22 October 2024
(This article belongs to the Section Information Applications)

Abstract

:
We propose a method for automatically extracting new trends and best practices from the recent literature on online learning, aligned with the learning cycle perspective. Using titles and abstracts of research articles published in high ranked educational journals, we assign topic proportions to the articles, where the topics are aligned with the components of the learning cycle: engagement, exploration, explanation, elaboration, evaluation, and evolution. The topic analysis is conducted using keyword-based Latent Dirichlet allocation, and the topic keywords are chosen to reflect the nature of the learning cycle components. Our analysis reveals the time dynamics of research topics aligned on learning cycle components, component weights, and interconnections between them in the current research focus. Connections between the topics and user-defined learning elements are discovered. Concretely, we examine how effective learning elements such as virtual reality, multimedia, gamification, and problem-based learning are related to the learning cycle components in the literature. In this way, any innovative learning strategy or learning element can be placed in the landscape of the learning cycle topics. The analysis can be helpful to other researches when designing effective learning activities that address particular components of the learning cycle.

Graphical Abstract

1. Background

Online learning applications with a large proportion of interactive, self-directed content for individual learning paths are becoming increasingly essential in higher education. The COVID-19 lockdown, in particular, spurred a boom in the use of such applications and in the empirical study of their effectiveness.
In their meta-analysis of more than 50 publications featuring randomized or quasi-randomized experiments on online, blended, and face-to-face learning, Means et al. [1] reported significant improvements in learning outcomes for online and blended formats. More recent meta-studies also suggest the superiority of the flipped classroom approach over traditional learning. Numerous studies have consistently shown that active online learning units within a flipped classroom model outperform traditional lecturing (Ying and Thompson [2], Shi et al. [3], Hew et al. [4]). Research on MOOCs and small online courses (e.g., Li et al. [5] and Zheng et al. [6]) further supports that fully asynchronous online learning can be effective, provided the course activities and materials are well designed to meet students’ needs.
This raises the questions of how to design an effective online learning unit and which learning elements enhance its efficiency. At the same time, research on enhancing these applications with innovative and effective learning features is rapidly evolving. Keeping up with the vast number of research articles that test and adopt new learning technologies—while continuously generating new insights—is challenging.
The purpose of this paper is to propose an approach to automatically discover patterns for designing online learning units using machine learning. Our primary goals are to structure recent relevant literature within a learning cycle framework, establish associations between the discovered topics and key elements of online learning units that contribute to learning success, and provide recommendations for designing effective online learning applications.
To conduct the analysis efficiently, we aim to use Latent Dirichlet Allocation (LDA, Blei et al. [7]) for automatic topic identification. As a commonly used technique for structuring text data automatically, topic analysis (Churchill and Singh [8]) represents each document in a collection (or corpus) as a combination of weighted keyword sets (topics), with each document containing these topics in varying proportions. In the context of e-learning literature, LDA was employed, e.g., by Gurcan et al. [9] to identify emerging trends. While suitable for tracking general trends and the time evolution of article volumes, their approach, which revealed broad topics such as “learning system” and “learning factors”, does not provide actionable recommendations for designing modern learning units in line with recent research findings.
To structure our analysis with a focus on designing learning applications, we adopted the 5E instructional framework, where the design of learning units follows the five-phase learning cycle: engage, explore, erxplain, elaborate, and evaluate, as described in Bybee et al. [10] and Hew et al. [4]. The first phase engages students by addressing a real-world problem or question, the second involves exploring the problem, and the third seeks to explain the phenomena using existing knowledge. The elaboration phase deepens understanding through additional material and exercises, while the final phase evaluates learning success.
Aligning the topics to the learning cycle components, as outlined above, allows for a structured representation of content that is helpful in designing effective learning applications. However, the LDA approach of Gurcan et al. [9] relies purely on data-driven topic extraction, offering no possibility for user-supplied input regarding topic content. Recently, Eshima et al. [11] proposed an LDA-based method called keyATM, which allows for the explicit embedding of known structures or side information into topic allocation. In this method, the authors embed user-supplied keywords into the LDA estimation procedure to obtain topics aligned with the user’s specific focus.
Thus, by using keyATM, we can extract topics aligned with the 5E learning cycle, combined with the evolution component, by tying the keywords to the definitions and descriptions of the stages discussed above. This approach enables us to analyze the interconnections between learning cycle stages, the dynamics of research volume devoted to each stage, and the relatedness of specific learning elements and tools to the learning cycle components.
The underlying text corpus consists of the titles and abstracts of articles published between 1 January 2019 and 31 March 2023. This time frame was chosen to build on the comprehensive review by Martin et al. [12], which covered works from 2009 to 2018 and identified 12 research themes in online learning. Our goal is to highlight the development of these themes in subsequent years and to identify patterns in the research aligned with the learning cycle components. Our approach is a neutral, automatic filtration and representation of available published literature based on specific criteria outlined in the next section. We use a representative sample of papers available through the OpenAlex API (Priem et al. [13]), which indexes 209 million works from various sources, including Crossref, PubMed, institutional repositories, and preprint repositories like arXiv, spanning 124,000 publishing venues. The results are conceptually organized and intended for both researchers and practitioners.
The rest of the paper is organized as follows: In the next section, we present an overview of the related works and position our paper in-light of the recent research. In the following section, we present our research article dataset and describe the filtering and preprocessing steps. We also briefly introduce the keyATM method, present the selected keywords, and discuss the resulting top words from the extracted topics. In the subsequent section, we conduct an in-depth analysis of the obtained topics and their interconnections. We establish the associations between the underlying learning cycle components and effective learning elements such as virtual and augmented reality, multimedia, gamification, and problem-based learning. Finally, we summarize our findings.

2. Related Work

Research on online learning has flourished in recent years, driven by the need to create adaptable and effective educational environments. A wide array of studies has focused on both theoretical frameworks and practical applications aimed at enhancing learning outcomes. Our work contributes to this growing body of research by focusing on the learning cycle framework and providing an automatic method for extracting and analyzing the trends and best practices from educational literature. To situate our work, we examine three relevant strands of research: learning cycles in education, topic modeling in educational research, and connections between innovative learning strategies and learning frameworks.

2.1. Learning Cycles in Education

The learning cycle, rooted in theories such as Kolb’s experiential learning model (Kolb [14]) and Bybee’s 5E instructional model (Bybee et al. [10]), has been an influential concept in education. As mentioned above, these cycles typically consist of stages like engagement, exploration, explanation, elaboration, evaluation. In the following, we will provide a more detailed overview of the learning cycle phases.
Engagement. Student engagement is defined in various ways. For instance, Hu et al. [15] described it as “the amount of effort dedicated to educational activities that bring out ideal performance”, while Lewis et al. [16] defined it as “the extent to which learners’ thoughts, feelings, and activities are actively involved in learning”. Wong and Liem [17] described learning engagement as “students’ psychological state of activity that enables them to feel activated, exert effort, and be absorbed during learning activities”. They distinguish between behavioral, cognitive, and emotional components of engagement based on established theories. According to the authors, learning engagement is a multilevel construct that varies depending on the learning context and time. In this sense, engagement is foundational to other stages, as it sustains the execution of learning activities. Lee et al. [18] identified several indicators of engagement: (1) behavioral—learning effort (e.g., self-completed units and time invested), participation in class activities (attendance and asking questions), and interaction (between learner and instructor); (2) cognitive—task solving (knowledge formation, and application, reflection on achievement), and learning management (self-direction, scheduling); and (3) emotional—learning satisfaction (interest, expectations, and enjoyment), sense of belonging (connection to the learning community), and learning passion (willingness to tackle challenges). Skilling et al. [19] connected engagement, motivation, and achievement, especially in secondary school mathematics, by measuring seven adaptive factors (self-efficacy, mastery orientation, valuing, persistence, planning, task management, and enjoyment) and five maladaptive factors (anxiety, failure avoidance, uncertain control, self-handicapping, and disengagement). Maroco et al. [20] developed a questionnaire to assess university student engagement, and Assunção et al. [21] confirmed that it produced reliable and valid data on academic engagement in higher education. Similarly, Tomás et al. [22] evaluated the reliability and validity of two additional self-assessment tools for quantifying student engagement.
Exploration. The exploration phase, as described by Bybee et al. [10], involves experimental activities, preliminary investigations, or using prior knowledge to generate new outcomes. Student activities may include making predictions, testing hypotheses, exploring alternatives, and generating new ideas. Depending on the subject, these activities can be implemented online through interactive applications, such as manipulatives (e.g., varying input parameters to produce different outcomes), simulations, and scenario-based explorations. These “hands-on” experiences have been shown to enhance learning success, as highlighted by Meylani et al. [23]. This phase is also closely related to experiential learning (see Morris [24]), where concrete experiences are central to the learning process.
Explanation. As outlined by Bybee et al. [10], this stage involves introducing new concepts or skills that serve as the theoretical foundation for solving the given problem or explaining the observed phenomena. In a fully online setting, where the instructor cannot easily adapt to individual learning preferences, these preferences should still be accounted for. Meylani et al. [23] reviewed studies showing that incorporating additional media formats can contribute to learning success. Learners should be provided with a variety of materials to understand new concepts, such as diagrams, videos, and text-based resources, as noted by Katsaris and Vidakis [25]. This ensures that diverse learning styles in e-learning are addressed.
Elaboration. According to Bybee et al. [10], elaboration deepens and broadens understanding, enhancing the application of new skills. In this phase, students should also have the opportunity to practice extensively and develop routine skills for solving problems related to newly acquired concepts. In asynchronous online learning, where direct instructor feedback is not available, self-testing with immediate feedback should be offered, as suggested by Meylani et al. [23]. To further engage students in practicing new concepts and gaining routine skills, gamified learning activities can be implemented. As noted by Demmese et al. [26], such gamified activities can motivate students to practice more by making the process enjoyable and encouraging them to solve additional exercises.
Evaluation. In this phase, ability assessment and the evaluation of student progress takes place. In online format, individual ability assessment is realized through quizzes with an end score. Meylani et al. [23] pointed out the importance of self-monitoring in an online framework. An overview of completed activities and attained test scores can be implemented in the form of a learning analytics dashboard, which have been shown to enable learners in making informed decisions (Susnjak et al. [27]).
Moreover, given that modern e-learning systems often incorporate adaptive or personalized features, we incorporate another component Evolution. This stage does not pertain to the learning cycle itself, but focuses on analyzing the usability of the learning system and suggesting improvements toward developing an adaptive, personalized e-learning system that accommodates individual learning preferences. The evolution section involves improving the e-learning environment based on direct feedback, questionnaires (such as the System Usability Scale, Lewis [28]), learning analytics, and incorporating personalized features. Meylani et al. [23] emphasized the importance of giving learners some control over the online resources available to them. Personalized recommendations, based on individual learning paths, can further enhance learning outcomes. As noted by Katsaris and Vidakis [25], personalized e-learning environments that adapt to individual learning preferences are on the rise and have the potential to optimally support individual learning processes. For example, Fatahi [29] proposed adapting e-learning environments according to learners’ emotions and personalities, while Khamparia and Pandey [30] reviewed methodologies for implementing e-learning adaptations based on automatic assessments of learning styles, highlighting their overall utility for improving learner success.
Numerous studies have highlighted the efficacy of these stages in promoting deep learning, engagement, and metacognitive development in various contexts, including inquiry-based and problem-based learning (Wilson et al. [31], Hmelo-Silver et al. [32]), and blended learning environments (Garrison and Kanuka [33]). Su et al. [34] used the 5E learning cycle to design an e-learning course and demonstrate its ability to enhance learning outcomes. While previous research has established the relevance of the learning cycle for guiding pedagogy, few studies have applied automated methods to examine its reflection in contemporary research literature, leaving a gap that our approach addresses.

2.2. Topic Modeling in Educational Research

Text mining and topic modeling techniques, such as Latent Dirichlet Allocation (LDA), have been widely used to extract thematic trends in educational research. For example, Costello et al. [35] applied LDA to explore the evolution of research topics in MOOC studies, while Pei et al. [36] used topic modeling to map key research areas in educational technology on multimodal learning analytics. Gurcan et al. [9] also applied LDA to identify emerging trends in e-learning. Topic modeling has proven effective in analyzing large corpora of research articles, helping to identify shifts in focus over time (Xiong et al. [37], Pei et al. [36], Mostafa [38]). Our work builds on these efforts by applying the keyword-based LDA of Eshima et al. [11], aligning extracted topics with the components of the learning cycle. This alignment provides a unique lens through which to analyze the thematic structure of recent online learning research, offering a new perspective that integrates educational theory with data-driven insights.

2.3. Connections Between Innovative Learning Strategies and Learning Frameworks

Several innovative learning strategies, such as virtual reality (VR), multimedia, gamification, and problem-based learning (PBL), have gained prominence in educational research and practice. These approaches have been found to enhance student engagement and outcomes by providing immersive, interactive, and learner-centered experiences (Dede [39], Hung et al. [40], Hamari et al. [41]). However, few studies have systematically analyzed how these elements relate to established theoretical learning frameworks like the learning cycle. For instance, Bellotti et al. [42] examined the role of gamification in the evaluation and exploration phases, while recent work by Yu and Xu [43] discussed the impact of VR on elaboration. Our study advances this line of research by systematically mapping these strategies onto all learning cycle components, thus offering a more holistic understanding of their integration into pedagogical design.

2.4. Contribution to the Literature

Our approach contributes to the literature in several ways. First, we offer a novel method for automatically extracting trends from relevant educational literature, applying LDA with carefully curated keywords to align topics with learning cycle components. Second, we provide insights into the temporal dynamics of these topics and shed light on how current research in online learning prioritizes different phases of the learning cycle. Finally, by mapping innovative learning strategies such as VR, multimedia, and gamification onto the learning cycle, we offer researchers and practitioners a new tool for designing and assessing learning activities in alignment with established educational theories.
This study is positioned at the intersection of automated text analysis, educational theory, and the study of online learning trends, thus filling an important gap in the current literature.

3. Data and Methods

In this section, we describe the search modalities used to retrieve the research articles considered for the topic analysis and provide information on filtering and preprocessing steps. Additionally, we briefly introduce the keyATM model, present the chosen keywords for the topics aligned with the learning cycle stages, and review the top words from the resulting topics.

3.1. Searching, Filtering, and Preprocessing Research Articles on Online Learning

Searching and filtering. We searched titles and abstracts of research articles published between 1 January 2019 and 31 March 2023, contained in the OpenAlex database (https://openalex.org/ accessed on 17 April 2023) using the keywords: “online learning”, “online teaching”, “e-learning”, “online course”, “online education”, “education technology”, “computer-assisted learning”, and “distance education”, with the R package openalexR (Massimo et al. [44]).
This search returned 106,032 articles (as of 17 April 2023). We then filtered the results to include only articles from publishers ranked in the top 20 on Google Scholar in education, educational technology, and educational psychology (https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=eng_educationaltechnology (accessed on 17 April 2023)). This filtering process reduced the number of articles to 5474. Finally, we excluded articles with unavailable abstracts, resulting in a final dataset of 5089 articles for further analysis.
Figure 1 plots the number of articles in the selection by publication date (2023 is omitted; the dashed line marks the mean value), showing an upward trend in publications and a peak in 2021.
Preprocessing. To prepare the text corpus for topic analysis, we first combined the titles and abstracts of the selected articles. We then tokenized the content into individual words and pairs of consecutive words, applying standard preprocessing routines such as stemming and the removal of punctuation, numbers, symbols, and stopwords. Additionally, we removed terms that occurred in less than 0.1% or more than 50% of the articles to exclude terms that were either too rare or too common to be informative. After preprocessing, we obtained a corpus of 5089 documents with a vocabulary consisting of 15,022 features (words and word pairs).
This processed text corpus was then used to fit a keyATM model with six keyword-based topics, reflecting the engagement, exploration, explanation, elaboration, evaluation, and evolution components, as well as several keywordless topics that capture other thematic content from the articles.

3.2. Fitting the Keyword-Assisted Topic Model: keyATM

As mentioned in the previous section, our goal is to automatically extract topics structured around the learning cycle components. We assume that the titles and abstracts in our text corpus consistently reflect the topics covered in the main text of each article. Additionally, we assume that some of these topics align with the learning cycle components (keyword topics), while others may refer to content not directly related to the learning cycle (keywordless topics).
Topic Keywords. In our analysis, we align the keywords with the definitions and descriptions of the 5E stages, plus the evolution component discussed earlier, ensuring that the extracted topics reflect these learning cycle phases. The keywords, presented in Table 1, were selected based on the definitions of the 5E stages from Eisenkraft [45], Duran and Duran [46], and Bybee et al. [10]. For the final component, evolution, we adopt descriptions of adaptive e-learning environments from Özcan Özyurt and Özyurt [47], El-Sabagh [48], and Kolekar et al. [49].
KeyATM model. To extract both types of topics—keyword topics aligned with the learning cycle and keywordless topics—we propose using the keyATM methodology of Eshima et al. [11]. This method is an LDA-based topic extraction algorithm that allows a user-specified focus in the form of topic keywords.
In keyATM, a corpus of D documents with vocabulary V is modeled to contain a total of K topics, of which K ˜ are keyword-based and K K ˜ are keyword-free topics. Each keyword-based topic is associated with a set of L k keywords V k = v k 1 , v k 2 , , v k L k . In the topic model, each document d is a collection of N d words, represented by the set of words W d = w d 1 , w d 2 , w d N d , and modeled in a Bayesian framework. Each w d i is associated with a latent topic variable z d i 1 , 2 , , K , fulfilling:
z d i i n d e p . Categorical ( θ d ) ,
where θ d is a K-dimensional vector of topic proportions for document d.
For a topic without keywords ( k { K ˜ + 1 , K ˜ + 2 , , K } ), the model assumes:
( w d i | z d i = k ) i n d e p . Categorical ( ϕ k ) ,
where ϕ k is a K-dimensional vector of word frequencies for topic k. For keyword topics ( k { 1 , 2 , , K ˜ } ), a Bernoulli random variable s d i with a success probability π k is introduced: ( s d i | z d i = k ) i n d e p . Bernoulli ( π k ) . If this variable equals to 0, then ( w d i | z d i = k ) follows the same topic word distribution as for keyword-free topics. If this variable is equal to 1, then:
( w d i | z d i = k ) i n d e p . Categorical ( ϕ ˜ k ) ,
where ϕ ˜ k is a V dimensional vector of word probabilities for the set of keywords of topic k, V k .
The prior distributions correspond to a Beta distribution for π k and Dirichlet distributions for ϕ k and ϕ ˜ k ( k = 1 , 2 , , K ). Furthermore, it is assumed that:
θ d i . i . d . Dirchlet ( α ) ,
where α is a K-dimensional vector with components α k , following a Gamma distribution with distinct prior parameters for keyword-based and keyword-free topics.
Using the prior distributions, Eshima et al. [11] derived the conditional posterior distributions for z d i = k , s d i , and α k and implemented a collapsed Gibbs sampling algorithm to sample from the posterior distributions. The estimation routine was provided in the R-package keyATM (see Eshima et al. [11]).
Choosing the number of topics. Using the keywords specified in Table 1, we fit the keyATM model to the data utilizing the keyATM package. However, we need to provide the total number of topics and the number of keyword-free topics as the input to the model estimation algorithm. The number of keyword topics is fixed at six, corresponding to the learning cycle. To choose the number of remaining keywordless topics, we applied a data-driven procedure based on the average topic coherence measure. This metric quantifies the quality of topics based on the co-occurrence of the top words within each topic (Thompson and Mimno [50], Selivanov et al. [51], Gurdiel et al. [52]) quantified the quality of topics based on co-occurrences of the top words constituting a topic. Specifically, we used the average mean–log-ratio topic coherence, defined for topic k with m top words w k 1 , , w k m as:
c o h k = i = 1 m j < i log # ( w k i , w k j ) # ( w k i ) + ε ,
where # ( · ) counts the contexts containing the input (a word or a word pair), and ε is a smoothing parameter set to 1 × 10 12 . This metric quantifies how often the top m words in a topic k co-occur in the reference text corpus. It is justified by the observation that words with similar meanings tend to co-occur in the same contexts, with coherence positively associated with interpretability.
We alternated the total number of topics K in the range [ 6 , 15 ] , estimated the respective keyATM model, and calculated the average topic coherence for the keyword topics as defined in (1). Subsequently, we calculated the average coherence across the six keyword topics:
c o h ¯ = 1 k k = 1 K ˜ c o h k .
The resulting average coherence for each K is shown in Table 2. The optimal topic number maximizing the average coherence criterion in (2) is K = 9 , meaning that the optimal number of keyword-free topics should be set to three.
KeyATM model with the optimal number of topics. Based on the results of selecting the number of topics, we fit the keyATM model with the optimal number of topics, K = 9 ( K ˜ = 6 ).
Table 3 shows the top 20 words for each of the extracted topics, with the provided keywords marked by a checkmark. After reviewing the top words of the keywordless topics, we labelled them as “learning platforms”, “technical acceptance”, and “challenges in distance education” (hereafter referred to as “platform”, “acceptance”, and “distance”, respectively). As seen in Table 3, most of the provided keywords appeared among the top 20 words, confirming their effectiveness in assisting topic extraction. Keywords that also appeared in the top 20 words of a different topic were labeled with the number of the corresponding keyword topic. For example, “engag”, a keyword for topic 1, “engagement”, also appeared in the top 20 words for topic 2, “exploration”. This overlap was an exception, further suggesting that the keywords effectively supported the topic identification process.
Overall, the top words for the learning cycle topics, as revealed by the model, aligned well with the intended interpretation of the learning cycle components, indicating an acceptable model fit. In the next section, we provide an in-depth analysis of the extracted topics, focusing on their interrelations and associations with various effective learning elements.

4. Results and Discussion

In this section, we present and analyze the results of our automatic keyATM topic extraction and discuss their implications. We consider the overall topic proportions, the interrelations between the topics, and their connection to various effective learning elements.

4.1. Topic Proportions

In this part, we examined the topic proportions across the selected articles, as well as their evolution over time.
Figure 2 shows the expected topic proportions per paper. The largest proportions among the keyword-assisted topics were found in elaboration and exploration, followed by evolution and engagement. The explanation topic had the smallest expected proportion among the keyword-assisted topics. Among the keywordless topics, the first topic, referring to learning platforms, dominated. Overall, the topic proportions were balanced, indicating that most articles covered multiple topics with a high probability.
Figure 3 illustrates the dynamics of topic proportions over the publication years. Notably, we observed a slight increase in the literature focused on engagement in 2018 and 2022, as well as a significant rise in works addressing challenges in distance education during the pandemic restrictions of 2020–2021. Furthermore, in 2023, there was a notable increase in the literature devoted to the elaboration and evolution stages of the learning cycle, highlighting their growing importance in online learning applications.
In summary, our analysis of topic proportions reveals that elaboration and exploration are the most prominent topics, followed by evolution and engagement, with explanation having the smallest proportion. The temporal dynamics showed a recent rise in research on the engagement, elaboration, and evolution stages, alongside an increasing body of literature focused on the challenges of distance education.

4.2. Exploring the Interrelations Between Topics

In this section, we examine the interrelations between topics by investigating whether articles with a high proportion of one topic also tend to have a high proportion of another. This approach helps determine if there is a close relationship between topics. Additionally, we explore another method of analyzing these interrelations by examining citations among the papers. If papers with a high proportion of one topic frequently cite papers with a high proportion of another topic, this supports the existence of an interrelation.
To visualize these interrelations, we compute and plot a topic network with topics as vertices and two types of links as edges. The first type of link represents the expected topic proportion in documents with a high proportion of a given topic. The thickness of the link corresponds to this expected proportion: the thicker the line, the higher the proportion. For this network, we extracted the top 100 documents for each topic. The resulting network is shown in the left panel of Figure 4.
The second type of link is based on reciprocal citations among the top 100 documents for each topic. In this network, the thickness of the edges is adjusted according to citation intensity. The resulting network is displayed in the right panel of Figure 4.
In the left panel of Figure 4, we observe that articles with a high proportion of the engagement topic often also cover a significant proportion of exploration issues, and this topic is also associated with challenges in distance education. Similarly, articles with a high proportion of exploration tend to address themes related to learning platforms and technology acceptance issues. The explanation topic is linked to exploration and evolution, while articles focusing on evolution are heavily related to challenges in distance education. The evaluation topic exhibits strong interrelations, particularly with engagement, exploration, elaboration, the learning platforms topic, and challenges in distance education. Additionally, learning technology acceptance is closely tied to the evolution of learning systems and platform design.
In the citations network shown in the right panel of Figure 4, we observed that articles focused on exploration frequently cite works related to technology acceptance and other exploration-related papers. Documents with a predominant engagement focus tend to reference articles on evolution and other engagement-related works. Furthermore, articles discussing the adaptation of learning environments often cite works on distance education and research directions. Notably, articles on research directions are closely connected to evaluation and technology acceptance topics, with substantial references to works addressing challenges in distance education.
Additionally, we plot the network of the top 20 words for each topic based on the co-occurrences of the respective words in the documents. The resulting network is presented in Figure 5. The network graph shows that top words from the same topic tend to be interconnected but also relate to top words from other topics. Specifically, the top words of the evaluation topic (dark blue) are interconnected with those from the elaboration (light blue) and explanation (turquoise) topics. A noticeable interrelation is also seen between the top words of the engagement topic (red) and those of the elaboration stage (light green).
Overall, our examination of the interrelations between topics reveals that articles focused on engagement often overlap with exploration and address challenges in distance education, while evaluation shows strong ties with topics like engagement, elaboration, and learning platforms. The citation network revealed that articles on exploration frequently cite works on technology acceptance, while papers on engagement and evolution are interconnected through citations.

4.3. Exploring Topic Relations to Other Areas of Interest

Using the topics aligned with the learning cycle components and extracted by the keyATM model, researchers can explore other areas of interest by providing relevant keywords. These keywords can summarize the area of interest and allow for exploring the strength of its relation to the model’s topics. The strength of this relation could be proportional to the number of supplied keywords present in papers with a large proportion of each topic.
To illustrate this, we used “mathematics education” as an overarching area of interest, with the keyword “math” to explore its connection to the extracted topics. Figure 6 shows the results of this association analysis. In the left panel of Figure 6, a network graph illustrates how the keyword connects with the extracted topics. The analysis indicates a strong association with the engagement and explanation stages, while connections to the evaluation and evolution components are weaker. Additionally, the right panel of Figure 6 displays the titles of highly associated papers and their primary topics. This suggests that the primary challenges in math education are related to strategies for engagement and explanation.
Overall, the proposed model, which aligns topics with the components of the learning cycle, can be used to assess associations with other areas of interest based on keyword frequencies. This approach also helps identify research papers that may be relevant and valuable for the researcher.

4.4. Association Between Innovative Learning Elements and Topics

To provide concrete recommendations for developing successful online applications, we examine specific learning elements and their connections to the extracted topics. This approach enables instructors to select effective learning elements for each phase of the learning cycle. Our learning elements were aligned with the findings of Li et al. [53], which identified effective distance education elements not covered by the 5E learning principles. These elements include:
  • VR: Virtual and Augmented reality (keywords: “realiti_augment”,“realiti_virtual”, “augment”, “realiti”)
  • MM: Multimedia, including audio, video, animation content (keywords: “audio”,“video”, “anim”, “content”)
  • GM: Gamification (keywords: “gamif”, “gamifi”, “game”, “play”, “game_studi”, “learn_gamif”, “gameplay”, “educ_game”, “game_engag”)
  • PB: Practice-oriented, problem-based learning (keywords: “practic_learn”, “problem_base”, “problem_activ”, “problem_experi”, “practic_motiv”)
We analyzed the frequency of keywords associated with each learning element by reviewing the top 100 papers categorized according to the 5E principles and examining the top five papers from each of the 5E topics.
The first type of the considered effective learning elements—VR, or virtual and augmented reality—shows a strong association with the elaboration component (with a proportion of 0.77), as detailed in Table 4. Other stages of the learning cycle do not exhibit a significant connection with VR. In this regard, Nurbekova and Baigusheva [54] explored the effects of VR on students’ self perception, finding that VR made study materials more engaging and comprehensible, enhances the visibility of learning content, and provides better insights into theoretical concepts. Similarly, Zhao [55] reported that augmented reality enriches information delivery and improves student performance. Wolski and Jagodziński [56] assessed the impact of a virtual chemical laboratory and confirmed improvements in problem-solving. These findings were supported by meta-analyses conducted by Garzón and Acevedo [57] and Chang et al. [58].
The second group of effective learning elements—MM, or multimedia, audio, video, and animation—primarily influences the explanation and evaluation stages (with proportions of 0.21 and 0.21), but also affects exploration and elaboration. The analysis also highlights the importance of adapting multimedia technology to learners’ needs (with a weight of 0.11). For instance, Wang et al. [59] positively assessed the impact of multimedia-based educational enhancements on student abilities. Johnson and Cooke [60] evaluated different feedback modes related to the evaluation stage, while Van Laer and Elen [61] identified factors that enhance self-regulation in blended learning environments.
Gamification strategies are crucial for the engagement stage (with a proportion of 0.32), but they also enhanced the elaboration and evaluation stages (with proportions of 0.2 and 0.19). For example, Palaniappan and Noor [62] described a “gamification experience” involving competition, point earning, leaderboards, and badges, which significantly improved the learners’ performance by affecting motivation and perception. Kyewski and Krämer [63] and Albuquerque et al. [64] evaluated gamification elements such as badges and leaderboards, focusing on their impact on motivation and emotional factors. Meishar-Tal and Kesler [65] and Chen et al. [66] examined gamification for elaboration purposes, noting positive effects on motivation as well.
The final group of effective learning elements—PB, or problem-based methods—shows effects across engagement, explanation, and elaboration, with a primary focus on engagement (with proportions of 0.4, 0.2, and 0.2). For instance, Park and Yun [67] and Lazarides et al. [68] analyzed the relationship between motivational and cognitive factors, noting significant impacts. Rajabalee and Santally [69] studied the correlation between satisfaction and engagement in a project-based course. Yeoman and Wilson [70] emphasized the importance of situative design for learning outcomes in the explanation and exploration stages.
In summary, the effective learning elements aligned well with the learning cycle stages, as defined by the topic model. This alignment assists instructors in planning and designing innovative courses, integrating the learning cycle philosophy with these effective elements.

4.5. Conclusions

In this paper, we propose a novel method for analizing recent online learning literature by utilizing a keyword-based Latent Dirichlet Allocation, keyATM, to conduct a topic analysis for automatically identifying emerging trends through the lens of the learning cycle. By leveraging titles and abstracts from high-ranked educational journals, we assigned topic proportions to each article, where the topics corresponded to the stages of the learning cycle: engagement, exploration, explanation, elaboration, evaluation, and evolution. Our findings revealed temporal trends in the relative importance of each learning cycle component and the interconnections between the extracted topics and other areas of interest within current research.
Furthermore, we explored the relationship between these topics and key learning elements, such as virtual reality, multimedia, gamification, and problem-based learning. The analysis of effective learning elements highlights their alignment with specific stages of the learning cycle, offering valuable insights for course design and instruction. Virtual and augmented reality proved particularly effective in the elaboration phase, enhancing student engagement with theoretical concepts and improving material visibility. Multimedia tools extend their influence to both elaboration and evaluation stages, while also contributing to exploration. Gamification demonstrates broad utility across multiple stages, fostering motivation and active learning. Lastly, problem-based methods emphasize explanation and elaboration, underscoring the importance of situative design and cognitive engagement. These findings underscore the potential of integrating these elements into course planning to create innovative, learner-centered environments.
Overall, our approach allows innovative learning strategies and elements to be mapped within the framework of the learning cycle. The insights gained from this analysis can support researchers in designing targeted and effective learning activities that align with specific stages of the learning cycle.
The findings from this study offer significant practical value for educators and instructional designers by providing a structured approach to integrating emerging trends and effective learning elements within the learning cycle framework. By identifying the stages of engagement, exploration, explanation, elaboration, evaluation, and evolution in online learning literature, practitioners can better align their course design strategies with these key phases of the learning process. The insights into how tools like virtual reality, multimedia, gamification, and problem-based learning align with specific stages offer actionable guidance for enhancing instructional techniques. For instance, virtual reality can be leveraged to deepen conceptual understanding during the elaboration phase, while multimedia can play a critical role in both elaboration and evaluation. By using these findings, educators can create more dynamic and learner-centered environments, ensuring that innovative teaching methods are effectively incorporated into each stage of the learning cycle.
While our proposed method provides valuable insights into the alignment of online learning trends with the learning cycle framework, several limitations warrant consideration. First, the analysis relies on titles and abstracts, which may not fully capture the depth and nuance of the research articles, potentially overlooking important details. Expanding the analysis to include full texts could provide a more comprehensive understanding of topic proportions and trends. Additionally, the keyATM approach, while effective in aligning topics with learning cycle components, is sensitive to the selection of keywords, and further refinement of these keywords could enhance topic identification. Another limitation is the focus on high-ranked educational journals, which may exclude valuable contributions from other sources, such as conference proceedings or practitioner-oriented publications.
For future research, expanding the dataset to include a broader range of literature, refining the keywords and applying other topic modeling techniques could enhance the robustness of the findings. Additionally, studies exploring the evolving role of emerging technologies such as AI-driven personalized learning or adaptive learning platforms in the learning cycle would offer further insights. Finally, future work could investigate the practical application of these findings by analyzing the effectiveness of course designs that align with the identified trends and learning cycle components, providing empirical validation for the theoretical framework.

Funding

This research was funded by the Institut für Angewandte Forschung IFAF Berlin, project IFAF Verbund MultiLA.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The code for retreiving the data and conducting the analysis is available at https://github.com/IFAFMultiLA/knowledge-hub-learning-analytics (accessed on 18 January 2024).

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Means, B.; Toyama, Y.; Murphy, R.; Bakia, M.; Jones, K.; Planning, E. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. 2010, Volume 115. Available online: https://www2.ed.gov/rschstat/eval/tech/evidence-basedpractices/finalreport.pdf (accessed on 18 January 2024).
  2. Ying, X.; Thompson, P. Flipped University Class: A Study of Motivation and Learning. J. Inf. Technol. Educ. Res. 2020, 19, 41–63. [Google Scholar] [CrossRef]
  3. Shi, Y.; Ma, Y.; MacLeod, J.; Yang, H.H. College students’ cognitive learning outcomes in flipped classroom instruction: A meta-analysis of the empirical literature. J. Comput. Educ. 2019, 7, 79–103. [Google Scholar] [CrossRef]
  4. Hew, K.; Jia, C.; Gonda, D.; Bai, S. Transitioning to the “new normal” of learning in unpredictable times: Pedagogical practices and learning performance in fully online flipped classrooms. Int. J. Educ. Technol. High. Educ. 2020, 17, 57. [Google Scholar] [CrossRef]
  5. Li, H.; Gu, H.; Chen, W.; Zhu, Q. Improving Massive Open Online Course Quality in Higher Education by Addressing Student Needs Using Quality Function Deployment. Sustainability 2023, 15, 15678. [Google Scholar] [CrossRef]
  6. Zheng, M.; Woo, D.; Benton, K. The Design and Impact of Interactive Online Modules for Dental Faculty Calibration. Educ. Sci. 2024, 14, 818. [Google Scholar] [CrossRef]
  7. Blei, D.M.; Ng, A.Y.; Jordan, M.I. Latent dirichlet allocation. J. Mach. Learn. Res. 2003, 3, 993–1022. [Google Scholar]
  8. Churchill, R.; Singh, L. The Evolution of Topic Modeling. ACM Comput. Surv. 2022, 54, 215. [Google Scholar] [CrossRef]
  9. Gurcan, F.; Ozyurt, O.; Cagitay, N.E. Investigation of Emerging Trends in the E-Learning Field Using Latent Dirichlet Allocation. Int. Rev. Res. Open Distrib. Learn. 2021, 22, 1–18. [Google Scholar] [CrossRef]
  10. Bybee, R.; Taylor, J.; Gardner, A.; Scotter, P.; Carlson, J.; Westbrook, A.; Landes, N. The BSCS 5E Instructional Model: Origins, Effectiveness, and Applications. BSCS Report. 2006. Available online: https://bscs.org/reports/the-bscs-5e-instructional-model-origins-and-effectiveness/ (accessed on 18 January 2024).
  11. Eshima, S.; Imai, K.; Sasaki, T. Keyword-Assisted Topic Models. Am. J. Political Sci. 2023, 68, 730–750. [Google Scholar] [CrossRef]
  12. Martin, F.; Sun, T.; Westine, C.D. A systematic review of research on online teaching and learning from 2009 to 2018. Comput. Educ. 2020, 159, 104009. [Google Scholar] [CrossRef]
  13. Priem, J.; Piwowar, H.; Orr, R. OpenAlex: A fully-open index of scholarly works, authors, venues, institutions, and concepts. arXiv 2022, arXiv:2205.01833. [Google Scholar]
  14. Kolb, D. Experiential Learning: Experience as the Source of Learning and Development; Financial Times Press: New York, NY, USA, 1984; Volume 1. [Google Scholar]
  15. Hu, S.; Kuh, G.; Li, S. The Effects of Engagement in Inquiry-Oriented Activities on Student Learning and Personal Development. Innov. High. Educ. 2008, 33, 71–81. [Google Scholar] [CrossRef]
  16. Lewis, A.; Huebner, E.; Malone, P.; Valois, R. Life Satisfaction and Student Engagement in Adolescents. J. Youth Adolesc. 2010, 40, 249–262. [Google Scholar] [CrossRef]
  17. Wong, Z.; Liem, G.A. Student Engagement: Current State of the Construct, Conceptual Refinement, and Future Research Directions. Educ. Psychol. Rev. 2022, 34, 1–32. [Google Scholar] [CrossRef]
  18. Lee, J.; Song, H.D.; Hong, A.J. Exploring Factors, and Indicators for Measuring Students’ Sustainable Engagement in e-Learning. Sustainability 2019, 11, 985. [Google Scholar] [CrossRef]
  19. Skilling, K.; Bobis, J.; Martin, A. The “ins and outs” of student engagement in mathematics: Shifts in engagement factors among high and low achievers. Math. Educ. Res. J. 2020, 33, 469–493. [Google Scholar] [CrossRef]
  20. Maroco, J.; Maroco, A.; Campos, J.; Fredricks, J. University student’s engagement: Development of the University Student Engagement Inventory (USEI). Psicol. Reflexão Crítica 2016, 29, 21. [Google Scholar] [CrossRef]
  21. Assunção, H.; Lin, S.W.; Sit, P.S.; Cheung, K.c.; Harju-Luukkainen, H.; Smith, T.; Maloa, B.; Campos, J.; Stepanovic Ilic, I.; Esposito, G.; et al. University Student Engagement Inventory (USEI): Transcultural Validity Evidence Across Four Continents. Front. Psychol. 2020, 10, 2796. [Google Scholar] [CrossRef]
  22. Tomás, J.; Gutiérrez, M.; Alberola, S.; Georgieva, S. Psychometric properties of two major approaches to measure school engagement in university students. Curr. Psychol. 2022, 41, 2654–2667. [Google Scholar] [CrossRef]
  23. Meylani, R.; Bitter, G.; Legacy, J. Desirable Characteristics of an Ideal Online Learning Environment. J. Educ. Soc. Res. 2015, 5, 203–216. [Google Scholar] [CrossRef]
  24. Morris, T.H. Experiential learning—A systematic review and revision of Kolb’s model. Interact. Learn. Environ. 2020, 28, 1064–1077. [Google Scholar] [CrossRef]
  25. Katsaris, I.; Vidakis, N. Adaptive e-learning systems through learning styles: A review of the literature. Adv. Mob. Learn. Educ. Res. 2021, 1, 124–145. [Google Scholar] [CrossRef]
  26. Demmese, F.; Yuan, X.; Dicheva, D. Evaluating the Effectiveness of Gamification on Students’ Performance in a Cybersecurity Course. J. Colloq. Inf. Syst. Secur. Educ. 2020, 8, 1821189. [Google Scholar]
  27. Susnjak, T.; Ramaswami, G.; Mathrani, A. Learning analytics dashboard: A tool for providing actionable insights to learners. Int. J. Educ. Technol. High. Educ. 2022, 19, 12. [Google Scholar] [CrossRef] [PubMed]
  28. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Human–Computer Interact. 2018, 34, 577–590. [Google Scholar] [CrossRef]
  29. Fatahi, S. An experimental study on an adaptive e-learning environment based on learner’s personality and emotion. Educ. Inf. Technol. 2019, 24, 2225–2241. [Google Scholar] [CrossRef]
  30. Khamparia, A.; Pandey, B. Association of learning styles with different e-learning problems: A systematic review and classification. Educ. Inf. Technol. 2020, 25, 1303–1331. [Google Scholar] [CrossRef]
  31. Wilson, C.D.; Taylor, J.A.; Kowalski, S.M.; Carlson, J. The Relative Effects and Equity of Inquiry-Based and Commonplace Science Teaching on Students’ Knowledge, Reasoning, and Argumentation. J. Res. Sci. Teach. 2010, 47, 276–301. [Google Scholar] [CrossRef]
  32. Hmelo-Silver, C.E.; Duncan, R.G.; Chinn, C.A. Scaffolding and Achievement in Problem-Based and Inquiry Learning: A Response to Kirschner, Sweller, and Clark (2006). Educ. Psychol. 2007, 42, 99–107. [Google Scholar] [CrossRef]
  33. Garrison, D.R.; Kanuka, H. Blended Learning: Uncovering Its Transformative Potential in Higher Education. Internet High. Educ. 2004, 7, 95–105. [Google Scholar] [CrossRef]
  34. Su, C.; Chiu, C.; Wang, T. The development of SCORM-conformant learning content based on the learning cycle using participatory design. J. Comput. Assist. Learn. 2010, 26, 392–406. [Google Scholar] [CrossRef]
  35. Costello, E.; Soverino, T.; Bolger, R. Mapping the MOOC Research Landscape: Insights from Empirical Studies. Int. J. Emerg. Technol. Learn. IJET 2022, 17, 4–19. [Google Scholar] [CrossRef]
  36. Pei, B.; Xing, W.; Wang, M. Academic Development of Multimodal Learning Analytics: A Bibliometric Analysis. Interact. Learn. Environ. 2021, 31, 1–19. [Google Scholar] [CrossRef]
  37. Xiong, H.; Cheng, Y.; Zhao, W.; Liu, J. Analyzing Scientific Research Topics in Manufacturing Field Using a Topic Model. Comput. Ind. Eng. 2019, 135, 333–347. [Google Scholar] [CrossRef]
  38. Mostafa, M.M. A One-Hundred-Year Structural Topic Modeling Analysis of the Knowledge Structure of International Management Research. Qual. Quant. 2023, 57, 3905–3935. [Google Scholar] [CrossRef]
  39. Dede, C. Immersive Interfaces for Engagement and Learning. Science 2009, 323, 66–69. [Google Scholar] [CrossRef]
  40. Hung, W.; Jonassen, D.; Liu, R. Problem-Based Learning. In Handbook of Research on Educational Communications and Technology; Springer: Berlin/Heidelberg, Germany, 2008; pp. 485–506. [Google Scholar] [CrossRef]
  41. Hamari, J.; Koivisto, J.; Sarsa, H. Does Gamification Work?—A Literature Review of Empirical Studies on Gamification. In Proceedings of the Annual Hawaii International Conference on System Sciences, Waikoloa, HI, USA, 6–9 January 2014. [Google Scholar] [CrossRef]
  42. Bellotti, F.; Kapralos, B.; Lee, K.; Moreno Ger, P.; Berta, R. Assessment in and of Serious Games: An Overview. Adv. Hum.-Comput. Interact. 2013, 2013, 36864. [Google Scholar] [CrossRef]
  43. Yu, Z.; Xu, W. A Meta-Analysis and Systematic Review of the Effect of Virtual Reality Technology on Users’ Learning Outcomes. Comput. Appl. Eng. Educ. 2022, 30, 1470–1484. [Google Scholar] [CrossRef]
  44. Massimo, A.; Trang, L.; Corrado, C.; Alessandra, B.; June, C. openalexR: An R-Tool for Collecting Bibliometric Data from OpenAlex. R J. 2024, 15, 167–180. [Google Scholar] [CrossRef]
  45. Eisenkraft, A. Expanding the 5E Model. Sci. Teach. 2003, 70, 56–59. [Google Scholar]
  46. Duran, L.B.; Duran, E. The 5E instructional model: A learning cycle approach for inquiry-based science teaching. Sci. Educ. Rev. 2004, 3, 49–58. [Google Scholar]
  47. Özyurt, Ö.; Özyurt, H. Learning style based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Comput. Hum. Behav. 2015, 52, 349–358. [Google Scholar] [CrossRef]
  48. El-Sabagh, H. Adaptive e-learning environment based on learning styles and its impact on development students’ engagement. Int. J. Educ. Technol. High. Educ. 2021, 18, 53. [Google Scholar] [CrossRef]
  49. Kolekar, S.; Pai, R.; Manohara, P.M.M. Rule based adaptive user interface for adaptive E-learning system. Educ. Inf. Technol. 2019, 24, 613–641. [Google Scholar] [CrossRef]
  50. Thompson, L.; Mimno, D. Authorless Topic Models: Biasing Models Away from Known Structure. In Proceedings of the 27th International Conference on Computational Linguistics, Santa Fe, NM, USA, August 2018; Bender, E.M., Derczynski, L., Isabelle, P., Eds.; Association for Computational Linguistics: Stroudsburg, PA, USA, 2018; pp. 3903–3914. Available online: https://aclanthology.org/C18-1329/ (accessed on 18 January 2024).
  51. Selivanov, D.; Bickel, M.; Wang, Q. text2vec: Modern Text Mining Framework for R, R Package Version 0.6.3; The R Project for Statistical Computing: Vienna, Austria, 2022; Available online: https://cran.r-project.org/web/packages/text2vec/ (accessed on 18 January 2024).
  52. Gurdiel, L.; Morales Mediano, J.; Cifuentes Quintero, J. A comparison study between coherence and perplexity for determining the number of topics in practitioners interviews analysis. In Proceedings of the IV Iberoamerican Conference of Young Researchers in Economy and Management, Madrid, Spain, 16–17 December 2021; Available online: https://repositorio.comillas.edu/xmlui/handle/11531/67714 (accessed on 18 January 2024).
  53. Li, K.C.; Wong, B.T.M.; Chan, H.T. Teaching and learning innovations for distance learning in the digital era: A literature review. Front. Educ. 2023, 8, 1198034. [Google Scholar] [CrossRef]
  54. Nurbekova, Z.; Baigusheva, B. On the Issue of Compliance with Didactic Principles in Learning using Augmented Reality. Int. J. Emerg. Technol. Learn. IJET 2020, 15, 121–132. [Google Scholar] [CrossRef]
  55. Zhao, Q. The Application of Augmented Reality Visual Communication in Network Teaching. Int. J. Emerg. Technol. Learn. IJET 2018, 13, 57–70. [Google Scholar] [CrossRef]
  56. Wolski, R.; Jagodziński, P. Virtual laboratory—Using a hand movement recognition system to improve the quality of chemical education. Br. J. Educ. Technol. 2019, 50, 218–231. [Google Scholar] [CrossRef]
  57. Garzón, J.; Acevedo, J. Meta-Analysis of the Impact of Augmented Reality on Students’ Learning Gains. Educ. Res. Rev. 2019, 27, 244–260. [Google Scholar] [CrossRef]
  58. Chang, H.Y.; Binali, T.; Liang, J.C.; Chiou, G.L.; Cheng, K.H.; Lee, S.W.Y.; Tsai, C.C. Ten Years of Augmented Reality in Education: A Meta-Analysis of (Quasi-) Experimental Studies to Investigate the Impact. Comput. Educ. 2022, 191, 104641. [Google Scholar] [CrossRef]
  59. Wang, X.; Sun, H.; Li, L. An Innovative Preschool Education Method Based on Computer Multimedia Technology. Int. J. Emerg. Technol. Learn. IJET 2019, 14, 57–68. [Google Scholar] [CrossRef]
  60. Johnson, G.M.; Cooke, A. Self-Regulation of Learning and Preference for Written versus Audio-Recorded Feedback by Distance Education Students. Distance Educ. 2016, 37, 107–120. [Google Scholar] [CrossRef]
  61. Van Laer, S.; Elen, J. In Search of Attributes That Support Self-Regulation in Blended Learning Environments. Educ. Inf. Technol. 2017, 22, 1395–1454. [Google Scholar] [CrossRef]
  62. Palaniappan, K.; Noor, N.M. Gamification Strategy to Support Self-Directed Learning in an Online Learning Environment. Int. J. Emerg. Technol. Learn. IJET 2022, 17, 104–116. [Google Scholar] [CrossRef]
  63. Kyewski, E.; Krämer, N.C. To Gamify or Not to Gamify? An Experimental Field Study of the Influence of Badges on Motivation, Activity, and Performance in an Online Learning Course. Comput. Educ. 2018, 118, 25–37. [Google Scholar] [CrossRef]
  64. Albuquerque, J.; Bittencourt, I.I.; Coelho, J.A.P.M.; Silva, A.P. Does Gender Stereotype Threat in Gamified Educational Environments Cause Anxiety? An Experimental Study. Comput. Educ. 2017, 115, 161–170. [Google Scholar] [CrossRef]
  65. Meishar-Tal, H.; Kesler, A. “If I Create a Game I’ll Learn”: Online Game Creation as a Tool to Promote Learning Skills of Students with Learning Difficulties. Interact. Learn. Environ. 2023, 31, 3071–3082. [Google Scholar] [CrossRef]
  66. Chen, M.H.M.; Tsai, S.T.; Chang, C.C. Effects of Game-Based Instruction on the Results of Primary School Children Taking a Natural Science Course. Educ. Sci. 2019, 9, 79. [Google Scholar] [CrossRef]
  67. Park, S.; Yun, H. Relationships between Motivational Strategies and Cognitive Learning in Distance Education Courses. Distance Educ. 2017, 38, 302–320. [Google Scholar] [CrossRef]
  68. Lazarides, R.; Dicke, A.L.; Rubach, C.; Eccles, J.S. Profiles of Motivational Beliefs in Math: Exploring Their Development, Relations to Student-Perceived Classroom Characteristics, and Impact on Future Career Aspirations and Choices. J. Educ. Psychol. 2020, 112, 70–92. [Google Scholar] [CrossRef]
  69. Rajabalee, Y.B.; Santally, M.I. Learner Satisfaction, Engagement and Performances in an Online Module: Implications for Institutional e-Learning Policy. Educ. Inf. Technol. 2021, 26, 2623–2656. [Google Scholar] [CrossRef] [PubMed]
  70. Yeoman, P.; Wilson, S. Designing for Situated Learning: Understanding the Relations between Material Properties, Designed Form and Emergent Learning Activity. Br. J. Educ. Technol. 2019, 50, 2090–2108. [Google Scholar] [CrossRef]
Figure 1. Number of articles in the selection plotted against their publication date (with 2023 omitted). The dashed line represents the mean value.
Figure 1. Number of articles in the selection plotted against their publication date (with 2023 omitted). The dashed line represents the mean value.
Information 15 00665 g001
Figure 2. Expected topic proportions of the extracted topics and their topwords.
Figure 2. Expected topic proportions of the extracted topics and their topwords.
Information 15 00665 g002
Figure 3. Proportion of topics in the selected articles over time.
Figure 3. Proportion of topics in the selected articles over time.
Information 15 00665 g003
Figure 4. (Left panel) Network graph for topic proportions, showing the expected proportion of edge-connected topics in a document with a high proportion of the vertex topic. (Right panel) Network graph for topic citations, illustrating the expected citations of documents with a high proportion of edge-connected topics. Both graphs are based on the 100 documents with the highest topic proportion for each topic.
Figure 4. (Left panel) Network graph for topic proportions, showing the expected proportion of edge-connected topics in a document with a high proportion of the vertex topic. (Right panel) Network graph for topic citations, illustrating the expected citations of documents with a high proportion of edge-connected topics. Both graphs are based on the 100 documents with the highest topic proportion for each topic.
Information 15 00665 g004
Figure 5. Network graph of top words for the learning cycle topics ( k = 1 , , 5 ), illustrating the co-occurrences of top words in document contexts.
Figure 5. Network graph of top words for the learning cycle topics ( k = 1 , , 5 ), illustrating the co-occurrences of top words in document contexts.
Information 15 00665 g005
Figure 6. (Left plot) Connections (lines) between the keyword (math) and the topics, with line width indicating frequency. (Right plot) Titles of the top three papers referring to learning cycle topics 1–5 that contain the keyword.
Figure 6. (Left plot) Connections (lines) between the keyword (math) and the topics, with line width indicating frequency. (Right plot) Titles of the top three papers referring to learning cycle topics 1–5 that contain the keyword.
Information 15 00665 g006
Table 1. Chosen keywords for the six keyword topics used to assist topic extraction.
Table 1. Chosen keywords for the six keyword topics used to assist topic extraction.
EngagementExplorationExplanationElaborationEvaluationEvolution
engagexplorexplainproblemassessadapt
motivexperiinstructpracticevaluindividu
emotinteractknowledgapplictestimprov
promotactivtheorisolvfeedbackbehavior
Table 2. Average coherence for different total topic numbers, K, including the six prespecified keyword topics. The optimal topic number based on the average coherence criterion is K = 9 , implying that the number of keywordless topics should be set to 3.
Table 2. Average coherence for different total topic numbers, K, including the six prespecified keyword topics. The optimal topic number based on the average coherence criterion is K = 9 , implying that the number of keywordless topics should be set to 3.
K coh ¯
6−1.9798
7−2.0108
8−1.9869
9−1.9657
10−2.0116
11−1.9689
12−2.0301
13−2.0082
14−1.9821
15−2.0677
Table 3. Top 20 words for each keyword-based topic. Keywords matching the respective topic are marked with a checkmark. Keywords that appear in the top words of another topic are labeled with the corresponding keyword topic number.
Table 3. Top 20 words for each keyword-based topic. Keywords matching the respective topic are marked with a checkmark. Keywords that appear in the top words of another topic are labeled with the corresponding keyword topic number.
1 Engagement2 Exploration3 Explanation4 Elaboration5 Evaluation6 Evolution7 Platform8 Acceptance9 Distance
selfinteract [✓]teacherteachassess [✓]modelresearchlanguagcovid
engag [✓]coursschooldevelopgroupcoursdigitmobilpandem
motiv [✓]socialmathematbaseeffectsystemhigherenglishface
academdesigninstruct [✓]designtest [✓]moocreviewacceptdistanc
factoractiv [✓]teachpractic [✓]feedback [✓]datadevelopintentteach
effectdiscussstemprocessevalu [✓]learneruniversmodelunivers
levelenvironserviceffectperformimprov [✓]articlfactorinstitut
positexperi [✓]knowledg [✓]applic [✓]peerbaseanalysiperceivhigher
significlearnerdevelopcomputreadopenpaperattitudremot
relatcollaborprofessionmethodresultqualitiliteraturadoptlectur
satisfactparticippreskillexperimentbehavior [✓]informinfluencchalleng
relationshipsupportchildrenproblem [✓]controlplatformcontexttowardfaculti
regulblendparentpapertwoanalysiidentifiresearchemerg
influencapproachictimplementbaseindividu [✓]increasforeignexperi [2]
efficaciresearchprogramnewscoreresultneedefltime
performinstructorsupportresultdifferproposrelatcomputtransit
resultengag [1]integrknowledg [3]writeperformfuturdevicsurvey
emot [✓]classclassroomintegrcomparuserprofessionuniverschang
perceivexplor [✓]tpacktoolinterventmanagfocususagacadem
differpresencsecondarivirtualsignificadapt [✓]includtheori [3]due
Table 4. Topic distribution of effective learning elements.
Table 4. Topic distribution of effective learning elements.
1 Engagement2 Exploration3 Explanation4 Elaboration5 Evaluation6 Evolution7 Platform8 Acceptance9 Distance
VR0.000.000.050.770.030.010.100.030.00
MM0.030.170.210.140.210.110.030.040.05
GM0.320.060.100.200.190.030.020.060.02
PB0.400.000.200.200.000.000.000.200.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Osipenko, M. Online Learning from the Learning Cycle Perspective: Discovering Patterns in Recent Research. Information 2024, 15, 665. https://doi.org/10.3390/info15110665

AMA Style

Osipenko M. Online Learning from the Learning Cycle Perspective: Discovering Patterns in Recent Research. Information. 2024; 15(11):665. https://doi.org/10.3390/info15110665

Chicago/Turabian Style

Osipenko, Maria. 2024. "Online Learning from the Learning Cycle Perspective: Discovering Patterns in Recent Research" Information 15, no. 11: 665. https://doi.org/10.3390/info15110665

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop