Next Article in Journal
Cognitive Pervasive Service Composition Applied to Predatory Crime Deterrence
Previous Article in Journal
Net-Shape NiTi Shape Memory Alloy by Spark Plasma Sintering Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Application of Learning Analytics in Virtual Tutoring: Moving toward a Model Based on Interventions and Learning Performance Analysis

by
Luis Magdiel Oliva-Córdova
1,2,3,*,
Antonio Garcia-Cabot
1 and
Héctor R. Amado-Salvatierra
4
1
Computer Science Department, University of Alcalá, 28805 Madrid, Spain
2
Engineering Faculty, University of San Carlos de Guatemala, Guatemala 01012, Guatemala
3
Doctoral Center, Francisco Marroquin University, Ciudad de Guatemala 01010, Guatemala
4
Galileo Educational System Department, Galileo University, Ciudad de Guatemala 01010, Guatemala
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(4), 1805; https://doi.org/10.3390/app11041805
Submission received: 7 January 2021 / Revised: 8 February 2021 / Accepted: 11 February 2021 / Published: 18 February 2021
(This article belongs to the Section Computing and Artificial Intelligence)

Abstract

:
The research area related to the use of Learning Analytics and the prediction of student performance is multidimensional; therefore, it can be explored and analyzed through different perspectives. This research addresses the relationship between pedagogical interventions based on Learning Analytics and student learning performance. The research problem of predicting student performance can be analyzed from various angles. This study presents an analysis based on the technique of Path Analysis (PA) and proposes a model based on the following variables: Mediation, Motivation, Communication, Learning Design, and Learning Performance. The study’s findings demonstrate the importance of the role of virtual tutors in carrying out pedagogical interventions thanks to the information retrieved from the Learning Analytics tools and its appropriate analysis.

1. Introduction

The advance of technology has enhanced the various areas of knowledge; in the educational process, there have been significant changes in the usual teaching methods and techniques, which has made it possible to develop active and participatory learning. In the virtual training modality, teachers are overloaded with activities related to the educational fact; planning of learning, design of digital didactic material, design of online learning activities, design of online evaluations, monitoring, tutoring, synchronous video classes, feedback, grading and return of assignments, among others. Given this reality, the teacher must select tools that allow him/her to strengthen the teaching practice and provide him/her with inputs to make decisions while developing learning management in the students.
Information and Communication Technologies (ICT) are in constant innovation and development. An example of their contribution to the learning process is the implementation of e-learning platforms. This contribution has had such an impact that in recent years, these platforms have allowed the generation, organization, and dissemination of knowledge in a simple way and with great availability for anyone with Internet access [1]. Several authors and educational centers maintain that technology plays an essential role in the educational system and that it has a potential for innovation and favorable development prospects [2]; United Nations Educational, Scientific and Cultural Organization(UNESCO) highlights the potential of ICT to disseminate and improve teaching and learning in a wide variety of contexts [3] and [4] adds that ICT also have the power to improve people’s quality of life. It also asserts that a future can be seen in which the Internet and other information technologies have the potential to promote impressive improvements in education.
The quality of education has been a subject of interest for teachers throughout time; the European Higher Education Area (EHEA) considers that “in order to achieve quality educational systems, new teaching methods for the training of students and innovative methods of assessing learning are necessary” [5].
Therefore, along with the rethinking of the educational system and the birth of virtual learning environments, there is also a need for monitoring these interfaces in order to verify compliance with the primary objective: providing a personalized education with quality content regardless of a student’s location.

1.1. Learning Analytics (LA)

Regardless of the approach given to these virtual environments, one must always keep in mind the following questions: How practical is the course? Is it meeting the needs of the students? How can the needs of the students be better supported? Which interactions are effective? How can these relationships be optimized to the maximum?
In order to answer the above questions with conventional procedures, it is required to evaluate the students, analyze the grades they have achieved, inquire about the dropout rate and the reflections of the teacher at the end of the course, but these procedures are already insufficient and make it difficult to make a judgment regarding the efficiency of the courses in the absence of more detailed information with relevant quantitative and qualitative data when making decisions about the methods used.
Given these weaknesses in conventional procedures, it is necessary to use Learning Analytics (LA), which allows the measuring, collecting, analyzing, and presenting of students’ data, their contexts, and the interactions generated therein in order to understand the learning process that is taking place and to optimize the environments in which it occurs. These interactions cover all levels: student–teacher–student, student–content, and student–student. More specifically, the answers to the questions mentioned above should be found using the LA discipline because the same will allow the generation of references for decision making at different levels, ranging from school administration and planning to the promotion of new, more personalized educational models [6].
A great promise of the LA is that knowledge of virtual learning environments and their impact on a large scale can be obtained through the collecting of large amounts of data. Learning approaches and design must be integrated with pedagogy when dealing with student-facing learning analytics systems in order for feedback tools to work effectively [7].

1.2. Online Tutoring

The role of the online teacher is of utmost importance within the teaching–learning process since he/she is responsible for determining which mentoring and motivational strategies to use for the realization of the learning design; that is, to present the content, to favor meaningful experiences, to implement pre-instructional, co-instructional, post-instructional, individual, or collaborative pedagogical methodologies and to measure the achievements that have an impact on the learning performance. In this sense, it is crucial to consider that the online tutor must develop communication and feedback skills to optimize time, resources, systems, and data obtained from students’ traces in the virtual environment.
Pedagogical interventions based on Learning Analytics can be related to the learning performance of their students. The question is: what is the impact of pedagogical interventions with Learning Analytics on the learning performance of university students? This article presents a proposal for a path model in which variables, such as mediation, motivation, communication, learning design, and learning performance, were considered to answer the question. Path Analysis (PA) was used to evaluate the adjustment between the set of dependency relationships (direct and indirect influence) proposed from the study variables.
The purpose of this study was the application of Learning Analytics tools in order to strengthen teaching practice and relate pedagogical interventions to students’ learning performance. In this way, the authors propose the objective of the research in the analysis of the application of Learning Analytics tools and their relevance in strengthening teaching practice and determining suggestions to be implemented from the study variables.
The work continues as follows: Section 2 presents a literature review on Learning Analytics and success cases applied to pedagogical intervention and students’ learning performance. Section 3 and Section 4 present the research hypotheses and the proposed model. Then, Section 5 presents an analysis of the data, closing with conclusions.

2. Literature Review

The incorporation of technology in the face-to-face or virtual teaching–learning processes requires the review of the teacher role because it is no longer a question of being in front of a classroom exposing concepts and looking for procedures so that students acquire new learning, but to host academic resources that facilitate the learning of students independently and collaboratively in platforms; in other words, the teacher must also acquire the skills to design content in an instructional way and use the virtual learning space (the e-learning platform, virtual campus or learning management system (LMS)) properly; also, they need to use Learning Analytics (LA), because this instrument can motivate students to focus on teaching by adapting it to the knowledge they already have.
With the growth and evolution of ICT supported Distance Education, the courses’ execution in this modality should become more effective and efficient [8]. The concern for having a quality educational system seems to have motivated the realization of studies where some of the applications of the information and communication technologies have been put into practice [9,10]. The experiences in the literature highlight the importance for teachers to select technological tools that allow them to strengthen their teaching practice and, at the same time, provide them with inputs for decision-making before, during, and after the development of student learning management. In this context, LA arises as an emerging discipline that seeks to improve teaching and to learn through a critical evaluation of raw data and the generation of patterns that characterize student habits, predicting learning to provide timely interventions [11]. The mechanism for predicting and ranking student performance is crucial to promoting student learning success. Finding students at risk by predicting their performance can help teachers intervene so that students improve and achieve the desired success [12].
The educational technology field is embracing the use of LA to enhance students’ learning experiences. Along with the exponential growth in this area, there is growing concern about the interpretability of analyses of student experience and what learning data can tell us [13]. In recent years, LA has become a promising research area that extracts useful information from educational databases to understand student progress and performance. The concept of LA refers to the measurement, collection, analysis, and presentation of information about students and their contexts to understand and optimize learning [14]. Recent developments have attracted much attention from researchers and practitioners toward the exploring of LA’s potential to improve learning and teaching practices. The abundance of available educational data, supported by technology-enhanced learning platforms, provides opportunities to assess student learning behavior, address student problems, optimize the educational environment, and facilitate data-driven decision making [15,16,17].
In distance learning environments, the teachers’ role is to provide support through data, and Predictive Learning Analytics (PLA) is crucial. The orientation and assistance from teachers significantly impact the learning, the results, and the completion of the students’ learning activities [18,19]. LA can enable students, teachers, and their institutions to understand better and predict learning and performance [20].
Virtual tutors play a crucial role in this process; the application of artificial intelligence and LA allows virtual tutors, teachers, and academic advisors to understand the behavior patterns related to the student’s academic success using data collected from institutional databases [21].

2.1. Mentoring and Learning Performance

The virtual tutor is called an information facilitator, who is a born motivator to provoke students’ interest to stay in the online course. The tutor must have the ability to provide feedback to the student’s action and himself in order to achieve two-way improvements. In addition, Learning Design (LD) can provide a context of understanding to design predictive models in collaboration with tutors, maximizing their potential to support learning [22]. The use of technology can provide an unlimited source of data: the intensive use of tracking data collected through various technologies, such as intelligent tutoring systems (ITS) or learning management systems (LMS), produces large volumes of student interaction data [23,24,25] that teachers can use to improve teaching practice and to make LA-based pedagogical interventions that improve the learning performance of their students.
The research area related to the use of Learning Analytics and student performance prediction is multidimensional and can be explored and analyzed through different perspectives. This research addresses the relationship between pedagogical interventions based on Learning Analytics and student learning performance. The research problem of predicting student performance can be analyzed from various angles. In the current literature, several complementary approaches provide a baseline for such analyses; for example, descriptive analyses use data obtained from course evaluations, surveys, student information systems, learning management system activities and forums, and interactions primarily for informational purposes [26].

2.2. Learning Performance and Learning Design

Different attributes are associated with predicting student performance. In the study described by [7] that applied an approach to early prediction of learning performance in a mixed calculus course, they applied principal component regression to predict learning performance. The experimental results showed that students’ learning performance in a blended calculus course can be predicted with high stability and accuracy using a dataset containing information from weeks 1 to 6 of the course. Meanwhile, in order to examine the relationship between variables and correlation analysis, group analysis, and multiple regression, this study combined research methodologies and learning analytics to examine the relationship between students’ learning experience and their interactions with e-learning tools and learning outcomes. The results showed how students that reported using an in-depth approach to learning and tended to interact more frequently with the online environment had better learning performance and a positive relationship between the frequency of student interactions with online learning tools and learning performance in the course [13].
Student performance can be assessed by the extent of their interaction with a virtual environment, and more specifically, the students’ interactions with implemented learning management. The tools generate many clickstream data, which reflects their participation in the learning environment [27]. Using student interactions to track learning experiences allows virtual tutors to follow the student or class’s progress and readjust a pedagogical plan according to student performance [16]. The study [15] analyzed the extent to which student engagement time was aligned with the instructor’s learning design and how engagement varied between different performance levels. The analysis was conducted over 28 weeks using follow-up data on 387 students and replicated over two semesters in 2015 and 2016. High-performing students spent more time studying in advance, while low-performing students spent a more significant proportion of their time on remedial activities. Interventions can affect students’ success in learning. The use of interventions with Learning Analytics has been shown to affect guiding tutors or educators to help students in at-risk situations.

2.3. Motivation and Learning Performance

Several institutions have implemented LA interventions that have demonstrated impact in student success positively. The effectiveness of the deep learning model in the early prediction of student performance has been determined, which allows for timely intervention by the university in implementing corrective strategies for student support and counseling [27]; likewise, systems for evaluating student learning performance based on analysis and learning objectives support the teacher during evaluation, promotion, and improvement processes within the learning process [16]. The study [24] used a predictive system that uses automatic learning methods to identify students at risk of not delivering (or failing) their next assignment. Teachers had access through interactive panels to weekly predictions of the risk of failure for their students. The quasi-experimental study showed that when Prediction Learning Analytics knowledge is accessed systematically, virtual tutors or teachers can successfully monitor, identify, and intervene, particularly students at risk of failing their studies. In the methodology, the research [25] proposes an intervention model based on learning analytics from four iteration modules—data collection, data processing, intervention implementation, and effects assessment—and applies it to a blended learning environment. Through the design of a group of pretest–post-test experiments, the intervention model’s effect was measured by learning commitment and learning achievement. The results show that the intervention model can effectively improve students’ behavioral and cognitive engagement and learning achievement, especially for at-risk students.

2.4. Communication and Feedback

The evaluation process requires monitoring of student progress by the virtual tutor or teacher. The objective of evaluation and monitoring is to intervene based on the students’ behavior. The search for consistent performance evaluation has motivated a renewed interest in research related to LA and Data Mining techniques to help tutors interpret raw data from the online environment. It has been found that learning objectives, learning activities, and assessments are interrelated elements that allow students to observe their learning, and that there is a need for tools that help teachers to monitor the academic progress of e-learning [8].
Previous studies have investigated the relationship between the frequency of use that students make of the LMS, such as logins, use of the discussion forum, resources used, and learning performance; however, these models do not consider other factors. The results show the central role of self-efficacy in predicting student performance. Online activity was not predictive of performance, suggesting psychological factors’ primacy over online engagement in determining the outcome. Measuring student behaviors through learning analytics allows researchers to examine relationships between LMS use and student performance.
Three measures were taken at the end of the semester to assess performance: (1) the number of times the student connected to the learning management system during the course of a semester of approximately 12 weeks; (2) the number of discussion messages read; (3) the number of discussion threads created; and (4) the number of resources reviewed. Correlational analyses were conducted using SPSS version 20 to examine the interrelationships between psychological factors, learning analytics, and grade pointing results. Students who have high levels of confidence in themselves and in their ability to achieve academically are more likely to experience academic success. This finding alone has several implications for educators involved in developing and implementing courses for students. Specifically, courses should promote learning environments that foster approaches that build confidence to learn.
Along with numerous conceptual studies, a gradual accumulation of empirical evidence has indicated a strong connection between instructors’ design learning and student behavior. The extent to which student engagement time was aligned with the tutor’s learning design and how engagement varied between different performance levels was investigated. The analysis was conducted using tracking data from 387 students and replicated over two semesters in 2015 and 2016. This study reinforced the importance of the pedagogical context and learning analytics because high-performing students spent more time studying in advance. In contrast, low-performing students spent a significantly larger portion of their time on remedial activities [15]. In educational institutions, success is measured by students’ learning performance. Through continuous assessment of learning performance, educators can introduce innovative technologies such as analytic learning management systems to improve desired student outcomes. The results can help tutors improve teaching and learning activities, thereby improving student learning performance. Decision-makers can develop better procedures and guidelines to assist in the planning, design, development, and evaluation of innovative teaching and learning strategies that broadly help students achieve desired outcomes; they also explore learning analytics to improve the educational experience.
This study’s objective was to explain the relationship between the use of Learning Analytics and the factors that interfere with student learning performance. The Learning Analytics approach implicitly recognizes the importance of student engagement. It adds a new dimension of data to offline engagement measures, such as participation with teachers and peers, time spent on the task, and emotional affinity with assignments [28]. Many learning analytics approaches use persuasive techniques to support study success and students at risk of dropping out [18].
Based on the literature review and LA’s expectations, the present study analyzed the different factors that influence students’ learning performance and how these results depend primarily on virtual tutors’ role within the learning process. The teacher must select tools that allow him/her to strengthen the teaching practice and provide him/her with inputs to make decisions while developing learning management in students. The purpose of the proposed study was to analyze how teaching practice can be strengthened with Learning Analytics and determine the relationship between pedagogical interventions based on Learning Analytics and the learning performance of students from the perspective of the virtual tutor within the online teaching–learning process.

3. Research Model and Hypotheses

The research model used in this study was based on Path Analysis (PA), which is a method that allowed evaluating the adjustment between the set of dependency relationships (direct and indirect influence) proposed from the study variables [29]. It should be mentioned that this method does not test causality but helps to infer causal hypotheses [30].
Before the study’s analyses, an exploratory analysis of the data was performed to assess their quality: treatment of missing data, control of tabulation errors, presence of outliers, variability in distribution, and strange patterns in the data [31].
The PA was developed in six steps: (1) specification: establishment of the study variables; (2) identification: determination of the information to compare the model; (3) parameter estimation: obtaining the values that best fit the observed matrix; (4) fit assessment: determining whether the relationships between the estimated model variables reflect the relationships observed in the data; (5) model re-specification: improving the model fits; and (6) interpretation of results [32,33,34]. For this purpose, the model presented in Figure 1 was proposed, which integrates the study variables: Mentoring (MN), Motivation (MT), Communication and Feedback (CF), Learning Design (LD), and Learning Performance (LP).
Following the objective of this study, the following hypotheses are proposed and will be tested:
  • H1. Mentoring explains the levels of Learning Design.
  • H2: Motivation explains the levels of the Learning Design.
  • H3: Communication and Feedback explains the levels of the Learning Design.
  • H4: Learning design explains Learning performance.
  • H5: The relationship between mentoring, motivation, and communication with learning performance is mediated by the learning design (there is no direct relationship between mentoring, motivation, and communication and learning performance).

4. Research Method

4.1. Study Context

The research was carried out in an online graduate program at the Faculty of Engineering of the University of San Carlos of Guatemala in which 140 students from different disciplines who specialize in educational technology and virtual learning environments participated; the study was conducted in two mandatory courses of the curriculum related to the management of virtual environments and the development of digital educational resources; the courses are led by two professor–tutors specialized in e-learning. Table 1 shows that most of the participants ranged between 29 and 37 years of age and were engineering professionals; all participated voluntarily.
The system used by the teachers to develop the virtual tutoring was the LMS Moodle; as part of the innovation for the study, plugins were integrated to generate learning analytics that would allow reporting of student interactions in the virtual learning environment in order to direct the teaching–learning process toward a model of interventions based on learning analytics and virtual tutoring of the teacher. These plugins were downloaded from the Moodle Community and evaluated for integration and use in graduate programs’ virtual learning environment (Figure 2).
With the purpose that online teachers will use Moodle plugins on time, regarding virtual tutoring, a four-phase model was proposed: (1) design of learning experiences; (2) data collection; (3) data analysis and modeling; and (4) pedagogical interventions. This process favored mentoring since after propitiating a meaningful learning experience, it was possible to collect student data on participation, motivation, successes, and achievements, analyze them, and then intervene promptly using personalized communication and feedback to obtain the desired results academic performance.
It is important to note that the University of San Carlos de Guatemala’s numerical quantification system considers the favorable range from 61 to 100 points; and the unfavorable range is from 0 to 60 points.

4.2. Instruments

Data collection was done through two components: (1) Learning Analytics plugin and (2) Teacher’s log, as shown in Figure 2 in the column A.
The element A in Figure 2 presents both components; in the first component, the Learning Analytics Plugins implemented in the LMS Moodle were used as tools. The data were obtained to be modeled and analyzed through the reports of course activity, accessed contents, activity submissions, distribution of hits, grades, number of active students, registrations, forum participation, course dedication, and interaction in the virtual learning objects.
In the second component, the data were collected through the blogs of the virtual tutors through the messages sent in the platform, follow-up emails, feedback on tasks, synchronous tutorials, an adaptation of educational resources, production of videos enriched with questions, learning design, support video tutorials, telephone calls, motivation messages; then, the data were sent to be evaluated and interpreted.

4.3. Procedure and Data Collection

The process was carried out through the application of an integrated model with four phases: (1) design of learning experiences; (2) data collection; (3) data processing and analysis; and (4) conclusions and discussion.

4.3.1. Phase 1: Design of Learning Experiences

The design of learning experiences was based on the instructional model’s principles of successive approaches Successive Approximation Model (SAM) [35], which is an iterative, cyclical, dynamic model consisting of three phases: iterative design, iterative development, and implementation. Based on this model, learning routes, training lessons, learning activities, audiovisual support resources, and virtual learning objects were designed, and communication and evaluation standards were established.

4.3.2. Phase 2: Data Collection

It was carried out through the instruments established in Figure 2, element A. The following variables were established for this purpose: (1) study time; (2) interaction with virtual learning objects; (3) interaction in forums/wikis; (4) interaction with tasks; (5) learning performance; (6) communication and feedback; (7) mentoring; (8) learning design; and (9) motivation as shown in Table 2.

4.3.3. Phase 3: Data Processing and Analysis

It was carried out through the processes established in Figure 2, element B. After obtaining the data, a classification modeling of the captured data from the LA plugins was done. The data from the tutorial logs were evaluated by estimating weights and making interpretations. The software programs used for the analysis were R (Language and environment for statistical computing and graphics), SPSS (Statistical Package for the Social Sciences), and AMOS (Analysis of Moment Structures by IBM SPSS).
The first analysis that was performed was exploratory through descriptive statistics of the study variables and correlation analysis between learning behavior and final learning performance. Finally, Path Analysis was used to evaluate the adjustment between the set of dependency relationships.

4.3.4. Phase 4: Conclusions and Discussion

After carrying out the analyses, the hypotheses raised were accepted or rejected according to the established study objective. Then, the results were compared with the previous studies, establishing the differences or coincidences based on the results found.

5. Data Analysis

5.1. Descriptive Statistics

Table 3 illustrates the main statistics that describe the variables of the study. For the different variables (student), there was a wide dispersion (Minimum and Maximum) among the participants concerning the different interactions within the platform, presenting the most significant variability in forum/wiki participation and with the tasks. As for the final grade, it is revealed that the average grade is above the minimum approval and presented greater homogeneity in the results. As for the variables related to the teacher’s interventions in the platform, the most significant variability is communication; for the rest of the variables, the variation was less.
To evaluate the relationship between the variables referred to the students’ learning behaviors when entering the platform and the final grade obtained, a correlation analysis was used (Spearman’s Rho). In Table 4, it is evident that all the variables are positively correlated with the students’ final grades. That is, as the different learning behavior variables increase, the final grade also increases. Likewise, the relationships presented were statistically significant (p < 0.05).

5.2. Path Analysis

The different teacher interventions and their influence on the learning performance were considered, for which a PA was chosen. Before the PA, the correlations for the variables of teacher intervention and learning performance were tested. This information was the base input to build the empirical model. It was found that all the correlations were positive and more significant than 0.3 but less than 0.8 (Table 5), which indicates a direct relationship between the different variables. In addition, all correlations were statistically significant (p < 0.1).
The model tested is shown in Figure 1 with its predictive observable variables (PA) and its mediating or indirect effects. For the statistical significance of the t indices, p < 0.05 was taken as a reference. In addition, multiple fit statistics and indices were calculated to test the theoretical model [30,31,32], the chi-square statistic, the Comparative Fit Index (CFI), Root Mean Square Error of Approximation (RMSEA), Square Root Mean Residual (SRMR), and the Goodness-of-Fit Index (GFI). Thresholds indicating good fit have been established for these indices. The authors [36] have suggested an adequate fit when the CFI and GFI are greater than 0.90, along with an RMSEA and SRMR less than 0.08, and an excellent fit if they are greater than 0.95 or less than 0.05 according to the indices indicated. All analyses were performed with AMOS 25 (Analysis of Moment Structures by IBM SPSS) [37].
The results obtained after identifying and estimating the model (Figure 1) evaluated the quality of the data’s fit to the proposed theoretical model using the thresholds described above as references. In other words, the model was interpreted globally together with the contrast of the hypotheses. The model tested did not show satisfactory values for the model [37], with the data collected χ 2 3 = 11.32; p <0.05; GFI = 0.942; AGFI = 0.711; IFC = 0.942; RMSEA = 0.202; AIC = 35.36; ECVI = 0.519).

5.3. Model Results

With the findings and index modification, adjustments were made to the initial model (Figure 1). In effect, the model was re-specified by considering the criterion of statistical significance between the regression coefficients and the theoretical support for making the modifications. Once the changes were made, the relationships found were statistically significant for the various coefficients and the levels of adjustments appropriate to the model presented in Figure 3.
The adjustment rates for the final model were acceptable χ 2 3 = 2.96; p >0.05; GFI = 0.984; AGFI = 0.918; IFC = 1.00; RMSEA = 0.001; AIC = 26.96; ECVI = 0.396). The above indicates that the errors associated with estimating the parameters fall in the range of normal values and the residues are small. The hypotheses results are demonstrated in Table 6.
In this study, all three hypotheses are positively significant at the 0.001 level, so the results indicate support for the following three hypotheses (H1, H2, H4).
For Hypothesis 1 (H1), virtual teacher tutoring is significantly positively related to learning design levels (β = 0.57). The virtual tutor (teacher) ‘s LA-based interventions contribute to inferring actions to mediate content, select pedagogical strategies, design didactic sequences, select types of assessments, and convenient technological tools for learning design.
For Hypothesis 2 (H2), motivation has a positive and significant impact on Learning Design levels (β = 0.57). In this sense, the virtual tutor’s motivational actions to develop his teaching practice contribute to deduce the levels of Learning Design that imply the concreteness of the educational fact.
Supported Hypothesis 4 (H4) demonstrates that the virtual tutor’s Learning Design based on LA exerts a direct positive influence on the students’ learning performance (β = 0.35). Consequently, when the virtual tutor performs actions to choose what, when, where, and how to teach (learning design), this is reflected in the students’ learning performance.
Regarding the direct and indirect effects present in the model (Figure 3), it is observed that the Tutoring (MN) and Motivation (MN) variables presented only indirect effects (using Learning Design) with coefficients 20 and 11, respectively. On the other hand, the variables Communication and Feedback (CF) and Learning Design (LD) showed direct effects with coefficients of 30 and 35 (H3). These last two variables showed a strong relationship with learning performance; that is, the higher the communication and learning design, the higher the student’s learning performance and vice versa. Hypothesis 3 (H3) was rejected as original where CF and LD had direct effects; instead, CF provide direct effects to Learning Performance (LP). In terms of Hypothesis 5 (H5), the three variables (MN; MN and CF) together are reflected in Learning Design (LD) because Communication and Feedback (CF) have a direct impact on Learning Performance (LP).

6. Findings and Discussion

The findings of this study reveal that the interventions performed by the virtual tutor (teacher) based on LA (H1) and the motivational actions developed (H2) contribute to the levels of learning design. Consequently, this design positively influences the learning performance of students (H4). In terms of interventions, it is identified that the teacher using the LA has a better understanding of learning design. This enables identifying and validating relevant measures of the processes, outcomes, and activities involved in learning performance [38].
In the context of the study, the motivation developed by the virtual tutor is transcendental in the learning design; this is because when using LA in virtual learning environments, it is possible to identify attitudes toward the content, attention, and commitment, extracurricular experiences, and interests that have an impact on learning performance [39]. The learning design developed by the virtual tutor that allows for meaningful and non-traditional experiences will effectively contribute to academic performance [40].
In terms of H3, where it was supposed that Communication and Feedback (CF) explain the level of learning design, the authors did not find a correlation. Instead, the results of the study also show that there is a direct relationship between the variables Communication and Feedback (CF) with the Learning Performance (LP) and the virtual tutor interventions. This finding was included in the revised model. We emphasize the direct relationship (Figure 3) between Communication and Feedback with Learning Performance where online socialization, support and feedback strategies, technical support, information sharing, content reinforcement, follow-up messages, and queries are some of the intervention and follow-up actions of virtual tutors. This agrees with [41], whose aim in the study was to examine the sequential and temporal characteristics of learning strategies and to investigate their association with communication and feedback. The results suggest a positive association between Communication and Feedback (CF) with effective Learning Performance (LP). Finally, it is important to recall (H5), stating that there is a relationship between mentoring, motivation, and communication with Learning Performance mediated by the Learning Design. In this case, the hypothesis was rejected, and it was found that Communication and Feedback (CF) directly affect Learning Performance (LP) as an important factor once there is a strong learning design. In this sense, it is possible that Motivation (MT) or Mentoring (MN) could not be perfect, but Communication and Feedback (CF) are relevant for the student’s success.
Currently, virtual teachers or tutors play different roles in carrying out the educational process. In the study of [24], an LA tool was used that allowed intervention actions with at-risk students and timely improvement of learning experiences, and as a result, the learning performance in students also improved.
In addition, the study’s explanation presented [42] examines the association between students’ learning approaches and study strategies extracted from digital tracking data of students’ interactions with online learning resources. Tutors can derive specific recommendations for their students in terms of strategies to follow and corrective actions to students’ learning approaches [11].
The research results are consistent with previous studies that have explored the association between students’ frequency of LMS use (logins, use of discussion forums, resources used, others) and learning performance; however, other factors such as motivation are also considered. The observed relationship between motivation, learning design, and learning performance may be similar to those observed in the study [43]. The results show the central role of elements linked to the levels of learning design and motivation. Motivational intervention is an intervention that provides external stimuli to learners. Within this research scope, an intervention engine was designed and developed that included task-based instructional intervention and supportive and motivational interventions based on learning experiences. Based on the results, it was observed that students indicated that the core intervention system is helpful and was beneficial to their learning process [44,45]. In addition, the study [8] shows a need for tools to help teachers monitor academic e-learning progress to identify actions that enable motivation [7].
Based on all these findings, it is possible to say that virtual tutors’ supportive intervention is an important factor in the satisfactory performance of students. Essential factors such as Communication and Learning Design can influence students’ Learning Performance.

7. Conclusions

With the continuous development of online learning platforms, the analysis and prediction of educational data has become a promising field of research, which is useful for developing a personalized learning system [29]. The main motivations for the use of learning analytics for higher education institutions include (a) improving student learning and motivation, thus reducing the dropout rate, and (b) trying to improve the student learning process by providing adaptive learning paths toward specific objectives set by the curriculum, the teacher, or the student [20]. Based on the intervention model, in this study, it was possible to verify that pedagogical interventions based on Learning Analytics do impact the final learning performance of a course and of university students.
This work was based in the following question: What is the impact of pedagogical interventions with Learning Analytics on the learning performance of university students? To answer this question, a model with the following variables was proposed: Motivation, Mentoring, Communication and Feedback, Learning Design and Learning Performance. For this model, it is important to mention that the Communication and Feedback component was strongly based on learning analytics tools.
It is important that virtual tutors appropriately manage tools to develop the different types of communication and feedback in virtual learning environments (synchronous, asynchronous, unidirectional, bidirectional, multidirectional, and massive communication). In that case, students will present a better learning performance in the courses. According to the results, in the same way, the learning design developed by the virtual tutor has a direct impact on learning performance, and the results are improved with the help of Learning Analytics tools.
In other words, the virtual tutor’s work in choosing what, when, where, and how to teach is reflected in learning performance. Learning Design involves a series of decisions about (1) the content, its structure, and dosage; (2) the timing of meaningful learning experiences; (3) the pre-instructional, co-instructional, post-instructional, individual, or collaborative pedagogical strategies; (4) the didactic sequences established; (5) the assessments; and (6) the uses of technologies to support learning. Among the findings, it is important to recognize that the study reflected benefits such as (1) time savings in monitoring and mentoring by being able to visualize reports that show the actions of course participants; (2) decision making regarding the learning design, for example, if it is effective or if it adheres to the graduate profile of the curriculum; (3) improvement in the learning performance of students from interventions based on learning analytics.
It is important to highlight the importance of virtual tutors’ role in conducting pedagogical interventions thanks to the information retrieved from the learning analytics tools. The use of the data analysis tools helped the students perceive the tutors’ accompaniment positively and, to some extent, impacted the students’ success in their learning.
Finally, within the implications that this study has in the University of San Carlos de Guatemala, it can be said that the virtual tutoring model based on learning analytics, which was used by the professors, had a positive impact, according to the timely interventions that favored learning performance. The future work will continue exploring the best practices based on Learning Analytics tools and the improvement of the model.

Author Contributions

Conceptualization, L.M.O.-C. and H.R.A.-S.; methodology, A.G.-C.; software H.R.A.-S. and L.M.O.-C.; formal analysis, L.M.O.-C., H.R.A.-S., A.G.-C.; investigation. L.M.O.-C.; resources, H.R.A.-S.; writing—original draft preparation, L.M.O.-C. and H.R.A.-S.; writing—review and editing, L.M.O.-C., H.R.A.-S. and A.G.-C.; supervision, A.G.-C.; project administration, H.R.A.-S., L.M.O.-C. and A.G.-C. All authors have read and agreed to the published version of the published version of the manuscript.

Funding

This research received no external funding

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available within the article.

Acknowledgments

The authors wish to thank the coordinator of e-learning programs of the Faculty of Engineering of the University of San Carlos of Guatemala for her support in the development of the research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. López, S.; Lucero, S. Aplicación de Técnicas de Learning Analytics en Entornos Blended-Learning para Enseñanza Universitaria. Ph.D. Dissertation, Enxeñaría Telemática, Universidad de Vigo, Madrid, España, 2019. [Google Scholar]
  2. Soomro, K.A.; Kale, U.; Curtis, R.; Akcaoglu, M.; Bernstein, M. Digital divide among higher education faculty. Int. J. Educ. Technol. High. Educ. 2020, 17, 1–16. [Google Scholar] [CrossRef] [Green Version]
  3. Guttman, C. Education in and for the Information Society; United Nations Educational, Scientific and Cultural Organization: UNESCO, France, 2003. [Google Scholar]
  4. Mangaroska, K.; Giannakos, M. Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning. IEEE Trans. Learn. Technol. 2018, 12, 516–534. [Google Scholar] [CrossRef] [Green Version]
  5. Alonso Díaz, L.; Entonado, F.B. El Docente de Educación Virtual, Guía básica; NARCEA SA DE EDICIONES: Madrid, Spain, 2012. [Google Scholar]
  6. Ruiz, I.I.B. Learning Analytics como cultura digital de las universidades: Diagnóstico de su aplicación en el sistema de educación a distancia de la UNAM basado en una escala compleja. Revista Iberoamericana de Educación 2019, 80, 89–116. [Google Scholar] [CrossRef]
  7. Lu, O.H.; Huang, A.Y.; Huang, J.C.; Lin, A.J.; Ogata, H.; Yang, S.J. Applying learning analytics for the early prediction of Students’ academic performance in blended learning. J. Educ. Technol. Soc. 2018, 21, 220–232. [Google Scholar]
  8. Costa, L.; Souza, M.; Salvador, L.; Amorim, R. Monitoring students performance in e-learning based on learning analytics and learning educational objectives. In 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT); IEEE: Maceio, Brazil, 2019; Volume 2161, pp. 192–193. [Google Scholar]
  9. Garcia-Cabot, A.; de-Marcos, L.; Garcia-Lopez, E. An empirical study on m-learning adaptation: Learning performance and learning contexts. Comput. Educ. 2015, 82, 450–459. [Google Scholar] [CrossRef]
  10. Yu, T.; Jo, I.H. Educational technology approach toward learning analytics: Relationship between student online behavior and learning performance in higher education. In Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, IN, USA, 24–28 March 2014; pp. 269–270. [Google Scholar]
  11. Peña-Ayala, A. Learning Analytics: Fundaments, Applications, and Trends; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  12. Shibani, A.; Knight, S.; Shum, S.B. Contextualizable learning analytics design: A generic model and writing analytics evaluations. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 210–219. [Google Scholar]
  13. Ellis, R.A.; Han, F.; Pardo, A. Improving learning analytics–combining observational and self-report data on student learning. J. Educ. Technol. Soc. 2017, 20, 158–169. [Google Scholar]
  14. Khine, M.S. Learning Analytics for Student Success: Future of Education in Digital Era. In Proceedings of the European Conference on Education, Brighton, UK, 29 June–1 July 2018. [Google Scholar]
  15. Na, K.S.; Tasir, Z. A systematic review of learning analytics intervention contributing to student success in online learning. In Proceedings of the 2017 International Conference on Learning and Teaching in Computing and Engineering (LaTICE), Hong Kong, China, 20–23 April 2017; pp. 62–68. [Google Scholar]
  16. Waheed, H.; Hassan, S.U.; Aljohani, N.R.; Hardman, J.; Alelyani, S.; Nawaz, R. Predicting academic performance of students from VLE big data using deep learning models. Comput. Hum. Behav. 2020, 104, 106189. [Google Scholar] [CrossRef] [Green Version]
  17. Nguyen, Q.; Huptych, M.; Rienties, B. Linking students’ timing of engagement to learning design and academic performance. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, Australia, 5–9 March 2018; pp. 141–150. [Google Scholar]
  18. Hilton, J. Open educational resources, student efficacy, and user perceptions: A synthesis of research published between 2015 and 2018. Educ. Technol. Res. Dev. 2019, 68, 853–887. [Google Scholar] [CrossRef] [Green Version]
  19. Ifenthaler, D.; Yau, J.Y.K. Utilising learning analytics to support study success in higher education: A systematic review. Educ. Technol. Res. Dev. 2020, 68, 1961–1990. [Google Scholar] [CrossRef]
  20. Seufert, S.; Meier, C.; Soellner, M.; Rietsche, R. A pedagogical perspective on big data and learning analytics: A conceptual model for digital learning support. Technol. Knowl. Learn. 2019, 24, 599–619. [Google Scholar] [CrossRef]
  21. Al-Doulat, A.; Nur, N.; Karduni, A.; Benedict, A.; Al-Hossami, E.; Maher, M.L.; Dou, W.; Dorodchi, M.; Niu, X. Making Sense of Student Success and Risk Through Unsupervised Machine Learning and Interactive Storytelling. In Proceedings of the International Conference on Artificial Intelligence in Education, Ifrane, Morocco, 6–10 July 2020; pp. 3–15. [Google Scholar]
  22. Er, E.; Gómez-Sánchez, E.; Dimitriadis, Y.; Bote-Lorenzo, M.L.; Asensio-Pérez, J.I.; Álvarez-Álvarez, S. Aligning learning design and learning analytics through instructor involvement: A MOOC case study. Interact. Learn. Environ. 2019, 27, 685–698. [Google Scholar] [CrossRef] [Green Version]
  23. Dawson, S.; Joksimovic, S.; Poquet, O.; Siemens, G. Increasing the impact of learning analytics. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 446–455. [Google Scholar]
  24. Herodotou, C.; Hlosta, M.; Boroowa, A.; Rienties, B.; Zdrahal, Z.; Mangafa, C. Empowering online teachers through predictive learning analytics. Br. J. Educ. Technol. 2019, 50, 3064–3079. [Google Scholar] [CrossRef]
  25. Gong, L.; Liu, Y. Design and application of intervention model based on learning analytics under blended learning environment. In Proceedings of the 2019 7th International Conference on Information and Education Technology, Aizu-Wakamatsu, Tokyo, Japan, 29–31 March 2019; pp. 225–229. [Google Scholar]
  26. Daud, A.; Aljohani, N.R.; Abbasi, R.A.; Lytras, M.D.; Abbas, F.; Alowibdi, J.S. Predicting student performance using advanced learning analytics. In Proceedings of the 26th International Conference on World Wide Web Companion, Perth, Australia, 3–7 April 2017; pp. 415–421. [Google Scholar]
  27. Joshi, A.; Desai, P.; Tewari, P. Learning Analytics framework for measuring students’ performance and teachers’ involvement through problem-based learning in engineering education. Procedia Comput. Sci. 2020, 172, 954–959. [Google Scholar] [CrossRef]
  28. Cirigliano, M.M.; Guthrie, C.; Pusic, M.V.; Cianciolo, A.T.; Lim-Dunham, J.E.; Spickard, A., III; Terry, V. “Yes, and…” Exploring the Future of Learning Analytics in Medical Education. Teach. Learn. Med. 2017, 29, 368–372. [Google Scholar] [CrossRef]
  29. Pérez, E.; Medrano, L.A.; Rosas, J.S. El Path Analysis: Conceptos básicos y ejemplos de aplicación. Rev. Argent. Cienc. Comport. 2013, 5, 52–66. [Google Scholar]
  30. Loehlin, J.C.; Beaujean, A.A. Latent Variable Models: An Introduction to Factor, Path, and Structural Equation Analysis. New York, NY, USA, 2016. [Google Scholar]
  31. Byrne, B.M. Structural Equation Modeling with AMOS: Basic Concepts, Applications, and Programming (Multivariate Applications Series); Taylor & Francis Group: New York, NY, USA, 2010; Volume 396, p. 7384. [Google Scholar]
  32. Kline, R.B. Principles and Practice of Structural Equation Modeling; Guilford Publications: New York, NY, USA, 2015. [Google Scholar]
  33. Kaplan, D. Structural Equation Modeling: Foundations and Extensions; Sage Publications: London, UK, 2008; Volume 10. [Google Scholar]
  34. Weston, R.; Gore, P.A., Jr. A brief guide to structural equation modeling. Couns. Psychol. 2006, 34, 719–751. [Google Scholar] [CrossRef]
  35. Jung, H.; Kim, Y.; Lee, H.; Shin, Y. Advanced instructional design for successive E-learning: Based on the successive approximation model (SAM). Int. J. E-Learn. 2019, 18, 191–204. [Google Scholar]
  36. Hu, L.; Bentler, P. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  37. Arbuckle, L. IBM SPSS Amos 25 User’s Guide; Amos Development Corporation: Crawfordville, FL, USA, 2017. [Google Scholar]
  38. Huang, S.Y.; Kuo, Y.H.; Chen, H.C. Applying digital escape rooms infused with science teaching in elementary school: Learning performance, learning motivation, and problem-solving ability. Think. Ski. Creat. 2020, 37, 100681. [Google Scholar] [CrossRef]
  39. Matcha, W.; Gašević, D.; Uzir, N.A.A.; Jovanović, J.; Pardo, A. Analytics of learning strategies: Associations with academic performance and feedback. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge, Tempe, AZ, USA, 4–8 March 2019; pp. 461–470. [Google Scholar]
  40. Gasevic, D.; Jovanovic, J.; Pardo, A.; Dawson, S. Detecting learning strategies with analytics: Links with self-reported measures and academic performance. J. Learn. Anal. 2017, 4, 113–128. [Google Scholar] [CrossRef]
  41. Broadbent, J. Academic success is about self-efficacy rather than frequency of use of the learning management system. Australas. J. Educ. Technol. 2016, 32, 38–49. [Google Scholar] [CrossRef] [Green Version]
  42. Şahin, M.; Yurdugül, H. An intervention engine design and development based on learning analytics: The intelligent intervention system (In 2 S). Smart Learn. Environ. 2019, 6, 18. [Google Scholar] [CrossRef]
  43. Amazona, M.V.; Hernandez, A.A. User Acceptance of Predictive Analytics for Student Academic Performance Monitoring: Insights from a Higher Education Institution in the Philippines. In Proceedings of the 2019 IEEE 13th International Conference on Telecommunication Systems, Services, and Applications (TSSA), Bali, Indonesia, 3–4 October 2019; pp. 124–127. [Google Scholar]
  44. Riley, R.M. e-Learning: Putting a World-Class Education at the Fingertips of All Children (The National Educational Technology Plan); US Department of Education, Office of Educational Technology: Washington, DC, USA, 2000.
  45. Cruz-Jesus, F.; Vicente, M.R.; Bacao, F.; Oliveira, T. The education-related digital divide: An analysis for the EU-28. Comput. Hum. Behav. 2016, 56, 72–82. [Google Scholar] [CrossRef]
Figure 1. Hypothesized model for the relationship between mentoring, motivation, communication, learning design, and learning performance.
Figure 1. Hypothesized model for the relationship between mentoring, motivation, communication, learning design, and learning performance.
Applsci 11 01805 g001
Figure 2. Diagram of instruments (A), processes (B), and actions (C).
Figure 2. Diagram of instruments (A), processes (B), and actions (C).
Applsci 11 01805 g002
Figure 3. The final model for the relationship between Mentoring (MN), Motivation (MT), Communication and Feedback (CF), Learning Design (LD), and Learning Performance (LP).
Figure 3. The final model for the relationship between Mentoring (MN), Motivation (MT), Communication and Feedback (CF), Learning Design (LD), and Learning Performance (LP).
Applsci 11 01805 g003
Table 1. Demographic and descriptive statistics of the participants.
Table 1. Demographic and descriptive statistics of the participants.
Item FrequencyPercentage
GenderFemale5841.40%
Male8258.60%
Age20-283222.85%
29–375640.00%
38–462719.29%
47–551410.00%
56+1107.86%
Professional areaEducation2316.43%
Psychology1712.14%
Architecture1107.86%
Agronomy1410.00%
Engineering3424.29%
Social Work1208.57%
Art805.71%
Economy2115.00%
Table 2. Data collecting methods for variable.
Table 2. Data collecting methods for variable.
CodeSuggested Independent VariablesData Collecting Methods
VS1Total studying time in LMSCalculating the total number of hours employed between login and logout.
VS2Interaction with virtual learning objectsAdding up the total interactions reported in the SCORM report of the virtual learning objects.
VS3Interaction in forums/wikiAdding up the total reported interactions with the forum analytics plugin.
VS4Interaction with learning tasksAdding up the total number of participants’ task submissions through the submission distribution block.
VS5Learning PerformanceAdding up all scores for assignments and assessment in the course.
VT1Communication and feedbackCounting the number of messages and task feedback.
VT2MentoringCounting the number of follow-up emails, phone calls, synchronous meetings, and participation in forums of questions.
VT3Learning DesignCounting the number of resources designed for the course, readjusted educational resources, enriched videos, complementary readings, and designed extracurricular activities.
VT4MotivationCounting the number of motivation messages sent, extracurricular meetings, and game dynamics implemented.
Table 3. Descriptive statistics for the study variables.
Table 3. Descriptive statistics for the study variables.
Variable CodeMeanStandard DeviationMinimumMaximum
Student
VS119.66298.4386117.7045.10
VS260.471420.6409821.00101.00
VS3290.1286150.7533719.00523.00
VS4204.985771.7399738.00430.00
VS580.721415.8956160.0099.00
Professor
VT1121.285721.5931789.00164.00
VT213.25712.8370510.0022.00
VT318.31431.7331318.0020.00
VT411.71431.2869811.0017.00
Table 4. Correlational analysis between learning behaviors and learning performance.
Table 4. Correlational analysis between learning behaviors and learning performance.
Standard DeviationMinimum
Total studying time in LMSSpearman’s Rho
p-value
0.351
0.006
Interaction with the LMSSpearman’s Rho
p-value
0.332
0.005
Interaction with virtual learning objectsSpearman’s Rho
p-value
0.653
0.003
Interaction in forums/wikiSpearman’s Rho
p-value
0.405
0.001
Interaction with learning tasksSpearman’s Rho
p-value
0.410
0.015
Table 5. Correlation analysis for model variables.
Table 5. Correlation analysis for model variables.
Variable MNMTCFLD
Motivation (MT)0.472---
Communication and Feedback (CF)0.5050.673--
Learning Design (LD)0.7110.5670.458-
Learning Performance (LP)0.3790.5250.4930.528
Table 6. Research hypotheses results. (*** indicates p < 0.001).
Table 6. Research hypotheses results. (*** indicates p < 0.001).
Research HypothesesStandardized Estimates βResults
H1: Mentoring explains the levels of Learning Design.0.57 ***Confirmed
H2: Motivation explains the levels of Learning Design.0.57 ***Confirmed
H4: Learning design explains Learning Performance.0.35 ***Confirmed
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Oliva-Córdova, L.M.; Garcia-Cabot, A.; Amado-Salvatierra, H.R. Application of Learning Analytics in Virtual Tutoring: Moving toward a Model Based on Interventions and Learning Performance Analysis. Appl. Sci. 2021, 11, 1805. https://doi.org/10.3390/app11041805

AMA Style

Oliva-Córdova LM, Garcia-Cabot A, Amado-Salvatierra HR. Application of Learning Analytics in Virtual Tutoring: Moving toward a Model Based on Interventions and Learning Performance Analysis. Applied Sciences. 2021; 11(4):1805. https://doi.org/10.3390/app11041805

Chicago/Turabian Style

Oliva-Córdova, Luis Magdiel, Antonio Garcia-Cabot, and Héctor R. Amado-Salvatierra. 2021. "Application of Learning Analytics in Virtual Tutoring: Moving toward a Model Based on Interventions and Learning Performance Analysis" Applied Sciences 11, no. 4: 1805. https://doi.org/10.3390/app11041805

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop