Next Article in Journal
Mental Health, Subjective Well-Being, and Academic Performance in Chilean Schoolchildren Who Are Part and Are Not Part of the School Inclusion Program
Next Article in Special Issue
Perception of Sports Science Students in Higher Education on Basic Digital Competences: Spanish Case
Previous Article in Journal
Computational Literacy as an Important Element of a Digitized Science Teacher Education—A Systematic Review of Curriculum Patterns in Physics Teacher Education Degrees in Germany
Previous Article in Special Issue
Assessing the Computational Thinking of Pre-Service Teachers: A Gender and Robotics Programming Experience Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Learning Analytics: A View on the Design and Assessment of Asynchronous Online Discussions for Better Teaching Performance

by
Lorea Fernández-Olaskoaga
1,*,
Montse Guitert Catasús
2,
Teresa Romeu Fontanillas
2 and
Juan Pedro Cerro Martínez
2
1
Faculty of Education, Philosophy and Anthopology, University of the Basque Country, 20018 Donostia-San Sebastián, Spain
2
Psychology Studies and Educational Sciences, Universitat Oberta de Catalunya, 08018 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(10), 1064; https://doi.org/10.3390/educsci13101064
Submission received: 14 July 2023 / Revised: 28 September 2023 / Accepted: 10 October 2023 / Published: 23 October 2023
(This article belongs to the Special Issue New Technology Challenges in Education for New Learning Ecosystem)

Abstract

:
In recent years, the impact of learning analytics has been investigated and explored in higher education contexts. This article aims to show how their application in online educational contexts is providing great support for teaching performance, especially in relation to the methodology applied, the monitoring of students’ interactions and participation, and the evaluation of activities, and how it can favor improvements in student performance and satisfaction. We therefore present the results obtained from use of the DIANA (DIAlog ANAlysis) tool designed for the research project “Use of learning analytics in digital environments: impact on the improvement of university teaching practice” (LAxDigTeach-21085GE). This study followed a mixed methodology (qualitative and quantitative) to better complement the data provided by learning analytics, given that the numerical data must be meaningful in the specific context in which they are collected. The results obtained are in line with previous research and show that the use of learning analytics have helped to improve teaching performance in relation to monitoring student interactions, participation, and evaluation, but are limited in terms of improving their performance and satisfaction. No generalized conclusions can be drawn as yet in light of the fact that the research project of which this study is a part has recently completed the pilot stage and we have only analyzed the information obtained in one of the participating subjects.

1. Introduction

The incursion of big data has produced a new “hyperbolic” phenomenon related to the datafication of society [1], which is having a direct impact on higher education [2,3,4]. Big data have been presented with great enthusiasm as the new engine of an intensive knowledge economy based on data mining techniques and artificial intelligence mechanisms for the generation of automated processes tailored to the user whose data are traced [5]. It is what we call data science, an open term applicable to many disciplines, understood as the set of techniques and principles that regulate the collection and measurement of data on a given process.
In the field of education, data mining techniques have generated powerful movements, among which learning analytics particularly stands out [6,7,8]. Along these lines, we define learning analytics as “the measurement, collection, analysis, and reporting of data about learners and their contexts in order to understand and optimize learning and the environments in which it occurs” [9] (p. 8). This is why it is considered as an emerging field of research in education that is expanding and amplifying with the rise in blended and online teaching and learning. Learning analytics is a powerful tool that can provide us with information on the interaction processes between students when using digital learning environments [10,11]. In addition, there is a need for the teachers to track students and evaluate learning activities, and learning analytics facilitates both tasks [12].
Learning analytics offers information to teachers thanks to the analysis and interpretation of data, allowing them to improve the educational practice and optimize student performance. In this sense, it enables the teacher to make a methodological change in the way they communicate and relate to students, enhancing the role of facilitator, guide, counselor, evaluator... offering more objective and detailed feedback, facilitating a true learning experience and, consequently, improving teaching and the academic results of students [13].
Nevertheless, there are voices that highlight certain critical aspects of the use of learning analytics. Some authors such as Prinsloo [14] consider that the most important challenge today is not the access to information or its exploitation by means of powerful algorithms, but rather the use of these data and the very fact of their mere existence, which can make a system vulnerable in ways that we cannot anticipate.
On the other hand, it should be noted, in relation to the quality of higher education [15], that one of the key indicators refers to interactions in shared spaces of digital environments, which is reinforced by the existing literature establishing the positive relationship between interaction and learning performance in online education models [16,17]. Another indicator in online education is that the teacher’s presence (cognitive, social, teaching) in digital environments, manifested in monitoring and feedback, becomes crucial for the students’ learning progress [18,19].
In an online educational model, in line with Goodyear [20] and Guardia [21], it is essential to design the training project using a methodology that guides decision making for each of the elements that make up the course. Everything must be ready in the virtual environment when beginning the teaching and learning process: planning, activities, resources, tools, modeling, and evaluation criteria and instruments. In an online learning environment, to keep students in training and motivated, there are two key elements: the first is the collaborative processes between students [22,23], and the second is a continuous, complex, and diversified assessment [24,25] that is supported by a feedback process [26,27] focused on improving the student’s learning process.
In this context, this article shows the results of a case study with a control and an experimental group on the improvement in the design, monitoring, assessment, and evaluation of virtual discussions using a software based on web technologies, called DIANA (DIAlog ANAlysis), a learning analytics tool designed ad-hoc. This tool was created to calculate a set of metrics and to provide teachers with information on student performance (number of responses, popularity, temporality of messages, level of participation, dispersion…) when students interact with each other by exchanging messages in asynchronous online discussions. This activity is one of the most common and complex activities to evaluate in online and hybrid environments [28]. The DIANA tool was used with two clear objectives; on the one hand, as a learning analytics tool to support the teaching process of teachers in those control and experimental groups where the tool was only used in the second group. On the other hand, DIANA was also used as a data collection instrument for the analysis of the communicative interaction between students.
According to the previous context, in this article, we show the results of applying a tool designed for the Moodle environment (DIANA) where the contributions are oriented in the field of improvement in the teaching–learning experience itself through an appropriate design of the subject that affects the personalization of learning. Furthermore, this experience also makes an approach toward the assessment and evaluation of online learning. At the end of the day, this study is framed in the project “Use of learning analytics in digital environments: impact on the improvement of university teaching practice” (LAxDigTeach-21085GE), where one of the objectives is to analyze the academic impact of applying learning analytics in university digital environments.
To answer this objective, we proposed the following research questions:
  • What changes does the DIANA tool facilitate in the design, monitoring, and assessment of asynchronous discussions for teachers?
  • What does learning analytics contribute to the improvement in student performance?

2. Materials and Methods

2.1. Research Context and Sample

The context in which this study takes place corresponds to the Master’s degree in Digital Education coordinated by the University of Extremadura in Spain during the year 2022/2023. The master’s degree is taught virtually and is focused on research (i.e., it provides the basis for developing research skills such as research methods) as well as critical and independent thinking in the field of educational technology.
Two of the master’s subjects were selected because they included an asynchronous online discussion as an assessment activity. We therefore contacted the teachers of these subjects and invited them to participate in the pilot study as part of the control and experimental groups. Table 1 shows the duration of the subjects and the period in which the discussion activity took place.
The sample group participating in the subjects was made up of 20 students (the same number in each subject) and two teachers (one for each subject). The pilot study aimed to analyze the impact of using learning analytics in a subject without using the DIANA learning analytics tool (control group) and another using the tool (experimental group) in two different discussions.
Being two different subjects and not simultaneous in time, coordinated work was carried out between the two teachers regarding the design and presentation of the discussion activity and the information provided to the students in the virtual space. With the intention of reproducing the development of the activity under the same or very similar conditions, the following decisions were taken according to the indications of the research team:
  • The duration of the discussion would be between 10 and 15 days starting in the second week of the course.
  • The discussion would consist of three discussion threads with a start and end date for each thread.
  • At the end of the discussion there would be a final contribution by the teacher as a summary with the final conclusions.

2.2. Research Methodology

In the field of educational research, understanding student behavior during collaborative activities is fundamental to improving the teaching and learning processes. This study highlights the importance of using a diversity of research methods and tools that allow for a comprehensive analysis of the results integrating quantitative and qualitative perspectives.
Qualitative research methodology is especially valuable because it allows researchers to study the complexity of the interaction occurring between students in the virtual environment, and thus understand the factors that influence their performance and decision making. By employing qualitative methods such as participant observation, researchers can capture data that help build a global picture of the experiences involved in the main study. However, the inclusion of quantitative data can bring an additional dimension to qualitative research methodology by providing empirical, objective [29], and measurable information. These data were obtained from instruments such as questionnaires and measurements made with ad-hoc analytical tools and statistical analyses that provide a broader and more generalizable view of the participants’ experiences.
By combining quantitative data with a qualitative perspective, we can obtain a more complete understanding of the phenomena studied and a greater triangulation of results [30]. This combination allowed us to validate and enrich the findings as well as generate more solid and applicable recommendations for the improvement of educational practices and the design of more effective pedagogical interventions. Moreover, using both viewpoints in this study provides a hybrid research approach that allows researchers to address complex questions from different perspectives and levels of analysis. This combination strengthens the validity and reliability of the findings by integrating different forms of knowledge.

2.3. Data Collection Tools and Procedure

Before starting with the pilot phases, in September 2022, training sessions were held with all participating teachers. This training was essential to understand the options of the DIANA tool and how to use it. Different tools were used for data collection. In light of the research questions, we considered the perspective of both the teachers and the students to create the questionnaires.
Data were collected before and after each pilot (control and experimental groups). In the first phase, the evaluation questionnaire was sent to the participating teachers beforehand in order to obtain their profile as online teachers. This enabled us to compare the teachers’ knowledge of the use made of the student data available in the teaching and learning platform in order to monitor and evaluate the asynchronous online discussion.
In a second phase, once the pilots had been completed, information was collected from the students, since they had to evaluate, by answering a few simple questions, their level of satisfaction based on the feedback provided by the teacher on their participation in the evaluation activity. In this same phase, and only for the experimental pilot, XML data were collected from the metrics reported by the DIANA tool in the online discussion monitored with the help of learning analytics.

2.3.1. Teacher Profile Evaluation Questionnaire

Before starting the pilot phase of the research, all of the participating teachers had to complete a questionnaire to evaluate their professional profile as online teachers. The purpose of this questionnaire was to understand their professional teaching experience in the field of online education as well as their experience in the use of technological tools for online assessment and collaborative learning. The questionnaire was divided into the following blocks: (1) basic demographic data, (2) degree of application of the typologies of collaborative activities, (3) degree of application of different online assessment tools, (4) frequency of use of Moodle activities and resources, and (5) use of data mining techniques using Moodle registration.

2.3.2. Learning Analytics Tool Evaluation Questionnaire (for Teachers)

In the experimental group, it was also necessary to collect information on the use of the learning analytics tool used for the monitoring and evaluation of the asynchronous online discussion. This questionnaire evaluated the following variables of the observed phenomenon:
  • Level of importance of indicators and metrics: Teachers were asked about the importance they gave, based on their own professional experience, to each of the metrics calculated by the analytical tool; the objective was to learn which metrics had the greatest impact on the teaching process.
  • Descriptors of the transversal indicators: Transversal indicators are those that show the attitudes and behaviors of the students during the development of the evaluation activity. For these indicators, it is difficult to find data that facilitate their measurement, since they deal with moral and ethical aspects of the student’s performance. The teachers were therefore asked to analyze which of the previously used metrics could be used to measure the cross-cutting indicators by means of a data cross-tabulation table.
  • Finally, a collection was made of responses to questions on the transfer of learning analytics to professional teaching practice, assessment of the instrument used in the asynchronous online discussion activity, and the possible applications of learning analytics as perceived by the teachers involved in the study.

2.3.3. Student Satisfaction Questionnaire

Another important element of the project was to learn the students’ degree of satisfaction with the feedback received from the teacher in both the control and the experimental group. A questionnaire was therefore designed enabling students to rate two questions from 1 to 5 (where 1 is the lowest and 5 the highest rating): the first, their degree of satisfaction with the evaluation, and the second, on the feedback received from the teacher.

2.3.4. Learning Analytics Digital Tool

This experience required the design and programming of an IT solution based on web technologies as a tool for collecting data on student activity in the asynchronous online discussion activity. The digital tool created automatically analyzes the messages exchanged by students in the communication spaces of the Moodle learning environment. To design this tool, called DIANA (DIAlog ANAlysis), we used the list of metrics conceptualized in the study by [28], which contains the description of a total of 21 metrics serving to evaluate different aspects of collaborative learning, which we called indicators.
The metrics reported by DIANA use reference values that can be customized by each teacher. The configuration panel shown in Figure 1 requires prior authentication, thus protecting the system from unauthorized access and complying with the European General Data Protection Regulation [31].
The configuration panel defines parameters such as the list of words that should be part of the semantic field of the conversation and the number of times these keywords should appear in the message exchange. Parameters can also be specified such as the minimum and maximum number of messages allowed, the maximum dispersion rate of the conversation based on the threads that form part of the discussion, and the maximum number of days a student can go without posting messages. Once the outline of the analysis to be performed in DIANA is defined, the metrics are grouped according to whether they are group metrics or individual metrics related to the performance of each student (Figure 2).

2.4. Ethical Implication

In order to carry out the research, all students participating in the study were given a consent form to sign in the event that they wanted to participate in the pilot study. All of the students signed this consent form.

3. Results

In this section, we show the results obtained in relation to the communicative interaction of the experimental group offered by the DIANA tool in each of the conversation threads and considering the debate in general. On the other hand, we describe the changes that the use of the tool has entailed for the design of the subject from a methodological aspect and its implications in the evaluation process from the teacher’s perspective.

3.1. Methodological Possibilities for Teachers in the Experimental Group

In order to develop virtual teaching, it is essential to start from the design of the subject specific to the context in which it is taught. The Master’s degree in Digital Education has been underway for nine years and the guidelines for the design of the subjects follow a methodology based on webquest [32], where choice of the tasks proposed for the students is made by the teachers. In the 22/23 academic year, three tasks were proposed on the subject: firstly, individual participation in a discussion forum to analyze the research perspectives in educational technology and their possibilities, secondly, work in small groups of five people to identify the research perspective applied in them by means of reading articles, and, finally, individual work where the student must choose a topic and draw up a draft justifying the possible investigation and research perspective applied. This activity is designed to make them start thinking about their master’s degree work.
Including the discussion forum activity has enabled a highly significant methodological change in the subject: restructured design of the subject itself. In this way, the virtual discussion has therefore become the central axis of the subject, while the remaining activities have taken place around it, enabling more coherent subject performance and activities than in previous courses. The subject has always had three activities to be performed by students; however, the activities themselves were independent from one another and failed to provide one another with adequate feedback. In this pilot experience, and with this new design, the subject has gained in internal coherence and natural approach during the activities, complementing one another and progressively adding demand and depth by means of the activities proposed over the weeks.
This restructuring has been crucial for specifying and defining the aspects to be analyzed with the DIANA tool. (1) The specific topic around which the discussion would proceed: on this occasion, analysis of the different research perspectives in educational technology. (2) The number of threads making up the discussion. For each of the threads, the following had to be specified: duration of the discussion; the minimum and maximum number of messages per student (between 2 and 3); the semantic field of the conversation (between 20–25 terms); the severity rate of the semantic control (10%); the maximum dispersion rate of the conversation (15%); the maximum inactivity time (2 days) (Figure 3).
It is important to note that defining some of the parameters listed above was not an easy task due to the lack of previous experience with the tool. Hence, to define the severity rate and the maximum dispersion rate, we followed the recommendations of the research team, taking care not to exceed them by more than 15%. (3) How to present the activity in the virtual space and the specific indications on how to participate in the debate by means of an infographic presentation (Figure 4).
The restructuring of the course has led to a second fundamental change related to the methodology: monitoring of the discussion through the three threads and, consequently, assessment and evaluation of the activity itself. The results provided by the DIANA tool have facilitated both tasks. The tool offers us data according to the definition of the parameters previously made by the teacher. This definition is related to the preferences and level of demand of the subject proposed by the teacher, meaning that considering these data enriches the interpretation of the qualitative results, achieving a more objective and accurate assessment and evaluation of the virtual discussion. The data provided by DIANA have made it possible to view student interaction and participation in different ways, for example, in the shape of word clouds, tables, bar charts, and node graphs. In addition to this, the students’ messages were read during the threads, and specific contributions were made to continue encouraging participation and dialogue, or otherwise, to redirect the thread.
At the end of the first and second discussion threads, individual messages created by the tool were sent to each of the students showing the type of participation they had had with the intention of seeking an improvement in the following threads and thus enabling the teacher to make an assessment of the process throughout the activity [33]. This feedback was received very positively, as one student commented: “Hi Lorea. Thank you very much for the feedback. It is very helpful to receive it during the development of the course, as it allows us to improve. I will try to improve my participation in the debate. Thank you and best regards!”. It should be noted that these messages had a direct impact on the following discussion thread, in two ways: (1) increased the participation of students with low participation until then, and (2) maintained or increased the participation of the most participative students. At the end of the third thread and therefore of the activity, individual messages were also sent and, in addition, the students were provided with a summary document of group participation with some of the data provided by DIANA. In general, thanks to the different sources of visualization offered by the tool, it was possible to carry out the evaluation in a simpler, more effective, and more confident way than on previous occasions.
For final evaluation of the discussion activity, those previous steps were very important because the evaluation process was simpler and was conducted with greater confidence than on other occasions thanks to the different viewing sources offered by the tool and the monitoring realized. In the next section, we show the results of this monitoring with the DIANA tool.

3.2. Results Provided by DIANA in Relation to the Communicative Interaction of the Experimental Group

In this pilot study, use of the DIANA tool was limited to the experimental group teacher. The control group teacher did not use DIANA, meaning that the students were not assessed based on the tool’s statistics.
We will now proceed to show the results of the experimental group in relation to communicative interaction. The way we present these results is divided into each of the discussion threads carried out in the virtual debate and in relation to the following indicators: (1) Participation in communicative interaction with the number of messages contributed; (2) Encouragement of dialogue and negotiation through the answers sent and the degree of popularity among peers; (3) The communicative style considering the number of words written; (4) The constancy and regularity in group interaction throughout the days; (5) The exchange of information within the group with contributions of external links and/or attached documents. Subsequently, the overall results are shown considering the three discussion threads.

3.2.1. First Discussion Thread

This first thread took place in 2022, from 5 to 9 December. The students had to discuss the main characteristics of the different research perspectives in Educational Technology. Of the 20 students, 19 participated and the number of answers provided was 41 (including two from the teacher), showing a 100% level of dialogue. The messages were divided as shown in Figure 5.
The number of words used in this thread ranged from 518 to 160, giving us an average of 314. As can be seen in Figure 5 participation was constant and regular throughout these days. It should be noted that the second and fourth days (6 and 8 December) were holidays in Spain, meaning that the participation could have been affected; also note that the last day of the discussion was one of the days with the least participation.
Figure 6 shows how four students were particularly active in their participation (the fifth one was the teacher), contributing more messages or answers than those defined by the teacher in the initial configuration (the minimum was two and the maximum three). These four students made between three and four contributions, earning them the highest levels of popularity among their classmates, with one of them standing out with 9.76% compared to the other three, who obtained a level of popularity of 7.32%. Another four students only made one contribution, achieving low popularity (2.44%) as well as a low level of participation. The remaining 11 students complied with the minimum of two contributions and their level of popularity remained steady at 4.88%.
Finally, and in relation to the files or links attached by each student, no contribution was made in this thread.

3.2.2. Second Discussion Thread

This second thread took place between 10 and 14 December. A total of 19 students participated, and this time, the discussion focused on the pros and cons of each of the research perspectives. This discussion thread was more productive in terms of the number of contributions made by the students, with a total of 65 answers distributed as follows over the days (Figure 7).
As can be seen in Figure 6, participation on the first two days of the discussion was lower than on the last three days, with more contributions being made from the third day, then remaining regular until the end of the debate. Given that there had been no change in the initial configuration of the debate with a constant a minimum of two interventions and a maximum of three, the students were more active in this thread. Thus, the number of more participative students increased to seven while the number of those who did not reach the minimum of two messages decreased to only two students (Figure 8).
As for the average number of words, the figure remained unchanged with respect to the previous thread. The student who participated most wrote 672 words and the least 135, meaning an average of 286 words, while only three students wrote fewer than 200 words. Finally, although in this thread there were no contributions of attached files by the students, there were contributions of external links with a total of 20. Especially noteworthy here is one student’s contribution with nine external links.

3.2.3. Third Discussion Thread

The last thread ran from 15 to 18 December. This thread consisted of drawing conclusions on what research perspective they considered could help them develop a good master’s thesis. This time, all 20 students participated with a total of 68 responses, distributed as follows (Figure 9).
As we can see in Figure 8, distribution of the messages was somewhat irregular over the four days, noting that 60% were made on the last day, meaning that the participation was concentrated on that day. In this discussion thread, the number of students who were highly participative increased with respect to the two previous threads. Here, there were particularly seven students who participated, while participation by another five was low (Figure 10).
Regarding the number of words, the average in this last thread was 178, and this time, the external links shared by students increased significantly compared to the previous thread, with 29 external links and two files attached.

3.2.4. General Results Considering the Three Discussion Threads

In relation to interaction in communicative participation, all students participated with a total of 177 messages analyzed, obtaining a homogeneity rate of 57.86%, which, while it showed irregular participation by the students, was not excessively skewed from the average. If we focus on the promotion of dialogue, a total of 172 responses were collected, enabling a dialogue level of 97.73%. Figure 11 shows the social network analysis from the students’ interactions in the forum. The graph represents one chain per thread of the forum, and the bigger points highlight messages which have received more responses due to their impact in the conversation.
Overall, a total of two files were exchanged and 52 external links collected, meaning that the students became strongly involved in justifying their contributions and responding to the thread raised with other resources not collected in the virtual space. The search for additional information is a clear indicator that the exchange of information encourages others to make contributions. Finally, in Figure 12, we show a list of the 75 terms most frequently used in the discussion. The most significant ones are highlighted in green due to having been included in one of the three semantic fields previously defined by the teacher.

3.3. Comparison between the Control and the Experimental Groups

The information collected in order to be able to compare the differences between the control and experimental groups in relation to academic performance and student satisfaction was limited because it is only a pilot within the overall research and we do not currently have information from more pilots to make a more complete comparison. Regarding academic performance and student satisfaction with the evaluation and feedback received from the teacher, it should be remembered that the control and experimental groups were developed in two different subjects with two different teachers. Thus, although student performance in general was good or very good in both subjects, the distribution of grades was different (Figure 13 and Figure 14). In relation to satisfaction, it should be noted that the same trend was observed in the study by Cerro Martínez et al. [28], where the students who were monitored using learning analytics showed a higher degree of satisfaction than the students in the control group. In this study, the median of the experimental group increased by a difference of one point with respect to the control group.

4. Discussion

The context in which this pilot was developed was based on the experience of a teacher who had no previous contact with learning analytics. Therefore, this study serves to compare the findings of other previous studies regarding the meaning of learning analytics when it comes to monitoring and evaluating the collaboration and interaction of students with respect to the tasks proposed in the subject. In this way, conducting collaborative activities in digital environments enables the collecting of evidence on interactions and the process of shared construction [13].
One of the biggest challenges for teachers in virtual environments is the monitoring and continuous assessment [33,34] of the students’ collaborative activities in the study by Cerro Martínez et al. [35]. It is therefore essential to have tools that help to visualize the collaborative action of students so that the data offered can help teachers to better interpret how this action is developed and enable them to act in the teaching–learning process. Note should be taken that the teaching action is determined by the monitoring carried out thanks to the pre-analysis performed with DIANA, and that the tool itself enables personalized feedback [36] based on the specific work undertaken by the students, which has a positive effect [37] including involvement and improvement in subsequent actions, as has been seen in the discussion threads, where the level of student participation has been growing, with an increasing number of participants showing a “highly participative” level in the thread [36] in the experimental group. In this respect, the students that have made the greatest participation are the ones who obtain the highest grades, as shown by Cerro Martínez et al. [35] in their previous study.
Online virtual discussions are one of the most widely used activities for enabling student interaction and collaboration; however, to develop an activity correctly, it is essential to have a good design [38] specifying key elements, in order that, through learning analytics, good monitoring and evaluation of the activity can be obtained [39]. We must not forget that the basic theory is data mining [40], which, transferred to the educational context, is postulated as educational data mining where, through specific methodologies and tools, the aim is to study educational issues, as in the case of this article. The pedagogical component is projected when the data generated in collaborative interactions need to be given a meaning and an interpretation, hence the use of learning analytics acquires a primary value given that it helps both teachers and institutions to improve their work by making decisions and creating predictive models adapted to their specific contexts [41].
The information provided by learning analytics is not self-sufficient but enables teachers by providing them with relevant information for their professional practice in both hybrid and online models. This means that they can offer students more objective and detailed suggestions that can be useful as feedback, which will facilitate a true learning experience and, consequently, improve teaching and the students’ academic results. The feedback offered by teachers enables students to self-regulate [42] and will finally improve their performance.

5. Conclusions and Future Research Lines

Summing up and referring to the initial questions: “What changes does the DIANA tool enable in the design, monitoring, and assessment of asynchronous discussions for teachers?” and “What does learning analytics contribute to the improvement of student performance?” we can say, referring to the subject design and methodology, that using DIANA has given greater internal coherence to the proposed tasks. Thus, DIANA has also produced more progressive development of the tasks making up the contents. In the same way, discussion has been the central axis of the subject, while the remaining tasks have been based on the aspects shared in the discussion threads, thereby enabling students to proceed with collaborative and accompanied development of the tasks. On the other hand, DIANA has enabled combined and complementary monitoring and visualization of the quantitative metrics and indicators with the contributions made during the discussion, facilitating faster and more objective assessment, supervision, and evaluation of the task than without it. As a first pilot experience, the parameters defined by the teacher should be reviewed and checked for future editions, for example, definition of the semantic field. Although the severity rate has remained below the 15% established in each of the threads, a better selection of terms should be made, discarding those that have not been reflected in the conversation and were therefore not significant. The tool parameters can therefore be adjusted in the coming years, depending on the teacher’s requirements.
Regarding the contribution of learning analytics to improving student performance, this can be translated into individualized commitment and the desire to improve their participation in future interventions. After each discussion thread, each student was given feedback generated by DIANA on their participation, with the intention of encouraging them to either continue or improve in future discussion threads. While we cannot conclude that this feedback was the direct reason for improved levels of participation, the students did appreciate receiving it and found that it motivated them. While this feedback has served to create closer contact with the students, we are aware that this pilot study was limited due to the small number of teachers and students in the sample. Therefore, it could be interesting to contrast and compare this with more students and teachers.
Finally, and for future research lines, it is necessary to go further into one underlying aspect not addressed in the study: the impact of analytics on education from a critical viewpoint. A series of aspects will therefore have to be taken into account such as (a) purposeful design of the tool for teachers who criticized it [43] for not coinciding with their needs; (b) if students are involved in the process, they should be allowed access to these analytics during development of the training action; (c) the diversity of existing analysis methods does not favor the generalized implementation of learning analytics from a standard point of view. Each context requires a specific technical approach, but this lack of standardization may not be taken into account in the measurement error [44]. At the same time, work could begin on making a predictive perspective of the results of the analyses, as proposed by Herodotou et al. [45].

Author Contributions

Conceptualization: M.G.C. and T.R.F.; Data curation, L.F.-O. and J.P.C.M.; Formal analysis, L.F.-O.; Methodology, J.P.C.M. and L.F.-O.; Writing—original draft preparation, L.F.-O., M.G.C., T.R.F. and J.P.C.M.; Writing—review and editing, L.F.-O., M.G.C., T.R.F. and J.P.C.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Innovation of Spain in the competition “Proyectos I+D+I 2020 Retos Investigación y Generación de Conocimiento” under the Project titled “Uso de analíticas de aprendizaje en entornos digitales: impacto en la mejora de la práctica docente universitaria (LAxDigTeach)”, grant number PID2020-115115GB-I00.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

For more information about the project, the DIANA tool, and the experimental pilots, consult, https://laxdt.uoclabs.uoc.es (accessed on 1 September 2021) and https://edulab.uoc.edu (accessed on 1 September 2021). Learning analytics tool evaluation questionnaire (for teachers), https://uocuniwide.eu.qualtrics.com/jfe/form/SV_eIMciQFnjtGcfxs (accessed on 15 September 2022). Student satisfaction questionnaire, https://uocuniwide.eu.qualtrics.com/jfe/form/SV_88O98xWQIJTPTFQ (accessed on 29 September 2022).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Van Dijck, J. Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Suveillance Soc. 2014, 12, 197–208. [Google Scholar] [CrossRef]
  2. Raffaghelli, J.E. Educators’ data literacy supporting critical perspectives in the context of a “datafied” education. In Teacher Education & Training on ICT between Europe and Latin America; Ranieri, M., Menichetti, L., Kashny-Borges, M., Eds.; Aracné: Roma, Italy, 2018; pp. 91–109. [Google Scholar]
  3. Raffaghelli, J.E.; Manca, S.; Stewart, B.; Prinsloo, P.; Sangrà, A. Supporting the development of critical data literacies in higher education: Building blocks for fair data cultures in society. Int. J. Educ. Technol. High. Educ. 2020, 17, 58. [Google Scholar] [CrossRef]
  4. Williamson, B. The hidden architecture of higher education: Building a big data infrastructure for the ‘smarter university’. Int. J. Educ. Technol. High. Educ. 2015, 15, 12. [Google Scholar] [CrossRef]
  5. Kitchin, R. The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar] [CrossRef]
  6. Buckingan Shum, S.; Deakin Crick, R. Learning analytics for 21st century competencies. J. Learn. Anal. 2016, 3, 6–21. [Google Scholar] [CrossRef]
  7. Daniel, B. Big data and analytics in higher education: Opportunities and challenges. Br. J. Educ. Technol. 2015, 46, 904–920. [Google Scholar] [CrossRef]
  8. Ferguson, R. Learning analytics: Drivers, developments and challenges. Int. J. Technol. Enhanc. Learn. 2012, 4, 304–317. [Google Scholar] [CrossRef]
  9. Siemens, G.; Gasevic, D. Learning analytics: A foundation for informed change in higher education. Educ. Technol. Soc. 2012, 15, 1–2. [Google Scholar]
  10. Gañán, D.; Caballé, S.; Clarisó, R.; Conesa, J.; Bañeres, D. ICT-FLAG: A web-based e-assessment platform featuring learning analytics and gamification. Int. J. Web Inform. Syst. 2017, 13, 25–54. [Google Scholar] [CrossRef]
  11. Caballé, S.; Clarisó, R. Formative Assessment, Learning Data Analytics and Gamification in ICT Education; Academic Press: London, UK, 2016. [Google Scholar] [CrossRef]
  12. Cooper, K.; Khosravi, H. Modelos visuales de dependencia temática basados en gráficos: Apoyando el diseño y la entrega de evaluación a escala. In Proceedings of the LAK 2018: 8th International Conference on Learning Analytics & Knowledge, Sydney, Australia, 5–9 March 2018. [Google Scholar] [CrossRef]
  13. Guitert, M.; Romeu, T.; Romero, M. Elementos clave para un modelo de aprendizaje basado en proyectos colaborativos online (ABPCL) en la Educación Superior. Amer. J. Dist. Educ. 2020, 34, 241–253. [Google Scholar] [CrossRef]
  14. Prinsloo, P. Data frontiers and frontiers power in higher education: A view of the Global South. Teach. High. Educ. 2020, 25, 266–383. [Google Scholar] [CrossRef]
  15. Sangrà, A.; Guitert, M.; Cabrera-Lanzo, N.; Taulats, M.; Toda, L.; Carrillo, A. Collecting data for feeding the online dimension of university rankings: A feasibility test. Ital. J. Educ. Technol. 2019, 27, 241–256. [Google Scholar] [CrossRef]
  16. Merrill, M.D. First Principles of Instruction. In Instructional Design Theories and Models: Building a Common Knowledge Base; Reigeluth, C.M., Carr, A., Eds.; Lawrence Erlbaum Associates Publishers: Hillsdale, NJ, USA, 2009; Volume 3. [Google Scholar]
  17. Peñalosa, E.; Castañeda-Figueras, S. Análisis cuantitativo de los efectos de las modalidades interactivas en el aprendizaje en línea. Rev. Mexic. Investig. Educ. 2010, 15, 1181–1222. [Google Scholar]
  18. Garrison, R.; Anderson, T.; Archer, W. A Theory of Critical Inquiry in Online Distance Education. In Handbook of Distance Education, 3rd ed.; Moore, M.G., Anderson, W.G., Eds.; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2003; pp. 113–127. [Google Scholar]
  19. Cleveland-Innes, M.; Garrison, R.; Vaughan, N. The Community of Inquiry Theoretical Framework. In Handbook of Distance Education; Moore, M.G., Diehl, W., Eds.; Routledge: New York, NY, USA, 2018; pp. 1–11. [Google Scholar]
  20. Goodyear, P. Educational design and networked learning: Patterns, pattern languages and design practice. Australas. J. Educ. Technol. 2005, 21, 82–101. [Google Scholar] [CrossRef]
  21. Guàrdia, L. Designing online courses. In Improving Online Teaching Practical Guide for Quality Online Education; Sangrà Coord, A., Ed.; Editorial UOC: Barcelona, Spain, 2022; pp. 47–64. [Google Scholar]
  22. Rubia, B.; Guitert, M. ¿La revolución de la enseñanza? El aprendizaje colaborativo en entornos virtuales (CSCL). Comunicar 2014, 42, 10–14. [Google Scholar] [CrossRef]
  23. Dillenbourg, P. Preface. In Arguing to Learn: Confronting Cognitions in Computer-Supported Collaborative Learning Environments; Andriessen, J., Baker, M., Suthers, D., Eds.; Kluwe: Dordrecht, The Netherlands, 2003; pp. VII–IX. [Google Scholar]
  24. Cabrea, N.; Fernández-Ferrer, M. Keys to an online assessment. In Improving Online Teaching Practical Guide for Quality Online Education; Sangrà Coord, A., Ed.; Editorial UOC: Barcelona, Spain, 2022; pp. 65–80. [Google Scholar]
  25. Wanner, T.; Palmer, E. Formative self-and peer assessment for improved student learning: The crucial factors of design, teacher participation and feedback. Assess. Eval. High. Educ. 2018, 43, 1032–1047. [Google Scholar] [CrossRef]
  26. Weaber, M.R. Do students value feedback? Student perceptions of tutors’ written responses. Assess. Eval. High. Educ. 2006, 31, 379–394. [Google Scholar] [CrossRef]
  27. Espasa, A.; Guash, T.; Mayordomo, R.M.; Martínez-Melo, M.; Carless, D. A Dialogic Feedback Index measuring key aspects of feedback processes in online learning environments. High. Educ. Resear. Develop. 2018, 37, 499–513. [Google Scholar] [CrossRef]
  28. Cerro Martínez, J.P.; Guitert Catasús, M.; Romeu Fontanillas, R. Impact in using learning analytics in asynchronous online discussion in higher education. Int. J. Educ. Technol. High. Educ. 2020, 17, 39. [Google Scholar] [CrossRef]
  29. Jorrín Abellán, I.M.; Fontana Abad, M.; Rubia Avi, B. Investigar en Educación; Editorial Síntesis: Madrid, Spain, 2021. [Google Scholar]
  30. Greene, J.C.; Caracelli, V.J.; Graham, W.F. Toward a conceptual framework for mixed-method evaluation designs. Educ. Evalu. Policy Anal. 1989, 11, 255–274. [Google Scholar] [CrossRef]
  31. Reglamento (UE) 2016/679 del Parlamento Europeo y del Consejo de 27 de Abril, Relativo a la Protección de las Personas Físicas en lo Que Respecta a la Libre Circulación de Estos Datos. Diario Oficial de la Unión Europea, 119, de 4 de mayo de 2016. Available online: https://www.boe.es/doue/2016/119/L00001-00088.pdf (accessed on 22 February 2019).
  32. Díez Gutiérrez, E.J. El uso de la webquest en la docencia universitaria: El aprendizaje colaborativo en red—Entorno QW. Rev. Latinoam. De Tecnol. Educ. RELATEC 2006, 5, 335–356. [Google Scholar]
  33. Romeu Fontanillas, T.; Romero Carbonell, M.; Guitert Catasús, M. E-assessment process: Giving a voice to online learners. Int. J. Educ. Technol. High. Educ. 2016, 13, 20. [Google Scholar] [CrossRef]
  34. Gibbs, G.; Simpson, C. Condiciones Para una Evaluación Continuada Favorecedora del Aprendizaje; Octaedro: Barcelona, Spain, 2009. [Google Scholar]
  35. Cerro Martínez, J.P.; Guitert Catasús, M.; Romeu Fontanillas, T. Impacto del uso de las analíticas del aprendizaje sobre trabajo colaborativo. Congrés Int. De Docència Univ. I Innovació CIDUI 2018, 4, 1–13. [Google Scholar]
  36. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  37. Salmon, G. E-Tivities: A Key to Active Online Learning, 2nd ed.; Routledge: London, UK, 2013. [Google Scholar]
  38. Ion, G.; Cano, E.; Cabrera, N. Competency Assessment Tool (CAT). The evaluation of an innovative competency-based assessment experience in higher education. Technol. Pedag. Educ. 2016, 25, 631–648. [Google Scholar] [CrossRef]
  39. Siemens, G.; Baker, R.S.J. Learning analytics and educational data mining: Towards communication and collaboration. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, LAK 2012, Vancouver, CO, Canada, 29 April–2 May 2012. [Google Scholar]
  40. Saleh, A.; Phillips, T.M.; Hmelo-Silver, C.E.; Glazewski, K.D.; Mott, B.W.; Lester, J.C. A learning analytics approach towards understanding collaborative inquiry in a problem-based learning environment. Br. J. Educ. Technol. 2022, 53, 1321–1342. [Google Scholar] [CrossRef]
  41. Holstein, K.; McLaren, B.M.; Aleven, V. Intelligent tutors as teachers’ aides: Exploring teacher needs for real-time analytics in blended classrooms. In Proceedings of the LAK 2017: 7th International Conference on Learning Analytics & Knowledge, Vancouver, CO, Canada, 13–17 March 2017. [Google Scholar] [CrossRef]
  42. Bergner, Y. Measurements and its Uses in Learning Analytics. In Handbook of Learning Analytics and Educational Data Mining; Lang, C., Gasevic, D., Wise, A., Siemens, G., Eds.; Society for Learning Analytics Research: Beaumont, BC, Canada, 2017; pp. 35–48. [Google Scholar] [CrossRef]
  43. Kim, M.; Lee, I.; Kim, S. A longitudinal examination of temporal and iterative relationships among learner engagement dimensions during online discussions. J. Comp. Educ. 2020, 8, 63–86. [Google Scholar] [CrossRef]
  44. Zhen, J.; Huang, L.; Li, S.; Lajoie, S.P.; Chen, X.; Hmelo-Silver, C.E. Selft-regulation and emotion matter: A case study of instructor interactions with a learning analytics dashboard. Comp. Educ. 2021, 161, 104061. [Google Scholar] [CrossRef]
  45. Heredotou, C.; Maguire, C.; Hlost, M.; Mulholland, P. Predictive Learning Analytics and University Teachers: Usage and perceptions three years post implementation. In Proceedings of the LAK 2023: 13th International Conference on Learning Analytics & Knowledge, LAK 2023, Arlington, TX, USA, 13–17 March 2023. [Google Scholar] [CrossRef]
Figure 1. DIANA configuration panel.
Figure 1. DIANA configuration panel.
Education 13 01064 g001
Figure 2. Individual metrics shown by DIANA.
Figure 2. Individual metrics shown by DIANA.
Education 13 01064 g002
Figure 3. Parameters defined in the DIANA tool prior to learning analytics pre-analysis.
Figure 3. Parameters defined in the DIANA tool prior to learning analytics pre-analysis.
Education 13 01064 g003
Figure 4. Part of the infographic presentation provided to the students with specific indications for participating in the debate.
Figure 4. Part of the infographic presentation provided to the students with specific indications for participating in the debate.
Education 13 01064 g004
Figure 5. Messages distribution from 5 to 9 December in the first discussion thread “Main characteristics of the different research perspectives in Educational Technology”. First day (December 5) 10 messages and last day (December 9) 7 messages.
Figure 5. Messages distribution from 5 to 9 December in the first discussion thread “Main characteristics of the different research perspectives in Educational Technology”. First day (December 5) 10 messages and last day (December 9) 7 messages.
Education 13 01064 g005
Figure 6. Student popularity rating in the first discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Figure 6. Student popularity rating in the first discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Education 13 01064 g006
Figure 7. Distribution of messages from 10 to 14 December in the second discussion thread: “Pros and cons of each research perspective”. First day (December 10) 10 messages and last day (December 14) 16 messages.
Figure 7. Distribution of messages from 10 to 14 December in the second discussion thread: “Pros and cons of each research perspective”. First day (December 10) 10 messages and last day (December 14) 16 messages.
Education 13 01064 g007
Figure 8. Student participation rate in the second discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Figure 8. Student participation rate in the second discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Education 13 01064 g008
Figure 9. Distribution of messages from 15 to 18 December in the third discussion thread: “What research perspective could help you to develop a good Master’s thesis?”. First day (December 15) 6 messages and last day (December 18) 40 messages.
Figure 9. Distribution of messages from 15 to 18 December in the third discussion thread: “What research perspective could help you to develop a good Master’s thesis?”. First day (December 15) 6 messages and last day (December 18) 40 messages.
Education 13 01064 g009
Figure 10. Student participation rate in the third discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Figure 10. Student participation rate in the third discussion thread. (Muy participativo = Very participative // Participativo = Participative // Poco participativo = Not very participative).
Education 13 01064 g010
Figure 11. Node graph of each of the discussion threads.
Figure 11. Node graph of each of the discussion threads.
Education 13 01064 g011
Figure 12. The 75 terms most frequently used in the discussion activity considering all three threads. Highlighted terms: paradigm, positivist, critical, research, view, social, study, interpretative, education, know, reality, object, methodology, reality, knowledge, researcher, viewpoint and social networks.
Figure 12. The 75 terms most frequently used in the discussion activity considering all three threads. Highlighted terms: paradigm, positivist, critical, research, view, social, study, interpretative, education, know, reality, object, methodology, reality, knowledge, researcher, viewpoint and social networks.
Education 13 01064 g012
Figure 13. Distribution of grades in the subject “Curricular integration of Educational Technology” (control group). The black line shows the tendency of the grade distribution.
Figure 13. Distribution of grades in the subject “Curricular integration of Educational Technology” (control group). The black line shows the tendency of the grade distribution.
Education 13 01064 g013
Figure 14. Distribution of grades in the subject “Research perspectives in Educational Technology” (experimental group). The black line shows the tendency of the grade distribution.
Figure 14. Distribution of grades in the subject “Research perspectives in Educational Technology” (experimental group). The black line shows the tendency of the grade distribution.
Education 13 01064 g014
Table 1. Information on the title of the participating subjects, their duration, discussion period, and type of group.
Table 1. Information on the title of the participating subjects, their duration, discussion period, and type of group.
Subject (Type of Group)Period of DurationDiscussion PeriodType of Group
Curricular integration of
Education
Technology
26 September to 14 October3–14 OctoberControl
Research perspectives in
Education
Technology
28 November to 21 December5–18 DecemberExperimental
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fernández-Olaskoaga, L.; Guitert Catasús, M.; Romeu Fontanillas, T.; Cerro Martínez, J.P. Learning Analytics: A View on the Design and Assessment of Asynchronous Online Discussions for Better Teaching Performance. Educ. Sci. 2023, 13, 1064. https://doi.org/10.3390/educsci13101064

AMA Style

Fernández-Olaskoaga L, Guitert Catasús M, Romeu Fontanillas T, Cerro Martínez JP. Learning Analytics: A View on the Design and Assessment of Asynchronous Online Discussions for Better Teaching Performance. Education Sciences. 2023; 13(10):1064. https://doi.org/10.3390/educsci13101064

Chicago/Turabian Style

Fernández-Olaskoaga, Lorea, Montse Guitert Catasús, Teresa Romeu Fontanillas, and Juan Pedro Cerro Martínez. 2023. "Learning Analytics: A View on the Design and Assessment of Asynchronous Online Discussions for Better Teaching Performance" Education Sciences 13, no. 10: 1064. https://doi.org/10.3390/educsci13101064

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop