Learning Analytics: A View on the Design and Assessment of Asynchronous Online Discussions for Better Teaching Performance
Round 1
Reviewer 1 Report
The current manuscript investigates how the use of a learning analytics tool in an online environment might support teaching performance. The findings suggest that the use of learning analytics could help to improve teaching performance in relation to monitoring student interactions, participation, and evaluation. Despite its strengths, the manuscript has multiple structural and methodological issues and limitations outlined below.
INTRODUCTION & LITERATURE REVIEW
- An important concern is the superficial nature of the literature review. The review did not present a holistic overview of major research on the topic lacking depth, critical insights, and logical structure. It did not provide sufficient context and background for the study. There was no clear justification or rationale presented for why this study is necessary. It also failed to identify gaps in the literature and justify the need for the study. The review did not sufficiently identify the research's rationale and did not position the study within the broader academic landscape. The literature review section needs more work.
- Another concern was that no strong rationale was provided for the study. What problem(s) is the study trying to address? Why is this study important? Why should we be interested in it?
METHOD, RESULTS & DISCUSSION
- The methodology used in the study was qualitative and quantitative methods. However, the methodology section is overly long with a lot of repetitive information. There is a need to rewrite the method section to be more precise and only highlight the most important part of the study.
- The discussion section lacked depth. It did not sufficiently interpret and explain the study's results in the broader context. It failed to compare the findings with existing literature to demonstrate the paper's contribution to the field. The analysis of the findings was shallow and did not adequately address the research questions.
- The limitations of the study were not acknowledged, thereby not allowing readers a more accurate interpretation of the results. The discussion section also failed to highlight the study's originality, theoretical and practical implications, and potential impact on real-world applications.
- It was unclear what the final results contributed to the existing literature on learning analytics because the literature review was superficial, and the discussion section failed to provide connections between the findings and prior studies.
The quality of English was the strongest part of the paper. However, some work can still be done in cutting out unnecessary words. Some of the sentences were a bit too wordy.
Author Response
Please see the attachment
Author Response File: Author Response.pdf
Reviewer 2 Report
This article, which uses an experimental design, seeks to evaluate the benefits of a data analytics tool (DIANA) for teaching and learning through online discussions. It uses what the authors call a mixed methods approach to compare the behavior of students in a series of online discussions, some of which employed DIANA and some of which did not (though that part of the design was unclear to me.)
I think the subject of this research is important. We do need to know what learning analytics offer in terms of teaching and learning. However, I found the article almost impenetrable in organization and focus, which disappointed me because I was interested in the topic. My feedback below:
1. First and foremost, the authors need to describe the intervention clearly before describing how they plan to assess it. As it is, the DIANA tool is introduced on page 2 but not explained until page 5. Without knowing what it is and does, the reader is left deeply confused about what the researchers are comparing. It is a complex tool, so the detailed explanation can wait, but the reader needs to know what is being compared: an online discussion with and without learning analytics.
2. The authors need to describe why evaluating DIANA will help answer larger questions about learning analytics. In other words, what makes DIANA typical of or better than the available learning tools? If this question isn’t dealt with, the reader doesn’t know if the results describe the benefits and drawbacks of learning analytics writ large or simply this tool.
3. The outcomes the researchers are interested in investigating are never clear. Are they looking for measures of student engagement during online discussions? If so, using what metrics? Are they looking to see whether or how learning analytics were used to inform teaching or learning? If so, what are those metrics? Are they looking for the pros and cons of using analytics? If so, the cognitive load and time necessary to process this additional info (for students and teachers alike) should be weighed against the benefits.
4. As a qualitative researcher, I chafe a bit at calling this a “mixed methods approach” based solely on open-ended survey questions. I also do not see any analysis or inclusion of the qualitative data (e.g., themes, quotes) in your Results section.
5. It isn’t clear to me if the control and intervention groups include the same people engaging in two different discussions, one with and one without the intervention.
6. Was any training provided to the teachers on how to use DIANA? This would seem to be important, given that they don’t all have significant online teaching experience never mind practice using data analytics tools.
7. In the Results section, it is not clear what you found in the control versus experimental group. It is also unclear to me how teachers or students used DIANA. While it’s clear there were high levels of participation and satisfaction (in both groups though, right??), it is not clear what role DIANA played. In other words, you need to describe how students and teachers encounter and make use of this tool as well as the mechanisms by which it would translate into greater participation, longer discussion posts, etc.
8. Considerably more scholarship has been done on learning analytics than is reflected here. What is the state of the art and where does this article fit into the ongoing conversation? That is not clear.
9. There are some odd word choices. For instance, in the intro, you use “irruption”, which means a sudden, violent entrance. Is that what you mean? Also, do you want to introduce the use of Big Data/analytics as “hyperbolic” straight from the outset? Isn’t that what you’re investigating: whether analytics are useful or just hype?
I think a good paper could be developed from this project, but that this version is too flawed to give a "revise and resubmit". I would recommend against publishing, but encourage the authors to keep working on it.
Author Response
Please see the attachment
Author Response File: Author Response.pdf
Reviewer 3 Report
A brief summary
The article aims to show the contribution of learning analytics to improving student performance and satisfaction. Mixed methodology used by author/s quantitative data with qualitative narratives, allows them to validate and enrich the findings, as well as to generate more solid and applicable recommendations for the improvement of educational practices and the design of more effective pedagogical interventions.
Specific comments
According to the tittle: "Learning analytics: implications in the design and assessment of asynchronous online discussions for better teaching performance", a suggest to the author/s for the word implications It can be replaced: ‘a view on the design…/ challenges…etc. ).
The author/s should add in the end of this study the questionnaires used and mentioned:
line 200 (Teacher’s questionnaire),
line 240 (Student’s satisfaction questionnaire).
Line 51 – add (Prinsloo) Some authors such as Prinsloo [14]
Line 85 – add (Abellan et al.) educational research to Abellan et al. [20]
Line 415– add (Martinez et al.) in the study by Martinez et al. [22]
Line 500 – add (Martinez et al.) in the study by Martinez et al. [25]
Correct
Line 184 – to create de questionnaires to create the questionnaires
Line 403 –terms most frecuently used terms most frequently used
In the Reference List:
Some references miss pages
Ref.5 Line 569
Ref.11 Line 579
Ref.16 Line 590
Ref.20 Line 597
Ref.31 Line 624
Author Response
Please see the attachment
Author Response File: Author Response.pdf
Round 2
Reviewer 1 Report
This is an interesting topic. Good work incorporating review suggestions.
Good.
Author Response
Dear reviewer 1
Thank you very much for your comment and for considering that the changes we have included answer to the suggestions you proposed in the first round. Now we see the article better articulated, argued and contrasted.
Yours sincerely,
Reviewer 2 Report
The manuscript is considerably improved with a far clearer explanation of what DIANA is and how the study was structured. I appreciate the visuals, too. Kudos to the authors. My comments:
A thorough edit is required to correct issues with the English. This paper is good enough in its current version to be taken seriously but the problems with English fluency will jeopardize its reception. I say this with full recognition of and respect for the skill it requires to write a scholarly article in a second language.
The section justifying mixed methods is a bit belabored/repetitive and could be scaled back.
A native English speaker could clean up the problems (strange passive voice constructions in particular) in relatively little time.
Author Response
Dear Reviewer 2:
Thank you very much for your comments and for considering the changes done are the expected ones for improve the article and get "YES" in the report form. According to the language we have send the text to revision and hope now is better. In relation to the methodology section, we have rewritten the 2.2 section.
Yours sincerely,