Next Article in Journal
ViTool-BC: Visualization Tool Based on Cooja Simulator for WSN
Next Article in Special Issue
Content Curation in E-Learning: A Case of Study with Spanish Engineering Students
Previous Article in Journal
Improving Ship Maneuvering Safety with Augmented Virtuality Navigation Information Displays
Previous Article in Special Issue
Artificial Intelligence for Student Assessment: A Systematic Review
 
 
Article
Peer-Review Record

E-Assessment in E-Learning Degrees: Comparison vs. Face-to-Face Assessment through Perceived Stress and Academic Performance in a Longitudinal Study

Appl. Sci. 2021, 11(16), 7664; https://doi.org/10.3390/app11167664
by Roberto Sánchez-Cabrero *, Javier Casado-Pérez, Amaya Arigita-García, Elena Zubiaurre-Ibáñez, David Gil-Pareja and Ana Sánchez-Rico
Reviewer 1: Anonymous
Reviewer 3:
Appl. Sci. 2021, 11(16), 7664; https://doi.org/10.3390/app11167664
Submission received: 21 July 2021 / Revised: 18 August 2021 / Accepted: 19 August 2021 / Published: 20 August 2021
(This article belongs to the Special Issue Application of Technologies in E-learning Assessment)

Round 1

Reviewer 1 Report

 E-assessment in e-learning degrees: Comparison vs face-to-face assessment through perceived stress and academic performance in a longitudinal study

The article is actual and addresses an extremely important topic  (assessment & evaluation) of any educational process, and for on-line education it is particularly neuralgic and has no unique solutions. There is an overall a sharp tension between learning objectives and evaluation methods, two poles with distinct priorities and philosophies regarding how results should affect education reform. The authors suggest the combination of two tools Respondus Monitor  and  LockDown Browser  enables students to take exams remotely while guaranteeing the integrity of the process. I think that's quite questionable, thinking and explanation why?

The article is well done, the methodology is appropriate, so I have no additional comment.

But I have two questions left, namely:

  1. Students mean age was 34.91 years - 34.44 women, 35.53 men with approx. SD 7.5 years. This is a Master’s degree from students? Is this Bologna Master’s program? Which kind of students are that? Usually Master students are from approx. 20-26 years old. Need additional explanations!
  2. This is my personal opinion, namely by any high technological solution, it is not possible to eliminate the fact that students will not find the "way". The solution is that we have to ask them challenging questions that require not only knowledge, but above all logical thinking, analysis, synthesis and evaluation (according to Blooms taxonomy)?

 

Author Response

Dear reviewers,

First, I would like to thank you your the effort to read and review our work and for your kind words. Your generosity has improved it to make it competitive and suitable to the academic field. We have paid close attention to the reviewers' suggestions and have improved the project substantially. The improvements made, as you can see in the attached manuscript, involve more than 1,200 new words (with significant improvements in introduction, discussion and conclusion sections) and 19 new references.  The language of the text has also been thoroughly reviewed with a native expert.

I hope the improvements will further improve your assessment of our work, but we have no problem reviewing it again if needed.

Thank you, and best regards.

Author Response File: Author Response.pdf

Reviewer 2 Report

Dear authors,

Thank you for the opportunity to read your work and provide feedback.  I really enjoyed your work and I think it provides some interesting insights into your local contexts.  There are some questions and comments that arose while I read your work. 

  1. Page 1, line 30 - You mention that there has been great debate on student assessment. You should provide some example citations from the past decade to illustrate this point. What is this debate about? How is it relevant to your current paper?
  2. Page 2, lines 87-90, you mention that the pandemic prompted a lot of research. This is true, but a lot of it also tends to ignore research in distance education that predates the pandemic.  What does that research tell us, and how does it inform your views, results, and future studies?
  3. Page 2, lines 95-96 - you mention using the Respondus product.  There has been a lot of backlash against these types of remote proctoring software because of issues relating to equity, access, and other factors. There has been a lot of discussion on twitter with the #againstsurveillance hashtag, and a recent article about this topic has been here: https://quod.lib.umich.edu/t/tia/17063888.0039.308?view=text;rgn=main  --- I think it would be worthwhile indicating in your paper whether these considerations were in mind when respondus was used (probably not :-) My campus also uses respondus and I know my colleagues have not thought about this), and this might be something to consider for your findings/limitations/future research
  4. Page 3, lines 106-109 - you mention the feature of respondus that faculty can go back and review videos.  In your experiment did faculty actually go back and review? If yes or no, something to consider mentioning in the findings/limitations section.
  5. Page 3, lines 113 - you claim that respondus is an effective deterant.  You need to cite this (if it's a claim from other research) or  point to it in your findings if you researched this.
  6. Page 3, lines 119-121 - for this section, I'd say that there is a lot of good research that is pre-pandemic that can give you past findings that are pre-pandemic, this way you can compare with yours.  Additionally, there is something called the "no significant difference" phenomenon (see here: https://detaresearch.org/research-support/no-significant-difference/ ).  I think this is something that needs to be considered in analyzing your results and in your discussion. 
  7. Page 11 / discussion.  Some questions that came up here:
    1. Did faculty members create new exams for these online exams?  Or were they the same type and style that they would have received if they were face-to-face?  If the exams are the same, then the variable condition is the remote examination.  If the exams were not the see same, you have more unknowns which can make for a possible confounding error
    2. What sorts of ICT competencies did learners have pre-pandemic? Did this impact their performance?
    3. What type of access did learners have to technology and broadband pre-pandemic? Did this impact their performance?
  8. Page 12, lines 320-321 - you mentioned that even online degreed tended to have classroom evaluations.  This should be cited because there is definitely a degree of nuance here.  There are distance education programs that had some sort of in-person exam component, but there are many that do not, and might never had.  It's not a given that exams were in-person.
  9. Page 12, lines 323-326 - This section also needs citing.  I would disagree that universities opted for "a myriad of haphazard measures" on two counts: (1) haphazard means that there was little thought put into what they did - which might be true for some instances, but not all; and (2) just because there were many different approaches to solving the issue at hand, it doesn't mean it's bad or haphazard.  For example, even in face-to-face instances there are many ways of conducting a class (lecture style, lab style, maker-space style, socratic dialogues, field trips, and so on). The same is true at a distance.
  10. Page 12, lines 341-343 - I think it's a giant leap to infer that the exams created were easier. Nothing in the paper suggests that to me.
  11. Your literature is mostly from the last 5 years. While it's good to have current literature, I fear that you might be missing out on some great distance education research from the past 20 years :-)

 

Thank you again. I hope my comments help make your paper even stronger :-)

Author Response

Dear reviewers,

First, I would like to thank you your the effort to read and review our work and for your kind words. Your generosity has improved it to make it competitive and suitable to the academic field. We have paid close attention to the reviewers' suggestions and have improved the project substantially. The improvements made, as you can see in the attached manuscript, involve more than 1,200 new words (with significant improvements in introduction, discussion and conclusion sections) and 19 new references.  The language of the text has also been thoroughly reviewed with a native expert.

I hope the improvements will further improve your assessment of our work, but we have no problem reviewing it again if needed.

Thank you, and best regards.

Author Response File: Author Response.pdf

Reviewer 3 Report

A timely paper that deals with a topic that will get considerably more attention as a consequence of Covid. 

The main concern I had with the paper was terminology regarding "assessment" & "evaluation", which seems to be used interchangeably -- yet in educational contexts these terms typically have distinct meanings.

In addition, it was not entirely clear what you mean by "online assessment" in the early part of the paper. The scope can span electronically managed processes to those in which involve instructors using learning management systems to review & assess student work. However, this seems to be clarified when the discussion turns to the implementation of Respondus. Then, it becomes very clear that the focus is entirely on exam-based assessments.

Tense inconsistency -- Given that the study has happened already there are several places where it makes more sense to use this. Moreover, much of the rest of the text is in the correct tense
e.g., line 9 "This study aims" --> This study aimed
li.15-16 "This study will compare" --> "This study compared"
li.37 "International reports like PISA" --> "In international reports prepared by PISA". PISA is a program not a report.
li.37 OCDE -- do you mean OECD? Also a good idea to spell out acronyms 
p.2li.56 "without its controversy" --> "without controversy"
p.2li.65-66 "a sharp tension between learning objectives and evaluation methods" yes, and therefore theorists such as Biggs are well regarded for his seminal work on "constructive alignment". see: Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher education, 32(3), 347-364.
p.2li.68 "The ever-increasing presence of digital technology in education has muddled these debates even further" -- that's quite a generalisation & contestable. Perhaps better to say it has "further complicated" the debates
p.3li.114 "This study has two research objectives" --> "This study had two research objectives"
p.3li.115 "it will compare" --> "to compare"
p.3.li.119 "will be to understand" --> "is to understand"
p.3li121 "This analysis will reveal" --> "Analysis aims to reveal"
p.12li.308 typo "Another interesting results" --> "Another interesting result"

Author Response

Dear reviewers,

First, I would like to thank you your the effort to read and review our work and for your kind words. Your generosity has improved it to make it competitive and suitable to the academic field. We have paid close attention to the reviewers' suggestions and have improved the project substantially. The improvements made, as you can see in the attached manuscript, involve more than 1,200 new words (with significant improvements in introduction, discussion and conclusion sections) and 19 new references.  The language of the text has also been thoroughly reviewed with a native expert.

I hope the improvements will further improve your assessment of our work, but we have no problem reviewing it again if needed.

Thank you, and best regards.

Author Response File: Author Response.pdf

Back to TopTop