Next Article in Journal
The Influence of Absorption and Need for Cognition on Students’ Learning Outcomes in Educational Robot-Supported Projects
Previous Article in Journal
Conceptions and Attitudes of Pre-School and Primary School Teachers towards STEAM Education in Spain
Previous Article in Special Issue
Bachelor of Education Science Students’ Beliefs, Perceptions, and Experiences of Online Learning during the COVID-19 Pandemic: A Case of Disadvantaged Students
 
 
Article
Peer-Review Record

Perceptions about the Assessment in Emergency Virtual Education Due to COVID-19: A Study with University Students from Lima

Educ. Sci. 2023, 13(4), 378; https://doi.org/10.3390/educsci13040378
by Iván Montes-Iturrizaga 1,*, Gloria María Zambrano Aranda 2, Yajaira Licet Pamplona-Ciro 3 and Klinge Orlando Villalba-Condori 4
Reviewer 2:
Educ. Sci. 2023, 13(4), 378; https://doi.org/10.3390/educsci13040378
Submission received: 3 March 2023 / Revised: 2 April 2023 / Accepted: 4 April 2023 / Published: 7 April 2023

Round 1

Reviewer 1 Report

This paper aims to investigate university students’ perceptions about the tests/assessment used by their professors in the pandemic context; at a university in Lima (Peru). It is an educational subject, but the paper has major limitations and, in particular, with regard to literature review, methodology, and discussion.

There is not always clear distinction between authors’ views/arguments and other researchers’ arguments. For example, in lines 58-60 who says the words within the quotations? (relevant references should be added). Similarly, in lines 97-99 “But this is also a problem for students, a significant percentage of whom graduate from secondary education without having achieved the basic competencies needed to enter the university world”; who says this, is it claim or research evidence outcome, from which countries?

All information regarding the Peruvian context (laws, PISA results, etc.) should be discussed in one paragraph (or a separate sub-section). Also, recent information on the participant university could be included (% of students in poverty, etc).

In introduction, there is a lengthy part regarding general issues/definitions about (university) student assessment. This part should be reduced, and the emphasis should be given on actual research evidence published during the pandemic (i.e., issues and challenges during the last 2 years). The literature review is limited, and it needs to be enhanced with recent evidence on university students’ perceptions about assessment during the pandemic. In discussing earlier research, the students’ field of study is useful to be mentioned, commented on; because the variable “Degree Program” appears in all 4 Tables of this study (as authors state “degree-specific emphases”).

The purpose of the study should not be followed by other studies; such studies, e.g., [5, 16, 19, 20] could constitute the background for this study. The specific research objectives are missing!

The section “Materials and Methods” needs to be re-organized to include separate sub-sections regarding the Sample (and Procedure) and the Research Instrument. The methodology section needs enhancement. Initially, the sample could be better described by adding/creating a Table, indicating frequency and % percentages for student characteristics/variables. For each academic subject, what was the “ratio for “population size/sample size”? The research instrument should be placed in Appendix, to facilitate the readership of the paper. Which were the actual questions, how were these developed, from which studies were these adapted (add references)? When exactly was the questionnaire administered (month and year)?

The information on types of tests (Multiple-choice test, Essay test, Oral test, etc.) applied by professors in different degree programs (Table 1) could also be obtained from professors, and then compared with students' perceptions. Why was not this applied?

The discussion needs deeper interpretation of findings. What do these mean for students studying Law, Marketing, etc.? (all Tables use the variable “Degree Program”). What are the pedagogical implications of preferring/applying xx type of exams? How are students' preferences (for their future post-pandemic exams?) regarding the type of exams linked to their online experiences during the pandemic? As stated earlier, the actual phrasing of the questions is missing and this prevents the readership of the paper.

What is the significance of this study to the field?

With regard to English language some minor errors. In the title, it is preferable “with” (university students) rather than “in”. In the second sentence of the abstract it is better to write “learn (how) to use teaching platforms”. Line 15, Students were “assessed” (rather than “evaluated”). Line 35, the phrase “obey performance exams” is not clear; is “obey” synonymous to “apply”? Line 258, I think the authors mean “finding” rather than “found”.

Author Response

First of all I would like to thank you for the observations and suggestions you make to our paper. Within this framework, we have specifically addressed all of them. Thus: 
1. we have clearly specified what corresponds to the ideas of other authors and to ours as authors. We have also better supported a number of ideas that were not adequately justified. 
2. We have made clarifications with respect to Peru and we have made explicit what is related to the results of the country in PISA and the problems of the educational system. 
3. We provide information on students living in poverty. 
4. We include a greater number of references to provide better theoretical support and background. Also, we have been careful to mention similar studies (although they are scarce) and to give more importance to what students think or believe with respect to the tests that have been applied in pandemics. 
5. The references have been significantly expanded and better target what you point out in your comments. 
6. We have included the specific objectives. 
7. We have indeed considered the university training program; however, the literature does not consider these aspects as we do. 
8. We have organized the materials and methods section; we have disaggregated, specified the sampling fraction, the procedures, the way in which we statistically analyzed the data, we have elaborated a table with frequencies and percentages of the students according to degree program. In addition, we have better explained aspects related to the instrument, 
9. We have considered in greater detail aspects related to the contributions of this type of study to the thematic field of assessment. 
10. All your suggestions regarding formatting, spelling and wording have been addressed. 

Reviewer 2 Report

The present paper studies university students' perceptions about the assessment in emergency virtual education due to COVID-19: a study in university students from Lima. Below are some comments to imrove the quality of the paper.

Introduction:

1- There is no research question or research goals or research rationale!

2- The authors should enrich the literature review. There only 20 references there. In addition, there is only 23 references in the whole paper, which is very little. 

Methodology: 

3- Sampling: The sampling is not clear. What is the sampling method? How the sample is related to the population in number and percent?

4- clarity: some sentences lack clarity, as the following one: " It is worth mentioning that 72% of all these students work an average of 5.5 hours per day (between Monday and Friday).". Are the students workers?

5- The authors should explain why they give detailed information about the participants, where this information does not serve the goal of the research. One example is: "Twenty-three percent of the respondents have one child and 15% have two children (all of them minors)". Why in the previous sentence, one percent is written in words while th other is written in numbers?

6- The methodology should be divided into sections: research design, research population and sample, data collecting tools, data analysis tools. 

7- The authors do not refer to the statistical exams they used to answer the research questions. 

Conclusions:

8- The authors should add a conclusions section, a recommendation section and a limitation section. 

 

Author Response

First of all I would like to thank you for the observations and suggestions you make to our paper. Within this framework, we have specifically addressed all of them. Thus: 
1. we have clearly specified what corresponds to the ideas of other authors and to ours as authors. We have also better supported a number of ideas that were not adequately justified. 
2. We have made clarifications with respect to Peru and we have made explicit what is related to the results of the country in PISA and the problems of the educational system. 
3. We provide information on students living in poverty. 
4. We include a greater number of references to provide better theoretical support and background. Also, we have been careful to mention similar studies (although they are scarce) and to give more importance to what students think or believe with respect to the tests that have been applied in pandemics. 
5. The references have been significantly expanded and better target what you point out in your comments. 
6. We have included the specific objectives. 
7. We have indeed considered the university training program; however, the literature does not consider these aspects as we do. 
8. We have organized the materials and methods section; we have disaggregated, specified the sampling fraction, the procedures, the way in which we statistically analyzed the data, we have elaborated a table with frequencies and percentages of the students according to degree program. In addition, we have better explained aspects related to the instrument, 
9. We have considered in greater detail aspects related to the contributions of this type of study to the thematic field of assessment. 
10. All your suggestions regarding formatting, spelling and wording have been addressed. 
11. We have included as an appendix the questionnaire prepared and applied. 
12. The statistical analyses have been described. Likewise, we have disaggregated the entire methodological section as indicated. 

Round 2

Reviewer 1 Report

I think the manuscript has been improved, and can be considered for publication.

A minor point: phrase “Let's see” (line 169) to be deleted.

Author Response

Mr/Mrs. Reviewer of my article.

Thank you for your comments and suggestions for improvement.

In this context I will detail each of the improvements made:

  1. I removed the expression "Let's see" from line 169, just as you indicated.
  2. To improve the style, format and writing of my article, I have been guided by the article; Bachelor of Education Science Students' Beliefs, Perceptions, and Experiences of Online Learning during the COVID-19 Pandemic: A Case of Disadvantaged Students. This article is part of the Special Issue "Education and Technology in Sciences-Selected Papers of CISETC 2021 Conference https://www.mdpi.com/journal/education/special_issues/education_and_technology_in_sciences
  3. The section "1.1. Purpose of Study" has been created and we have improved the wording of what this article aims to achieve.
  4. Numbering 2.2. has been added and "Selection of Participants" has been added instead of "Sample". In this section, the year in which the questionnaire was applied was clearly placed: 2021.
  5. In Table 1, a row was placed at the end with the total number of subjects in the sample and the respective percentage.
  6. The numbering 2.3. has been changed to "Instruments" and "Questionnaire" has been added instead of "Instruments". Also, in this numeral 2.3. "(Appendix A)" was inserted as there was no indication to see this appendix in order to read the questionnaire.
  7. The whole second paragraph of 2.3. was improved.
  8. The wording of part 2.4. has been improved. Procedures.
  9. Created section 2.5. "Data Analysis".
  10. Numbered the titles and subtitles as it is done in the articles of the Education Science Journal. In the same part 2.3. we have changed the word "test" and "psychometric test" to "questionnaire". In addition, in some parts we have placed "research questionnaire" instead of "instrument".
  11. Instead of "Materials and Method" we have put "Methodology".
  12. We changed the numbering from 2.1. to "Research design". Capitalised "D".
  13. In the title of Table 3, the word "exam" has been replaced by "test".
  14. In the title of Table 5, the word "test" has been inserted instead of "exams".
  15. We have considered in "limitations" this expression “We also did not take into account the academic achievement tests that the professors of our sample students provided in the virtual classrooms. Likewise, we did not have access to their grades or their scores on the tests administered by the professors; as this information is confidential information of a professional nature in the Peruvian legal framework".
  16. In the "conclusions" we have drafted a paragraph that corresponds more to this section. “Finally, it is important to note that we do not have previous studies similar to this one that would have played a relevant role in establishing reliable comparisons that would indicate whether assessment in emergency e-learning is the same or different from that practiced before the pandemic. Based on the above, we can hypothesize that the assessment would have suffered a setback in this health crisis at the university that we considered in this research [38]. In this context, our study revealed that: 1. multiple-choice tests were favored in some degree programs; 2. that there are differences in the assessment cultures of each degree program; 3. that gender and age are not linked to preferences or to the consideration of the most important tests for university education; and, 4. that students preferred multiple-choice tests even though they did not consider them to be important".
  17. The subtitle "Appendix A. Questionnaire" was clearly placed at the end.
  18. I have made the following spelling and style corrections (I will point this out in the new line numbering of the current version of my manuscript).

- Line 7: changed "highlighted" to "remarkable".

- Line 8: changed "would" to "may".

- Line 9: changed "weak" to "lack of".

- Line 10: added the word "previously".

- Line 11: changed "have been" to "were".

- Line 16: changed "recognize" to "recognized".

- Line 17: changed "training" to "education".

- Line 17: added "Finally, it was found that".

- Line 26: changed "these" to "them".

- Line 27: changed "we generate" to "is generated".

- Line 31: changed "intelligent" to "intellectual".

- Line 44: added "the".

- Line 71: deleted "Thus, and".

- Line 86: deleted "situations have also been reported where" and changed to "it was reported that..".

- Line 87: added the word "who".

- Line 100: added "because".

- Line 100: changed "whom" to "them".

- Line 254: changed "how the students considered the answer" to "the student's appreciation of the".

  1. Improved the following references: 1, 7, 12, 18, 28, 28, 32

Reviewer 2 Report

Improved. 

Author Response

Mr/Mrs. Reviewer of my article.

Thank you for your comments and suggestions for improvement.

In this context I will detail each of the improvements made:

  1. I removed the expression "Let's see" from line 169. 
  2. To improve the style, format and writing of my article, I have been guided by the article; Bachelor of Education Science Students' Beliefs, Perceptions, and Experiences of Online Learning during the COVID-19 Pandemic: A Case of Disadvantaged Students. This article is part of the Special Issue "Education and Technology in Sciences-Selected Papers of CISETC 2021 Conference https://www.mdpi.com/journal/education/special_issues/education_and_technology_in_sciences
  3. The section "1.1. Purpose of Study" has been created and we have improved the wording of what this article aims to achieve.
  4. Numbering 2.2. has been added and "Selection of Participants" has been added instead of "Sample". In this section, the year in which the questionnaire was applied was clearly placed: 2021.
  5. In Table 1, a row was placed at the end with the total number of subjects in the sample and the respective percentage.
  6. The numbering 2.3. has been changed to "Instruments" and "Questionnaire" has been added instead of "Instruments". Also, in this numeral 2.3. "(Appendix A)" was inserted as there was no indication to see this appendix in order to read the questionnaire.
  7. The whole second paragraph of 2.3. was improved.
  8. The wording of part 2.4. has been improved. Procedures.
  9. Created section 2.5. "Data Analysis".
  10. Numbered the titles and subtitles as it is done in the articles of the Education Science Journal. In the same part 2.3. we have changed the word "test" and "psychometric test" to "questionnaire". In addition, in some parts we have placed "research questionnaire" instead of "instrument".
  11. Instead of "Materials and Method" we have put "Methodology".
  12. We changed the numbering from 2.1. to "Research design". Capitalised "D".
  13. In the title of Table 3, the word "exam" has been replaced by "test".
  14. In the title of Table 5, the word "test" has been inserted instead of "exams".
  15. We have considered in "limitations" this expression “We also did not take into account the academic achievement tests that the professors of our sample students provided in the virtual classrooms. Likewise, we did not have access to their grades or their scores on the tests administered by the professors; as this information is confidential information of a professional nature in the Peruvian legal framework".
  16. In the "conclusions" we have drafted a paragraph that corresponds more to this section. “Finally, it is important to note that we do not have previous studies similar to this one that would have played a relevant role in establishing reliable comparisons that would indicate whether assessment in emergency e-learning is the same or different from that practiced before the pandemic. Based on the above, we can hypothesize that the assessment would have suffered a setback in this health crisis at the university that we considered in this research [38]. In this context, our study revealed that: 1. multiple-choice tests were favored in some degree programs; 2. that there are differences in the assessment cultures of each degree program; 3. that gender and age are not linked to preferences or to the consideration of the most important tests for university education; and, 4. that students preferred multiple-choice tests even though they did not consider them to be important".
  17. The subtitle "Appendix A. Questionnaire" was clearly placed at the end.
  18. I have made the following spelling and style corrections (I will point this out in the new line numbering of the current version of my manuscript).

- Line 7: changed "highlighted" to "remarkable".

- Line 8: changed "would" to "may".

- Line 9: changed "weak" to "lack of".

- Line 10: added the word "previously".

- Line 11: changed "have been" to "were".

- Line 16: changed "recognize" to "recognized".

- Line 17: changed "training" to "education".

- Line 17: added "Finally, it was found that".

- Line 26: changed "these" to "them".

- Line 27: changed "we generate" to "is generated".

- Line 31: changed "intelligent" to "intellectual".

- Line 44: added "the".

- Line 71: deleted "Thus, and".

- Line 86: deleted "situations have also been reported where" and changed to "it was reported that..".

- Line 87: added the word "who".

- Line 100: added "because".

- Line 100: changed "whom" to "them".

- Line 254: changed "how the students considered the answer" to "the student's appreciation of the".

  1. Improved the following references: 1, 7, 12, 18, 28, 28, 32
Back to TopTop