Next Article in Journal
Envisioning Climate Change Adaptation Futures Using Storytelling Workshops
Next Article in Special Issue
Analysis of the Dynamical Capabilities into the Public Research Institutes to Their Strategic Decision-Making
Previous Article in Journal
Collaborative Distributed Planning with Asymmetric Information. A Technological Driver for Sustainable Development
Previous Article in Special Issue
Mixed Analysis of the Flipped Classroom in the Concrete and Steel Structures Subject in the Context of COVID-19 Crisis Outbreak. A Pilot Study
 
 
Article
Peer-Review Record

Comparing Face-to-Face, Emergency Remote Teaching and Smart Classroom: A Qualitative Exploratory Research Based on Students’ Experience during the COVID-19 Pandemic

Sustainability 2021, 13(12), 6625; https://doi.org/10.3390/su13126625
by Josep Petchamé 1,*, Ignasi Iriondo 1, Eva Villegas 1, David Riu 2 and David Fonseca 3
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Sustainability 2021, 13(12), 6625; https://doi.org/10.3390/su13126625
Submission received: 9 March 2021 / Revised: 4 June 2021 / Accepted: 7 June 2021 / Published: 10 June 2021
(This article belongs to the Special Issue Information Systems, E-learning and Knowledge Management)

Round 1

Reviewer 1 Report

Thanks for submitting a modified version of the manuscript. I just have one more question regarding the paper. Students' data were collected via a questionnaire. However, I did not see a detailed introduction to the questionnaire. For example, is it a self-developed questionnaire? What are the psychometric properties of the questionnaire, if there are any? Moreover, is it a Likert scale questionnaire? It would be better to have a few more descriptive information on the questionnaire. 

Author Response

Please see the attachment

Author Response File: Author Response.docx

Reviewer 2 Report

This reviewer is the second time that this manuscript has been evaluated, and on the previous occasion it already established that it did not meet the requirements to be published in the journal.

The authors in this second version of the manuscript have made changes based on some of the questions that were posed in the first review, such as changing the manuscript format to Introduction, Material and Method, Results and Discussion.

But the authors have not solved an essential question, they do not report the characteristics of the instrument used.

The reliability of the study scores that they have to report are those obtained in this work, not those obtained by other authors in other studies. Therefore, the authors have to report it in a mandatory way.

In addition, they claim that the reliability of the instrument has been reported in other studies [citations 78-80], as they comment in section 2.3. Reliability and validity-

It is essential that the authors report the reliability of the instrument used and do not refer to other authors.

Regarding the analysis of the data and the results presented, they are very simple and essential, they are limited to the presentation of an elementary descriptive study, reporting the response percentages and the mean scores obtained, no other analysis is presented. It is also not known whether it responds to a normal or a non-normal distribution, and no other parametric or non-parametric test is reported, so the evidence they present is minimal and has no power of generalization to other contexts, in addition to not providing evidence that is significant for the educational community or for education professionals in the university context.

It would have been necessary, at least, an analysis of differences of means between the results obtained depending on the methodologies used (or an analysis of variance, in case parametric tests could be applied) or any other study and / or further analysis. detailed.

For these reasons I consider that its publication is not appropriate, it was already indicated in the previous review and it has not been done.

Author Response

Please see the attachment

Author Response File: Author Response.docx

Reviewer 3 Report

Dear Author(s),

Thanks for the paper “Comparing Face-To-Face, Emergency Remote Teaching, and Smart Classroom: A Qualitative Exploratory Research based on Students’ Experience during the COVID-19 Pandemia” through two constructs used in the field of user experience.

However, I believe there are some aspects that can be enhanced in order either to make clear this study, namely introduction, methodological, findings (results), and conclusion sections.

I would suggest the use of past simple (harmonization) and advise not to use “we” (see lines 149, 156, 193, and 306).

  • Keywords: I would suggest another keyword, face-to-face as this is one out of three different modalities of classes analyzed by the authors in this study concerning students ‘experience and so, suppress the word user experience.
  • In the Introduction section, I would recommend:

- It is mentioned that “1) Online education allows competition among universities within a worldwide context [30] (…)” (see line 55). Nevertheless, there are also universities that are (only) online-based or flexible learning (available on http://www.openuniversity.edu and consulted on 3rd April 2021), that compete among themselves and not only with other Institutions that have online learning as an offer before the COVID-19 pandemic or even as it is discussed in this study, an offer to be continued in the near future either in a mixed modality or not according to students ´perceptions.

For this please explore Open University of the Netherlands (available on https://www.ou.nl and consulted on 3rd April 2021), UAb in Portugal (available on https://portal.uab.pt and consulted on 3rd April 2021) and UNED in Spain, which is “la mayor universidad de España, con sus más de 250.000 estudiantes que cursan sus titulaciones oficiales”(available on http://portal.uned.es and consulted on 3rd April 2021)

to have (more) insights and assessment of this issue.

- It is mentioned on page 61 that “1) Online teaching implies certain challenges for instructors [44,45] (..) and so I would suggest the change of wording instructors to professors. Does this imply only professors? Or also “professors, lectures as well as researchers? If this latter, so I would recommend “academic staff” instead of professors.

- It is mentioned that “A module included in the corporative Learning Management System named (Online Teaching and Learning Platform) to allow the online interconnections of users” (lines 93-95). I believe that it should be “A module was included in the corporative Learning Management System named (…). Please verify.

- It is mentioned that “In September 2020, the 2020-2021 academic year began. At that moment, most of the classrooms at La Salle URL were equipped to allow F2F class sessions with students inside the classroom (on-campus) or connected remotely to the classroom in a synchronous way (off-campus). The aim of this SC format was trying to keep most of the advantages that F2F classes offer [58], despite the COVID-19 pandemic (lines 98-102). I think it would justify answering some of the questions that aroused me, as I am aware of SC´s aim and how (lines 98-100):

- Who was the “designer of” SC format? Board? Professors and students? Board and Professors? Scientific Council? Pedagogic Council?

- Why: I believe that SC format would benefit from this explanation after answering “who” beforehand.

- I would suggest in 1.2 “aim and value-added of the study” instead of novelty in line 127. The same for line 138.

- I would suggest adding “learning solutions” in line 129 and so, it remains “teaching and learning solutions”.

- I would suggest rephrasing lines 147 to 148 and 154 to 161: The general aim of this research was to assess three different class modalities according to the undergraduates’ perceptions since 2019/2020 academic year in ICT engineering programs at La Salle website (Portal?).

The specific research objectives were: 1) Gain insight into the potential for the SC to create new expectations for students once restrictions on COVID-19 are cleared, which could mean rethinking the old F2F format; 2) Obtain greater knowledge about students’ perceptions when experiencing the ERT and the SC formats; 3) Detect possible negative effects of the ERT and SC formats on students from an emotional viewpoint when compared with the F2F format.

  • I would recommend a literature review section (the introduction is a mix of “an opening” and “state of the art”)
  • In the methods section (I suggest naming it “methods” only), which is quite detailed, I have some proposals:

- When it is referred in 1.2 “This study is an exploratory research on undergraduates’ perceptions once they have received classes in three different technological environments implemented in the context of undergraduate engineering studies from a user experience approach [63,64] (lines 144 to 146)” I would move it to the methods section.

- I would rephrase lines 154 to 156 by moving the phrase to section 1.2. (please see 1.2).

- So, in lines 161 to 170 I would rephrase to:

- This study is an exploratory research study on undergraduates’ perceptions once they have received classes in three different technological environments implemented in the context of undergraduate engineering studies from a user experience approach. User experience is an adequate construct suitable to carry out this research because it allows collecting data on the students themselves. A specific technique, Bipolar Laddering (BLA) can be used to collect qualitative data about the user experience of respondents without biasing their perceptions as it is based on open-ended questions. Besides, to complete information about students’ experience, an Emotional Appraisal technique can provide additional data once their experience is completed in a specific class format.

- Regarding methods I would suggest a diagram to better describe the used techniques/surveys concerning undergraduates’ perceptions once they have received classes in three different technological environments.

- It is referred the “Data were collected by means of a voluntary and anonymous questionnaire” (line 206). Can this be annexed to the MS?

- It is mentioned that the “Students were asked to introduce their answers at end of the last session of VC&FE in (it is in and not on) January 2021, and it took fifty minutes (…)” (line 210). Isn´t that 15 minutes instead of 50 minutes as it seems a long time to answer to a questionnaire moreover at the end of the last session?

  • In the results section, I would suggest renaming to “findings” as this is a qualitative study or at least “results and findings”.

- I would suggest the presence of results/findings in a different way but in a  more appealing way. For each semester try to put altogether the positive and negative elements and if they are suggested by two or more students and even by only one.

  • In the conclusion section, I would suggest adding as a limitation “the insufficient knowledge base to evaluate the effect of the deployment of different technological education formats to face the situation derived from the COVID-19 pandemic”.

- I believe that a quantitative survey to assess all the different items would be very useful in this study regarding statistics (statically significance) to compare to Emotional Appraisal.

- There shall be noted in this section the “value-added” as well “next steps” (moving from discussion section- lines 547 to 552).

Author Response

Please see the attachment, and thank you very much for all your comments and suggestions.

Author Response File: Author Response.docx

Reviewer 4 Report

Basically, I think that the aim and methodology of this paper are relevant to Sustainability. However, there remain issues that need to be addressed.

1. The description of the abstract should be improved. For example, the results should be modified, and added important information about ERT and SC benefit for students in the new version. The author(s) should try to describe constructive and helpful comments that have improved the abstract.

2. About Section 6. Conclusions, base on the obtained results, the reviewer suggests the authors should add some suggestions or propose some practical strategies. The added content needs to respond to the abstract. 

3. About the Conclusions, On page 18, line 615 to page19, line 631. The reviewer suggests the authors can add Section 7. (Limitations and Directions for Future Studies) That would be let the reader more understand the limitations of this research. 

4. On page 20, line 654, Figure 6. should it be modified Figure A1. (Suggest the author(s) need to confirm the relevant format naming. If the Figure name should be changed, please try to re-check the figure name and number in this study. EX: line 214, line 303, and  line 654)

Comments for author File: Comments.docx

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

The authors have made quite a few changes in the methodological section (and in others, which have improved the quality of the work) and in the analysis of the results, but there are some questions that are not correct, as well as not acceptable in a JCR Q2 journal.

When reporting the mean age of the participants, they offer the variance scores (when the standard deviation corresponds).

The emotional assessment items assessed through an instrument are capable of calculating their reliability of the scores through some means (alpha, omega, etc.), but they again stop reporting on the reliability of the scores. They calculate mean scores and measures of dispersion (standard deviation), as well as box plots with whiskers (adequate). If the authors perform these calculations, they may have data available to report the reliability of the scores. This reviewer is the third time that he requests this information, it is not admissible for them to omit it again in the article.

They perform an ANOVA that is presented in Table 5, but they incorrectly report the results of said analysis, the degrees of freedom are not reported, nor the value of the F test. Nor do they perform a posterior test to detail between which groups are statistically significant differences. It is also common practice to offer the results of the statistical effect size analysis linked to the ANOVA (it is also not reported). These nuances are necessary when reporting the results of an investigation.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

Dear Author(s),

Thanks for the changes you have made to the paper “Comparing Face-To-Face, Emergency Remote Teaching, and Smart Classroom: A Qualitative Exploratory Research based on Students’ Experience during the COVID-19 Pandemia” through two constructs used in the field of user experience.

  1. Despite the changes, which have enhanced the quality of the paper, I believe there are key methodological issues that are too important to be addressed, namely in the analysis of results and findings section (#4) regarding statistical analysis as mentioned downwards.

On the one hand, the results had had a statistical treatment using MATLAB ® software.

On the other, it is said “pairs showing a statistically significant difference of means were written in bold (p-values obtained (alpha 0.05) from all the pairwise test). Despite the statistical treatment of the results of the Emotional Appraisal, it should be noted that these results were obtained in the context of an exploratory qualitative analysis. Therefore, this analysis was appropriate in the context of this research, which was oriented to collect opinions and feelings of ICT engineering undergraduates once they had experienced three-class modalities”.

 So, here it should be noted, and it is more accurate to report an ANOVA test mentioning the degrees of freedom and value of the F test with special notation. This is of ultimate importance to present the analysis and to confirm its reliability.

Example: There was a statistically significant difference between groups as demonstrated by one-way ANOVA (F (2,47) = 3.5, p = 0.038).

Please describe more and be more precise in the presentation of results and findings sections their link to discussion section (this is what “comes from the heart (methods section)” of the MS).

  1. Regarding (see 2) “value-added of the study” I suggest rephrasing as it is not what is asked.

Example: The paper contributes to the literature by reflecting the lack of integration of policies and strategies in HEIs in a Southern European country (Portugal), within the framework and goals of the UN DESD 2005-2014, and explaining similar patterns probably existing in other countries.

3. The appendix is not quite visible. I would suggest no to use a print screen excel.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 3

Reviewer 3 Report

Dear Author(s),

I would recommend publishing, but beforehand I would recommend a small amendment (removing the lines in axis X in Appendix A) after the corrections presented in the 3rd version, namely methodological issues.

Author Response

Thank you for your comment. We have removed the grid lines in the Figure A1, as su suggested.

This manuscript is a resubmission of an earlier submission. The following is a list of the peer review reports and author responses from that submission.


Round 1

Reviewer 1 Report

Dear authors,

I read your article, who discusses a topic of great interest.

 However,  major improvements are needed, such as:
- the novelty of your study must be added in the introduction
- a part of the introduction needs to be moved to a "literature review" section
- also the "literature review" section needs to be improved.
- it is necessary to add some research hypotheses
-  the research hypotheses must be tested with statistical tests
- the sample included in the analysis is very small. How many students are in the engineering program? Why didn't you include them all in the study?
- in the methodology part, the research tools used must be explained more clearly
- you can add some information from the study in a section of annexes
- it is necessary to include statistical correlations to test the results obtained
- the discussions should be presented compared to results from other studies
- I also put some comments directly on the text. See the attached file.

These are just some of the elements that need major improvement.

Good luck! 

Comments for author File: Comments.pdf

Reviewer 2 Report

The article does not meet the requirements for publication in this journal. The manuscript does not follow the standards or the usual sections in educational research: Introduction, Method (Material and Method), Results and Discussion.

The keywords used are not suitable for internationally accepted descriptors in the context of higher education. The abstract does not offer evidence of the results at a statistical level, probably because the data offered are not based on elementary analyzes, very basic, not admissible in a publication like this journal.

The title is excessively long, in addition to not reflecting what is presented in the article, at no time is an investigation or qualitative analysis developed. The authors limit themselves to using two instruments (which are not described with precision, nor do they report on their metric qualities (validity and reliability), nor on other essential aspects of them.

The introduction does not include a detailed review of different studies on the international scene related to the subject under study. If there are different methodology used during the analyzed period, throughout the global pandemic.
 At the end of the introduction, an objective is presented, formulated in an inconsistent or incomplete way, which is also not specifically answered in the study.

In the Method section (Material and Method) the participants are not described, not if they indicate the size of the sample or specify sociodemographic details of the same. The instruments used are insufficiently reported and in an inappropriate place (details of the instruments are given in the Introduction section).

As indicated, the characteristics of the information collection instruments are not specified in detail.

In relation to the analysis of the data and the results presented, they are very simple and essential, they are limited to the presentation of an elementary descriptive study, reporting the response percentages and the mean scores obtained, no other analysis is presented. nor is the number of participants in the study known, nor do we know whether it responds to a normal or a non-normal distribution, and no other parametric or non-parametric test is reported, so the evidence they present is minimal and has no power of generalization to other contexts, in addition to not providing evidence that is significant for the educational community or for education professionals in the university context.

It would have been necessary, at least, an analysis of differences of means between the results obtained depending on the methodologies used (or an analysis of variance, in case parametric tests could be applied) or any other study and / or further analysis. detailed.

In this sense, both the discussion and the conclusions have no weight or value, as they are not supported by rigorously or validly reported results.

The authors use 31 references from the last 5 years out of the 58 contemplated, so the thematic news index is adequate, but a further review of international works would be necessary.

In conclusion, in the opinion of this reviewer, the article does not meet the requirements to be accepted.

Reviewer 3 Report

Thanks for submitting this research study. This is a well written paper to investigate college students' response to face to face, online and hybrid methods to take class under COVID-19. However, I do have some concerns regarding the paper. 1. Regarding the instrument applied to collect students response, Emotional Appraisal Questionnaire, the author could provide more detailed background information. 2. It would be better if the study could provide more information on how the data were collected. It was only indicated in the study that the survey was uploaded in a management system. However, is it a platform for all students to submit survey after class? Or they get a link to complete the survey and then upload to the system? Or it's a paper form instrument for students to complete and then upload?
Back to TopTop