Next Article in Journal
Introducing Computer Science Unplugged in Pakistan: A Machine Learning Approach
Previous Article in Journal
International Office Professionals: An Example of Street-Level Bureaucrats in Higher Education
 
 
Article
Peer-Review Record

Design, Evolution, and Evaluation of a General Chemistry-Bridging Course

Educ. Sci. 2023, 13(9), 891; https://doi.org/10.3390/educsci13090891
by Scott A. Reid
Reviewer 1:
Reviewer 2:
Reviewer 3:
Educ. Sci. 2023, 13(9), 891; https://doi.org/10.3390/educsci13090891
Submission received: 21 July 2023 / Revised: 29 August 2023 / Accepted: 1 September 2023 / Published: 3 September 2023
(This article belongs to the Section Higher Education)

Round 1

Reviewer 1 Report

Here are minor edits that I suggest:

line 61: add s to effort

line 89: and other places - write out credit instead of using cr.

line 177: Likert-scale (insert hyphen)

In paragraph that starts on line 212 - remove underlined words

Please see the comments above for editorial comments.

Author Response

Here are minor edits that I suggest:

line 61: add s to effort

Response: changed.

line 89: and other places - write out credit instead of using cr.

Response: changed.

line 177: Likert-scale (insert hyphen)

Response: changed.

In paragraph that starts on line 212 - remove underlined words

      Response: changed.

 

Reviewer 2 Report

On page 1, line 43 I think the authors identify the principal problem in this area, lack of engagement on the part of students who really need it. However, it is not clear how this contribution really seeks to address this.

I enjoyed the introduction and the points made rang very true. It is a perceptive introduction, which shows a strong grasp of the literature and is highly readable.  However, it lacked a clear statment on the theoretical framework guiding the contribution. It might be a reviewing cliche to point out the absence of research questions but they would have provided a focus for the authors and readers to judge the success of the contribution.

The authors acknowledge the problem with uptake, but it is very difficult to gauge anything based upon 2 students over 3 years. As an associate dean I am suprised the course continues to be run. Why is it not offered before General Chemistry is begun to build self confidence?

On reading the course description I was disappointed there was little detail and no examples on how this critical thinking and active learning was implemented and achieved. I know there are tables in the supplementary, but I am more interested in the approach than the content and the paper is content focussed.

Page 5, line 148: This is  ambiguous: are these bridge course numbers those retaking the bridging course or retaking General Chemistry I? Only later in the manuscript does it become clearer that this about retaking general chemistry 1. 

Re Figure 1: I am all for figures but this figure simply presents two percentages.

The acronyms QPA and DFW need defining to support non-US readers.

Page 7, line 177: Were students asked to grade from 6 to 1? Or from excellent to very poor. This opens the debate about taking the means of ordinal data. Better to simply report the median term.

Page 7, line 181: Presumably this is all the qualitative results. Please state.

In conclusion, this is a nice idea, and the paper makes some excellent points. However, the evidence for those points is largely from the literature or is left to the intuition of the reader. These are highly preliminary results and to be fair the authors acknowledge that. I'd like to see the authors address the disconnect between the statements and the philosophy in the introduction and the limited impact of this study and the reasons for that limited impact.

Author Response

On page 1, line 43 I think the authors identify the principal problem in this area, lack of engagement on the part of students who really need it. However, it is not clear how this contribution really seeks to address this.

I enjoyed the introduction and the points made rang very true. It is a perceptive introduction, which shows a strong grasp of the literature and is highly readable.  However, it lacked a clear statement on the theoretical framework guiding the contribution. It might be a reviewing cliche to point out the absence of research questions but they would have provided a focus for the authors and readers to judge the success of the contribution.

Response: We have added a statement on theoretical framework and guiding research questions to the revised document, which immediately follows the statement of IRB approval.  The guiding framework is growth mindset theory, and from that viewpoint the bridging course itself can be considered as a growth mindset intervention.

The authors acknowledge the problem with uptake, but it is very difficult to gauge anything based upon 2 students over 3 years. As an associate dean I am surprised the course continues to be run. Why is it not offered before General Chemistry is begun to build self confidence?

Response: We agree with the reviewer that the online version of the course has been a disappointment.  We hypothesize that there are two reasons for this: 1) the cost of the course, and 2) the fact that students may not wish to disrupt their winter break.  Thus, as we note in the article, moving forward we are addressing these concerns by offering a 0 credit J-term offering and also bringing back a section into the fall term as originally conceived.  Regarding the potential of offering this course before the semester, as we summarize in the introduction, our experience with prep modules or courses is that at-risk students do not complete them at the same rate as other students.  This mirrors what is found in the literature.

On reading the course description I was disappointed there was little detail and no examples on how this critical thinking and active learning was implemented and achieved. I know there are tables in the supplementary, but I am more interested in the approach than the content and the paper is content focused.

Response: Thank you for this critique.  In the revised version, we have added specific detail in the supporting information on the types of assignments used and examples of each.

Page 5, line 148: This is  ambiguous: are these bridge course numbers those retaking the bridging course or retaking General Chemistry I? Only later in the manuscript does it become clearer that this about retaking general chemistry 1. 

Response: In the revised version, we have clarified this point by emphasizing that the retake refers to general chemistry I.  We have also emphasized this point in our guiding research question, which is focused on the immediate impact of the bridging course.

Re Figure 1: I am all for figures but this figure simply presents two percentages.

Response: Good point, we have removed this figure in the revised version.

The acronyms QPA and DFW need defining to support non-US readers.

Response: We have defined these terms in the revised version.

Page 7, line 177: Were students asked to grade from 6 to 1? Or from excellent to very poor. This opens the debate about taking the means of ordinal data. Better to simply report the median term.

Response: The instrument includes both descriptors (i.e., 6 = Excellent, and so on).  In the revised version we have reported the median as suggested.

Page 7, line 181: Presumably this is all the qualitative results. Please state.

Response: The instrument includes both descriptors (i.e., 6 = Excellent, and so on).  In the revised version we have reported the median as suggested.

In conclusion, this is a nice idea, and the paper makes some excellent points. However, the evidence for those points is largely from the literature or is left to the intuition of the reader. These are highly preliminary results and to be fair the authors acknowledge that. I'd like to see the authors address the disconnect between the statements and the philosophy in the introduction and the limited impact of this study and the reasons for that limited impact.

Response: The reviewer makes some excellent points.  While we have additional additional text clarifying the impact of this study in the revised article, it is worth noting that the initial three year offering did demonstrate impact, capturing 21% of students who withdrew initially from the course over that period.  The online course fared much poorer, which is actually an important aspect to capture moving forward in the debate between in-person and online offerings. 

Reviewer 3 Report

Thank you for the opportunity to review this paper. 

I like the implementation of the intervention, however, I think the methods section needs to be more detailed. Can the authors talk about the instructors/classroom environment? There are so many other factors that could come to play in the trends reported in the results, so it is important to provide information about the setting and other factors that could have affected the results. Were all students recruited from the same section? Did they all have the same instructor? 

Also, could you please provide the exact years for the initial course design and the online course design? Because of the limited number of participants, I would recommend only looking at the initial design, or collecting more data based on the online design. I do not believe the data for both should be combined because of the difference in the course structures.

I think the quality of the writing in the paper is fine. 

Author Response

I like the implementation of the intervention, however, I think the methods section needs to be more detailed. Can the authors talk about the instructors/classroom environment? There are so many other factors that could come to play in the trends reported in the results, so it is important to provide information about the setting and other factors that could have affected the results. Were all students recruited from the same section? Did they all have the same instructor? 

Response: We thank the reviewer for these comments.  In the revised article, we have clarified that students were recruited from all sections, and thus various instructors.  As suggested also by reviewer 2, we have added significant detail on the methodology (assignments, etc) used in the course in the supporting information.

Also, could you please provide the exact years for the initial course design and the online course design? Because of the limited number of participants, I would recommend only looking at the initial design, or collecting more data based on the online design. I do not believe the data for both should be combined because of the difference in the course structures.

Response: We thank the reviewer again.  In the revised article, we have provided information on the years for the course offerings, shown in Table 3.  While both sets of data were combined because the focus and overall methodology of the courses were similar, in reality the number of students in the online offering was so small as to make a minor contribution to the data.  As the differences in grade distributions is at the limit of significance, we tried to not over-emphasize the quantitative aspect.

 

Round 2

Reviewer 3 Report

I still think the methods section can be expanded upon. You want to provide enough details for someone else to carry out this study in a different location

I think the quality is fine. 

Author Response

We appreciate the reviewers response, and have added to the supporting information detailed worksheets for each module of the online course, which provides a module overview and specifics on the assignments.  We hope that this will provide sufficient detail for any interested reader to implement.

Back to TopTop