5.1. Comparative Analysis
The following points can be raised for a comparative analysis and discussion regarding the three analyzed courses at the University of Göttingen:
It is interesting and might be below expectations that less than half of the registered students have actively used the video files for learning. This is in contrast to the results reported in [
52], where most students in a Mathematics course used video lectures as their primary learning material. This is even more interesting as before the COVID-19 event, it was a standard argument of students to ask for video files and recordings of lectures. There are two possible explanation hypotheses for discussion: First, video files might actually—at least for a major part of the student body—be less attractive for learning than for example simple slides in PDF files, e.g., due to the fixed learning speed in watching the video. When learning with slides, students might for example prefer to use different speed levels, lower levels for parts harder to grasp and higher speed levels for topics easier to comprehend. Based on their individual learning approaches [
53], different students prefer different learning materials (including PDF files, video files, or referenced textbooks). Peimani and Kamalipour [
54] argue that using multiple communication channels can also result in deeper learning through the representation of multiple viewpoints. Similar interactions between learning pace and (digital or non-digital) teaching channels were also found in [
42]. Second, the low rate of video consumption might be connected to the (unusual, untrained) time schedule management by students. The lecturing concept required students to prepare synchronous Q&A sessions by watching the relevant video; if students did not manage beforehand, they might have been inclined to skip the video altogether, assuming that just consuming the live Q&A session would partly replace their own video studying session. Another explanation for student engagement is provided in [
55], where the students’ levels of self-regulation and digital capabilities were identified as predictors for their engagement in online teaching. Additionally, due to the very short lead times, our videos did not have subtitles, hindering accessibility for deaf students [
56].
In addition, synchronous study elements featured very low student participation rates. Again, two possible explanations might be the reasons for that: First, the increased multi-media learning material might have been sufficient and no more questions were left with most of the students. This may also be due to the fact that more written documents, i.e., solutions for tutorial exercises, were also provided as documents, so that detailed videos or the Q&A sessions may not have been perceived as necessary. Actually, the low participation rate corresponds with similarly low live question rate in traditional face-to-face lectures and courses (with larger groups) at our faculty in Göttingen. We do believe that the low participation rate in the Q&A sessions is an effective solution for students who would not dare to ask their questions in a face-to-face format, as a potential exposure is higher in a full lecture hall than in an online Q&A session, where students could choose to use a pseudonym. Second, there might have been other hurdles for participation. For example, scheduling conflicts might have come up as all lectures and courses went online during COVID-19 lockdown periods, as also found in [
54]. For the (synchronous) online sessions, there was no administrative scheduling management to avoid collisions such as normally implemented with face-to-face sessions. Options to increase student participation during in online teaching are also discussed in [
57], pointing out that multiple instruments and channels need to be combined to foster student engagement.
A slightly reduced exam performance in the largest course (production and logistics) can be caused by a multitude of reasons. It is not necessarily due to the changes lecture format, but can for example also be traced back to a generally higher emotional and cognitive stress level of the general population and the student population during COVID-19 lockdowns. Similarly, in [
58], significantly lower student performances were found during courses in the COVID-19 pandemic.
The interesting fact the exam participation rates for face-to-face courses pre-COVID-19 and online courses during 2020 are on a similar level can be discussed as the fact that “hard to reach” student groups are similar and within the same limits and problems, not affected by the media change in the teaching and learning setting. We believe that the didactic concept with weekly Q&A sessions is an effective way to counteract procrastination during the semesters’ lecture time, because it encourages students to regularly and actively engage in learning. However, we could not analyze the exact times of students’ accesses to course materials as was for example done in [
58] to evaluate the impact on student procrastination. Additionally, as attendance of lectures at our University is not compulsory for students (both digitally or face-to-face), options to engage with the hard-to-reach students are limited, and, as Scherrer [
29] notes, it is unclear whether this is the lecturer’s responsibility at all.
The student feedback and evaluation were on average on a more positive level during COVID-19 than before. This is interesting and a possible bias due to positive selection processes with the online evaluation have to be checked and reconsidered (mainly those who already participated strongly in the digital teaching offers may have also used the online feedback system). An evaluation in presence (digital) format is traditionally used at the University of Göttingen, fostering this bias question further. Miltenburg [
42] found no significant changes in the course evaluation. Our data allows for a more detailed investigation of improvements: Statistically significant improvements were found in the overall rating of the mandatory P&L course as well as the usefulness of provided media in both courses (P&L and MM). Additionally, the required preparation and follow-up of materials was increased, as was expected due to the change of the didactic concepts. Interestingly, the behavior of the lecturer was perceived as fairer in the face-to-face format of P&L. This may be due to the fact that lecturers are more tangible for students in face-to-face formats, especially regarding their answers and actions towards students for example with questions or contributions.
Most students mentioned in the teaching evaluation that the digital formats offered more options for interaction and feedback. This hints at the possibility to implement specific digital elements also in the post-COVID-19 university teaching.
A shift was observed regarding the acceptance of online and digital communication systems for university teaching: In the first months up to half a year, students accepted many different tools and software applications, mainly because they were happy to receive any teaching at all. However, after about half a year, students increasingly criticized the multitude and “chaos” of different digital teaching tools. This led to a standardization and reduction of digital teaching tools during the 12 months COVID-19 period.
It was further observed that for different tutorials and courses the digital setting allowed for quality checks and standardization as for example identical and jointly produced videos were used for all these sessions with different student groups. In the case of mistakes or feedback from students it was easier to change these things in a standardized fashion for all tutorial groups than it would have been in face-to-face courses.
To a great extent, students preferred specialization courses during the digital teaching phase due to COVID-19. This can be linked to the possible harder scheduling task for students as said before: Avoiding parallel courses was harder for students and less relevant with specialization courses than for basic ones. This is due to the partly uncoordinated timelines and schedules of digital courses—but could also be improved for further digital teaching sessions as a lack of coordination was mainly observed due to the short-term nature of the short-notice switch to digital teaching in 2020.
Regarding the graduate seminar, using the students’ own computers to watch a lecture and use a simulation software simultaneously was easier in a video conference session than in a classroom. However, our experience was that most of the students did not prepare and did not try to use the software beforehand and did not program the short exercise explained in the recorded video. Regarding student supervision, the advantages of the digital format were higher flexibility, fast assignment of appointments, and shorter meetings. From the lecture’s perspective, students needed more support or at least asked more often for a consultation meeting, which could also be due the lower barriers for a digital meeting.
5.2. Limitations
General conditions of our students, regarding, e.g., mental health, technical conditions, or the impact of the extended deadline for opting out of an exam should also play a role for in determining the students’ performance, but could not be analyzed with the available data. Moreover, regarding H3, we could not match all students who participated in the Q&A sessions with the exam candidates, because we allowed students to use arbitrary aliases in Discord, and they could delete their Discord accounts after the exam, so that it is not possible to identify them. However, most students actually used their full names and retained their accounts, so that this bias should be minimal. Furthermore, the registration numbers of exam participants can differ, because in 2020, there was an exception to how exam registrations were handled: students were allowed to sign out of exams even 24 h after taking them (usually, they are not allowed to sign out of exams by 24 h before taking them). Some of the provided video files were re-uploaded during the semester, because of small errors. This resets the video viewership for the respective file. However, because the number of faulty videos was low, errors were usually found quickly, and students were also informed quickly, this should not have a great effect on overall video viewership statistics. Finally, the course evaluation data could be skewed, because in Göttingen, most students are asked to fill out this evaluation in a synchronous course. Because the Q&A sessions were the only synchronous elements and also optional, the distribution of students participating in these course evaluations may be different from previous iterations of the course.
5.3. Implications
There is a multitude of implications that can be connected to the findings presented in this paper. The most important one is the question of individualization though: Digitalization implies in many forms and fashions the differentiation and individualization of learning. This can be a positive tendency for example with the chance to adjust to individual learning stages and capabilities better than in pure physical teaching settings. On the other hand, this is also accompanied by risks such as students falling behind or being left behind if their personal learning characteristics are less suitable for digital formats requiring specific competences and for example more self-organization skills.
Altogether, university teaching in a digitalized context requires intense and complex preparation as well as strategic planning. In [
59], the course design, pedagogical strategies incorporating active learning and providing a sense of online community, infrastructure for delivery and training, and incorporating activities that support student wellbeing were identified as success factors for digital education. In [
60], student–student and instructor–student dialogues are identified as success factors. In particular, depending on the digital platform and format used, supporting student-student dialogues can be challenging [
54]. In [
61], challenges regarding the diversity of student backgrounds and equitable participation are highlighted. A comprehensive view towards all aspects relevant to learnings is essential and requires motivation, skill and endurance of teachers in order to reach learning curve effects on both sides regarding digital instruments, for lecturers and students alike. This in turn means that most decision regarding specific formats, technologies and didactics used shall be located at the decentral level—and not to be centralized during digitalization efforts. This implies for example, that no central decision for specific software or platform solutions shall be made centrally, but university services should provide a multitude of digital services for the lecturer to select from individually.