Next Article in Journal
Enhancing Home Education in Italian Context: Teachers’ Perception of a Hybrid Inclusive Classroom
Previous Article in Journal
Teaching English to First-year Students in Russia: Addressing the Challenges of Distance Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Undergraduate Student Performance in a Structural Analysis Course: Continuous Assessment before and after the COVID-19 Outbreak

by
César De Santos-Berbel
1,*,
José Ignacio Hernando García
1 and
Laura De Santos Berbel
2
1
Departamento de Estructuras y Física de Edificación, Universidad Politécnica de Madrid, 28040 Madrid, Spain
2
C.E.I.P. Simón de Colonia, 09400 Aranda de Duero, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(8), 561; https://doi.org/10.3390/educsci12080561
Submission received: 13 July 2022 / Revised: 12 August 2022 / Accepted: 16 August 2022 / Published: 18 August 2022
(This article belongs to the Section Higher Education)

Abstract

:
The COVID-19 pandemic situation in 2020 forced educational institutions worldwide to fully adopt online learning for both teaching and assessment. However, this change may lead to less satisfactory learning outcomes, if the online technologies used have not been adequately applied. This study compares student engagement and performance through online continuous assessment in a one-semester structural analysis course for undergraduate architecture students before and after the pandemic outbreak. Online continuous assessment assignments had already been deployed and validated in the course evaluation system before the outbreak, and they were further leveraged during the online course. These assignments consisted of three weekly Moodle questionnaires throughout each of the fifteen course weeks, which determined the continuous assessment score. More than 200 students participated in each period. The results showed that shifting to online education affected continuous assessment outcomes very little in terms of participation rates and student performance. The possible underlying causes for the slight differences found between the two academic years are also discussed. The results highlight the robustness of the continuous assessment method used and emphasize the importance of having developed and validated online learning procedures to broach learning activities if contingency situations are required.

Graphical Abstract

1. Introduction

The COVID-19 pandemic situation in 2020 forced educational institutions worldwide to leave face-to-face education and adopt an online learning modality for both teaching and assessment. Education institutions faced the challenge of providing teaching of sustained quality while hard lockdowns were in effect, with a very short transition period. Despite such circumstances, the interest and involvement of students had to be maintained remotely, taking advantage of available technologies, developments, and the associated experience. Over the previous years, learning and communication platforms had already become widespread, transforming the learning environment. After the outbreak, these technologies had to be deployed to replace physical classrooms. A higher degree of deployment of online learning technologies can make it more feasible to succeed in preserving the involvement of students [1]. If these objectives are not met, concerns arise about the learning outcomes, not only during the pandemic, but also in the years to come due to cumulative learning loss [2].
The European Higher Education Area promotes continuous student work and the counterpart assessment by the lecturers, among its measures. This method is based on course evaluation through daily class work, course-related projects, and practical work, instead of relying exclusively on a final examination system [3]. The main advantage of this approach is that students receive feedback on their progress throughout the course, allowing university lecturers to adapt their response to the needs of the students.
Teaching the concepts of structure analysis involves several peculiarities. Students must know how to calculate structures correctly, developing the ability to understand the physical behavior of a structure as a whole, as well as its individual elements [4]. Moreover, the learning development should provide techniques for future structural design practitioners to evaluate the reasonableness of the design outcome [5]. With these objectives in mind, various successful strategies have been devised for teaching structural analysis, i.e., problem-based learning, project-based learning, or design and testing [4,6,7,8]. According to the literature, other learning methods could be incorporated to a greater or lesser degree in the teaching of structures, in both blended and electronic learning. Among the most outstanding methods are interactive lectures, experiential learning, and inquiry-based learning [9]. Yet, maintaining these learning approaches while the lockdowns were in force was an additional challenge to instructors.
Given these factors, the purpose of this study is to analyze and compare student performance using online continuous assessment in a structural analysis course for undergraduate architecture students before and after the COVID-19 outbreak. The comparison is made on the quantitative basis of the assignment scores and participation rates, taking into account that the assessment periods examined followed the same scheme and criteria.

2. Previous Work

2.1. Blended Learning and Electronic Learning

Technologies and different educational applications have not only boosted a more digitalized and globalized education model, but have also helped to modify teaching under such exceptional circumstances as a pandemic. Due to the COVID-19 pandemic, the educational institutions across the world found themselves in the position of having to switch rapidly from face-to-face to online teaching.
Before the COVID-19 pandemic outbreak, blended learning and electronic learning-based course schemes had already experienced a huge increase, with a considerable number of reported advantages, while claiming to retain many of the strengths of face-to-face classes. Several authors have evaluated the suitability of blended learning and electronic learning schemes. A previous study conducted by Ortiz et al. [10] confirmed that there was a high correlation between the final marks of students and the continuous assessment results of blended learning.
University online courses (UOC) and massive open online courses (MOOCs) led the way to implement completely online courses with self-continuous assessment. MOOCs are becoming increasingly popular around the world, as they enable students to select the composition of their own training, combining flexibility and the advantage of being taught by world renowned instructors [11].
Previous experience with distance education has been revealed a key factor in the success of implementing online courses, facilitating the adaptation of both students and instructors [1,12]. In teaching units with experience in distance education, lecturers can easily replicate physical classrooms using the existing online education infrastructure. Moreover, experience indicates that the entire university community might benefit further if distance education takes place throughout the curriculum in a planned and uniform manner [13]. After more than a year of using the education system necessitated by the pandemic, various teaching modalities have been tested. There are studies that maintain that the most favorable option is teaching with a certain degree of face-to-face teaching, i.e., hybrid teaching, because interaction with teachers and classmates is fundamental for a positive learning experience [14]. These modes of teaching have been essential, along with other advanced technologies, in courses that necessarily require face-to-face interaction [15]. However, many experimental learning-based courses have been forced to be delivered entirely online due to severe local restrictions [16].
A fundamental issue of online learning courses is to maintain the active participation and engagement of students, avoiding dropout [17,18]. This is fundamental for all education levels in the context of a forced shift to online learning due to the pandemic. Virtual learning environments, with introductory low-stake assessments, have been acknowledged to encourage high levels of student engagement, which in turn helped maintain or enhance the student experience [19]. However, Turnbull et al. [13] maintain that it is unrealistic to expect that lecturers and students adapt to online learning by repeating patterns of behavior and processes that worked for face-to-face learning.
The most severe restrictions as a reaction to the pandemic took place in the period from March to June 2020, a period for which some academic performance and learning quality results are already known. According to higher education students, course features and instructor characteristics, as well as technical and social support, remain crucial for ensuring the quality of education [20]. Students also have a positive view of both the adaptation to blended learning due to the pandemic in technical university courses and the degree to which digital technologies facilitated it [1,21]. The learning quality perceived by engineering students was generally maintained when comparing face-to-face to online learning, except in the basic course units taught in the first year [22]. In the context of very low human interaction, advanced tools successfully assist students in following the course [23,24].
Technologies to deliver online education can be classified into two main groups: asynchronous and synchronous. Asynchronous learning systems are built on communication platforms that do not require time-sensitive interactions between stakeholders in the education process [25]. Learning management systems are distance learning platforms that facilitate stakeholder interactions based on a ‘request-response’ framework, generally unconstrained by time limitations. Modular Object-Oriented Dynamic Learning Environment (MOODLE) is one of the most popular online learning management systems, standing out for its high interoperability, a large toolset, and advanced features. These features aid instructors to support student involvement and achieve better learning outcomes. MOODLE has been proved to gauge student learning performance in relation to their final evaluation [26]. A recent study indicates that the number of activities carried out in MOODLE in a university with a face-to-face scheme has tripled compared to the year prior to the pandemic outbreak [27].

2.2. Continuous Assessment

Encouraged by the Bologna Process, continuous assessment aims to support student learning in higher education through student academic performance monitoring, enabling adequate instructor feedback while increasing student motivation for learning. Continuous assessment involves positive impacts on learning as it encourages student follow-up, continued work, and interest in the course, thus reducing dropout rates [28,29]. The calculation of the final grade from a series of scores, rather than on the basis of a single final exam, helps to ensure that the student’s performance is not unduly affected by possible setbacks on the day of the final exam [30]. Students perceive an important incentive of continuous assessment when assignments are summative, for which real effort is required, rather than voluntary assignments or prerequisites for examination [31,32]. In continuous assessment, moreover, frequent practicing of applied exercises similar to those proposed on exams improves motivation, increasing learning and success rates of students [3]. As a result, positive correlation is generally found between continuous assessment and student achievement [33].

3. Course Syllabus

The structural analysis course from which the data for this study are taken is a compulsory subject for the Degree in Fundamentals of Architecture of the Technical University of Madrid. It is taught in the third year and covers one semester. Six hours of teacher-centered classes are given each course week, and are distributed into two three-hour sessions, the first half of which consists of a master class lecture and the second half of which consists of an interactive problem-based learning workshop where students complete part or all, as the case may be, of the continuous assessment assignments. The course comprises six credits, according to the European Credit Transfer and Accumulation System (ECTS).
The learning outcome is to obtain graphs of internal forces, maximum structure stresses, maximum deflections, and member dimensioning given the structure load state and the corresponding limit state method (ultimate limit state and serviceability limit state).
The content of the course comprises the fundamental knowledge for the linear and non-linear analysis of building structures. It is offered every year, with the syllabus in the same order. The course covers the following topics (course week is indicated in parentheses):
  • Fundamental concepts of the course: safety of building structures (W 1–2) and dimensioning of structural members (W 3–10).
  • Analysis methods: linear elastic analysis (W 1–9), theory of second order analysis (W 10), and plastic calculation (W 11, 12, 14, 15).
  • Structural typology: plane and space trusses (W 1–3), isostatic beams (W 4, 11), multi-span hyperstatic beams (W 5, 11), isostatic frames (W 4, 11), hyperstatic frames (W 5–12), arches (W 5), grillage structures (W 13–14), and slabs (W 15).
  • Equations and methods: equilibrium equations (W 1–14), compatibility equations (W 4–10, 12, 13), and constitutive equations (W 4–10, 13); force method (W 4, 13) and matrix method (W 5–9, 12, 13); principle of virtual displacements (W 1–15).
  • Internal forces: simple bending (W 3–14), mixed bending (W 5–14), tension (W 1–12), and compression (W 3–12).
  • Deflection of structures (W 1–10, 13).
In the syllabus of the Degree in Fundamentals of Architecture, the subject of this study, is preceded by a one-semester introductory course on the mechanics of continuous media.

4. Continuous Assessment Procedure

The final grade of the structural analysis course on which this study is focused is composed of two scores: one resulting from the completion of two exams and another derived from the continuous assessment. The study presented hereby compares the student performance throughout the continuous assessment of academic years 2019–2020 and 2020–2021. The characteristics described below are common to both academic years.
The continuous assessment score was calculated as the average score of fifteen weekly scores. Each weekly score is in turn, the weighted average of the scores of two assignments. Figure 1 displays the timeline of a generic course week, along with the corresponding questionnaires. The first one is an exam-like short-applied synchronous questionnaire related to the content taught in the master class, which must be completed during the subsequent workshop time. The second one is an exam-like long applied asynchronous questionnaire that students begin to complete during the workshop time, although with an estimated required time of 6 to 8 h. Students have one week to complete the long-term questionnaire and are required to take a complementary synchronous test in the first class of the following week. In this complementary questionnaire, a selection of results from the long-term exercise are requested in order to verify the student’s own work, and its score is averaged along with that of the long-term questionnaire. Although in the long-term questionnaire, the students work on a single practical problem, the extension of these questionnaires is large. This is a consequence of the fact that students are asked for multiple intermediate results that guide the student in solving the problem. In this way, a guided instruction method is applied [9].
These questionnaires are prepared to be uploaded to MOODLE, administered and corrected with a high level of automation, following previous developments of professors from the Technical University of Madrid [34]. The questionnaires combine calculated question type, interfaces for drawing graphs of internal forces, and drag and drop type questions [35,36,37,38]. Figure 2 shows an example of the rubric and a question that asks the students to draw the graph of internal forces in a MOODLE questionnaire. Although the rubric is essentially the same in each questionnaire for all students, different variants are automatically generated that incorporate a random parametrization of initial data, such as forces or dimensions. This measure aims at preventing students from copying each other, while not discouraging collaboration in the resolution of the questionnaire. Short questionnaires had only one attempt allowed and were set up to provide the correct answer to students at the end of the class. Conversely, in long questionnaires, an unlimited number of attempts was allowed, and feedback as to whether the answer was correct or not was provided after each attempt, without providing the correct value. This formative feedback, as well as a summative score, is known to increase student motivation, encouraging students to try to complete the questionnaire with the maximum score [32,39]. To aid students, the lecturers offered their availability to resolve doubts during additional tutoring sessions for the correct completion of the exercise. The results of the questionnaires were scored with a value that varied linearly from 0 to 10 points, 5 being the minimum pass score. The suitability of these online learning tools for blended learning has been successfully validated [10].
The most important difference between the two academic years analyzed is that classes were conducted entirely face-to-face during the academic year 2019–2020, while they were fully online and synchronous in the academic year 2020–2021. In the latter year, the Zoom application was used by instructors to set up live virtual classrooms [40]. In the context of the COVID-19 pandemic, the courses shifted relatively quickly to an online learning scheme in March 2020, forced by the strict lockdown declared in Spain. Therefore, students had almost one semester to adapt themselves to the online learning scheme, which resumed for the new academic year in September 2020. For example, in the workshop portion, students were able to effectively raise their doubts and questions using ZOOM annotation tools on the questionnaire rubric showed on shared screen.
Although the syllabus scheme is maintained each academic year, new questionnaires are proposed each year from a large database developed by the professors. The designs of the structures to analyze in respective questionnaires differ significantly from those proposed in several previous years. This measure prevents academic dishonesty because solutions are not readily available to students. In addition, it removes possible dependency between the data sample analyzed in this study. Another important fact allowing for the comparative study hereby presented is that the calculated and numerical questions allow for an objective quantitative assessment.

5. Results and Discussion

The results presented in this section are based on the scores obtained by 210 students during the academic year 2019–2020 and 213 in the academic year 2020–2021. The students were distributed in three classroom groups of three different lecturers who taught the subject in a fully coordinated manner. Table 1 displays the main statistics of the academic years analyzed, including the total number of students that participated, the total number of scores corresponding to questionnaires completed by students, their associated participation rate, and the corresponding total average. It is observed that both the participation rate and the average score of the questionnaires completed were higher in year 2020–2021.
Based on these data, the participation rates, statistical analysis of student performance, and its evolution have been studied.

5.1. Participation Rates

Two indicators have been considered to analyze the degree of involvement of students in continuous assessment. First, the net participation rate of a questionnaire, which accounts for the number of students who completed the questionnaire out of the total number of students. Likewise, the net participation rate for each week was the average net participation rate of the questionnaires proposed for that week. In a complementary manner, the gross participation rate for each week is defined as the number of students who have completed at least one of the questionnaires set for the given week out of the total number of students.
Figure 3 compares the student participation rates over the fifteen weeks of the two academic years analyzed. It can first be noticed that participation rates were more uniform in 2020. In fact, participation rates in 2019 began at low levels. The most likely reason for the uniformity in participation rates in 2020 is that students were more aware of the need for using online assessment tools. In general, gross participation rates remained over 70% and net participation rates over 50%.
Relatively low participation rates were observed in certain course weeks. Students were able to dedicate less time to continuous assessment during week 10 because exams of other subjects are held during that week, in addition to the fact that many assignments due were due that week. While this circumstance has virtually no impact in 2020, it produces a plunge in the participation rate in 2019. However, under similar circumstances, the opposing effect was observed in the last week of the course. In 2019, the gap between the net and the gross participation rates was greater than that in 2020. This means that students who were engaged with the course participated in a more regular manner in 2020.

5.2. Assignment Scores Throughout Continuous Assessment

Figure 4 presents a boxplot that illustrates the ranges of the weekly scores for both academic years. The weekly score values plotted were calculated, incorporating non-completed questionnaires with score equal to 0. As the course progresses, an increase in the value of the scores is observed. However, there is also an increase in their dispersion. These effects are more pronounced in the second year analyzed. The increasing dispersion responds to the higher frequency of null scores associated with questionnaires not taken in the last weeks of the course.
The weekly evolution of the cumulative average score of each student is used in the following figure to compare the students’ follow-up of the course. Figure 5 illustrates the evolution of the average scores of the students throughout the course for academic years 2019–2020 and 2020–2021, respectively. Each line represents the proportion of students whose cumulative average score is equal to or below the indicated value. For example, after 8 weeks, 69% of students had an average score of 5 points or less in academic year 2019–2020, while 55% of students scored 5 points or less in 2020–2021 after 8 weeks. Ideally in this graph, the high scores would show as little a separation between each other as possible, and placed as low as possible. In this sense, Figure 5b indicates that overall, the student performance in continuous assessment was more satisfactory in the academic year 2020–2021, which is consistent with the data displayed in Table 1.
When students face an assessment system, especially continuous assessment, there is a progression of learning about how the system works. This process is also observed in the first weeks of both academic years, from weeks 1 to 4 in 2019–2020 and from weeks 1 to 3 in 2020–2021. A steep increase in the average scores is observed in the former period, whereas the dispersion of the average scores smoothly increased in the latter period. Noting that the average values plotted in Figure 5 also incorporate non-completed questionnaires, with scores equal to 0, this observation is consistent with the low participation rates over the first weeks in year 2019–2020, which helps explain the high frequency of very low average scores. Up to that point, it can be inferred that a significant number of students have encountered difficulties in getting used to the dynamics of the course, or other types of difficulties (administrative, enrollment, technology access, etc.), which are issues that have been observed by other authors [41]. Looking at the process of adaptation to the assessment method and the dynamics of the subject from the respective turning points mentioned above, slightly different trends can be observed for the two academic years studied. Lines with descending slopes indicate that students improved their average score as the course develops, as noticed in the broad outline for the year 2019–2020 series. In contrast, lines that represent low average scores in year 2020–2021 have ascending slopes which, in combination with the participation data in Figure 3, reflect the progressive increase in the number of students dropping out of the course.

5.3. Statistical Analysis

The statistical analysis carried out using the weekly scores is described below. Table 2 displays the weekly scores statistics and indices over the fifteen weeks of the academic year 2019–2020, and Table 3 shows the counterpart statistics and indices for 2020–2021. They include the average and the standard deviation of the week’s scores, which were calculated subsequently, incorporating the scores of the assignments for the counterpart week, where non-completed questionnaires were scored as 0.
The discrimination index of the weekly assignments was calculated. Let x i be a vector containing the scores of week i and T a vector containing the student’s final scores. The discrimination index of week i assignments is the product moment correlation coefficient between x i and T (Equation (1)) [42].
D i = C ( x i , T ) V a r ( x i ) · V a r ( T )
C(xi, T) is the covariance of xi and T. The discrimination index indicates the extent to which the assignments of a particular week have discriminated between the high scorers and low scorers on the course, and its values range from 0 (no discrimination) to 1 (maximum discrimination). Students who earned high scores on the other weekly assignments should also earn high scores in week i, so the score for the week and the continuous assessment score should be well correlated.
The discriminative efficiency is another index that measures the contribution of an assignment to the final score, which is expressed as the ratio of the correlation coefficient of vectors x i and T to the maximum correlation coefficient of vectors x i and T, with their respective components sorted (Equation (2)) [42]. This index is considered more robust than the discrimination index. Its values range from 0 (no discrimination efficiency) to 1 (maximum discrimination efficiency).
D E i = C ( x i , T ) C m a x ( x i , T )
To examine possible differences in the abovementioned statistics and indices associated to the questionnaires of the two academic years analyzed, Student’s t-tests for paired samples were performed. The averages of weekly scores of both academic years were compared using a one-tailed t-test, and the p-value was 0.006, indicating that the weekly scores were significantly higher in year 2020–2021 at the 95% confidence level. Next, the two series of discrimination indices were compared using a two-tailed t-test, and the resulting p-value was 0.21. As a result, it cannot be concluded that the discrimination indices of either academic year are significantly different at the 95% confidence interval.
It can be observed that the average weekly grades tend to increase as the course progresses, especially in academic year 2019–2020, as they were initially lower than in 2020–2021. The dispersion of weekly scores is narrower in year 2020–2021 (2.53 to 3.38), while in 2019–2020, it reaches 4.21 by week 12. The higher variance of test scores does not necessarily mean that students had particular comprehension difficulties, as the average score is in line with the previous and the next one. According to the course syllabus, week 12 is devoted to plastic calculation of hyperstatic frames using compatibility equations, the matrix method, and the principle of virtual work. Plastic calculation, the structural typologies, and the methods used have already been taught in previous weeks of the course, and the discriminative efficiency of this week (43.3%) is at an average value. The relatively high standard deviation is more likely due to other external factors, e.g., excessive workload in other concurrent courses. Although in week 12 of the following year, this dispersion in grades is not observed; therefore, it is advisable to verify that in the following courses, there is no high variance, since it could indicate that the adequate performance of the students is at stake due to other possible deficiencies.
Finally, the difference between the average scores of the synchronous and asynchronous tests for each week are included in Table 2 and Table 3. To compute the average values, non-completed questionnaires were scored as 0. Positive values indicate that overall students performed was better in the asynchronous test (for which they have an unlimited number of attempts). This would suggest that students consolidate the knowledge they have acquired during the week as long as the difficulty of both tests is similar. Conversely, negative values indicate that students were more successful in the synchronous test, and therefore, they would have had difficulty strengthening their knowledge. Given that no significant differences were found between the test scores of the two academic years and that, in general, the differences in year 2020–2021 are greater, it could be said that in year 2019–2020, students’ learning occurred to a greater degree through the synchronous tests, while in 2020, students were able to better consolidate their knowledge, thanks to the asynchronous tests.

5.4. Discussion

Although a slightly higher overall performance and higher participation are observed during the latter academic year, higher dropout rates are a cause for concern. In the year of the pandemic, the distribution of the continuous evaluation final scores features a dispersion more typical of a bimodal distribution, clearly differentiating between students who took the course to the end and students who dropped out.
Several possible factors causing these differences have been identified in relation to the existing literature, which should be addressed to enhance blended and online learning, and reduce drop out. The previous introduction to structural analysis course that the students would theoretically have taken, according to the curriculum, was in the spring semester of academic year 2019–2020, during which the pandemic broke out. The uncertainty associated with the adequate adaptation of teaching procedures and evaluation systems after the severe disruption in that semester caused by the pandemic outbreak might have significantly affected the learning performance, accumulating learning loss, as reported by several authors [43].
The lack of equal access to telematics facilities, e.g., access to a permanent and stable internet connection, as well as a non-proficient use of online learning technologies, can be determining factors for dropping out of a course that is highly dependent on such technologies [44,45,46]. The difficulty in concentrating at home, given the restrictions on public places of study, might have been another possible cause of the decline in academic performance.
Traditional classroom interaction encourages a social environment with interpersonal communications. The lack of face-to-face interaction with the university community, even with an extensive use of social media, could diminished what would have otherwise been a collaborative work regarding the resolution of assignments [47]. During the second half of the class, namely the interactive problem-based learning workshop, collaborative group work is not only allowed, but is considered particularly beneficial for learning, as students are assigned rubrics with different values for the initial data, thanks to the high degree of automation enabled by the MOODLE questionnaires. Therefore, the numerical solutions vary. However, group work promotion is more difficult without a physical location in the classroom, losing this synergy. Lack of engagement in the online class and fatigue have been described as an important setback of synchronous online learning [48]. Other social, domestic, and personal circumstances related to the pandemic, to some extent, may also have negatively affected student performance [49].

6. Conclusions

This study compared student performance using online continuous assessment in a structural analysis course for undergraduate architecture students before and after the COVID-19 outbreak. The continuous assessment procedure was based on synchronous and asynchronous MOODLE questionnaires, implemented and tested before the pandemic, and followed the same scheme and criteria in both academic years.
The combined consideration of the graphs presented allowed for a detailed analysis of the evolution of learning. Participation rates slightly increased in the academic year 2020–2021, especially at the beginning of the course. In comparison to participation in year 2019–2020, the year with fully online learning, more uniform participation values were maintained throughout the year. This fact is likely due to the initial student awareness of full online continuous assessment. Moreover, little but significantly higher student performance was observed on average in the final scores for online continuous assessment of the year 2020–2021. However, slightly more students dropped out of the course developed during the pandemic, which is a cause for concern. The possible causes of these differences have been discussed, and do not seem to be attributable to the method of learning, but to social and personal issues related to the pandemic.
The difference between the average scores of the synchronous and asynchronous tests for each week showed that students’ learning was mainly produced through the synchronous tests in academic year 2019–2020, whereas students consolidate their knowledge during the week through the asynchronous tests in year 2020–2021.
Finally, the combination of online learning and continuous assessment, due to face-to-face restrictions because of the pandemic, has been proved resilient and robust, indicating that blended learning is a worthwhile learning modality. Nonetheless, despite the numeric indicators that quantify overall student success, concerns may arise regarding the real impact of the COVID-19 outbreak, since it may not be possible to know in depth the dynamics of student learning and how the pandemic and its implications affected student comprehension. Therefore, a return to face-to-face learning is essential to ensure learning quality and avoid cumulative learning loss, since is not likely that most courses have shown success during online teaching. In any case, it is desirable to monitor the evolution of student performance in the coming years in order to understand the medium-term effects of the pandemic on higher education.

Author Contributions

Conceptualization, C.D.S.-B. and J.I.H.G.; methodology, C.D.S.-B. and J.I.H.G.; software, C.D.S.-B. and J.I.H.G.; validation, C.D.S.-B. and J.I.H.G.; formal analysis, C.D.S.-B. and J.I.H.G.; investigation, C.D.S.-B., J.I.H.G. and L.D.S.B.; resources, J.I.H.G.; data curation, C.D.S.-B.; writing—original draft preparation, C.D.S.-B. and L.D.S.B.; writing—review and editing, C.D.S.-B., J.I.H.G. and L.D.S.B.; supervision, J.I.H.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Institutional Review Board approval was not required since data did not include any personal identifying information. However, the study was conducted in accordance with the Declaration of Helsinki.

Informed Consent Statement

Institutional informed consent from Director of the Department of Structures and Building Physics of the Technical University of Madrid was obtained on behalf of all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions regarding the participating university departments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sánchez Ruiz, L.M.; Moll-López, S.; Moraño-Fernández, J.A.; Llobregat-Gómez, N. B-Learning and Technology: Enablers for University Education Resilience. An Experience Case under COVID-19 in Spain. Sustainability 2021, 13, 3532. [Google Scholar] [CrossRef]
  2. Kaffenberger, M. Modelling the Long-Run Learning Impact of the COVID-19 Learning Shock: Actions to (More than) Mitigate Loss. Int. J. Educ. Dev. 2021, 81, 102326. [Google Scholar] [CrossRef] [PubMed]
  3. Sanz-Pérez, E.S. Students’ Performance and Perceptions on Continuous Assessment. Redefining a Chemical Engineering Subject in the European Higher Education Area. Educ. Chem. Eng. 2019, 28, 13–24. [Google Scholar] [CrossRef]
  4. Fernández-Sánchez, G.; Millán, M.Á. Structural Analysis Education: Learning by Hands-on Projects and Calculating Structures. J. Prof. Issues Eng. Educ. Pract. 2013, 139, 244–247. [Google Scholar] [CrossRef]
  5. Hanson, J. Teaching Students How to Evaluate the Reasonableness of Structural Analysis Results. J. Civ. Eng. Educ. 2022, 148, 04021013. [Google Scholar] [CrossRef]
  6. de Justo, E.; Delgado, A. Change to Competence-Based Education in Structural Engineering. J. Prof. Issues Eng. Educ. Pract. 2015, 141, 05014005. [Google Scholar] [CrossRef]
  7. Quinn, K.A.; Albano, L.D. Problem-Based Learning in Structural Engineering Education. J. Prof. Issues Eng. Educ. Pract. 2008, 134, 329–334. [Google Scholar] [CrossRef]
  8. Solís, M.; Romero, A.; Galvín, P. Teaching Structural Analysis through Design, Building, and Testing. J. Prof. Issues Eng. Educ. Pract. 2012, 138, 246–253. [Google Scholar] [CrossRef]
  9. Nilson, L.B.; Goodson, L.A. Online Teaching at Its Best: Merging Instructional Design with Teaching and Learning Research, 2nd ed.; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2021; ISBN 978-1-119-76501-1. [Google Scholar]
  10. Ortiz, J.; Aznar, A.; Hernando, J.I.; Ortiz, A.; Cervera, J. Statistical Validation of E-Learning Assessment. Teach. Educ. Curric. Stud. 2016, 1, 20–27. [Google Scholar] [CrossRef]
  11. Engle, D.; Mankoff, C.; Carbrey, J. Coursera’s Introductory Human Physiology Course: Factors That Characterize Successful Completion of a MOOC. Int. Rev. Res. Open Distrib. Learn. 2020, 16, 46–68. [Google Scholar] [CrossRef]
  12. Holcomb, L.B.; King, F.B.; Brown, S.W. Student Traits and Attributes Contributing to Success in Online Courses: Evaluation of University Online Courses. J. Interact. Online Learn. 2004, 2, 1–17. [Google Scholar]
  13. Turnbull, D.; Chugh, R.; Luck, J. Transitioning to E-Learning during the COVID-19 Pandemic: How Have Higher Education Institutions Responded to the Challenge? Educ. Inf. Technol. 2021, 26, 6401–6419. [Google Scholar] [CrossRef] [PubMed]
  14. Zhou, J.; Zhang, Q. A Survey Study on U.S. College Students’ Learning Experience in COVID-19. Educ. Sci. 2021, 11, 248. [Google Scholar] [CrossRef]
  15. Plummer, L.; Kaygısız, B.B.; Kuehner, C.P.; Gore, S.; Mercuro, R.; Chatiwala, N.; Naidoo, K. Teaching Online during the COVID-19 Pandemic: A Phenomenological Study of Physical Therapist Faculty in Brazil, Cyprus and the United States. Educ. Sci. 2021, 11, 130. [Google Scholar] [CrossRef]
  16. Nogales-Delgado, S.; Román Suero, S.; Encinar Martín, J.M. COVID-19 Outbreak: Insights about Teaching Tasks in a Chemical Engineering Laboratory. Educ. Sci. 2020, 10, 226. [Google Scholar] [CrossRef]
  17. Luburić, N.; Slivka, J.; Sladić, G.; Milosavljević, G. The Challenges of Migrating an Active Learning Classroom Online in a Crisis. Comput. Appl. Eng. Educ. 2021, 29, 1617–1641. [Google Scholar] [CrossRef]
  18. Voghoei, S.; Hashemi Tonekaboni, N.; Yazdansepas, D.; Arabnia, H.R. University Online Courses: Correlation between Students’ Participation Rate and Academic Performance. In Proceedings of the Proceedings—6th Annual Conference on Computational Science and Computational Intelligence, CSCI 2019, Las Vegas, NV, USA, 5–7 December 2019; pp. 772–777. [Google Scholar]
  19. Holmes, N. Engaging with Assessment: Increasing Student Engagement through Continuous Assessment. Act. Learn. High. Educ. 2018, 19, 23–34. [Google Scholar] [CrossRef]
  20. Elumalai, K.V.; Sankar, J.P.; Kalaichelvi, R.; John, J.A.; Menon, N.; Alqahtani, M.S.M.; Abumelha, M.A. Factors Affecting the Quality of E-Learning during the COVID-19 Pandemic from the Perspective of Higher Education Students. In COVID-19 and Education: Learning and Teaching in a Pandemic-Constrained Environment; Cheong, C., Coldwell-Neilson, J., MacCallum, K., Luo, T., Scime, A., Eds.; Informing Science Press: Snata Rosa, CA, USA, 2021; pp. 167–190. [Google Scholar]
  21. Alqahtani, A.Y.; Rajkhan, A.A. E-Learning Critical Success Factors during the COVID-19 Pandemic: A Comprehensive Analysis of e-Learning Managerial Perspectives. Educ. Sci. 2020, 10, 216. [Google Scholar] [CrossRef]
  22. Revilla-Cuesta, V.; Skaf, M.; Varona, J.M.; Ortega-López, V. The Outbreak of the COVID-19 Pandemic and Its Social Impact on Education: Were Engineering Teachers Ready to Teach Online? Int. J. Environ. Res. Public Health 2021, 18, 2127. [Google Scholar] [CrossRef]
  23. Chaves, P.R.; Assumpção, R.M.; Ferreira, L.C.; Cardieri, P.; Branquinho, O.C.; Fruett, F. A Remote Emulation Environment for the Teaching of Low-Power Wireless Communications. Comput. Appl. Eng. Educ. 2021, 29, 1453–1464. [Google Scholar] [CrossRef]
  24. Sweidan, S.Z.; Abu Laban, S.S.; Alnaimat, N.A.; Darabkh, K.A. SIAAA-C: A Student Interactive Assistant Android Application with Chatbot during COVID-19 Pandemic. Comput. Appl. Eng. Educ. 2021, 29, 1718–1742. [Google Scholar] [CrossRef]
  25. Larasati, P.F.; Santoso, H.B. Interaction Design Evaluation and Improvements of Cozora—A Synchronous and Asynchronous Online Learning Aplication. In Proceedings of the Proceedings—2017 7th World Engineering Education Forum, WEEF 2017; Institute of Electrical and Electronic Engineers, Kuala Lumpur, Malaysia, 13–16 November 2017; pp. 536–541. [Google Scholar]
  26. Romero, C.; Espejo, P.G.; Zafra, A.; Romero, J.R.; Ventura, S. Web Usage Mining for Predicting Final Marks of Students That Use Moodle Courses. Comput. Appl. Eng. Educ. 2013, 21, 135–146. [Google Scholar] [CrossRef]
  27. Lapevska, D.; Velinov, A.; Zdravev, Z. Analysis of Moodle Activities before and after the COVID-19 Pandemic—Case Study at Goce Delchev University. Balk. J. Appl. Math. Inform. 2021, 3, 51–58. [Google Scholar] [CrossRef]
  28. Coll, C.; Rochera, M.J.; Mayordomo, R.M.; Naranjo, M. Continuous Assessment and Support for Learning: An Experience in Educational Innovation with ICT Support in Higher Education. Electron. J. Res. Educ. Psychol. 2007, 5, 783–804. [Google Scholar]
  29. Hernández, R. Does Continuous Assessment in Higher Education Support Student Learning? High. Educ. 2012, 64, 489–502. [Google Scholar] [CrossRef]
  30. Bjælde, O.E.; Jørgensen, T.H.; Lindberg, A.B. Continuous Assessment in Higher Education in Denmark: Early Experiences from Two Science Courses. J. Teach. Learn. High. Educ. 2017, 12, 1–19. [Google Scholar]
  31. Carless, D. Learning-Oriented Assessment: Conceptual Bases and Practical Implications. Innov. Educ. Teach. Int. 2007, 44, 57–66. [Google Scholar] [CrossRef]
  32. Trotter, E. Student Perceptions of Continuous Summative Assessment. Assess. Eval. High. Educ. 2006, 31, 505–521. [Google Scholar] [CrossRef]
  33. Martín-Carrasco, F.J.; Granados, A.; Santillán, D.; Mediero, L. Continuous Assessment in Civil Engineering Education: Yes, but with Some Conditions. In Proceedings of the 6th International Conference on Computer Supported Education (CSEDU), Barcelona, Spain, 1–3 April 2014; pp. 103–109. [Google Scholar]
  34. Aznar, A.; Hernando, J.I.; Cervera, J.; Ortiz, J. Educational Self-Correcting Application towards Continous Assessment for e-Learning of Analysis of Building Structures. Educ. Futur. Rev. Investig. Exp. Educ. 2012, 2, 2–15. [Google Scholar]
  35. Aznar, A.; Hernando, J.I.; Antuña, J.; Ortiz, J. How to Learn Having Fun: Drag and Drop Questions for Building Structures. In Proceedings of the 11th annual International Conference on Education and New Learning Technologies (EDULEARN19), IATED, Palma, Spain, 1–3 July 2019; Volume 1, pp. 5651–5656. [Google Scholar] [CrossRef]
  36. Aznar, A.; Hernando, J.I. Novel Educational Assessment for Building Structures: Automatic Evaluation of on-Line Graphics. Procedia—Soc. Behav. Sci. 2015, 176, 602–609. [Google Scholar] [CrossRef]
  37. Aznar, A.; Hernando, J.I.; Ortiz, J. Refreshing ‘Graph of Internal Forces’. In Proceedings of the 9th annual International Conference on Education and New Learning Technologies (EDULEARN17), Barcelona, Spain, 3–5 July 2017; Volume 1, pp. 1833–1840. [Google Scholar] [CrossRef]
  38. Modular Objet-Oriented Dynamic Learning Environment (MOODLE) Calculated Question Type—MoodleDocs. Available online: https://docs.moodle.org/311/en/Calculated_question_type (accessed on 24 June 2021).
  39. Gibbs, G.; Lucas, L. Coursework Assessment, Class Size and Student Performance: 1984–1994. J. Furth. High. Educ. 1997, 21, 183–192. [Google Scholar] [CrossRef]
  40. Barbosa, T.J.G.; Barbosa, M.J. Zoom: An Innovative Solution for the Live-Online Virtual Classroom. Hisp. Educ. Technol. Serv. Online J. 2019, 9. [Google Scholar]
  41. Mouchantaf, M. The COVID-19 Pandemic: Challenges Faced and Lessons Learned Regarding Distance Learning in Lebanese Higher Education Institutions. Theory Pract. Lang. Stud. 2020, 10, 1259–1266. [Google Scholar] [CrossRef]
  42. Modular Objet-Oriented Dynamic Learning Environment (MOODLE) Quiz Statistics Calculations—MoodleDocs. Available online: https://docs.moodle.org/dev/Quiz_statistics_calculations (accessed on 13 July 2021).
  43. Donnelly, R.; Patrinos, H.A. Learning Loss during COVID-19: An Early Systematic Review. Prospects 2021, 1–9. [Google Scholar] [CrossRef] [PubMed]
  44. Asgari, S.; Trajkovic, J.; Rahmani, M.; Zhang, W.; Lo, R.C.; Sciortino, A. An Observational Study of Engineering Online Education during the COVID-19 Pandemic. PLoS ONE 2021, 16, e0250041. [Google Scholar] [CrossRef] [PubMed]
  45. Chan, C.C.B.; Wilson, O. Using Chakowa’s Digitally Enhanced Learning Model to Adapt Face-to-Face EAP Materials for Online Teaching and Learning. Int. J. TESOL Stud. 2020, 2, 83–97. [Google Scholar] [CrossRef]
  46. Sales, D.; Cuevas-Cerveró, A.; Gómez Hernández, J.A. Perspectivas Sobre La Competencia Informacional y Digital de Estudiantes y Docentes de Ciencias Sociales Antes y Durante El Confinamiento Por La COVID-19. Prof. Inf. 2020, 29, 1–22. [Google Scholar]
  47. Alamri, J.M. The Perception of Interpersonal Relations between Instructors and Students as Experienced within Classroom and Online Communication: A Mixed Method Case Study of Undergraduate Women in a Saudi Institution. Ph.D. Thesis, University of Nottingham, Nottingham, UK, 2016. [Google Scholar]
  48. Peper, E.; Wilson, V.; Martin, M.; Rosegard, E.; Harvey, R. Avoid Zoom Fatigue, Be Present and Learn. NeuroRegulation 2021, 8, 47–56. [Google Scholar] [CrossRef]
  49. Gonzalez, T.; De la Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 Confinement on Students’ Performance in Higher Education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef]
Figure 1. Timeline of the development of one course week.
Figure 1. Timeline of the development of one course week.
Education 12 00561 g001
Figure 2. Example of rubric and graph of internal forces question in the MOODLE questionnaire with interface for drawing graphs of internal forces [38].
Figure 2. Example of rubric and graph of internal forces question in the MOODLE questionnaire with interface for drawing graphs of internal forces [38].
Education 12 00561 g002
Figure 3. Student participation rates over the two academic years analyzed.
Figure 3. Student participation rates over the two academic years analyzed.
Education 12 00561 g003
Figure 4. Boxplot of weekly scores.
Figure 4. Boxplot of weekly scores.
Education 12 00561 g004
Figure 5. Cumulative percentage of students and weekly average scores equal to or less than the number on the line: (a) Academic year 2019–2020; (b) Academic year 2020–2021.
Figure 5. Cumulative percentage of students and weekly average scores equal to or less than the number on the line: (a) Academic year 2019–2020; (b) Academic year 2020–2021.
Education 12 00561 g005
Table 1. Main statistics of the results.
Table 1. Main statistics of the results.
Academic Year2019–20202020–2021
Number of students210213
Number of questionnaire scores58236081
Average participation rate61.7%66.9%
Average of all scores (over 10 points)6.267.04
Table 2. Weekly assignment statistics and indices of academic year 2019–2020.
Table 2. Weekly assignment statistics and indices of academic year 2019–2020.
Week123456789101112131415
Average of week scores3.264.515.356.205.985.515.747.406.326.366.906.278.797.597.34
Standard deviation of week scores2.952.883.483.073.543.803.462.932.993.823.804.212.392.722.96
Discrimination index0.570.540.640.550.500.530.770.550.550.490.190.390.570.650.35
Discriminative efficiency0.410.400.540.440.430.510.660.430.450.470.180.430.430.360.24
Difference between asynchronous-synchronous−0.172.481.07−2.23−0.48−2.430.000.640.270.001.460.29−0.60−1.43−0.53
Table 3. Weekly assignment statistics and indices of academic year 2020–2021.
Table 3. Weekly assignment statistics and indices of academic year 2020–2021.
Week123456789101112131415
Average of week scores6.956.196.394.995.785.826.788.047.287.867.787.768.067.988.57
Standard deviation of week scores2.613.223.373.373.332.892.652.932.793.383.722.762.533.012.83
Discrimination index0.540.350.530.570.510.670.480.760.740.710.510.510.710.580.54
Discriminative efficiency0.370.300.490.590.490.540.340.660.550.700.580.430.550.510.48
Difference between asynchronous-synchronous−0.533.202.381.804.503.782.521.810.960.280.932.351.62−0.320.00
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

De Santos-Berbel, C.; Hernando García, J.I.; De Santos Berbel, L. Undergraduate Student Performance in a Structural Analysis Course: Continuous Assessment before and after the COVID-19 Outbreak. Educ. Sci. 2022, 12, 561. https://doi.org/10.3390/educsci12080561

AMA Style

De Santos-Berbel C, Hernando García JI, De Santos Berbel L. Undergraduate Student Performance in a Structural Analysis Course: Continuous Assessment before and after the COVID-19 Outbreak. Education Sciences. 2022; 12(8):561. https://doi.org/10.3390/educsci12080561

Chicago/Turabian Style

De Santos-Berbel, César, José Ignacio Hernando García, and Laura De Santos Berbel. 2022. "Undergraduate Student Performance in a Structural Analysis Course: Continuous Assessment before and after the COVID-19 Outbreak" Education Sciences 12, no. 8: 561. https://doi.org/10.3390/educsci12080561

APA Style

De Santos-Berbel, C., Hernando García, J. I., & De Santos Berbel, L. (2022). Undergraduate Student Performance in a Structural Analysis Course: Continuous Assessment before and after the COVID-19 Outbreak. Education Sciences, 12(8), 561. https://doi.org/10.3390/educsci12080561

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop