Next Article in Journal
Designing Inclusive Computational Thinking Learning Trajectories for the Youngest Learners
Previous Article in Journal
From Tiles to Worksheet: Exploring Concreteness Fading in Learning Vector Addition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Was “Returning to Normal” More Effective? Comparing Online and Offline Learning in English as a Foreign Language

1
Education and Society, University of Dundee, Dundee DD1 4HN, UK
2
Institute of Foreign Languages, RUDN University, 117198 Moscow, Russia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(7), 731; https://doi.org/10.3390/educsci14070731
Submission received: 11 April 2024 / Revised: 24 June 2024 / Accepted: 29 June 2024 / Published: 4 July 2024

Abstract

:
The aim of this research was to investigate whether a post-pandemic return to more face-to-face teaching was any more effective than during-pandemic online teaching, using examination results as an indicator. It compares the two middle years of a four-year undergraduate degree in English as a Foreign Language over two consecutive years. Year 1 saw 73% of the time spent online teaching and learning, while Year 2 spent 25%. The relative effects on the examination results of more versus less online were compared. The participants were 105 Methodologists (future teachers) and 272 Translators (N = 377), predominantly female (83%). Entry scores were checked to ensure the similarity of the cohorts. Examinations were taken twice a year. On one course, more online yielded better performance on both occasions. On three others, more online yielded better performance in the Winter but equivalent performance in the Summer. Of 24 Effect Sizes (ESs), only 3 were in favour of more offline. The average ES was 0.10 in favour of more online, 0.21 in Winter and 0.05 in Summer. Thus, more online learning was modestly more effective than less online learning. This had implications for course designers/university managers in terms of the degree of return to face-to-face learning.

1. Introduction

This aim of this research was to investigate whether a post-pandemic return to predominantly face-to-face teaching was any more effective than during-pandemic predominantly online teaching. Face-to-face teaching and learning imply that the teacher and students are physically present in person in the same room. It is usually at a set time and place and cannot be accessed on demand [1]. Online learning is a style of education in which students learn part or complete programs of work via digital media only, so that they can completely control the time, pace and place of their learning. In other words, all learning happens outside of the educational institution [2]. However, it should be noted that here, we are not talking about different intensities of blended learning [3]. This involves students learning via digital media as well as face-to-face teaching, so that they can control the time, pace and place of their learning for part of each day. Often, different kinds of activities are undertaken in each setting. In this research, when teaching and learning was online, it was all online, and when it was face to face/offline, it was all offline.
In this paper the effectiveness of online and face-to-face learning is compared by scrutinizing examination results, arguably a broad and simplistic outcome measure (but of great importance to students), yet surprisingly rarely used as an outcome measure in research (see Section 2.1. Previous Studies on Online/Offline Outcomes below). It is set in the context of the subject of English as a Foreign Language, widely studied in many universities around the world. The research compared the two middle years of a four-year degree programme in foreign languages over two calendar years. In one year, there was much online teaching and learning, and in the ensuing year, much less online teaching and learning. This enabled the comparison of the relative effects of more versus less online teaching and learning.
For some years, the research literature has been positive about the effects of online learning. Usually, however, these studies have been of projects that set out to compare all online learning with all face-to-face learning. They were reasonably well planned and organised, featuring training for the online implementers. The present study is different. It compares partial online with substantially online and compares a year in which teachers were suddenly thrust into having to instruct completely online with little preparation with a subsequent year in which there was a substantial return to face-to-face teaching with a more modest component of online learning. This is what many universities are now trying to do: a partial return to face-to-face teaching while attempting to retain some of the benefits of online teaching. The research gap here is that the issue of the optimum amount of online learning has rarely been studied, with most studies only comparing online with offline learning, while few studies have made this kind of comparison using examination results despite the widespread use of examinations as an outcome measure.

2. Literature Review

Only exemplar key studies on online learning and foreign language learning in general will be mentioned. The research literature on online learning dates from over two decades ago. As early as 2009, a meta-analysis by the U.S. Department of Education [4] of over 1000 studies from 1996 to 2008 selected 51 studies, mostly of university students. Students in the online learning condition performed better than those receiving face-to-face instruction. However, many studies were actually comparing face-to-face with blended learning rather than purely online. More recently, a systematic comparative analysis of online and blended learning was provided [5], concluding that blended learning was more effective than purely online learning. The main advantage of digital methods might be the possibility for enhanced task flexibility and learner autonomy, encouraging greater self-regulation.
Regarding foreign language learning, a “bibliometric synthesis” of 60 blended language-learning studies from 2000 to 2019 was conducted [6]. Blended learning was used mainly for practice or exercises. A meta-analysis of 34 studies of blended learning in second language vocabulary acquisition was offered [7], with a moderate Effect Size of 0.64. Receptive vocabulary was better than productive vocabulary. The authors found that tertiary students benefitted more than school students, and mobile-assisted learning was more effective than computer-assisted learning. However, both of these studies were of blended learning, rather than more versus less online.

2.1. Previous Studies on Online/Offline Outcomes

Long before the onset of the pandemic, there was research comparing the relative effectiveness of online and offline learning in higher education. Initially, this was conducted in terms of student perceptions. Students from 29 Austrian universities (n = 2196) were delivered a questionnaire on preferences for online or face-to-face learning [8]. Students appreciated online learning for supporting self-regulated learning and distributing information. They preferred face-to-face learning for communications in which interpersonal relations had to be established. Similarly, student perceptions of face-to-face learning were higher than for online learning in term of social presence, social interaction and satisfaction [9].
Regarding studies using more objective measures, students in a digital learning environment achieved better performance and higher levels of satisfaction than both those in a traditional classroom and those in a less interactive digital learning environment [10]. Three online courses were compared with the same three courses face to face, all with the same instructor (N = 968) [11]. The students’ final grades were higher in the online courses, with no difference in the completion rate. The students in online courses reported a better understanding of the course structure, better communication with course staff and higher engagement and satisfaction. A comparison of face-to-face, online, and blended teaching modes in an undergraduate child development course was made [12] to examine whether there were differences in student learning outcomes among the three modes, but there were no significant differences. A study to compare three teaching modes (online, face to face, blended) in delivering a mathematics course was conducted [13], reporting that there were no significant differences among the three modes.
Over 5000 courses taught by over 100 faculty members were examined [14]. Compared to a face-to-face format, in online courses, able students performed even better, while struggling students performed worse. In the same vein, a meta-analysis of 30 papers [15] compared online learning with traditional face-to-face instruction. Online learning was at least as effective. However, all these studies occurred before the COVID-19 pandemic and associated surge into remote learning. Overall, then, some studies found online better, some found online and offline had different advantages and disadvantages, some found offline equal to offline, but none of these studies found offline superior, whether the measure was perceptions or something more objective.

2.2. The International Effects of the COVID-19 Pandemic

Regarding the impact of the pandemic, the response was characterised as “emergency remote teaching” [16]. The scale of it was emphasised, noting it had affected nearly 1.6 billion learners in more than 200 countries, more than 94% of the world’s student population [17]. Further, re-opening of schools was another challenge. Research highlighted deficiencies such as pedagogical weaknesses in current online teaching, the limited exposure of teachers to online teaching, and for students non-conducive environments for learning at home and issues of equity.
Broadly identified challenges with digital learning were accessibility, affordability, flexibility and learning pedagogy [17]. Many countries had substantial issues with reliable Internet connections and access to digital devices for students from economically deprived families. In these countries, the Western reports of high online learning effectiveness might prove to be nonsense. Where both parents were working, the supervision of students learning online at home could be non-existent. While intrinsically motivated learners might be relatively unaffected, students with learning difficulties faced problems. Likewise, other researchers noted that the available resources, staff readiness, confidence, student accessibility and motivation played important parts in online learning [18].
A number of studies investigated student and teacher perceptions during the pandemic.

2.3. International Student and Teacher Perspectives

The experiences of Pakistani university students showed the vast majority of students were unable to access the Internet owing to technical as well as financial issues [19]. The lack of face-to-face interaction with the instructor and the absence of traditional classroom socialization were also highlighted. Somewhat similarly, data were collected in Romania from 762 students from two large universities using an online questionnaire (which seems somewhat paradoxical given the topic under investigation) [20]. The students felt that institutions were not prepared for online learning, and so the advantages of online learning were diminished. Student perceptions of online/offline learning at one university in Greece [21] showed the university had moved swiftly to respond to the lockdown and shift to distance learning. Of 103 students, initially, the majority (78%) felt negative emotions, including stress, anxiety and sadness. However, this settled, and later, only 48% felt concern. The majority (80%) were then happy about online education. The students commented that online learning was less distracting than the classroom. Other benefits were not having to commute (56%) and, consequently, not being late. On the other hand, the majority (71%) also mentioned the lack of personal contact between the teacher and students and also among students. Data from 280 students from various universities in Malaysia were analysed [22] to check for any differences between males and females regarding accessibility to a digital learning portal. Little difference between males and females was found. Learner perceptions in one university in India were investigated [23], comparing the pre- and post-COVID-19 period. The students had a higher perception of pre-pandemic learning than of online learning.
Regarding the perceptions of teachers, 87 university teachers felt the main challenges were computer literacy levels, the university’s digital environment and support, academic staff readiness, and students’ readiness for online learning [24]. In China, responses were summarised from online instructors who were not well prepared for, or accepting of, online teaching and learning [25]. It was not easy to change students’ learning behaviours quickly. Some instructors possessed low skills in computer and Internet tasks and preferred traditional methods. In India, the perceptions of both students and teachers were reported [26], involving 78 teachers and 521 postgraduate students in one university. During the pandemic, students returned to their hometowns in remote rural situations without Internet connectivity or broadband services and an uninterrupted power supply. Conducting online practical classes proved difficult, because it required systematic demonstration in the presence of the students. Teachers and students’ preferences of the three modes, online education, blended education and face-to-face education, were explored by studying medical students and teachers in Bahrein [27]. The conclusion was that both perceived blended learning and face-to-face learning to be acceptable in medical education, while online education was only acceptable in theoretical teaching and in some clinically oriented teachings.
Overall, however, much of the literature on dealing with the pandemic emanates from the West, where computer provision and home access to devices and the Internet is relatively high. As soon as less developed countries are considered, it is clear that online learning can face massive problems.

2.4. Student Achievement Outcomes in Examinations

While many studies have explored student and teacher perceptions, few previous studies have investigated student achievement outcomes during and after the pandemic in terms of cognitive outcomes or examination results. Two studies from Spain [28,29] are of interest, but neither examined the transition from wholly online to partially online learning as universities emerged from the pandemic, as the present study does.
The first [28] analysed the effects of COVID-19 confinement on learning performance. Investigating 458 students from three different subjects at Universidad Autónoma de Madrid, the academic years 2017/2018 and 2018/2019 were compared to 2019/2020, the year the pandemic struck. There was a significant positive effect of COVID-19 confinement on student performance. The differences reached a substantial statistical significance, both in subjects that increased the number of assessment activities during the pandemic and subjects that did not change the student workload.
The other study came from the Universidad Politécnica de Madrid [29]. In this study, the impacts of unplanned change, class size, synchronous/asynchronous deliveries and the use of digital methods on students’ academic performances were investigated. The research used quantitative data from academic records across all 43 courses of a bachelor’s degree programme in Telecommunication Engineering and qualitative data from a questionnaire delivered to all (N = 43) course coordinators. This research also compared the academic results of students during the COVID-19 pandemic with those of the two previous years. Both synchronous (60%) and asynchronous (40%) learning were used. The percentage of students who passed courses in the three academic years showed a significant increase under pandemic teaching when compared to the previous two years. The differences were sustained across non-elective courses, with no differences across courses different class sizes or delivery modes.
Turning to the few studies examining post-pandemic learning using objective measures, a comparative study was conducted on online teaching and face-to-face teaching for cultivating Chinese students’ ability for creative idea generation [30] and pointed out that the two modes had their own advantages and disadvantages, leading them to suggest that blended learning might be better. Other researchers [31] studied participants undertaking professional development as foreign language teachers and deployed online, offline and blended learning in three classes known to be of the same ability. However, they used purpose-built tests rather than regular examinations to measure the outcomes. Additionally, their online learning occurred via a MOOC rather than being dedicated to these students, and their blended learning condition was merely the first half of the class sessions in the MOOC and the second half in face-to-face mode. They concluded that the face-to-face condition yielded the best results. This was somewhat in contradiction to the trend in the rest of the literature but might reflect the specific conditions of the study or a negative novelty effect of the interventions.
The present study focused on the transition out of the pandemic rather than performance in or out of the pandemic and was located in a different country, but it goes beyond this. It thus addresses gaps in the rather scant research literature on online/offline learning and examination outcomes. We now turn to describing the methods for the research.

3. Methods

This research compared the two middle years of a four-year undergraduate degree programme in English as a Foreign Language. Year 1 saw 73% of the time spent on online teaching and learning, while Year 2 saw 25%. The relative effects of more versus less online learning on the examination results were compared. The middle two years were selected, as first-year students would still be settling in and fourth-year students would be too preoccupied with their final examinations. English as a Foreign Language was mandatory for all students in all years. The age of the students was homogeneous: 17 to 19 years old. All subjects gave their informed consent for inclusion before they participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of RUDN University (approval no. RUDN 245916).

3.1. Research Questions

  • As students moved from more online to less online learning, did their examination performance increase or decrease?
  • Was there a difference between Winter (mid-academic-year) examination scores and Summer (end-academic-year) examination scores?

3.2. Sample

This particular Institute for Foreign Languages was a convenience sample, but all students within the two middle years participated. There were two types of courses: for Methodologists and for Translators. Translators are self-explanatory, but Methodologists are future English as a Foreign Language teachers. In both courses, students were placed in classes on the basis of their ability.
Thus, for the first year of the study, 45 Methodologists were in 4 classes, and 129 Translators were in 11 classes (total: 174). For the second year of the study, there were 60 Methodologists in 5 classes and 143 Translators in 13 classes (total: 203) (see Table 1 and Supplementary Materials).
The students were predominantly female. For the first year, the males were 29 of the students out of 174 (17%). For the second year, males were 47 out of 203 (23%) (see Table 1). Thus, the second year saw a modest increase in the total number of students and also in male Translators.

3.3. Differences in Teaching and Learning Methods

In the first year of the study, owing to the pandemic, students engaged in face-to-face learning for one-quarter of the time and online learning for almost three-quarters of the time. The abrupt transition to the online format in the first two months was accompanied by an overload of the university’s Learning Management System. The teachers were faced with searching for free online platforms such as Skype, Zoom and even gaming platforms. Learning in each class was built differently depending on the digital competence of the teacher and the motivation of the students. However, the situation was constantly monitored (weekly reporting was requested from teachers and students). After two months, the university switched to Microsoft Teams. The classes then showed far greater consistency in their methods of teaching and learning, and the situation stabilised. Teams became a familiar and accessible platform for meetings (including totally online classes for foreign students who could not enter the country due to COVID-19 restrictions).
In the second year of the study, all pandemic restrictions ended from February, so the students engaged in online learning for one-quarter of the time and face-to-face learning for three-quarters of the time. Thus, the online/offline ratio for the first year was 8:3 (73%), while the online/offline ratio for the second year was 3:9 (25%). This yielded an interesting opportunity to compare the effectiveness of more online versus less online learning.

3.4. Assessment

This sub-section is somewhat complicated. The State Unified Examination (SUE) score prior to university entry enabled the comparison of the equivalence prior to entry into the university of the two admission cohorts. The SUE assesses each student on listening, reading, writing, use of English and speaking. The Total Score was used, with a maximum of 100 (and see Supplementary Materials).
For first-year admissions, out of a total cohort of 174, 27 students (15%) had no SUE data (because they were foreign students), so the SUE comparisons were made on 147 students (84%). For second-year admissions, out of a total cohort of 203, 42 lacked SUE data, so the comparisons were made on 161 students (79%) (see Table 2).
In both study years, internal examinations were taken at the end of each of two semesters, in the Summer and in the Winter. The Summer and Winter assessments for all students in both years of admission were compared between the consecutive first and second years, Summer with Summer and Winter with Winter (as the methods of calculating scores differed from Winter to Summer). In both years, all examinations were undertaken on a face-to-face basis. No student repeated a course.
While the Methodologists and Translators shared one course (First Foreign Language Applied), the second course differed between them: Translators took Precis Writing and Annotation while Methodologists took First Foreign Language Translation Workshop. Consequently, these were analysed separately. While for the first cohort both the second and third year had a third course assessed (Foreign Language in the Framework of European Competences), the second cohort only had this assessment in their second year, and thus, online and offline years could not be compared for the first year.
For the Winter assessment, the weighting difference between the maximum sub-test score (the Current Score) and the maximum Final Score was in the ratio 80:20. In the Summer, however, the weighting difference between the Current Score and Final Score was in the ratio 50:50. In both cases, for each sub-test, the Current Score and Final Score were added together to give a Total Score out of a maximum of 100.
There were no reliability or validity data available on either Winter or Summer assessments (as is common in many universities). In general, entry examination scores are good predictors of eventual examination grades. However, another study [32] noted that essay grades given were unreliable and over-dependent on language and organisational components.

3.5. Analysis

When comparing SUE scores, a one-way ANOVA was compared to an independent t-test. Both produced identical results. Subsequently, independent and paired (dependent) t-tests were deemed appropriate for comparisons between relatively online and offline years. In every case, these were coupled with an Effect Size (ES = Hedges g). At the end of the Results section, the ESs are compared across conditions to indicate our conclusions.

4. Results

4.1. State Unified Examination (SUE)

First, the SUE was compared across the year of admission to see if the cohorts were equivalent. Table 2 shows that there were more missing cases in the second year.
In the first year of the study, the mean was 74.97 and the standard deviation (sd) 15.92, while the second-year mean was 73.60 and the sd 13.71. In other words, there was a slight decline in Year 2 but less variance. Comparing online and offline years with the independent samples t-test, there was no significant difference (p = 0.416) between the cohorts. Likewise, the ANOVA showed no difference between the groups or within groups (F = 0.66, df = 306, p = 0.416), with exactly the same probability. Thus, the cohorts were not significantly different on entry.

4.2. First Foreign Language Applied

For the “First Foreign Language Applied” assessment, full data were available for both Translators and Methodologists. Summer in one year was compared to Summer in the next year, and Winter in one year was compared to Winter in the next year, for all three scores: Current Score, Final Score and Total Score (Table 3). For these analyses, a paired samples (dependent) t-test was used.
Comparing Winter–Winter Current Scores, the second (offline) Winter had lower scores, and this difference was highly statistically significant (t = 5.54, df = 358, p < 0.01, ES = 0.29). Comparing Winter–Winter Final Scores, the second (offline) was again lower, but not significantly so (t = 0.539, df = 373, p = 0.59, ES = 0.03). Comparing Winter–Winter Total Scores, the offline score was again lower, and the difference was again statistically significant (t = 3.10, df = 375, p < 0.01, ES = 0.17).
Thus, for two out of three comparisons, online learning appeared more effective than offline learning. However, would these results also hold true for the Summer–Summer comparison (Table 4)?
For the Summer–Summer First Foreign Language Current Score, the picture was reversed, with later scores being higher than early scores: this difference was statistically significant (t = 2.95, df = 357, p < 0.01, ES = 0.16). For the Final Score comparison, however, the online students again had the advantage, and this difference was also statistically significant (t = 3.61, df = 366, p < 0.01, ES = 0.19). For the Total Score comparison, the two occasions were almost exactly equal (t = 0.56, df = 0.37, p = 0.58, ES = 0.03). Thus, for the Summer–Summer comparisons, one had online higher, one had offline higher, and overall, there was no difference.

4.3. Precis Writing and Annotation in the First Foreign Language for Translators

Regarding Precis Writing and Annotation (PWA) for Translators (Table 5), the Winter–Winter Current Scores were much higher for online than for offline, and this difference was highly statistically significant (t = 7.83, df = 260, p < 0.01, ES = 0.48).
For the Winter Final Score for Translators on PWA, offline did slightly better than online, although this difference did not reach statistical significance (t = 1.42, df = 269, p = 0.16, ES = 0.21). For the Winter Total Score on PWA for Translators, online was again the highest, and this difference did reach statistical significance (t = 3.90, df = 270, p < 0.01, ES = 0.24).
Regarding the Summer scores (Table 6), for Current Scores, the online year was again higher than the offline year, and this difference just reached statistical significance (t = 2.00, df = 252, p = 0.046, ES = 0.15). For Summer Final Scores, there was no statistically significant difference between the online and offline years (t = 0.32, df = 265, p = 0.75, ES = 0.03). For Summer Total Scores, the online year was again higher than the offline year, and this difference reached statistical significance (t = 2.29, df = 268, p = 0.02, ES = 0.16).

4.4. First Foreign Language Translation Workshop for Methodologists

Regarding the Methodologists (Table 7), who were assessed on the First Foreign Language Translation Workshop, with the Current Score for Winter–Winter, there was no statistically significant difference between years (t = 0.43, df = 96, p = 0.67, ES = 0.043). For Final Scores, offline was much lower than online, and this was statistically significant (t = 2.40, df = 104, p = 0.02, ES = 0.23). However, for Total Scores, the years were almost exactly equal with no statistically significant difference (t = 0.38, df = 104, p = 0.70, ES = 0.04).
Regarding Summer–Summer comparisons (Table 8), for Current Scores, the online year was substantially and significantly higher than the offline year (t = 4.25, df = 97, p < 0.01, ES = 0.51). However, for Final Scores, offline did significantly better than online (t = 2.60, df = 101, p = 0.01, ES = 0.47). For Total Scores, there was no significant difference between online and offline (t = 0.89, df = 101, p = 0.37, ES = 0.08).

4.5. Foreign Language in the Framework of European Competencies

For Foreign Language in Framework of European Competencies, complete data were only available for the first year of the study. For this year, the Current, Final and Total Scores were explored, first for the Winter and then for the Summer (Table 9).
For the Winter Current Score, the online year did better than the offline year (t = 5.18, df = 127, p < 0.01, ES = 0.59). For the Final Score, the online year did better than the offline year but this did not reach statistical significance (t = 0.64, df = 172, p = 0.53, ES = 0.09). For the Total Score, the online year did better than the offline year, and this reached statistical significance (t = 2.09, df = 173, p = 0.04, ES = 0.15).
Regarding Summer scores (Table 10), for Current Scores, the online and offline scores were essentially the same (t = 0.14, df = 169, p = 0.89, ES = 0.01). For Final Scores, the online and offline scores were also essentially the same (t = 0.87, df = 170, p = 0.39, ES = 0.17). For Total Scores, although the offline year was slightly higher than the online year, the difference did not reach statistical significance (t = 1.55, df = 172, p = 0.12, ES = 0.11).

4.6. Average Effect Sizes

Of 24 ESs, only 3 were in favour of mainly offline teaching. The overall average ES was 0.10 in favour of online teaching and learning, which would be described as small. However, the expectation would be that face-to-face learning was “normal” and more desirable and presumed to be more effective, so this is of interest. Certainly, there will have been issues with at least some teachers struggling to rapidly adapt their teaching to an online situation. One might expect the ES for online teaching to rise as they become more skilled.
Comparing the times of year, the overall ES was 0.21 in the Winter and 0.05 in the Summer. The institution regarded the Summer end-of-academic-year assessment as the most important of the two, and it is interesting that this has the lower average ES, although this might also result from the different methods of calculating these scores. Comparing courses, First Foreign Language Applied (taken by all students) had an overall ES of 0.09; Precis Writing and Annotation (taken only by Translators) had an overall ES of 0.14, First Foreign Language Translation Workshop (taken by Methodologists) had an overall ES of 0.07, and Foreign Language in the Framework of European Competencies had an overall ES of 0.17. This suggests that online teaching and learning had bigger effects in some courses than others.

5. Discussion

5.1. Summary

On the State Unified Exam (SUE) pre-entry, there was no significant difference between the online year and the offline year, i.e., the cohorts were equivalent. Translators did better on the SUE than Methodologists, significantly so in the second year.
Winter scores (mid-year) in one year were compared to Winter scores in the next year and likewise for Summer (end of year) scores.
The First Foreign Language Applied course assessment was the only one taken by both Translators and Methodologists. Comparing Winter scores on the Current, Final and Total scores, the online year was better in all cases. Comparing Summer scores on the Current, Final and Total scores, for one, the offline year was better, for one, the online year was better, and for one, the scores were equivalent.
Regarding Precis Writing and Annotation (PWA) for Translators, for Winter Current, Final and Total Scores, in two cases, online was higher than offline (both significant), while one was higher for offline (not significant). For Summer Current, Final and Total Scores, online was higher than offline in two cases (both significant), while one showed no difference (not significant).
Regarding Methodologists assessed on the course First Foreign Language Translation Workshop in the Winter, in two cases, there was no difference, and in one case, online was significantly higher. For the Summer comparisons, in one case, online was higher (significant), and in another, offline was higher (significant), while in the third case, both were equal (not significant).
Considering Foreign Language in the Framework of European Competencies, data were only available for the first year. For Winter scores, in all cases, online was higher than offline (significant in two cases). For Summer, in all cases, offline and online were the same.
Of 24 ESs, only 3 were in favour of mainly offline teaching. On average, the ESs were 0.21 for Winter and 0.05 for Summer. First Foreign Language Applied (taken by all students) had an overall ES of 0.09, Precis Writing and Annotation (taken only by Translators) had an overall ES of 0.14, First Foreign Language Translation Workshop (taken by Methodologists) had an overall ES of 0.07, and Foreign Language in the Framework of European Competencies had an overall ES of 0.17.

5.2. Limitations and Strengths

The major strength of this study was that the whole cohort of students in the Institute was included, and this enabled the use of parametric statistical analysis. However, only one course was taken by all students, but nonetheless, the results differentiated by course are informative. Generally, there was a high rate of assessment completion in all courses.
However, there are some issues to note. There were more males in the second year of admission, which might have made some small difference to the results. Also, there were more missing details for the SUE in one year than another, but again, the difference this might have made to the results was small. As the pandemic lessened, the decrease of students’ examination performance from more online to less online classes might have resulted from exhaustion or burnout. However, this would not accord with the anecdotal observations of staff members, who felt that students were glad to return to what they considered “normal”.
The differences in the way the Winter and Summer scores were calculated might account for the difference in the average ESs between the two. Winter scores emphasised sub-test scores more than Summer scores. Thus, the Summer scores, which were assumed by the institution to be more important than the Winter scores, might be showing less difference between online and offline merely because they were calculated differently.

5.3. Relationship to Previous Studies

Here, we initially discuss the two previous papers that investigated student examination achievement outcomes during the pandemic [28,29]. Both of these compared purely online with purely offline and were preoccupied with affairs during the pandemic rather than focused on emerging from the pandemic like the present paper. The first paper [28] was problematic, as two interventions took place, and we cannot tell which was responsible for what outcomes.
The second paper [29] was considerably better but only investigated the number of students who passed examinations rather than the absolute scores of each student in each exam as in the present paper. We would argue that this lack of detail may have significantly biased the findings. Additionally, this paper was about Telecommunication Engineering, which might be a subject that lends itself to online learning better than Foreign Language learning, so the findings may not be broadly generalisable to other subjects.
Another study [30] was focused on creative idea generation rather than examination results. At first glance, yet another paper [31] suggests a good design but used a strange version of blended learning and did not use examination results but instead used purpose-built tests. Their conclusion that the face-to-face condition yielded the best results is somewhat suspect. More recently, the issue of the move from online to offline learning and teaching in universities in a qualitative study of 24 international students at a British university has been addressed [33]. Generally, the students were resistant to the transfer, citing psychological anxiety, financial losses and negative learning experiences. Students had developed a reliance on digital resources while learning remotely. This qualitative study is an interesting complement to the present quantitative study. In Saudi Arabia, the perceptions of 480 students from both science and arts departments were surveyed [34]. The majority felt tired (77%) and unhappy (63%) after starting offline classes again. The majority believed that offline classes created difficulty in time management and concentration, and online, they were more comfortable, alert, satisfied and gained higher scores in exams. The majority (72%) preferred the online mode of learning and wished to continue it. These results were also in line with the present study, although again, using a quite different methodology. Of course, one does not know if these feelings would be sustained in the longer term.

5.4. Interpretation

It was interesting that Translators performed better than Methodologists (future teachers of English), since one would hope that future teachers of English would be of a high standard. For Precis Writing and Annotation (PWA) for Translators, favouring online learning was more marked for both Winter and Summer scores. Regarding Methodologists on First Foreign Language Translation Workshop, there was less evidence favouring online learning (two cases significant for online; one case significant for offline). It appears that Methodologists were less capable of thriving in an online environment. Again, this is worrying if these students are to become teachers of English.
The difference between Winter and Summer is so consistent that it merits explanation. Of course, the calculation methods are different (see Limitations above), but there may be other explanations. Instructors place more faith in the Summer assessments, as students are aware, so is there more stress in the Summer assessments, and might this affect online learning differently than offline learning, remembering that all assessments were taken on a face-to-face basis?

5.5. Implications for Practice, Policy and Future Research

Online teaching and learning are not without challenges and many of these have been summarised [35]. Learners’ issues include learners’ expectations, readiness, identity and participation. Instructors’ issues include changing faculty roles, transitioning from face to face to online, time management and teaching styles. Higher education institutions needed to provide professional development for instructors, training for students and technical support for content development.
The implications for practice from this study are that extra attention needs to be provided to the teachers of Methodologists in order to ensure that their students may become as competent with online learning as Translators. The teachers might wish to reconsider their views on the relative value of Winter and Summer assessments. The rationale for the different methods of calculation needs explicating, since the Winter assessments clearly favour online learning much more than the Summer assessments.
From a broader perspective, providing more professional development for the teachers whose digital literacy is relatively poor seems necessary, although, obviously, motivation as well as competence enter into this. Some kind of assessment of digital literacy skills in teachers as well as in students may help to refine the kinds of training needed. The SUE seems useful at predicting initial performance and enabling setting based on ability but is not a good predictor in the longer term.
Regarding future research, the fact that so few papers have been published on online/offline differences in student examination outcomes is remarkable, and this should certainly be featured in more in the future. Of course, the reliability and validity of examination results are important issues, and they cannot be considered perfect [32]. The diversification of forms of assessment might at least enable the triangulation of outcomes. A follow-up of students after they have left university might enable the relating of final examination performance to subsequent performance on the job, which is presumably a more important validity indicator.
Using whole cohorts of students within one department of an institution clearly has benefits in terms of applying inferential statistical analysis. It would be beneficial to conduct similar work on students in other departments in the same institution and then on several departments in other institutions. Of course, the complexity of course options might make this difficult. Studies of examination results should be coupled with surveys of student and teacher perceptions of online and offline learning so that both types of data can be triangulated.

6. Conclusions

Returning to the research questions, we provide the following answers:
  • As students moved from more online to less online learning, did their examination performance increase or decrease?
Answer: So far as Winter assessments were concerned, their examination performance decreased. So far as Summer assessments were concerned, this tendency was much weaker. Nonetheless, online learning seems at least as effective as offline learning and possibly more effective.
2.
Was there a difference between Winter (mid-academic-year) examination scores and Summer (end-academic-year) examination scores?
Answer: Yes, indeed there was. Winter scores for online learning tended to be much higher than those for the Summer assessments.
These findings suggest that in emerging from the pandemic, instructors should take care not to simply revert to face-to-face teaching but instead take care to analyse students’ needs and develop a model of synthesised online/offline learning that capitalises on the advantages of both and avoids the disadvantages of each. This may involve blended learning, with information given largely online and interactive discussion and conceptual development largely face to face. In this paper, we have purely addressed cognitive matters, and social and emotional factors will certainly be equally important in devising a way forward.
EFL teaching and learning in another country may have different pedagogical principles and different levels of development in student and teacher skills in online learning, so it is uncertain whether the same results would be found. However, our findings should at least be considered by other countries. Also, it may be that our results could be generalised to other subjects beyond EFL, but other subjects may also have different pedagogical principles and different levels of development in student and teacher skills in online learning. Again, it is an empirical question whether the same results would be found. However, in Section 5.3 above, we note that similar results using different methodologies have already been found in other countries and in other subjects.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14070731/s1, Supplementary Information S1: Variability Between Classes; Supplementary Information S2: SUE and First Foreign Language Applied; Supplementary Information S3: Analysis by Class; Supplementary Information S4: Summary and Discussion.

Author Contributions

Conceptualization, K.T., N.E. and N.S.; methodology, K.T. and N.E.; validation, K.T. and N.E.; formal analysis, K.T.; investigation, N.E. and N.S.; resources, N.E. and N.S.; data curation, K.T. and N.E.; writing—original draft preparation, K.T.; writing—review and editing, K.T. and N.E.; supervision, N.E. and N.S.; project administration, N.E. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was not financially supported by any external organization.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of RUDN University (protocol code RUDN 245916, 3 September 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data involved in this study can be made available to interested researchers on request.

Acknowledgments

The authors are grateful to the many students who participated in this research.

Conflicts of Interest

The authors declare that they have no competing or conflicting interests.

References

  1. TopHat. Face-to-Face Learning Definition and Meaning. 2024. Available online: https://tophat.com/glossary/f/face-to-face-learning (accessed on 24 March 2024).
  2. Moore, J.L.; Dickson-Deane, C.; Galyen, K. E-Learning, online learning, and distance learning environments: Are they the same? Internet High. Educ. 2011, 14, 129–135. [Google Scholar] [CrossRef]
  3. Hrastinski, S. What do we mean by Blended Learning? TechTrends 2019, 63, 564–569. [Google Scholar] [CrossRef]
  4. U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Available online: https://repository.alt.ac.uk/629/1/US_DepEdu_Final_report_2009.pdf (accessed on 24 March 2024).
  5. Topping, K.J.; Douglas, W.; Robertson, D.; Ferguson, N. The effectiveness of online and blended learning from schools: A systematic review. Rev. Educ. 2022, 10, e3353. [Google Scholar] [CrossRef]
  6. Li, R. Research trends of blended language learning: A bibliometric synthesis of SSCI-indexed journal articles during 2000–2019. ReCALL 2022, 34, 309–326. [Google Scholar] [CrossRef]
  7. Yu, A.Q.; Trainin, G. A meta-analysis examining technology-assisted L2 vocabulary learning. ReCALL 2022, 34, 235–252. [Google Scholar] [CrossRef]
  8. Paechter, M.; Maier, B. Online or face-to-face? Students’ experiences and preferences in e-learning. Internet High. Educ. 2010, 13, 292–297. [Google Scholar] [CrossRef]
  9. Bali, S.; Liu, M.C. Students’ perceptions toward online learning and face-to-face learning courses. J. Phys. Conf. Ser. 2018, 1108, 012094. [Google Scholar] [CrossRef]
  10. Zhang, D.S. Interactive multimedia-based e-learning: A study of effectiveness. Am. J. Distance Educ. 2010, 19, 149–162. [Google Scholar] [CrossRef]
  11. Soffer, T.; Nachmias, R. Effectiveness of learning in online academic courses compared with face-to-face courses in higher education. J. Comput. Assist. Learn. 2018, 34, 534–543. [Google Scholar] [CrossRef]
  12. Yen, S.C.; Lo, Y.; Lee, A.; Enriquez, J.M. Learning online, offline, and in-between: Comparing student academic outcomes and course satisfaction in face-to-face, online, and blended teaching modalities. Educ. Inf. Technol. 2018, 23, 2141–2153. [Google Scholar] [CrossRef]
  13. Larson, D.K.; Sung, C.H. Comparing student performance: Online versus blended versus face-to-face. J. Asynchronous Learn. Netw. 2019, 13, 31–42. [Google Scholar] [CrossRef]
  14. Cavanaugh, J.K.; Jacquemin, S.J. A large sample comparison of grade-based student learning outcomes in online vs. face-to-face courses. Online Learn. 2015, 19, EJ1062940. [Google Scholar] [CrossRef]
  15. Castro, M.D.B.; Tumibay, G.M. A literature review: Efficacy of online learning courses for Higher Education institution using meta-analysis. Educ. Inf. Technol. 2021, 26, 1367–1385. [Google Scholar] [CrossRef]
  16. Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The Difference between Emergency Remote Teaching and Online Learning. EDUCAUSE Review. 2020. Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning (accessed on 14 January 2024).
  17. Pokhrel, S.; Chhetri, R. A literature review on impact of Covid-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  18. Ali, W. Online and remote learning in higher education institutes: A necessity in light of COVID-19 pandemic. High. Educ. Stud. 2020, 10, 16–25. [Google Scholar] [CrossRef]
  19. Adnan, M.; Anwar, K. Online learning amid the COVID-19 pandemic: Students’ perspectives. J. Pedagog. Sociol. Psychol. 2020, 2, 45–51. [Google Scholar] [CrossRef]
  20. Coman, C.; Tîru, L.G.; Mesesan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online teaching and learning in higher education during the Coronavirus pandemic: Students’ perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  21. Karalis, T.; Raikou, N. Teaching at the times of COVID-19: Inferences and implications for higher education pedagogy. Int. J. Acad. Res. Bus. Soc. Sci. 2020, 10, 479–493. [Google Scholar] [CrossRef] [PubMed]
  22. Shahzad, A.; Hassan, R.; Aremu, A.Y.; Hussain, A.; Lodhi, R.N. Effects of COVID 19 in e learning on higher education institution students: The group comparison between male and female. Qual. Quant. 2021, 55, 805–826. [Google Scholar] [CrossRef] [PubMed]
  23. Sharma, A.; Alvi, I. Evaluating pre and post COVID 19 learning: An empirical study of learners’ perception in higher education. Educ. Inf. Technol. 2021, 26, 7015–7032. [Google Scholar] [CrossRef]
  24. Almazova, N.; Krylova, E.; Rubtsova, A.; Odinokaya, M. Challenges and opportunities for Russian higher education amid COVID-19: Teachers’ perspective. Educ. Sci. 2020, 10, 368. [Google Scholar] [CrossRef]
  25. Chang, C.L.; Fang, M. E-learning and online instructions of higher education during the 2019 novel coronavirus diseases (COVID-19) epidemic. J. Phys. Conf. Ser. 2020, 1574, 012166. [Google Scholar] [CrossRef]
  26. Mishra, L.; Gupta, T.; Shree, A. Online teaching-learning in higher education during lockdown period of COVID-19 pandemic. Int. J. Educ. Res. Open 2020, 1, 100012. [Google Scholar] [CrossRef] [PubMed]
  27. Atwa, H.; Shehata, M.H.; Al-Ansari, A.; Kumar, A.; Jaradat, A.; Ahmed, J.; Deifalla, A. Online, face-to-face, or blended learning? Faculty and medical students’ perceptions during the COVID-19 pandemic: A mixed-method study. Front. Med. 2022, 9, 791352. [Google Scholar] [CrossRef] [PubMed]
  28. Gonzalez, T.; de la Rubia, M.A.; Hincz, K.P.; Comas-Lopez, M.; Subirats, L.; Fort, S.; Sacha, G.M. Influence of COVID-19 confinement in students’ performance in higher education. PLoS ONE 2020, 15, e0239490. [Google Scholar] [CrossRef]
  29. Iglesias-Pradas, S.; Hernandez-García, A.; Chaparro-Pelaez, J.; Prieto, J.L. Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: A case study. Comput. Hum. Behav. 2021, 119, 106713. [Google Scholar] [CrossRef] [PubMed]
  30. Zhang, J.; Dai, Y.; Zhao, F. Comparative study on online and offline teaching for creative idea generation. Front. Psychol. 2022, 13, 872099. [Google Scholar] [CrossRef]
  31. Sun, Q.; Zhang, L.J. Examining the relative effectiveness of online, blended and face-to-face teaching models in promoting the professional development of foreign language teachers. Porta Linguarum Int. J. Foreign Lang. Teach. Learn. 2023, 2023c, 13–27. [Google Scholar] [CrossRef]
  32. Brown, G.T.L. The validity of examination essays in higher education: Issues and responses. High. Educ. Q. 2010, 64, 276–291. [Google Scholar] [CrossRef]
  33. Zhao, X.; Xue, W. From online to offline education in the post-pandemic era: Challenges encountered by international students at British universities. Front. Psychol. 2023, 13, 1093475. [Google Scholar] [CrossRef] [PubMed]
  34. Riaz, F.; Mahmood, S.E.; Begum, T.; Ahmad, M.T.; Al-Shaikh, A.A.; Ahmad, A.; Shati, A.A.; Khan, M.S. Students’ preferences and perceptions regarding online versus offline teaching and learning post-COVID-19 lockdown. Sustainability 2023, 15, 2362. [Google Scholar] [CrossRef]
  35. Kebritchi, M.; Lipschuetz, A.; Santiague, L. Issues and challenges for teaching successful online courses in higher education: A literature review. J. Educ. Technol. Systems 2017, 46, 4–29. [Google Scholar] [CrossRef]
Table 1. Parameters of the participant sample.
Table 1. Parameters of the participant sample.
Year of AdmissionType of CourseN
Students
Gender:
N Males
N
Classes
Typical Class SizeChange Investigated
First year
total n = 174
Methodologists 45  8/45 = 18% 410–142nd–3rd year
Translators12921/129 = 16%1110–142nd–3rd year
Second year
total n = 203
Methodologists 60  9/60 = 15% 511–131st–2nd year
Translators14338/143 = 27%13 9–121st–2nd year
Table 2. Numbers with SUE data for two years of admission.
Table 2. Numbers with SUE data for two years of admission.
Year Cases
ValidMissingTotal
NPercentNPercentNPercent
First year of study14784.5%2715.5%174100.0%
Second year of study16179.3%4220.7%203100.0%
Table 3. Comparing Winter Current, Final and Total Scores for First Foreign Language Applied.
Table 3. Comparing Winter Current, Final and Total Scores for First Foreign Language Applied.
MeanNStd. DeviationStd. Error Mean
Winter first-year First Foreign Language Applied Current Score 66.1435911.5610.610
Winter second-year First Foreign Language Applied Current Score 61.8835911.4060.602
Winter first-year First Foreign Language Applied Final Score17.8137512.8000.661
Winter second-year First Foreign Language Applied Final Score17.44375 6.3550.328
Winter first-year First Foreign Language Applied Total Score80.9237611.4860.592
Winter second-year First Foreign Language Applied Total Score78.9437612.2850.634
Table 4. Comparing Summer Current, Final and Total Scores for First Foreign Language Applied.
Table 4. Comparing Summer Current, Final and Total Scores for First Foreign Language Applied.
MeanNStd. DeviationStd. Error Mean
Summer first-year First Foreign Language Applied Current Score41.7435810.5210.556
Summer second-year First Foreign Language Applied Current Score43.8735812.1910.644
Summer first-year First Foreign Language Applied Final Score39.7436711.1530.582
Summer second-year First Foreign Language Applied Final Score37.1236710.8020.564
Summer first-year First Foreign Language Applied Total Score80.0437011.9280.620
Summer second-year First Foreign Language Applied Total Score80.4037014.3740.747
Table 5. Winter Current, Final and Total Scores for Precis Writing and Annotation for Translators.
Table 5. Winter Current, Final and Total Scores for Precis Writing and Annotation for Translators.
MeanNStd. DeviationStd. Error Mean
Current Winter exam first-year Score: 2nd year or 1st year65.8826110.9110.675
Current Winter exam second-year Score: 3rd year or 2nd year59.4426112.4390.770
Final Winter exam first-year Score: 2nd year or 1st year 17.4627010.8670.661
Final Winter exam second-year Score: 3rd year or 2nd year18.54270 6.2740.382
Total Winter exam first-year Score: 2nd year or 1st year 80.8427111.9040.723
Total Winter exam second-year Score: 3rd year or 2nd year 77.8227113.5490.823
Table 6. Summer Current, Final and Total Scores for Precis Writing and Annotation for Translators.
Table 6. Summer Current, Final and Total Scores for Precis Writing and Annotation for Translators.
MeanNStd. DeviationStd. Error Mean
Current Summer exam first-year Score: 2nd year or 1st year62.6525312.120.76
Current Summer exam second-year Score: 3rd year or 2nd year60.8625314.810.93
Final Summer exam first-year Score: 2nd year or 1st year18.9526611.850.73
Final Summer exam second-year Score: 3rd year or 2nd year19.2626611.620.71
Total Summer exam first-year Score: 2nd year or 1st year79.4726912.720.78
Total Summer exam second-year Score: 3rd year or 2nd year77.4626917.131.04
Table 7. Winter Current, Final and Total Scores for First Foreign Language Translation for Methodologists.
Table 7. Winter Current, Final and Total Scores for First Foreign Language Translation for Methodologists.
MeanNStd. DeviationStd. Error Mean
Current Winter exam first-year Score: 2nd year or 1st year63.05 9712.3921.258
Current Winter exam second-year Score: 3rd year or 2nd year63.77 9711.1071.128
Final Winter exam first-year Score: 2nd year or 1st year 21.8010519.1171.866
Final Winter exam second-year Score: 3rd year or 2nd year17.68105 7.2010.703
Total Winter exam first-year Score: 2nd year or 1st year80.0510510.3651.012
Total Winter exam second-year Score: 3rd year or 2nd year80.5010511.8631.158
Table 8. Summer Current, Final and Total Scores for First Foreign Language Translation for Methodologists.
Table 8. Summer Current, Final and Total Scores for First Foreign Language Translation for Methodologists.
MeanNStd. DeviationStd. Error Mean
Current Summer exam first-year Score: 2nd year or 1st year64.86 9810.161.03
Current Summer exam second-year Score: 3rd year or 2nd year59.50 9814.091.42
Final exam first-year Score: 2nd year or 1st year16.4610213.141.30
Final exam second-year Score: 3rd year or 2nd year20.52102 8.680.86
Total Summer exam first-year Score: 2nd year or 1st year78.7710211.481.14
Total Summer exam second-year Score: 3rd year or 2nd year79.7310212.981.29
Table 9. Foreign Languages in the Framework of European Competencies (FLiFEC) Winter Current, Final and Total Scores.
Table 9. Foreign Languages in the Framework of European Competencies (FLiFEC) Winter Current, Final and Total Scores.
MeanNStd. DeviationStd. Error Mean
Current Winter first-year FLiFEC Score64.5412812.871.138
Current Winter second-year FLiFEC Score58.2012810.750.950
Final Winter first-year FLiFEC Score19.1017311.40 0.87
Final Winter second-year FLiFEC Score18.41173 7.78 0.59
Total Winter first-year FLiFEC Score79.6617412.10 0.92
Total Winter second-year FLiFEC Score77.9017412.08 0.92
Table 10. Foreign Languages in the Framework of European Competencies (FLiFEC) Summer Current, Final and Total Scores.
Table 10. Foreign Languages in the Framework of European Competencies (FLiFEC) Summer Current, Final and Total Scores.
MeanNStd. DeviationStd. Error Mean
Summer first-year FLiFEC Current Score63.6217011.870.91
Summer second-year FLiFEC Current Score63.7517013.981.07
Summer first-year FLiFEC Final Score16.12171 8.740.67
Summer second-year FLiFEC Final Score16.75171 3.680.28
Summer first-year FLiFEC Total Score78.5717312.550.95
Summer second-year FLiFEC Total Score80.0017315.331.17
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Topping, K.; Erokhova, N.; Sokolova, N. Was “Returning to Normal” More Effective? Comparing Online and Offline Learning in English as a Foreign Language. Educ. Sci. 2024, 14, 731. https://doi.org/10.3390/educsci14070731

AMA Style

Topping K, Erokhova N, Sokolova N. Was “Returning to Normal” More Effective? Comparing Online and Offline Learning in English as a Foreign Language. Education Sciences. 2024; 14(7):731. https://doi.org/10.3390/educsci14070731

Chicago/Turabian Style

Topping, Keith, Natalia Erokhova, and Nataliia Sokolova. 2024. "Was “Returning to Normal” More Effective? Comparing Online and Offline Learning in English as a Foreign Language" Education Sciences 14, no. 7: 731. https://doi.org/10.3390/educsci14070731

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop