Next Article in Journal
Predicting the Potential Risk Area of the Invasive Plant Galinsoga parviflora in Tibet Using the MaxEnt Model
Previous Article in Journal
Evolution of Polish E-Consumers’ Environmental Awareness and Purchasing Behavior over Ten Years
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Learning Ecosystem to Enhance Formative Assessment in Second Language Acquisition in Higher Education

by
Ana María Pinto-Llorente
* and
Vanessa Izquierdo-Álvarez
Research GRoup in InterAction and eLearning (GRIAL), Research Institute for Educational Sciences (IUCE), University of Salamanca, 37008 Salamanca, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(11), 4687; https://doi.org/10.3390/su16114687
Submission received: 19 April 2024 / Revised: 24 May 2024 / Accepted: 28 May 2024 / Published: 31 May 2024
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
Formative assessment (FA) provides lecturers and learners with information about the quality and development of different tasks. Formative feedback becomes an essential element that increases the quality of these tasks, guides learners in the process, and motivates them to be actively involved, acquiring the learning objectives and skills. A digital learning ecosystems (DLE) offers the possibility of a quality, effective, and sustainable education, enabling people to carry on with their training from anywhere, at any time, and through personal pathways. In this framework, we present a descriptive study with a qualitative approach to explore the views of 91 participants regarding the process of FA in second language acquisition (SLA) in a DLE. According to the results, the DLE provides the necessary tools to favour FA, and to obtain valuable data on the learners’ academic achievements in SLA. The learners take an active role in learning English and competence development such as reading, writing, and listening, and other skills such as digital competence, communication, problem solving, empathy, or creativity. The data obtained help lecturers to identify learners’ difficulties in SLA and address them with different didactic strategies. It can be concluded that FA plays a relevant role in SLA in a DLE.

1. Introduction

In today’s society, it is undeniable that digital technologies have a significant impact and have revolutionised different areas of our lives. In education, they have provided opportunities for learners and educators to use, interact, and create materials and tools to improve the teaching–learning process, as well as its quality, flexibility, and sustainability [1,2].
Educational technology experts, such as Area Moreira [3], do not hesitate to point out that digital technologies are having such an impact that contemporary society is becoming a symbiotic organism which is dependent on the digital ecosystem. In the context of higher education, this situation has led to the digitisation of universities, to a strong digital transformation of these traditional institutions to stop the process of the obsolescence of the conventional university pedagogy, and to move towards a more flexible university, a ubiquitous university, which can respond to the needs of citizens [3].
If we think about what an ecosystem or a natural ecosystem is in ecology, we can define it as a unit composed of plants, animals, and microorganisms. A unit that is made up of biotic factors that interact together with all the non-living physical factors, called the abiotic factors, of an environment in a specific area [4].
If we extrapolate this definition to the educational environment and make an analogy with the natural ecosystem, a digital learning ecosystem (DLE) would include some biotic components, such as users, actors, or communities (teachers, learners, tutors, etc.) and digital content (documents, images, videos, tools, etc.), as well as some abiotic components that would make up the environment, including pedagogical theories, teaching-learning methods, or some technologies [5]. We then have a digital ecosystem in which, as it happens in a natural ecosystem, a set of relationships and interactions between the biotic and abiotic components are established in a specific environment [6,7,8].
Accepting this definition and taking into account the previous research, these complex interactions between learners, learners and content, learners and interface, or learners and teachers [8], and the design of the DLE play a significant role in improving the teaching–learning process and the learning experience [2,8,9], in providing formative feedback [10,11], in helping learners develop 21st-century core skills [12,13,14], in transforming information into knowledge [15,16], or in promoting student engagement [2,8].
It is also important to highlight how the implementation of a well-designed DLE offers the possibility of quality [17], effective [13], and sustainable education [18] in the age of digital technology and artificial intelligence.
It must also provide citizens with the flexibility to study or train from anywhere, at any time, and through personalised learning pathways [19]. A DLE that enables formative assessments (FA) can improve learners’ competences [20]. In this sense, Black and Wiliam [21] refer to the concept of FA as a process in which different tasks are carried out by teachers and students, providing feedback that allows the modification of teaching and learning activities, as well as evidence that can be used to adapt the teaching to meet the students’ needs. A process that involves planning, discussion, consensus building, and reflection, in addition to measuring, analysing, and improving the performance [22] (p. 2), about the data gathered from authentic assessment tasks that enable the assessment of learners’ competence and the achievement of learning objectives.
The Organisation for Economic Co-operation and Development (OECD) [23] highlights how FA increases the retention of knowledge, and the effectiveness of the time students invest in their learning process. In this sense, FA tells us exactly where students are in terms of moving towards their learning [24].
Liao et al. [25] argue that FA is a necessary tool to improve the teaching–learning process. It is a pedagogical strategy that promotes learning [26] and the student self-regulation of learning. It makes students aware of their progress in the learning process, as well as their strengths and limitations [27].
Parmigiani et al. [28] suggest that there are three FA strategies, namely self-assessment, peer assessment, and group assessment. FA helps to make the learning process transparent as it provides insight into the key elements for the correct acquisition of knowledge [29]. As Divjak et al. [30] point out, the key to creating robust learning is coherence between intended learning outcomes, activities, and their alignment with formative and summative assessment. They also emphasise that FA is a predictor of students’ final academic achievement. In line with this issue, William and Leahy [31] consider that the impact of FA on learning is more relevant than other educational actions or interventions. Therefore, the key aspects of FA are based on knowing where learners are going, where learners are, and what learners need to achieve learning [21,32,33]. Thus, the greatest potential of FA lies in its contribution to the design of more effective and meaningful learning environments [28].

2. Literature Review

Regarding FA and second language acquisition (SLA), recent research [34] shows that FA is gaining ground. The reported experiences show that learners value its use. In this sense, both strategies involving the use of FA and classroom feedback are highly valued as very positive in terms of contributing to meaningful learning [35]. The study developed by Galora Moya and Salazar Tobar [36] in the context of higher education shows that FA supports students’ learning and autonomy. The main findings suggest that teacher feedback helps when learning difficulties occur, especially with low achievers or students with learning difficulties. It is concluded that FA for SLA in higher education favours learners who are more motivated, willing to correct mistakes, and engaged in learning.
Research on FA and SLA also emphasises that it is a helpful strategy for developing learner autonomy and reflection [37]. With regard to the research conducted by Kabilan and Khan [38], the results show that the use of the FA encourages students to reflect critically on their learning and to self-assess their strengths and weaknesses. On the other hand, the FA process in SLA is also a helpful strategy for promoting collaborative and cooperative learning through peer assessment [39], motivating learners [19] and providing formative feedback [39,40,41]. The study by Baker and Gossman [42] shows that the use of the FA process is crucial to generate motivation, communication, collaborative learning, or flexibility through personalised learning opportunities. Similar ideas are also expressed in the research carried out by Pinto-Llorente [43], which shows that the use of the FA in a DLE for SLA promotes the active role of learners, as it helps them to develop skills and knowledge, thus helping them to be more motivated. The findings of the study of Herrera [44] also show that incorporating the process of FA into a DLE for teaching and learning English makes it a much more motivating and exciting experience for learners, incorporating the media of the digital age. In short, the process of FA in a DLE for SLA contributes to learning as a dynamic and meaningful medium for learning.
Other previous studies also highlight how the experiences of FA and SLA have shown a significant reduction in learners’ anxiety [45]. The results of the research conducted by Ismail et al. [46] also show that the FA process was more effective than other strategies in increasing the motivation to learn, reducing test anxiety, and activating self-regulation strategies in English in second language learners. The research regarding FA and SLA also points out a relevant improvement in learners’ oral skills, writing skills, vocabulary, or grammar [47]. Other studies also emphasise how digital technologies enable the design and implementation of learning ecosystems [43,48], where all the protagonists of the teaching–learning process have access to the necessary tools for FA purposes [35,49].
As Vishnyakova et al. [50] point out, there is a need for further research in FA, calling for the development of new identification and innovative formats (pp. 68–69) to keep research up to date so that the academic community can keep abreast of this issue. Xu et al. [51] point out that some studies have tried to define how to design courses with technology to implement FA, emphasising that the main problem is its application in real teaching. On the other hand, Børte et al. [52] state that teachers usually adapt technology to what they already do in traditional face-to-face teaching, often to the detriment of promoting students’ active learning or adapting learning to the circumstances or needs of citizens in today’s society. Accordingly, it is necessary to explore new experiences in which digital ecosystems play a relevant role in the implementation of FA in SLA, rather than just trying to integrate some technological tools into traditional classrooms. It is necessary to develop some research that tries to develop designs that favour quality, effective, and sustainable education in digital environments to provide evidence regarding FA and its relevance to SLA. There is a need to explore its implications in this area and to conduct research on this topic. On this basis, the present research implements a DLE inspired by the social constructivist pedagogy. Specifically, the DLE is designed using Moodle (https://moodle.org/getinvolved (accessed on 18 April 2024)), a learning management system in which a modular hypermedia model is implemented. The constructivist environment favours the creation of knowledge and the active role of learners in the learning process, mainly through different didactic materials for knowledge transfer and collaborative activities. In this DLE, different learning strategies, such as transmissive (labels, files, and web pages), interactive (questionnaire, glossary, and task), collaborative (forum and wikis), and communicative (messages) resources, are used. The information is presented in themes or blocks of content, from the simplest to the most complex. Each of them has assessment, peer-assessment, and self-assessment tasks to encourage FA assessment and the participation of all protagonists of the teaching–learning process. The research is carried out to gather information about FA for learning English as a second language.

3. Materials and Methods

3.1. Objective and Research Questions

The main objective of this research was to explore and understand the participants’ views on the process of FA for SLA in a DLE.
The study addressed the following research questions:
  • RQ1. What are the perceptions of the learners regarding the process of FA for SLA in a DLE?
  • RQ2. Which competences are strengthened by implementing FA in SLA in a DLE?
  • RQ3. Does the process of FA offer opportunities to reflect on SLA?
  • RQ4. What is the overall impact of the feedback on the SLA in a DLE?

3.2. Method

Based on the objective of the research, a qualitative methodology with a phenomenological approach was used, which provided the opportunity to explore all their perceptions and views on the experience [53]. The research design was a descriptive study with a qualitative approach [54], and the grounded theory process was followed to categorise the dimensions. As Glaser and Strauss [55] point out, this process is dynamic and creative, and two basic strategies can be distinguished. On the one hand, theoretical sampling refers to the process of data collection, analysis, and categorisation carried out by researchers until theoretical saturation occurs; and, on the other hand, the constant comparative method, which aims to generate theory from a constant comparative analysis of the data collected. In this way, researchers encode and reflect on the nature of the collected data from the very beginning [56].

3.3. Participants

The researchers adopted a theoretical sampling approach, conducting the necessary interviews to achieve theoretical saturation and data saturation, and to meet the research objective. As Hernández et al. [53] point out, the size of the sample in a qualitative study is not relevant from a probabilistic perspective, since the researchers aim is to achieve the depth necessary to understand the phenomenon under study, to achieve the objective, and to answer the questions. According to Glaser and Strauss [55], theoretical saturation should be the limit of the sample size. This saturation means that additional data that can develop category properties are not to be found. They define it as the process of data collection for generating theory, whereby the analyst jointly collects, codes, and analyses his data and decides what data to collect next and where to find them, developing his theory as it emerges. This process of data collection is controlled by the emerging theory, whether substantive or formal (p. 45).
The population available for this study consisted of 451 students from the Faculty of Education, who were studying for a degree in Primary Education with a specialisation in English through blended learning. The final sample of the study consisted of 91 participants, which was sufficient to collect in-depth and complete information and to achieve theoretical saturation.

3.4. Instrument

The instrument chosen to collect data was the individual unstructured interview. Individual interviews allowed for individual discourse and the meanings and aspects of the interviewees’ inner world to be considered, and unstructured interviews allowed them to express themselves freely.
Before starting the fieldwork, the researchers prepared a general guide for the content, or topics related to the objectives of the study. They used this guide to lead each of the individual interviews conducted. The guide included the following questions: (1) In your opinion, what was your experience with the assessment process implemented in the DLE?; (2) Do you think that the DLE had the necessary technological tools to carry out FA of the process of second language acquisition? and to obtain valuable data on the learners’ academic achievements?; (3) What is the relevance of digital technologies in the implementation of FA?; (4) What are the roles of the teachers and the learners in the formative assessment process within the DLE?; (5) Are the assessment tasks and didactic materials available in the digital learning ecosystem appropriate to practise and improve English?; (6) Which skills are favoured?; (7) Do you think that the process of assessment has encouraged reflection on SLA?; (8) How do you perceive the role of feedback?; (9) Does it play a decisive role in SLA?; (10) Is it an essential element in learning a second language?; or (11) Does feedback encourage reflection on the SLA? However, the researchers were free to make the appropriate or necessary changes, according to each situation and participant, both in the order of the questions and their content.
Qualitative internal validity (credibility) was achieved by following the guidelines and recommendations proposed by Coleman and Unrau [57] and Hernández et al. [53]. 16 experts in research methods, education, English, and educational technology from the University of Salamanca agreed on the coding, which guaranteed the external validity (transferability) of the analysis process. It is important to clarify that each of the interviews was coded by a researcher and an expert. This made it possible to calculate the degree of agreement between two coders (inter-coder agreement). The researchers calculated the Cohen’s kappa coefficient (k) to measure the agreement between coders in the category system and obtained k = 0.81. This level of reliability meant that the coding was reliable and valid. The high level of agreement between coders meant that each category was part of the final category tree.

3.5. Data Collection and Analyses

The interviews were conducted over a period of three months. At the beginning of each interview, each interviewee was reminded of the research objectives and asked for consent to participate in the study and to be recorded. The interviews were recorded to facilitate later transcription and to increase their accuracy.
For the data analysis, it followed Miles and Huberman’s [58] set of suggestions for qualitative data analysis processes, as it was the most appropriate for the research. It distinguishes three basic tasks as follows: (a) data reduction; (b) organising and processing data; (c) generating results and testing conclusions (Figure 1). It is a convergent, recursive design for data analysis that ends when discourse saturation is reached.
Data reduction involves simplifying the large amount of data collected. First of all, categorisation and coding to identify and differentiate units of meaning are the most used methods for this purpose. In that respect, in this research, units were initially separated according to thematic criteria, and the line was used as the textual unit so as not to distort the meaning of the text. Secondly, the data were reduced. This involved identifying and classifying items, categorising, and coding data units to identify the thematic components that led to their classification in a particular content category. Each unit was assigned to a specific category. It used a hierarchical scheme in which the broadest categories were broken down into subcategories. Priori categories were established based on the previous research, to which emerging categories and subcategories were added through the reading of the transcriptions of the interviews. Thirdly, the synthesis and grouping phase was carried out, which resulted in a physical grouping of units belonging to the same category and a synthesis in a metacategory of information that had common points, but was contained in different categories.
Once the data reduction had been carried out, the second basic analytical task was conducted, involving the ordering and processing of the data. When the ordering of the data involves a change in the language used to express the data, it is called data transformation. One of these processes involves the use of graphs or diagrams to represent the data and to observe the deep structures and relationships within it.
This was followed by the phase of drawing and verifying conclusions, which involves presenting and interpreting the results and drawing conclusions from the study undertaken.

4. Results

4.1. Semantic Analysis Results

First of all, we will present the results of the semantic analysis. To conduct this analysis, we used the word frequency search function offered by NVivo 12. We searched for the 20 words that occurred most frequently in the 91 coded interviews. The resulting word cloud is shown in Figure 2. The words contained in this cloud have a size that is directly related to their frequency of occurrence. These words therefore reflect the issues most frequently raised by the participants in the study. The word that stands out the most is assessment (500 references), a term that is directly related to the objective of the study since the aim was to explore and understand the participants’ views on the process of FA in SLA in a DLE, followed by ecosystem (473 references); second language (424 references); and digital (419 references). The semantic analysis also highlighted the terms formative (393 references), feedback (387 references), competences (383 references), and reflection (379 references), among others.

4.2. Content Analysis Results

As we conducted the content analysis, several specific coding categories emerged (1. Competences, 2. Reflection L2, 3. Formative Feedback, and 4. Digital Learning Ecosystem), which allowed for the further description and interpretation of the data collected. This resulted in a concept map of categories that was created based on the research objectives and the participants’ perceptions of FA in a DLE (Figure 3).
In addition, the following table (Table 1) shows the results obtained for each of the categories mentioned above in terms of the total number of text units (lines) coded in each category (it is represented by the letter A in the table), the number of interviews in which the participants refer to each category (it is represented by the letter B in the table), the percentage of the number of interviews in which the participants refer to each category in relation to the total number of interviews (it is represented by the letter C in the table), the number of text units (lines) in the interviews in which the participants refer to each category (it is represented by the letter D in the table), the percentage of the total number of text units (lines) coded of each category in relation to the total text units (lines) of the interviews in which the participants refer to each category (it is represented by the letter E in the table), and the percentage of A in relation to the total number of text units (lines) of all individual interviews (It is represented by the letter F in the table). Regarding these data, the categories in which more lines were coded were those that referred to the Competences (n = 5752 lines) and Digital Learning Ecosystem (n = 5256) categories. These results represent 30% and 27%, respectively, of the total number of lines coded in these categories in relation to the total number of the lines coded in the 91 interviews. These results also show that the participants referred to the category of Reflection L2 in all the interviews (n = 91) and to Formative Feedback in almost all of them (n = 90).
The results obtained for each of the categories are then presented in more detail. Extracts and quotations from the coded interviews are also included to illustrate the findings. Each of these fragments was given a code according to the following structure: participant number, gender, and age.

4.2.1. Category: Digital Learning Ecosystem

Regarding the results obtained in relation to RQ1. What are the perceptions of the learners regarding the process of FA for SLA in a DLE?, the participants felt that the experience developed in this digital ecosystem provided the lecturers with the appropriate and necessary tools to carry out the FA of the process of SLA and to obtain valuable data on the learners’ academic achievements in order to adapt the didactic materials and the tasks to their specific levels and needs. They emphasised that the data obtained through FA helped lecturers to identify where learners had difficulties in learning a second language and to address them with different didactic strategies.
In addition, they highlighted the positive aspect of having an environment in which the FA was a core element because they felt that all the work developed online and their progress in SLA was valued.
“The assessment tasks that we have developed in the different modules of the digital learning ecosystem are appropriate for the lecturers to measure our progress and adapt the materials and activities to our needs. We did a lot of work remotely during the development of the subject, and this is the fairest way to keep that in mind.”
(Participant_69, woman, 27)
“We have done a variety of online tasks that have allowed us to practise English grammar, as well as skills such as reading, writing, or listening. These activities, which we did remotely during the course, have been considered in determining our final grade, which is undoubtedly more meaningful than only basing the grade on a final exam.”
(Participant_80, man, 32)
“One thing that I would like to highlight about the digital learning ecosystem designed and implemented in this subject is the use made of a variety of tasks and technological tools because through them we were continuously assessed because the different activities completed were considered. The lecturers have assessed us continuously and not just through a final exam.”
(Participant_57, woman, 33)
“We are given several activities to complete throughout the subject, and I think that this has been good because the continuous work helped me to acquire and practise the course content and improve my English skills. In addition, it has also meant that we could be constantly monitored as most of the activities set and completed during the course were assessed.”
(Participant_71, man, 43)
The results of the content analysis also refer to the active role of the protagonists of the teaching–learning process and to the different roles that the learners and the lecturers acquired. In the participants’ discourse, it is possible to see their perception of how the tasks that they carried out using different technological tools had allowed them to play a more active role in their learning of this foreign language. They were able to make decisions according to their personal circumstances or needs, set their own learning pace, monitor their progress, and take individual responsibility. The results also show the importance of the role of the lecturers as guides or advisors in the teaching and learning process, as well as the great effort they had previously made in designing the digital ecosystem implemented. The participants stressed that the work completed by the lecturers before and during the course had been essential for the success of the teaching–learning process. Similarly, they stated that the involvement of the lecturers was a necessary part of the process, as they had to master the technological tools used to design, structure, and plan them properly, and anticipate the difficulties that the learners might have in completing them, which could have led the learners to make mistakes that would have hindered their learning.
“We have had a very active role throughout the course, and we were fully involved as the activities unfolded, making decisions and taking responsibilities.”
(Participant 39, woman, 24)
“I think that our role was very active. We had the opportunity to make decisions according to our circumstances or the needs of our level in the second language. In a way, we were responsible for our learning and our progress, we set our own pace for learning.”
(Participant_79, woman, 49)
“I would like to highlight the opportunities that were offered to us in the digital ecosystem to complete the tasks offered to us according to our needs with the aim of being autonomous in our learning of English. All of this was possible, of course, thanks to the essential work of the lecturers, who have continuously guided us through the process.”
(Participant_24, woman, 31)
“The lecturers were essential, from the design of the tasks set through the different technological tools to the way we did them. I think that their work was essential, and I think that they put a lot of effort into guiding us and helping us through the process.”
(Participant_13, woman, 26)

4.2.2. Category: Competences

As far as the results of RQ2. Which competences are strengthened by implementing FA in SLA in a DLE? are concerned, there is clear evidence of the participants’ positive opinions about the possibilities offered by the tasks and didactic materials available in the different modules of the digital technological ecosystem of the subject to practise and to improve their level of English. More specifically, the participants referred to the opportunities to practise the core skills of the language, skills such as reading, writing, and listening, through a variety of tasks available to them.
“We were able to practise listening in different tasks and, thus, to improve this skill. A key skill in learning a second language that allows us to extract and to understand the most important information from the different discourses we listened to, as well as to be prepared to do different assessment tasks focused on this skill.”
(Participant_70, man, 36)
“In the digital ecosystem, we have had different tasks developed through several technological tools that have helped us to measure our ability to understand native people with different accents. They have also allowed us to discriminate the information that we received and to retain the most important parts of it, or to understand the small details that we were asked about in the assessment activities.”
(Participant_46, woman, 28)
“I think that the tasks available in the digital ecosystem were very suitable to meet our language needs and to achieve the objectives set, especially to practise and improve listening, reading, and writing.”
(Participant_ 61, woman, 40)
“We have been given many activities. They have helped us to improve our oral and written comprehension, as well as other specific contents related to the objectives of the subject.”
(Participant_33, man, 33)
“The activities that we have done have helped us to gradually improve our oral comprehension. This is one of the most complicated skills and one of the most difficult to improve in, but we have had enough practise, and this has been the best way to do it.”
(Participant_83, man, 24)
The participants’ discourse also highlighted that the digital ecosystem and the technological tools used in the design and development of the tasks provided the learners with several opportunities to improve their digital competence and their confidence in using technological tools to improve their level of English and to provide them with a more personalised learning experience. The participants indicated that they could use a variety of technological tools such as forums, workshops, or quizzes, and digital materials, such as digital presentations or documents, videocasts, or podcasts, with a learning purpose.
“I think that my digital competence has improved as a result of this experience, and this is directly due to the possibilities we have had to use different technological tools such as quizzes or forums to develop the tasks proposed and to communicate with the rest of the students and the lecturers.”
(Participant_48, woman, 24)
“We have been able to use a good variety of technological tools to work on the different contents of the subject, to improve our level of English, and, at the same time, to improve our digital competence.”
(Participant_1, woman, 42)
“I think that the use of digital technologies has made the experience more personal and has allowed us to choose practices or materials according to our needs.”
(Participant_19, man, 28)
“We have had the opportunity to participate in workshops to assess our peers, and to access the content or to practise the skills for example through videocasts or podcasts.”
(Participant_43, man, 33)
The participants emphasised the aspects of digital competence that had been improved. They thought that this experience had favoured communication and collaboration among themselves, and between them and the lecturers through digital technologies, allowing them to share knowledge and create a learning community. They also stressed that the experience promoted the development of a relevant competence related to communication and collaboration, adopting an empathic communication between all the protagonists of the teaching–learning process, respecting the different points of view and ideas of their peers, as well as negotiating the development of the group tasks; a competence that also favoured the feeling of belonging and, consequently, reduced the feeling of loneliness. The participants highlighted that the positive interactions between learners and between learners and lecturers played a relevant role in learning a second language and in improving the learners’ confidence in this language. On the other hand, it was emphasised how the experience facilitated the creation of digital content. They had the opportunity to create knowledge by developing different digital tasks in the second language. Finally, the experience developed helped the participants to improve the area of digital competence related to problem solving, being able to identify and solve problems when using the technological tools available in the DLE. Moreover, the use of digital tools to create knowledge helped them to do so in an innovative way, improving their creativity.
“The technological tools available in the digital ecosystem have facilitated communication and collaboration between us, and with the lecturers. We have been able to share our practices and experiences.”
(Participant_17, woman, 32)
“The fact that we have been able to communicate synchronously and asynchronously with our classmates and lecturers has fostered a sense of belonging.”
(Participant_38, woman, 45)
“Our interactions in some forums have been in the second language and this has allowed us to practise and to improve our level of English.”
(Participant_7, man, 43)
“We have had the opportunity to develop different digital materials, to create practices in the second language, and this is a relevant aspect for our future practice as it is an essential part of our work as teachers.”
(Participant_15, woman, 35)
“We have worked together to solve problems and to create innovative experiences in the second language.”
(Participant_45, man, 34)
Finally, the participants emphasised that they had had the opportunity to improve and reflect on their level of digital competence, to understand where they needed to improve or update, and to appreciate the importance of having a good level of digital competence in order to continue to improve in the learning of a second language.
“It is clear that it is necessary to have a good digital competence as it is important in all areas of our lives and, of course, in learning a second language.”
(Participant_62, man, 23)
“It is important that we can improve our digital competence so that we are up to date with the use of different technological tools for teaching and learning English. If our digital competence is good, we will be more confident to use different technological tools in the classes we will teach in the future.”
(Participant_51, man, 34)

4.2.3. Category: Reflection on the L2

Regarding the results obtained concerning RQ3. Does the process of FA offer opportunities to reflect on SLA?, the participants highlighted how, throughout the development of the subject, they had the opportunity to reflect on their progress, on what they were learning, and how this reflection had helped them to increase their self-confidence in SLA, which had had a relevant impact on the achievement of the learning outcomes. The participants felt that reflection was greatly enhanced by the possibilities they were given to participate, developing different individual and group activities related to the different contents of the subject.
It was also emphasised how the digital ecosystem implemented had fostered an environment for the participants to think critically about their strengths and weaknesses in learning English. They pointed out how they could reflect on the whole process of SLA, on the one hand, thanks to the lecturers’ supervision, and, on the other hand, the progress that they saw in their level of English after receiving the results of the tasks they had completed.
The learners emphasised the freedom that they had to organise the whole learning process in the DLE. This freedom was seen as a key element in encouraging reflection and self-directed action towards language outcomes. They pointed out how they had been able to carry out the activities themselves, choosing when it was possible, and those best suited to their interests or needs. They also added how reflection, how thinking critically about their progress, had helped them to acquire some competences that allowed them to continue learning English autonomously. Autonomous learning was seen as essential for SLA, as they stressed that it was a process in which they often had to be independent, and that learning had to be a lifelong process.
“We have had the opportunity to do many activities that helped us to practise the second language and to reflect on our progress. This practice has helped us not only to improve our level of English but also our confidence in the language and its use in different contexts. It is complicated to get it, but I think that it is possible with good supervision and enough practice. That is the best way to get it.”
(Participant_9, woman, 25)
“We took part in an experience in which the activities designed have allowed us to work on our own, with us taking full responsibility for these tasks. It was a very effective way of learning that encouraged autonomous learning of a second language.”
(Participant_35, woman, 43)
“The fact that there were some activities that we could do for a long time and as often as we wanted or needed to be encouraged autonomous learning, and gave us the freedom to organise the whole learning process according to our needs and reflect about what we were learning.”
(Participant_72, man, 37)
“In the digital ecosystem implemented, each of us has been responsible for the work to be done, and we have chosen the ones that have been best suited to our needs and have done them when we can, based on our personal and professional circumstances, based on our level, our progress.”
(Participant_77, man, 29)

4.2.4. Formative Feedback

Regarding the results of the RQ4. What is the overall impact of the feedback on the SLA in a DLE?, the analysis of the participants’ discourse showed its undeniable value in learning a second language. It was an essential element that provided quality to the implemented DLE since it allowed the learners to obtain clear and immediate clarification on the tasks, on the learning products, or on subject content about which they were uncertain or on which they could improve their performance. They felt that the feedback allowed them to know where they were regarding the learning objectives of the subject and to improve the quality of the activities they had to carry out in the different modules of the course.
The participants pointed out how feedback provided them with a realistic view of their level and of their progress at any given time, and it guided them through the process of learning a second language, helping them to reflect on what they had acquired, their needs, or any gaps in their knowledge. In this sense, feedback was described as helpful information that motivated and engaged the learners to take an active role in their learning and the development of their second language skills.
Finally, the research findings highlighted that the technological tools available in the DLE had supported or had facilitated continuous formative feedback, providing opportunities for lecturers and learners to engage in a continuous dialogue, and to offer immediate feedback regarding the learning products, tasks, or progress in the second language.
“When we completed some of the tasks, such as the tests, we received immediate feedback, as the questions were accompanied by an explanation as to why that was the correct answer and not the alternatives we were given.”
(Participant_32, man, 24)
“We received feedback from the lecturers because when we completed the tasks, we always received an explanation for each of the answers, which meant that we could understand our mistakes and what we had got right, and it also meant that we knew what aspects to focus on.”
(Participant_65, woman, 26)
“An important part of the assignments was the feedback that we received immediately and in context. This was necessary to progress through the course content with greater certainty and with a much greater sense of confidence when it came to completing the assessment tasks and the final exam.”
(Participant_71, man, 43)
“The formative feedback we received at the end of each assignment gave us a much more realistic view of our needs and our progress through the course. It gave each of us information about our performance and progress.”
(Participant_16, woman, 45)
“The immediate feedback that we received was very useful because we quickly knew what we had done wrong, and we were given appropriate explanations that helped us clarify things that we were unsure about.”
(Participant_37, woman, 31)
“Technology has played an important role because it has enabled us to get continuous feedback on our assignments.”
(Participant_32, man 24)
“The technological tools such as quizzes or forums have been key for feedback. These tools have allowed us to establish a fluid communication between all of us, between the learners, and between the learners and the lecturers.”
(Participant_79, woman, 49)

5. Discussion

In relation to the first research question, concerning students’ perceptions of the FA process for SLA in a DLE, it can be noted that the results showed that technology was a core element for the implementation of the FA process in the DLE [35,47,48,49,59]. It allowed the lecturers to define different tasks and didactic materials according to the learners’ needs, as well as to access the data of their participation results and progress. These data allowed the lecturers to be aware of the students’ needs, to provide answers to them, and to guide them in the SLA [39,40,41]. These findings are consistent with those reported by Galora Moya and Salazar Tobar [36], for whom the importance lies in how feedback is relayed to learners, seeing it as an engine of change for significant learning. The collection of data concerning the learners’ progress also allowed the lecturers to adjust the lessons to achieve the learning goals and to adapt the teaching process to the learners’ needs in order to provide them with an adequate response that would allow them to continue to make progress in learning a second language [35,36]. In this sense, the results are in line with the study conducted by Saglam [48], which shows the power of digital tools to provide diagnostic information that makes it possible to adapt learning to learners’ individual needs.
Regarding the second research question, which relates to the competences that are strengthened by the implementation of the FA in SLA in a DLE, a clear and relevant factor for the success of FA in the DLE was the active role of the learners and the lecturers. In this sense, it was crucial for the students to assume their role and to be actively involved, taking responsibility for the SLA, making decisions according to their progress or needs, and managing their learning [15,19,49,60,61]. The lecturers, on the other hand, had to continuously monitor the learners’ SLA and had to make some important decisions in the design of the lessons to adapt them to their needs. The study carried out by Galora Moya and Salazar Tobar [36] agrees with this aspect, pointing out that success lies in the personalisation of learning. In this sense, the lecturers had to support and help the students to develop the core competences of a second language, such as reading, writing, and listening [47,48], as well as other competences such as digital competence, communication and collaboration skills, empathy, problem solving, creativity, or critical thinking, which are essential for their future profession as teachers [12,13,14]. These findings are consistent with previous studies, such as the research conducted by Xu et al. [51], whose results show how the adoption of formative assessment improves knowledge acquisition while enhancing learners’ ability to develop complex skills. In addition, they had to try to make the students more autonomous in their learning to become more self-directed learners [27,37,38,59], allowing them to know their weaknesses and strengths, and to plan their learning pathways now and in the future. Along the same lines, the results of the study developed by Kabilan and Khan [38] show that the effective use of the FA in the teaching–learning process contributes to self-reflection, critical thinking, or knowledge of learners’ strengths and weaknesses.
Concerning the third research question, which is related to the opportunities offered by the process of FA to reflect on SLA, it should be noted that an immediate and continuous flow of formative feedback was the key to the successful implementation of FA in a DLE, in which technology supported or facilitated this formative feedback [10,11,21,39]. Feedback showed the learners’ weaknesses and strengths in SLA, as well as suggestions to help them to improve their level, performance, and competences, and to achieve the learning goals. It also allowed the lecturers to know which activities and didactic materials were adequate, and which needed to be changed [37,38]. Formative feedback provided an opportunity for the lecturers and the learners to engage in a continuous dialogue [40]. Similar findings were reported in the study developed by Burner [41], who concluded that the understanding between teachers and learners is the key to realising the potential of FA. It also motivated and engaged the learners to take an active role in SLA and competence development [47,48]. In this respect, the Bachelor and Bachelor study [40] concluded that active learning helps learners make meaningful use of a second language.
The FA process did not only allow the lecturers to collect data on the learners’ performance and progress to plan the lessons, but it also allowed the learners to reflect on the SLA, what they were learning, their needs, and the gaps in their knowledge [37,38]. The critical reflection on their strengths and weaknesses also increased, on the one hand, their self-confidence in their progress in learning the second language, which had a relevant impact on the achievement of the learning outcomes, and, on the other hand, their autonomy in learning a second language as a lifelong process.
Finally, the fourth research question was concerned with the overall impact of feedback on SLA in a DLE. Based on the results, it can be concluded that feedback plays a relevant role in the learning of a second language in that digital environment. In this sense, the research conducted by Caruso et al. [62] agree that feedback is one of the central elements for the correct learning of a second language. They believe that feedback based on interactive comments is necessary, as it can more effectively engage the learner to make a real commitment to learning and develop a real awareness of their second language learning. Other studies on feedback and SLA have found that feedback generates positive emotions in learners [63,64], so that they are grateful when they receive feedback on their tasks [63]. Therefore, the main findings about feedback on SLA in a DLE show that the inclusion of FA allows lecturers to know learners critically, to know where they are in terms of their learning, with the possibility of being able to adjust their efforts in one direction or another. The inclusion of FA in a DLE for SLA makes it a balanced and complete ecosystem on many levels. This idea is supported by Cosi et al. [65], whose research showed that FA through digital technology tools significantly improves not only academic performance, but also learner and teacher satisfaction and student engagement [66].

6. Conclusions

In this research, we explored the role of the FA for SLA in a DLE. Our findings have shown the relevance of incorporating the FA in the DLE implemented for learning English as a second language, as it has helped both lecturers and learners to become aware of the teaching–learning process. The main benefits identified in this study are related, on the one hand, to the knowledge that the FA provides to lecturers about their learners in terms of learning needs and learning outcomes, and, on the other hand, to the channel of communication that has been established between them, which has made it possible to personalise learning. It has also helped the learners to become more engaged and motivated in their second language learning, and to develop the core language skills and other transversal key competences such as digital competence, which are necessary for active participation in the 21st-century society.
Overall, the evidence provides a favourable perspective for the incorporation of the FA for SLA in a DLE. However, despite the satisfactory results of the FA process for SLA in a DLE, limitations were also identified. Regarding the research design, the sample of students belongs to only one university and one branch of knowledge, which limits the generalisability of the results. In this sense, the results and limitations found allow us to think about future lines of research. On the one hand, the sample could be extended to other universities, degrees, and branches of knowledge. On the other hand, regarding the research design, a quantitative study is suggested to complement the qualitative results already found.

Author Contributions

Conceptualisation, A.M.P.-L. and V.I.-Á.; Data curation, A.M.P.-L. and V.I.-Á.; Formal analysis, A.M.P.-L. and V.I.-Á.; Funding acquisition, A.M.P.-L. and V.I.-Á.; Investigation, A.M.P.-L. and V.I.-Á.; Methodology, A.M.P.-L. and V.I.-Á.; Project administration, A.M.P.-L. and V.I.-Á.; Resources, A.M.P.-L. and V.I.-Á.; Software, A.M.P.-L. and V.I.-Á.; Supervision, A.M.P.-L. and V.I.-Á.; Validation, A.M.P.-L. and V.I.-Á.; Visualisation, A.M.P.-L. and V.I.-Á.; Writing—original draft, A.M.P.-L. and V.I.-Á.; Writing—review and editing, A.M.P.-L. and V.I.-Á. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the ethical guidelines of the European Educational Research Association (EERA) which promotes ethical educational research, as well as the Ethical Guidelines for Educational Research of the British Educational Research Association (BERA).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Abdulrahim, H.; Mabrouk, F. COVID-19 and the digital transformation of Saudi Higher Education. Asian J. Distance Educ. 2020, 15, 291–306. [Google Scholar]
  2. Nguyen, L.T.; Tuamsuk, K. Digital learning ecosystem at educational institutions: A content analysis of scholarly discourse. Cogent Educ. 2022, 9, 2111033. [Google Scholar] [CrossRef]
  3. Area Moreira, M. La Enseñanza Universitaria Digital. Fundamentos Pedagógicos y Tendencias Actuales; Universidad de la Laguna: Santa Cruz de Tenerife, Spain, 2019. [Google Scholar]
  4. Ficheman, I.K.; de Deus Lopes, R. Digital learning ecosystems: Authoring, collaboration, immersion and mobility. In Proceedings of the 7th International Conference on Interaction Design and Children, Association for Computing Machinery, Chicago, IL, USA, 11–13 June 2008. [Google Scholar] [CrossRef]
  5. Kummanee, J.; Nilsook, P.; Wannapiroon, P. Digital learning ecosystem involving STEAM gamification for a vocational innovator. Int. J. Inf. Educ. Technol. 2020, 10, 533–539. [Google Scholar] [CrossRef]
  6. Nguyen, L.T.; Kanjug, I.; Lowatcharin, G.; Manakul, T.; Poonpon, K.; Sarakorn, W.; Somabut, A.; Srisawasdi, N.; Traiyarach, S.; Tuamsuk, K. Digital Learning Ecosystem for Classroom Teaching in Thailand High Schools. Sage Open 2023, 13, 21582440231158303. [Google Scholar] [CrossRef]
  7. Laanpere, M.; Pata, K.; Normak, P.; Poldoja, H. Pedagogy-driven design of digital learning ecosystems. Comput. Sci. Inf. Syst. 2014, 11, 419–442. [Google Scholar] [CrossRef]
  8. Reyna, J. Digital teaching and learning ecosystem (DTLE): A theoretical approach for online learning environments. In Proceedings of the Ascilite 2011: Changing Demands, Changing Directions, Hobart, TAS, Australia, 4–7 December 2011. [Google Scholar]
  9. Lawrence, C.R.; Lorraine, J.R.V. Emergent guiding principles for STEM education. In Innovative Learning Environments in STEM Higher Education. Opportunities, Challenges, and Looking Forward; Ryoo, J., Winkelmann, K., Eds.; Springer: New York, NY, USA, 2021; pp. 107–120. [Google Scholar] [CrossRef]
  10. Espasa, A.; Mayordomo, R.M.; Guasch, T.; Martinez-Melo, M. Does the type of feedback channel used in online learning environments matter? Students’ perceptions and impact on learning. Act. Learn. High. Educ. 2022, 23, 49–63. [Google Scholar] [CrossRef]
  11. Pentucci, M.; Sarra, A.; Laici, C. Feedback to align teacher and student in a Digital Learning Ecosystem. Educ. Sci. Soc. 2023, 14, 242–260. [Google Scholar] [CrossRef]
  12. Besnoy, K.D.; Dantzler, J.A.; Siders, J.A. Creating a digital ecosystem for the gifted education classroom. J. Adv. Acad. 2012, 23, 305–325. [Google Scholar] [CrossRef]
  13. Tuamsuk, K.; Nguyen, L.T.; Kanjug, I.; Lowatcharin, G.; Manakul, T.; Poonpon, K.; Sarakorn, W.; Somabut, A.; Srisawasdi, N.; Traiyarach, S. Key success factors for transforming classrooms into learning communities in digital learning ecosystem at secondary schools in Thailand. Contemp. Educ. Technol. 2023, 15, ep408. [Google Scholar] [CrossRef]
  14. Yashalova, N.N.; Vasiltsov, V.S. Digital Education: New Challenges and Opportunities. Sci. Tech. Inf. Process. 2020, 47, 260–265. [Google Scholar] [CrossRef]
  15. Pinto-Llorente, A.M. Assessing the Impact of a Digital Ecosystem to Learn English Pronunciation. In ICT-Based Assessment, Methods, and Programs in Tertiary Education; Yilan, S.M., Koruyan, K., Eds.; IGI Global: Hershey, PA, USA, 2020; pp. 23–44. [Google Scholar] [CrossRef]
  16. Beggan, A. Digital learning ecosystems: Discussing the outcomes of a principle led investigation into an alternative VLE. J. High. Educ. Res. 2020, 3, 1–15. [Google Scholar]
  17. Izquierdo-Álvarez, V.; Pinto-Llorente, A.M. Opportunities and Challenges of E-Learning in Spanish Institutions of Higher Education. In Challenges and Opportunities for the Global Implementation of E-Learning Frameworks; Khan, B.H., Affouneh, S., Salha, S.H., Khlaif, Z.N., Eds.; IGI Global: Hershey, PA, USA, 2021; pp. 112–127. [Google Scholar] [CrossRef]
  18. Jeladze, E.; Pata, K. Smart, Digitally Enhanced Learning Ecosystems: Bottlenecks to Sustainability in Georgia. Sustainability 2018, 10, 2672. [Google Scholar] [CrossRef]
  19. Pinto-Llorente, A.M. A Digital Ecosystem for Teaching-Learning English in Higher Education. A Qualitative Case Study. In ICT-Based Assessment, Methods, and Programs in Tertiary Education; Yilan, S.M., Koruyan, K., Eds.; IGI Global: Hershey, PA, USA, 2020; pp. 257–276. [Google Scholar] [CrossRef]
  20. Wolff, C.; Reimann, C.; Mikhaylova, E.; Aldaghamin, A.; Pampus, S.; Hermann, E. Digital Education Ecosystem (DEE) for a Virtual Master School. In Proceedings of the 2021 IEEE International Conference on Smart Information Systems and Technologies (SIST), Nur-Sultan, Kazakhstan, 28–30 April 2021. [Google Scholar] [CrossRef]
  21. Black, P.; Wiliam, D. Developing the theory of formative assessment. Educ. Assess. Eval. Account. 2009, 21, 5–31. [Google Scholar] [CrossRef]
  22. Martell, K.; Calderon, T. (Eds.) Assessment of student learning in business schools: What it is, where we are, and where we need to go next. In Assessment of Student Learning in Business Schools: Best Practices Each Step of the Way; Association for Institutional Research: Tallahassee, FL, USA, 2005; pp. 1–22. [Google Scholar]
  23. Organisation for Economic Co-Operation and Development (OECD). Formative Assessment: Improving Learning in Secondary Classrooms; OECD: Paris, France, 2005. [Google Scholar]
  24. Garrison, C.; Ehringhaus, M. Formative and Summative Assessments in the Classroom; Measured Progress: Dover, NH, USA, 2020. [Google Scholar]
  25. Liao, X.; Zhang, X.; Wang, Z.F.; Luo, H. Design and implementation of an AI-enabled visual report tool as formative assessment to promote learning achievement and self-regulated learning: An experimental study. Br. J. Educ. Technol. 2024, 55, 1253–1276. [Google Scholar] [CrossRef]
  26. Sadler, D.R. Evaluación formativa y diseño de sistemas instruccionales. Cienc. Instr. 1989, 18, 119–144. [Google Scholar] [CrossRef]
  27. Tirado-Olivares, S.; López-Fernández, C.; González-Calero, J.A.; Cózar-Gutiérrez, R. Enhancing historical thinking through learning analytics in Primary Education: A bridge to formative assessment. Educ. Inf. Technol. 2024, 1–25. [Google Scholar] [CrossRef]
  28. Parmigiani, D.; Nicchia, E.; Murgia, E.; Ingersoll, M. Formative assessment in higher education: An exploratory study within programs for professionals in education. Front. Educ. 2024, 9, 1366215. [Google Scholar] [CrossRef]
  29. Slingerland, M.; Weeldenburg, G.; Borghouts, L. Formative assessment in physical education: Teachers’ experiences when designing and implementing formative assessment activities. Eur. Phys. Educ. Rev. 2024, 1356336X241237398. [Google Scholar] [CrossRef]
  30. Divjak, B.; Svetec, B.; Horvat, D. How can valid and reliable automatic formative assessment predict the acquisition of learning outcomes? J. Comput. Assist. Learn. 2024, 1–17. [Google Scholar] [CrossRef]
  31. Wiliam, D.; Leahy, S. Formatieve Assessment Integreren in de Praktijk; Bazalt Educatieve Uitgaven: Hague, The Netherlands, 2018. [Google Scholar]
  32. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  33. Leahy, S.; Lyon, C.; Thompson, M.M.; Wiliam, D. Classroom assessment: Minute by minute, day by day. Educ. Leadersh. 2005, 63, 19–24. [Google Scholar]
  34. Vassiliou, S.; Papadima-Sophocleous, S.; Giannikas, C.N. Formative Assessment in Second Language Learning: A Systematic Review and an Annotated Bibliography; Research-Publishing.net: Doubs, France, 2022. [Google Scholar] [CrossRef]
  35. Perera-Diltz, D.M.; Moe, J.L. Formative and summative assessment in online education. J. Res. Innov. Teach. 2014, 7, 130–142. [Google Scholar]
  36. Galora Moya, N.P.; Salazar Tobar, M.C. Formative Evaluation and Formative Feedback: An Effective Practice to Promote Student Learning in Higher Education. Rev. Publicando 2017, 4, 321–333. [Google Scholar]
  37. Babaee, M.; Tikoduadua, M. E-portfolios: A new trend in formative writing assessment. Int. J. Mod. Educ. Forum 2013, 2, 49–56. [Google Scholar]
  38. Kabilan, M.K.; Khan, M.A. Assessing pre-service English language teachers’ learning using e-portfolios: Benefits, challenges and competencies gained. Comput. Educ. 2012, 58, 1007–1020. [Google Scholar] [CrossRef]
  39. Almalki, M.S.; Gruba, P. Conceptualizing Formative Blended Assessment (FBA) in Saudi EFL. In ICT-Based Assessment, Methods, and Programs in Tertiary Education; Yilan, S.M., Koruyan, K., Eds.; IGI Global: Hershey, PA, USA, 2020; pp. 65–82. [Google Scholar] [CrossRef]
  40. Bachelor, J.W.; Bachelor, R.B. Classroom currency as a means of formative feedback, reflection, and assessment in the world language classroom. NECTFL Rev. 2016, 78, 31–42. [Google Scholar]
  41. Burner, T. Formative assessment of writing in English as a foreign language. Scand. J. Educ. Res. 2016, 60, 626–648. [Google Scholar] [CrossRef]
  42. Baker, J.; Gossman, P. The learning impact of a virtual learning environment: Students’ views. Teach. Educ. Adv. Netw. J. 2013, 5, 19–38. [Google Scholar]
  43. Pinto-Llorente, A.M. Pre-Service Teachers’ Perceptions of the Effectiveness of a Virtual Learning Environment to Support a Learner-Centred Approach: A Qualitative Study. In Handbook of Research on Modern Educational Technologies, Applications, and Management; Khosrow-Pour, M., Ed.; IGI Global: Hershey, PA, USA, 2020; pp. 870–887. [Google Scholar] [CrossRef]
  44. Herrera, L. Impact of Implementing a Virtual Learning Environment (VLE) in the EFL Classroom. Íkala Rev. Leng. Cult. 2017, 22, 479–498. [Google Scholar] [CrossRef]
  45. Bayat, A.; Jamshidipour, A.; Hashemi, M. The beneficial impacts of applying formative assessment on Iranian university students’ anxiety reduction and listening efficacy. Int. J. Lang. Educ. Teach. 2017, 5, 1–11. [Google Scholar] [CrossRef]
  46. Ismail, S.M.; Rahul, D.R.; Patra, I.; Rezvani, E. Formative vs. summative assessment: Impacts on academic motivation, attitude toward learning, test anxiety, and self-regulation skill. Lang. Test. Asia 2022, 12, 40. [Google Scholar] [CrossRef]
  47. Pinto-Llorente, A.M.; Sánchez-Gómez, M.C.; García-Peñalvo, F.J.; Martín, S.C. The use of online quizzes for continuous assessment and self-assessment of second-language learners. In Proceedings of the Fourth International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, 2–4 November 2016. [Google Scholar] [CrossRef]
  48. Saglam, A.L.G. The integration of educational technology for classroom-based formative assessment to empower teaching and learning. In Handbook of Research on Mobile Devices and Smart Gadgets in K-12 Education; Khan, A., Umair, S., Eds.; IGI Global: Hershey, PA, USA, 2018; pp. 321–341. [Google Scholar] [CrossRef]
  49. Yeh, E.; Swinehart, N.A. Learner-Centered Approach to Technology Integration: Online Geographical Tools in the ESL Classroom. In Handbook of Research on Learner-Centered Pedagogy in Teacher Education and Professional Development; Keengwe, J., Onchwari, G., Eds.; IGI Global: Hershey, PA, USA, 2017; pp. 1–22. [Google Scholar] [CrossRef]
  50. Vishnyakova, O.D.; Markova, E.S.; Leonov, T.V. The Role of Prior Knowledge in Formative Assessment for Linguistic Competence Development. Prof. Discourse Commun. 2023, 5, 68–78. [Google Scholar] [CrossRef]
  51. Xu, X.L.; Shen, W.Q.; Islam, A.Y.M.; Zhou, Y. A whole learning process-oriented formative assessment framework to cultivate complex skills. Humanit. Soc. Sci. Commun. 2023, 10, 653. [Google Scholar] [CrossRef]
  52. Børte, K.; Lillejord, S.; Chan, J.; Wasson, B.; Greiff, S. Prerequisites for teachers’ technology use in formative assessment practices: A systematic review. Educ. Res. Rev. 2023, 41, 100568. [Google Scholar] [CrossRef]
  53. Hernández, R.; Fernández, C.; Baptista, P. Metodología de la Investigación, 6th ed.; McGraw Hill Education: Ciudad de Mexico, Mexico, 2014. [Google Scholar]
  54. Creswell, J.; Guetterman, T.C. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, 6th ed.; Pearson Education: Boston, MA, USA, 2020. [Google Scholar]
  55. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Aldine Publishing Company: New York, NY, USA, 2006. [Google Scholar]
  56. Trinidad, A.; Carrero, V.; Soriano, R.M. Teoría Fundamentada ”Grounded Theory” La Construcción de la Teoría a Través del Análisis Interpretacional; Centro de Investigaciones Sociológicas (CIS): Madrid, Spain, 2006.
  57. Coleman, H.; Unrau, Y.A. Qualitative Data Analysis. In Social Work: Research and Evaluation. Quantitative and Qualitative Approaches; Grinnell, R.M., Unrau, Y.A., Eds.; Oxford University Press: New York, NY, USA, 2005; pp. 403–420. [Google Scholar]
  58. Miles, M.B.; Huberman, A.M. Qualitative Data Analysis: An Expanded Sourcebook; SAGE: Newbury Park, CA, USA, 1994. [Google Scholar]
  59. Çelik, S.; Aytın, K. Teachers’ views on digital educational tools in English language learning: Benefits and challenges in the Turkish context. TESL-EJ Teach. Engl. Second Foreign Lang. 2014, 18, n2. [Google Scholar]
  60. Ryerse, M. The Student Role in Formative Assessment: A Practitioner’s Guide. 2019. Available online: https://www.gettingsmart.com/2018/01/the-student-role-in-formative-assessment-how-i-know-practitioner-guide/ (accessed on 18 April 2024).
  61. Wagner, S. Importance of Formative Assessment. 2015. Available online: https://www.naiku.net/blog/importance-of-formative-assessment/ (accessed on 18 April 2024).
  62. Caruso, M.; Fraschini, N.; Kuuse, S. Online Tools for Feedback Engagement in Second Language Learning. Int. J. Comput.-Assist. Lang. Learn. Teach. 2019, 1, 58–78. [Google Scholar] [CrossRef]
  63. Mäkipää, T. Upper secondary students’ perceptions of feedback literacy in second language learning in Finland—A qualitative case study. Teach. Teach. Educ. 2024, 143, 104554. [Google Scholar] [CrossRef]
  64. Mäkipää, T.; Hildén, R. What Kind of Feedback is Perceived as Encouraging by Finnish General Upper Secondary School Students? Educ. Sci. 2021, 11, 12. [Google Scholar] [CrossRef]
  65. Cosi, A.; Voltas, N.; Lázaro-Cantabrana, J.L.; Morales, P.; Calvo, M.; Molina, S.; Quiroga, M.Á. Formative assessment at university through digital technology tools. Profr. Rev. Currículum Form. Profr. 2020, 24, 164–183. [Google Scholar] [CrossRef]
  66. Baleni, Z.G. Online formative assessment in higher education: Its pros and cons. Electron. J. e-Learn. 2015, 13, 228–236. [Google Scholar]
Figure 1. Data analysis processes. Adapted from Miles and Huberman [58].
Figure 1. Data analysis processes. Adapted from Miles and Huberman [58].
Sustainability 16 04687 g001
Figure 2. Word cloud.
Figure 2. Word cloud.
Sustainability 16 04687 g002
Figure 3. Concept map.
Figure 3. Concept map.
Sustainability 16 04687 g003
Table 1. Results of categories that emerged from the content analysis.
Table 1. Results of categories that emerged from the content analysis.
Formative AssessmentABCDEF
Competences5.7528290%16.80234%30%
Reflection L24.56991100%19.44323%23%
Formative Feedback3.2489099%18.83417%16%
Digital Learning Ecosystem5.2567987%16.32232%27%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pinto-Llorente, A.M.; Izquierdo-Álvarez, V. Digital Learning Ecosystem to Enhance Formative Assessment in Second Language Acquisition in Higher Education. Sustainability 2024, 16, 4687. https://doi.org/10.3390/su16114687

AMA Style

Pinto-Llorente AM, Izquierdo-Álvarez V. Digital Learning Ecosystem to Enhance Formative Assessment in Second Language Acquisition in Higher Education. Sustainability. 2024; 16(11):4687. https://doi.org/10.3390/su16114687

Chicago/Turabian Style

Pinto-Llorente, Ana María, and Vanessa Izquierdo-Álvarez. 2024. "Digital Learning Ecosystem to Enhance Formative Assessment in Second Language Acquisition in Higher Education" Sustainability 16, no. 11: 4687. https://doi.org/10.3390/su16114687

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop