Next Article in Journal
Management Discourse Analysis of High- and Low-Efficacy Schools: A Comparative Study of Factors Influencing School Performance
Next Article in Special Issue
AI, Analytics and a New Assessment Model for Universities
Previous Article in Journal
Integrating Societal Issues with Mathematical Modelling in Pre-Service Teacher Education
Previous Article in Special Issue
Maintaining Academic Integrity in Programming: Locality-Sensitive Hashing and Recommendations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making

by
Viviana Betancur-Chicué
* and
Ana García-Valcárcel Muñoz-Repiso
*
Instituto Universitario de Ciencias de la Educación, Universidad de Salamanca, 37008 Salamanca, Spain
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2023, 13(7), 722; https://doi.org/10.3390/educsci13070722
Submission received: 8 June 2023 / Revised: 6 July 2023 / Accepted: 13 July 2023 / Published: 15 July 2023
(This article belongs to the Special Issue Application of New Technologies for Assessment in Higher Education)

Abstract

:
The assessment and feedback area of the European Framework for the Digital Competence of Educators (DigCompEdu) establishes a specific competence related to the ability to use digital technologies to provide feedback and make decisions for learning. According to the literature, this particular competence is one of the least developed in the teaching profession. As there are few specialised training strategies in the field of information and communication technology (ICT)-mediated feedback, this study aims to validate a microlearning proposal for university teachers, organised in levels of progression following the DigCompEdu guidelines. To validate the proposal, a literature analysis was carried out and a training proposal was developed and submitted to a peer review process to assess its relevance. This study identifies the elements that should be included in a training strategy in the area of feedback and decision making for university contexts. Finally, it is concluded that this type of training requires a combination of agile and self-managed strategies (characteristics of microlearning), which can be complemented by the presentation of evidence and collaborative work with colleagues.

1. Introduction

The European Framework for the Digital Competence of Educators (DigCompEdu) is considered a globally relevant reference for initial and continuing teacher training [1,2,3]. From this perspective, various studies carried out in the field of diagnosis and assessment of teachers’ digital competence (TDC) highlight the need to promote training strategies that, in line with DigCompEdu, foster their progressive development. However, due to the extensive nature of TDC, it has been identified that the area of assessment and feedback has a clear tendency to be less developed. A specific case can be observed at La Salle University in Colombia (verified through an internal institutional diagnosis), and more specifically, the competence related to feedback, planning and decision making, which refers to the ability of teachers “To use digital technologies to provide targeted and timely feedback to learners. To adapt teaching strategies and to provide targeted support, based on the evidence generated by the digital technologies used. To enable learners and parents to understand the evidence provided by digital technologies and use it for decision-making” [4] (p. 66).
According to Espasa and Guasch [5], “feedback is a key element of the teaching-learning process, as it provides students with relevant information about what they have done well and what they can improve, and how they can do it” (p. 153); this paradigm will accompany the theoretical proposal of this article. In this sense, feedback considers three elements: providing feedback to help progress, encouraging the student to read, understand, and do something with the information, and identifying changes based on the feedback to enrich the learning activity. However, there are reasons why students do not use or engage with feedback: “because they do not understand it, because it does not arrive at the right time, because it does not help them improve, among others” [5] (p. 129). This same issue has been identified in systematic literature reviews such as the one conducted by Haughney et al. [6], who reported that students’ involvement in feedback should be reinforced, for example, by their participation in the construction of assessment rubrics as indicated by Cockett and Jackson [7]. Research on feedback is currently quite limited, as concluded in the review by Morris et al. [8]. According to the review by Gros and Cano [9], the role of technology remains focused on the storage of information or the support of grading processes rather than on the enrichment of feedback. From the same perspective, the review by Paterson et al. [10] identified that students value multimodal and personalised feedback for its contribution to the consolidation of personalised learning paths. Finally, the review conducted by Banihashem [11] provides guidance on using learning analytics for the development of feedback processes.
On the other hand, according to the review by Betancur and García [12], teacher training processes require strategies that combine flexibility, personalisation, and tailored adaptation [13]. Therefore, the proposed approach will be based on microlearning, which is a specific instructional unit focused on a specific objective, as supported by Hug [14]. Microlearning is characterised by its flexibility in the channels or mediums in which it is delivered and its rapid production, which allows for easy updating [15]. These are key elements in the field of TDC due to its ever-changing nature.
In this way, as feedback is an essential component in the learning process, the fact that the associated TDC may be reporting a lower level of development [16,17,18,19,20] is one of the reasons why the delivery of effective feedback to students may be failing. However, the literature does not provide a clear view of what aspects a teacher training plan should be based on to promote TDC in the field of feedback. Therefore, this article aims to present the validation process of a microlearning proposal for university teachers from various fields of knowledge affiliated with La Salle University, Colombia, organised in levels of progression according to the DigCompEdu guidelines, and focused on information and communication technology (ICT)-mediated feedback.

2. Materials and Methods

This is a descriptive study of a qualitative nature with an empirical design adjusted to the needs of a formulation process for a training proposal.
First, a state-of-the-art review was conducted based on two questions that delimit the study to the specific field of feedback in the university context and the teacher training strategies reported in the literature on this topic. Research question 1 (RQ1) was “What is currently being researched in the field of feedback in higher education?”, and research question 2 (RQ2), “What elements are being considered in the development of teacher training strategies in the field of feedback in university contexts?”.
For the literature review, a total of 88 articles were identified in the initial search in the Web of Science (WOS) on 22 September 2022. Applying an initial filter based on reading the abstracts of the articles and using the exclusion criterion of articles that did not analyse feedback strategies for learning and did not provide explicit contributions, 21 studies were excluded. Therefore, 67 articles were reviewed, from which the following results regarding RQ1 (“What is currently being researched in the field of feedback in higher education?”) were obtained. The data from the review are available at: https://doi.org/10.5281/zenodo.7972308 (accessed on 26 May 2023).
Subsequently, with the data obtained, a content analysis was conducted on the criteria and indicators established by DigCompEdu regarding TDC in feedback, planning and decision making. From this analysis, a preliminary proposal for a teacher training plan was developed, combining the information derived from the literature review with the content analysis of the European framework. This analysis consisted of identifying the requirements for each level of proficiency from A2 to C2, and based on the theoretical elements provided by the literature, an initial proposal of microcourses with their respective subtopics is presented to promote the specific training plan for this TDC.
Finally, the proposed training plan was subjected to a validation process by experts who fulfilled three profiles: three experts in the field of feedback, three experts in the field of digital competencies, and three teachers who could potentially benefit from the training plan. The validation instrument was designed by establishing a 3-point rating scale for each specific topic of each microcourse: highly relevant (3 points), moderately relevant (2 points), and less relevant (1 point), followed by a comments section to receive specific feedback from the experts on the topics when deemed necessary. Similarly, the instrument included a section to assess the overall relevance of the microcourse and an optional space for justification.
Figure 1 summarises the flow of the implemented methodology.
The collected data for the literature review and validation of the plan are available at: https://doi.org/10.5281/zenodo.7972308 (accessed on 26 May 2023).

3. Results

3.1. Literature Review: RQ1

The general findings of the literature review are summarised in Table 1, which presents: (a) the categories of analysis identified in the reviewed studies; (b) the main characteristics described by the authors for each category; and (c) the frequency of appearance of these descriptors in the analysed studies.
Regarding the field of learning feedback, Deneen and Munshi [21] identified three functions of feedback with digital technologies: storing student information, converting that information into feedback, and delivering it to the student in a clear manner while ensuring motivation. These three functions should contribute to the development of the feedback cycle described by Salvat and García [9]: planning (establishing the goal and activity), performing (where the activity is carried out and support mechanisms are provided), and self-reflection (evaluating one’s performance). In line with these ideas, Ryan et al. [22] helped to understand that feedback is more useful when it is more personalised, detailed, and exemplified.
From the study by Salvat and Garcia [9], the contribution of ICT to feedback delivery is characterised by providing immediacy and greater precision in the information provided. It facilitates the creation of scaffolding, the possibility of reusing information, and favours personalisation and adaptive evaluation. They also identified, through the review of different studies, the importance of considering the combination of content quality and the closeness with which it is constructed and delivered to generate a dialogue for successful feedback. Together, these elements are linked to feedback that is understandable to the student and that actually helps them to improve, which, according to Fraile et al. [23], translates into a reduction of the gap between the learning objective and the student’s current level.
In this same perspective, the potential of ICT in feedback is associated with the fact that, as mentioned by Ryan et al. [22], unlike face-to-face dialogues, recordings, notes, or audio comments provide a permanent resource for the free use of the student, where the teacher can also incorporate key motivational aspects such as tone, pace, and even humour. Overall, the study published by Ryan et al. [22] identified that feedback is more effective when a combination of formats is used.

3.2. Literature Review: RQ2

Regarding RQ2: “Which elements are being considered in the development of teacher training strategies in the field of feedback in university contexts?”, the literature review did not report studies focusing specifically on the feedback component. However, it does confirm the need to promote teacher training strategies for the development of their TDC based on key elements, some of which are mentioned below.
On the other hand, the literature review results highlighted the need to consider in the construction of a professional development proposal the models of training and integration of technologies in education identified by Revuelta et al. [24], such as the Technological Pedagogical Content Knowledge (TPACK) framework, the Substitution Augmentation Modification Redefinition (SAMR) model, the Unified Theory of Acceptance and Use of Technology (UTAUT) model, the Krumsvik model, the Kolb model, and the Norwegian model of Professional Digital Competence.
Considering these models in the construction of a teacher training proposal is essential in order to avoid a merely instrumental integration of technology in educational practice and to adopt, instead, an innovative and critical approach toward the opportunities they generate.
Systematic review studies, such as the one conducted by García-Ruiz et al. [1], highlight the need for continuous and specialised training in the area of digital competence for feedback, with an emphasis on initial teacher training focused on resource management and the didactic use of ICT.
Other training experiences in developing TDC, such as the case of Reisoğlu and Çebi [25], highlight the need to offer training that includes knowledge and practices related to digital resources, teaching, learning, and assessment. They also emphasise the importance of designing courses that serve as models and offer opportunities for real practice and peer collaboration.
The review conducted by Betancur and García [2] highlights the importance of providing agile, flexible, and customised teacher training in TDC, tailored to the reality of each educational context and including a practical component that facilitates the construction of products for real teaching. In this context, microlearning has the quality of delivering relevant information with the help of visual and interactive elements in a very concise format, which helps to maintain student motivation. Studies such as that of Diaz et al. [26] support the idea that although microlearning has been proposed in the literature as a solution independent of formal educational contexts, it should also be considered a good mechanism for reinforcing traditional learning management systems. Additionally, microlearning has the characteristic of facilitating micro-credentials that involve a more significant effort than completing an activity and obtaining a “stamp”, as pointed out by Zhang and West [27].
In this field, studies reporting a positive impact of agile training strategies on the development of TDC have been identified. For example, De la Roca [28] developed a MicroMaster program based on four Massive Online Open Courses (MOOC), Cabero and Romero [13] designed a t-MOOC for the development of all areas of TDC, and Basantes et al. [29] created a NanoMOOC. In these cases, the relationship between the need to promote teacher training and the scarcity of time experienced by teachers is recognised, which is considered by Torgerson [30] as the main driver of microlearning. It is not only about being brief, but also about being efficient, and this is further enhanced by the use of mobile devices today and the possibility offered by microlearning to connect with knowledge networks. An essential element identified in the literature on microlearning is its ability to reduce cognitive load [31,32], thanks to the informative or instructional fragments it is composed of [33]. According to Vilchis [31], this dynamic and versatile approach facilitates knowledge retention, as the material is available at any time and in multiple formats, and also enhances motivation (resulting in ease of time management).

3.3. Content Analysis and Creation of a Training Proposal

Based on this literature review, the criteria and descriptors established by DigCompEdu for the competence related to the use of technology to provide feedback were analysed. This analysis allowed the development of an initial proposal of microcourses and subtopics to promote a specific training plan for this TDC.
A teacher training strategy based on microlearning, which is understood by Hug [14] as a didactic unit that provides a brief activity designed to achieve a specific objective associated with a change in performance, was adopted. In summary, microlearning refers to all types of short-duration learning activities with microcontent [34]. There are also different views on microlearning, such as the one proposed by Torgerson [30], who defines it as the participation in learning activities that usually last between a few seconds and 20 min, combining multiple contents and strategies with highly innovative potential.
In this sense, the literature review presented by Betancur and García [12], which highlighted the need to consolidate training offers that combine flexibility, personalisation, and customisation [13], as well as synchronisation with the educational reality and its practical characteristics [35], while fostering the construction of products applied to real teaching [25,36], is taken into consideration. These characteristics are met by microlearning, as it has been a strategy oriented towards adult education and the development of professional competencies [37,38,39,40]. Microlearning is also characterised by its flexibility in terms of the channels or media in which it is delivered and the multiple areas of knowledge in which it has been used [38,39,41,42]. An essential element is that it allows for quick production and easy updates [15], which are crucial in the field of TDC due to its ever-changing nature.
Appendix A presents a proposal for microcourses based on the criteria, descriptors, and progression levels established by DigCompEdu. The results of the content analysis are presented for each level, combining the contributions from the literature reviewed and the level-specific needs identified. From this analysis, the microcourses that a teacher should develop together with the subtopics addressed to achieve the TDC are established.

3.4. Validation of the Training Proposal

The quantitative results of the validation showed a high rating for the topics of the microcourses. The Pareto diagram in Figure 2 presents the score ranges obtained; considering that the highest rating for each microcourse topic was 3, it can be observed that the highest number of microcourses received the highest rating, between 2.8 and 3.0. Overall, the validation was positive:
The topics that received the lowest ratings, within the range of 2.1 to 2.4, were identified (Table 2) in order to evaluate the necessary adjustments to the training proposal in order to make them clear and relevant:
Based on the validation and feedback from the experts, the following adjustments were made to each of the topics listed, emphasising that the concept of feedback as a component of TDC should be addressed in a balanced manner between technological aids that facilitate its delivery and the quality and pedagogical relevance it must have for an effective teaching-learning process:
“Function of the audio format when providing feedback”: The proposal is adjusted by integrating into a single item both the function and the relevant cases for using this format.
“Feedback from content curation”: The observations showed that the creation of resources is a task that exceeds the teachers’ actual assessment time, so it is adjusted through the construction of comment banks for feedback.
“Providing scaffolding in feedback”: The types of scaffolding to be addressed in the microcourse are specified to define their scope. The topics are also strengthened to emphasise the importance of considering the analysis of feedback from a dialogical rather than a purely instrumental perspective.
“Configuring conditions within a Learning Management System (LMS)”: A more detailed description of the scope of automatic feedback through a learning platform is included.
“Virtual tutors” and “Tools for configuring virtual tutors”: A conceptual clarification is made to explain that the topic of chatbots as feedback tools will be addressed.
“Tools for reviewing exercises with artificial intelligence (AI)”: It is adjusted to “AI tools that contribute to feedback” and “ChatGPT and its use in feedback practices”.
“Design and delivery of feedback”: Observations related to level C2 led to delimiting and improving the scope of the microcourse towards “How to identify the effect of our feedback? Media and instruments” and “Analysis of feedback effects and decision-making”.
Finally, the validation process led to a training proposal that could be explored in a university teacher training scenario. It is recommended, as indicated in the literature, to make the respective adaptations before its design, development, and implementation. Similarly, having a clear thematic proposal facilitates the design of diagnostic instruments that can be implemented before the development of the training. Therefore, by embracing the flexibility of microlearning strategies, customised training paths can be offered based on the results or profiles of teachers generated by these diagnostics.

4. Discussion

The proposed training plan is based on the results of a literature review on the relationship between feedback and the use of technology. Therefore, each of the proposed microcourse blocks will be analysed along with the studies that contribute to their conceptual foundation. The microcourse proposal aims to contribute in a general way to the development of TDC in the field of feedback so that each teacher can adapt the relevant elements to their practice. To this end, each microcourse will have a learning sequence based on pre-assessment questions, explanatory and exemplification videos, practice questions, a final assessment, and a transfer survey (which investigates what will be applied in the short, medium, and long term).
The first microcourse block, aimed at achieving level A2 of TDC, is called “How to select technological tools for providing learning feedback?”. Its objective is to “understand how digital technologies can help me provide feedback to students or adapt my teaching strategies” [4] (p. 66). The microcourses included in this block are: (1) What is feedback?; (2) What tools can help me provide feedback?; and (3) Providing feedback through rubrics. These topics are supported by Espasa and Guasch [5,43,44], as well as Deneen and Munshi [21], Cockett and Jackson [7], and Salvat and García [9], who focused their analysis on the characteristics of feedback and its dialogical nature.
The second microcourse block, aimed at achieving level B1 of TDC, is called “How to provide feedback at an initial dialogical level?”. Its objective is for teachers to “use digital technology to grade and provide comments on electronically submitted tasks” [4] (p. 66). The microcourses included in this block are: (4) Providing feedback through objective tests; (5) Providing feedback through audio; and (6) Providing feedback through video and screencast. These topics are supported by several studies such as those of Sangrá et al. [45], Bulut et al. [46], and Espasa et al. [44], who identified the significant contribution of ICT in providing feedback on questionnaire responses. Ryan et al. [22], Salvat and García [9], and Espasa et al. [43] also support the value of providing feedback using multiple formats according to the needs. This block initiates a reflection on the term “dialogical feedback”, identified by Espasa et al. [44]. Dialogical feedback refers to a cycle that involves the delivery, understanding or processing, and implementation or the possibility of taking action based on the feedback. This cycle involves questioning the student and asking for critical explanations and clarifications [47].
The third microcourse block, aimed at achieving level B2 of TDC, is called “How to provide feedback at an advanced dialogical level?”. Its objective is for teachers to “adapt their teaching and assessment practices based on the data generated by the digital technologies they use, and use this data to provide personalised feedback and offer differentiated support to students” [4] (p. 66). The microcourses included in this block are: (7) Providing feedback through enriched digital content; (8) Learning data analysis; and (9) Student literacy in feedback and scaffolding. This block is supported by studies such as the one by Espasa et al. [44], which identified that dialogical feedback should personalise comments. The authors also suggested creating information banks using digital tools to reuse and adapt feedback [22]. Similarly, the theme of student literacy is fundamental and is supported by the studies of Quezada et al. [48], Carless and Boud [49], and Espasa and Guasch [50], highlighting the importance of student involvement. Wong and Lang [51] also affirmed the need to create synchronous spaces for feedback. In the field of educational scaffolding, Espasa and Meneses [52] considered the potential of semantic feedback, while Alemdag and Yildirim [53] analysed the impact of scaffolding on peer feedback. In this same block, the studies of Sedrakyan et al. [54], Tempelaar et al. [55], and Banihashem et al. [11] identified a theoretical basis supporting the importance of creating learning metric dashboards to guide feedback.
The fourth microcourse block, aimed at achieving level C1 of TDC, is called “How to provide feedback from conditional systems?”. Its objective is for teachers to “use digital technologies to personalise feedback and support, enabling them to identify areas for improvement and collaboratively develop learning plans to address these areas based on available evidence. It also aims to use the data generated by digital technologies to reflect on which teaching strategies work well for each type of student and adapt teaching strategies accordingly” [4] (p. 66). The microcourses included in this block are: (10) Configuring conditionals and learning paths in an LMS; and (11) Gamifying an LMS as a feedback strategy. A guiding study in this field is the research by Floratos et al. [56], which established a set of requirements that feedback should consider. The proposal associates gamification with assessment tasks that can engage students in productive learning. In a similar vein, the studies by Montenegro et al. [57] and Tang et al. [58] supported strategies for adopting gamification as a key component in delivering formative feedback. This element aligns with the creation of adaptive learning paths through learning platforms that contribute to enriching feedback, as demonstrated in the study by Lopez et al. [59].
Finally, the fifth microcourse block, aimed at achieving level C2 of TDC, is called “Providing Feedback from an Artificial Intelligence Perspective”. Its objective is for teachers to “design new systems for offering feedback and reflect, debate, redesign, and innovate in their strategies. Overall, the use of digital data for evaluation and improvement is sought” [4] (p. 66). The microcourses included in this block are: (12) AI tools for providing feedback; and (13) Investigating our feedback. These topics are supported by elements from the literature review, such as the studies by del Puerto and Gutiérrez [60] and Hooda et al. [61], which analysed the potential of AI in educational practice, identifying its contribution to assessment and feedback processes. This research is associated with the investigation of the use of chatbots and their potential for generating feedback, as discussed by Vijayakumar et al. [62]. Similarly, studies like the one that identified that students are not satisfied with the feedback they receive because they do not find it useful or perhaps do not understand it [50] highlight the importance of investigating the effectiveness of feedback.
In summary, the training program proposed in the five blocks of microcourses is guided by a perspective that, while focusing on the knowledge of digital tools that can contribute to ongoing teacher feedback, also emphasises the importance of dialogue in the feedback process in teaching and learning. It recognises that it is not enough to provide effective feedback, but that adjustments need to be made in planning so that students have time to make the most of it. The program encourages reflection on the dialogical role of feedback and emphasises the need for teachers to create opportunities for students to benefit from it.

5. Conclusions

It is concluded that the proposed teacher training plan has disciplinary endorsement based on feedback from academic peers and a literature review. However, since the field of TDC, and learning feedback specifically, is an area that is constantly changing and being updated, the plan will require a new application of the methodology described in this article in the very near future to keep it aligned with the needs of teacher training. Similarly, it is emphasised that the impact of providing training primarily through microlearning strategies will need to be evaluated.
The limitations of the study lie in the fact that this type of teacher training proposal requires continuous updating and adaptation by each institution to the characteristics and needs of its teaching staff. Similarly, the field of technologies that can contribute to the development of feedback is experiencing unstoppable advances, especially with the rise of artificial intelligence, which is another factor to be considered in the short term.
Based on this training plan, it is planned to carry out the design, production, implementation, and evaluation phases of the microcourses, with the aim of validating the relevance and impact that a training strategy focused on a specific TDC can have. The results and experience obtained will serve as a foundation for further work in designing agile training strategies for other TDCs within the DigCompEdu framework.

Author Contributions

Conceptualization, V.B.-C.; methodology, V.B.-C. and A.G.-V.M.-R.; software, V.B.-C.; validation, V.B.-C. and A.G.-V.M.-R.; formal analysis, V.B.-C. and A.G.-V.M.-R.; investigation, V.B.-C. and A.G.-V.M.-R.; resources, V.B.-C. and A.G.-V.M.-R.; data curation, V.B.-C.; writing—original draft preparation, V.B.-C. and A.G.-V.M.-R.; writing—review and editing, V.B.-C. and A.G.-V.M.-R.; visualization, V.B.-C. and A.G.-V.M.-R.; supervision, V.B.-C. and A.G.-V.M.-R.; project administration, V.B.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Yes.

Acknowledgments

PhD Programme: Knowledge Society Education. IUCE, USAL.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Proposed microcourses for teacher digital competence training related to feedback.
Table A1. Proposed microcourses for teacher digital competence training related to feedback.
Competence LevelIndicatorsProposed MicrocoursesThemes of the Microcourses
A2 (Explorer): Use of digital technologies to configure feedback.I use digital technologies to obtain an overview of students’ progress, which I use as a basis for offering suggestions and advice.Block 1: How to select technological tools for providing learning feedback?
M1. What is feedback?
M2. What tools can help me provide feedback?
M3. Providing feedback using rubrics and checklists.
M1. Foundations of feedback. Feedback cycle. Levels of feedback customisation. Elements in feedback. Feedback formats.
M2. Tools associated with LMS. Tools associated with Microsoft. Tools associated with Google.
M3. Design and configuration of holistic rubrics for providing feedback. Design and configuration of analytical rubrics for providing feedback.
B1 (Integrator): Use of digital technologies to provide feedback.I use digital technology to grade and provide feedback on electronically submitted assignments. I assist students and/or parents in accessing information about student performance using digital technologies.Block 2: How to provide feedback at an introductory dialogic level?
M4. Providing feedback from objective assessments.
M5. Providing feedback from audio.
M6. Providing feedback from video and screencasts.
M4. Designing feedback for multiple-choice questions. Designing feedback for matching questions. Designing feedback for true/false questions. Designing feedback for fill-in-the-blank questions.
M5. Role of the audio format in feedback. When to use audio for feedback. Considerations in designing audio feedback. Agile tools for providing audio feedback.
M6. Role of video format in feedback. When to use video or screencast for feedback. Considerations in designing video or screencast feedback. Agile tools for providing video or screencast feedback.
B2 (Expert): Use of digital data to enhance the effectiveness of feedback and support.I adapt my teaching and assessment practices based on the data generated by the digital technologies I use. I use these data to provide personalised feedback and offer differentiated support to students.Block 3: How to provide feedback at an advanced dialogic level?
M7. Providing feedback from enriched digital content.
M8. Learning data analysis.
M9. Literacy in feedback for students and providing scaffolding.
M7. Feedback from content curation. Tools for content curation. Creating feedback banks.
M8. Available metrics in Moodle that contribute to feedback. Learning analytics reports. Digital tools that provide learning metrics.
M9. Validating the clarity of feedback. Scaffolding in feedback. How to engage students to make the most of feedback.
C1 (Leader): Use of digital technologies to personalise feedback and support.I help students identify areas for improvement and collaboratively develop learning plans to address these areas based on the available evidence.Block 4: How to provide feedback using conditional systems?
M10. Configuring conditionals and learning paths in an LMS.
M11. Gamifying an LMS as a feedback strategy.
M10. Planning learning paths based on results (automated feedback). Configuring conditions within an LMS.
M11. Planning the gamified path. Level Up, Stash, and Game extensions. Use of badges.
C2 (Pioneer): Use of digital data to evaluate and improve teaching. Designing new systems for providing feedback.I reflect, debate, redesign, and innovate teaching strategies based on the digital evidence I find regarding students’ preferences and needs.Block 5: Providing feedback from an artificial intelligence (AI) perspective
M12. AI tools for feedback.
M13. Investigating our feedback.
M12. Chatbots as feedback tools. AI tools that contribute to feedback. ChatGPT and its use in feedback practices.
M13. Why study the effect of our feedback? How to identify the effect of our feedback? Media and instruments. Analysing the effects of feedback and decision making.

References

  1. García-Ruiz, R.; Buenestado-Fernández, M.; Ramírez-Montoya, M.S. Evaluación de la Competencia Digital Docente: Instrumentos, resultados y propuestas. Revisión sistemática de la literatura. Educ. XX1 2023, 26, 273–301. [Google Scholar] [CrossRef]
  2. Betancur, V.; García, A. Necesidades de formación y referentes de evaluación en torno a la competencia digital docente: Revisión sistemática. Fonseca J. Commun. 2022, 25, 133–147. [Google Scholar] [CrossRef]
  3. Cabero-Almenara, J.; Romero-Tena, R.; Palacios-Rodríguez, A. Evaluation of Teacher Digital Competence Frameworks Through Expert Judgement: The Use of the Expert Competence Coefficient. J. New Approaches Educ. Res. 2020, 9, 275–293. [Google Scholar] [CrossRef]
  4. Punie, Y.; Redecker, C. (Eds.) European Framework for the Digital Competence of Educators: DigCompEdu, EUR 28775 EN; Publications Office of the European Union: Luxembourg, 2017; ISBN 978-92-79-73718-3. [Google Scholar]
  5. Espasa, A.; Guasch, T. Menos es más: Menos correcciones y más feedback para aprender. In Decálogo para la Mejora de la Docencia Online; Sangrà Morer, A., Badia Garganté, T., Cabrera Lanzo, N., Espasa Roca, A., Fernández Ferrer, M., Guàrdia Ortiz, L., Guasch Pascual, T., Guitert Catasús, M., Maina, M.F., Raffaghelli, J.E., et al., Eds.; Editorial UOC: Barcelona, Spain, 2020; Available online: http://openaccess.uoc.edu/webapps/o2/handle/10609/122307 (accessed on 7 June 2023).
  6. Haughney, K.; Wakeman, S.; Hart, L. Quality of Feedback in Higher Education: A Review of Literature. Educ. Sci. 2020, 10, 60. [Google Scholar] [CrossRef] [Green Version]
  7. Cockett, A.; Jackson, C. The use of assessment rubrics to enhance feedback in higher education: An integrative literature review. Nurse Educ. Today 2018, 69, 8–13. [Google Scholar] [CrossRef]
  8. Morris, R.; Perry, T.; Wardle, L. Formative assessment and feedback for learning in higher education: A systematic review. Rev. Educ. 2021, 9, e3292. [Google Scholar] [CrossRef]
  9. Gros Salvat, B.; Cano Garcia, E. Self-Regulated Feedback Processes Enhanced by Technology in Higher Education: A Systematic Review. RIED.-Rev. Iberoam. Educ. Distancia 2021, 24, 107–125. [Google Scholar] [CrossRef]
  10. Paterson, C.; Paterson, N.; Jackson, W.; Work, F. What are students’ needs and preferences for academic feedback in higher education? A systematic review. Nurse Educ. Today 2020, 85, 104236. [Google Scholar] [CrossRef]
  11. Banihashem, S.K.; Noroozi, O.; van Ginkel, S.; Macfadyen, L.P.; Biemans, H.J.A. A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educ. Res. Rev. 2022, 37, 100489. [Google Scholar] [CrossRef]
  12. Betancur, V.; García, A. Características del Diseño de Estrategias de microaprendizaje en escenarios educativos: Revisión sistemática. RIED-Rev. Iberoam. Educ. Distancia 2023, 26, 201–222. [Google Scholar] [CrossRef]
  13. Cabero Almenara, J.; Romero-Tena, R. Diseño de Un T-MOOC Para La Formación En Competencias Digitales Docentes: Estudio En Desarrollo (Proyecto DIPROMOOC). Innoeduca. Int. J. Technol. Educ. Innov. 2020, 6, 4–13. [Google Scholar] [CrossRef]
  14. Hug, T. Sound pedagogy practices for designing and implementing microlearning objects. In Microlearning in the Digital Age: The Design and Delivery of Learning in Snippets; Corbeil, J.R., Khan, B.H., Corbeil, M.E., Eds.; eBook Collection (EBSCOhost); Routledge: New York, NY, USA, 2021. [Google Scholar]
  15. Allela, M.A.; Ogange, B.O.; Junaid, M.I.; Charles, P.B. Effectiveness of Multimodal Microlearning for In-Service Teacher Training. J. Learn. Dev. 2020, 7, 384–398. [Google Scholar] [CrossRef]
  16. Figueira, L.F. Digital Competence: DigCompEdu Check-in as a Digital Literacy Diagnostic Tool to Support Teacher Training. Educ. Formação 2022, 7. [Google Scholar] [CrossRef]
  17. Dias-Trindade, S.; Moreira, J.A.; Ferreira, A.G. Assessment of University Teachers on Their Digital Competences. Qwerty-Open Interdiscip. J. Technol. Cult. Educ. 2020, 15, 50–69. [Google Scholar] [CrossRef]
  18. Basilotta-Gómez-Pablos, V.; Matarranz, M.; Casado-Aranda, L.-A.; Otto, A. Teachers’ Digital Competencies in Higher Education: A Systematic Literature Review. Int. J. Educ. Technol. High. Educ. 2022, 19, 8. [Google Scholar] [CrossRef]
  19. Lucas, M.; Dorotea, N.; Piedade, J. Developing Teachers’ Digital Competence: Results From a Pilot in Portugal. IEEE Rev. Iberoam. Tecnol. Aprendiz. 2021, 16, 84–92. [Google Scholar] [CrossRef]
  20. Santo, E.d.E.; Dias-Trindade, S.; dos Reis, R.S. Self-Assessment of Digital Competence for Educators: A Brazilian Study with University Professors. Res. Soc. Dev. 2022, 11, e26311930725. [Google Scholar] [CrossRef]
  21. Deneen, C.; Munshi, C. Technology-Enabled Feedback: It’s Time for a Critical Review of Research and Practice. In Proceedings of the 35th International Conference of Innovation: Open Oceans: Learning without borders, Proceedings ASCILITE, Geelong, Australia, 25–28 November 2018; pp. 113–120. [Google Scholar]
  22. Ryan, T.; Henderson, M.; Phillips, M. Feedback Modes Matter: Comparing Student Perceptions of Digital and Non-Digital Feedback Modes in Higher Education. Br. J. Educ. Technol. 2019, 50, 1507–1523. [Google Scholar] [CrossRef]
  23. Fraile, J.; Ruiz-Bravo, P.; Zamorano-Sande, D.; Orgaz-Rincón, D. Evaluación formativa, autorregulación, feedback y herramientas digitales: Uso de Socrative en educación superior (Formative assessment, self-regulation, feedback and digital tools: Use of Socrative in higher education). Retos 2021, 42, 724–734. [Google Scholar] [CrossRef]
  24. Revuelta-Domínguez, F.-I.; Guerra-Antequera, J.; González-Pérez, A.; Pedrera-Rodríguez, M.-I.; González-Fernández, A. Digital Teaching Competence: A Systematic Review. Sustainability 2022, 14, 6428. [Google Scholar] [CrossRef]
  25. Reisoğlu, İ.; Çebi, A. How Can the Digital Competences of Pre-Service Teachers Be Developed? Examining a Case Study through the Lens of DigComp and DigCompEdu. Comput. Educ. 2020, 156, 103940. [Google Scholar] [CrossRef]
  26. Diaz Redondo, R.P.; Caeiro Rodriguez, M.; Lopez Escobar, J.J.; Fernandez Vilas, A. Integrating Micro-Learning Content in Traditional e-Learning Platforms. Multimed. Tools Appl. 2020, 80, 3121–3151. [Google Scholar] [CrossRef]
  27. Zhang, J.; West, R.E. Designing Microlearning Instruction for Professional Development Through a Competency Based Approach. TechTrends 2019, 64, 310–318. [Google Scholar] [CrossRef]
  28. De la Roca, M.; Morales, M.; Teixeira, A.; Hernández, R.; Amado-Salvatierra, H. The Experience of Designing and Developing an EdX’s MicroMasters Program to Develop or Reinforce the Digital Competence on Teachers. In Proceedings of the 2018 Learning with MOOCS (LWMOOCS), Madrid, Spain, 26–28 September 2018; pp. 34–38. [Google Scholar] [CrossRef]
  29. Basantes-Andrade, A.; Cabezas-González, M.; Casillas-Martín, S. Los nano-MOOC como herramienta de formación en competencia digital docente. Rev. Ibérica Sist. E Tecnol. Informação 2020, E32, 202–214. [Google Scholar]
  30. Torgerson, C. What is microlearning? Origin, definitions, and applications. In Microlearning in the Digital Age: The Design and Delivery of Learning in Snippets; Corbeil, J.R., Khan, B.H., Corbeil, M.E., Eds.; eBook Collection (EBSCOhost); Routledge: New York, NY, USA, 2021. [Google Scholar]
  31. Vilchis, N. Microaprendizaje: Lecciones Breves que Enriquecen el Aula; Observatorio/Instituto para el Futuro de la Educación: Monterrey, Mexico, 2023; Available online: https://observatorio.tec.mx/edu-news/microaprendizaje-en-el-aula/ (accessed on 22 May 2023).
  32. Tufan, D. Multimedia design principles for microlearning. In Microlearning in the Digital Age: The Design and Delivery of Learning in Snippets; Corbeil, J.R., Khan, B.H., Corbeil, M.E., Eds.; eBook Collection (EBSCOhost); Routledge: New York, NY, USA, 2021. [Google Scholar]
  33. Corbeil, J.R.; Khan, B.H.; Corbeil, M.E. Microlearning in the Digital Age: The Design and Delivery of Learning in Snippets; Routledge: New York, NY, USA, 2021. [Google Scholar]
  34. Hug, T. Microlearning. In Encyclopedia of the Sciences of Learning; Seel, N.M., Ed.; Springer: Boston, MA, USA, 2012; pp. 2268–2271. [Google Scholar] [CrossRef]
  35. Colás-Bravo-Bravo, P.; Conde-Jiménez, J.; Reyes-de-Cózar, S. The development of the digital teaching competence from a sociocultural approach. Comunicar 2019, 27, 21–32. [Google Scholar] [CrossRef] [Green Version]
  36. Lucas, M.; Bem-Haja, P.; Siddiq, F.; Moreira, A.; Redecker, C. The Relation between In-Service Teachers’ Digital Competence and Personal and Contextual Factors: What Matters Most? Comput. Educ. 2021, 160, 104052. [Google Scholar] [CrossRef]
  37. Tennyson, C.D.; Smallheer, B.A.; De Gagne, J.C. Microlearning Strategies in Nurse Practitioner Education. Nurse Educ. 2022, 47, 2–3. [Google Scholar] [CrossRef]
  38. Heydari, S.; Adibi, P.; Omid, A.; Yamani, N. Preferences of the Medical Faculty Members for Electronic Faculty Development Programs (e-FDP): A Qualitative Study. Adv. Med. Educ. Pract. 2019, 10, 515–526. [Google Scholar] [CrossRef] [Green Version]
  39. Prior Filipe, H.; Paton, M.; Tipping, J.; Schneeweiss, S.; Mack, H.G. Microlearning to Improve CPD Learning Objectives. Clin. Teach. 2020, 17, 695–699. [Google Scholar] [CrossRef]
  40. Govender, K.K.; Madden, M. The Effectiveness of Micro-Learning in Retail Banking. S. Afr. J. High. Educ. 2020, 34, 74–94. [Google Scholar] [CrossRef]
  41. Emerson, L.C.; Berge, Z.L. Microlearning: Knowledge Management Applications and Competency-Based Training in the Workplace. Knowl. Manag. E-Learn. Int. J. 2018, 10, 125–132. [Google Scholar]
  42. Lee, Y.-M.; Jahnke, I.; Austin, L. Mobile Microlearning Design and Effects on Learning Efficacy and Learner Experience. Educ. Technol. Res. Dev. 2021, 69, 885–915. [Google Scholar] [CrossRef]
  43. Espasa Roca, A.; Mayordomo Saiz, R.M.; Guasch Pascual, T.; Martinez Melo, M. Does the Type of Feedback Channel Used in Online Learning Environments Matter? Students’ Perceptions and Impact on Learning. Act. Learn. High. Educ. 2019, 23, 49–63. [Google Scholar] [CrossRef] [Green Version]
  44. Espasa, A.; Guasch, T.; Mayordomo, R.M.; Martínez-Melo, M.; Carless, D. A Dialogic Feedback Index Measuring Key Aspects of Feedback Processes in Online Learning Environments. High. Educ. Res. Dev. 2018, 37, 499–513. [Google Scholar] [CrossRef]
  45. Sangrà, A.; Guitert, M.; Behar, P.A. Competencias y metodologías innovadoras para la educación digital. RIED-Rev. Iberoam. Educ. Distancia 2023, 26, 9–16. [Google Scholar] [CrossRef]
  46. Bulut, O.; Cutumisu, M.; Aquilina, A.M.; Singh, D. Effects of Digital Score Reporting and Feedback on Students’ Learning in Higher Education. Front. Educ. 2019, 4, 65. [Google Scholar] [CrossRef] [Green Version]
  47. Guasch, T.; Espasa, A.; Martinez-Melo, M. The Art of Questioning in Online Learning Environments: The Potentialities of Feedback in Writing. Assess. Eval. High. Educ. 2019, 44, 111–123. [Google Scholar] [CrossRef]
  48. Quezada Cáceres, S.; Salinas Tapia, C.; Quezada Cáceres, S.; Salinas Tapia, C. Modelo de retroalimentación para el aprendizaje: Una propuesta basada en la revisión de literatura. Rev. Mex. Investig. Educ. 2021, 26, 225–251. [Google Scholar]
  49. Carless, D.; Boud, D. The Development of Student Feedback Literacy: Enabling Uptake of Feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef] [Green Version]
  50. Espasa-Roca, A.; Guasch-Pascual, T. ¿Cómo implicar a los estudiantes para que utilicen el feedback online? RIED-Rev. Iberoam. Educ. Distancia 2021, 24, 2. [Google Scholar] [CrossRef]
  51. Wong, L.; Lam, C. Herramientas para la retroalimentación y la evaluación para el aprendizaje a distancia en el contexto de la pandemia por la COVID-19. En Blanco Y Negro 2020, 11, 83–95. [Google Scholar]
  52. Espasa, A.; Meneses, J. Analysing Feedback Processes in an Online Teaching and Learning Environment: An Exploratory Study. High. Educ. 2010, 59, 277–292. [Google Scholar] [CrossRef]
  53. Alemdag, E.; Yildirim, Z. Effectiveness of Online Regulation Scaffolds on Peer Feedback Provision and Uptake: A Mixed Methods Study. Comput. Educ. 2022, 188, 104574. [Google Scholar] [CrossRef]
  54. Sedrakyan, G.; Dennerlein, S.; Pammer-Schindler, V.; Lindstaedt, S. Measuring Learning Progress for Serving Immediate Feedback Needs: Learning Process Quantification Framework (LPQF). In Addressing Global Challenges and Quality Education; Alario-Hoyos, C., Rodríguez-Triana, M.J., Scheffel, M., Arnedillo-Sánchez, I., Dennerlein, S.M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2020; pp. 443–448. [Google Scholar] [CrossRef]
  55. Tempelaar, D.; Nguyen, Q.; Rienties, B. Learning Feedback Based on Dispositional Learning Analytics. In Machine Learning Paradigms: Advances in Learning Analytics; Virvou, E.E.M., Alepis, G.A., Tsihrintzis, L.C.J., Eds.; Springer International Publishing: Berlin/Heidelberg, Germany, 2020; pp. 69–89. [Google Scholar] [CrossRef]
  56. Floratos, N.; Guasch, T.; Espasa, A. Recommendations on Formative Assessment and Feedback Practices for Stronger Engagement in MOOCs. Open Prax. 2015, 7, 141–152. [Google Scholar] [CrossRef] [Green Version]
  57. Montenegro, N.Y.; Fernández, B.H.; Mesía, M.M.S.; Uriarte, M.N.L. App de gamificación para la retroalimentación formativa en estudiantes de secundaria. Horiz. Rev. Investig. Cienc. Educ. 2022, 6, 2019–2030. [Google Scholar] [CrossRef]
  58. Tang, J.; (Chris)Zhao, Y.; Wang, T.; Zeng, Z. Examining the Effects of Feedback-Giving as a Gamification Mechanic in Crowd Rating Systems. Int. J. Hum.-Comput. Interact. 2021, 37, 1916–1930. [Google Scholar] [CrossRef]
  59. López, D.L.; Muniesa, F.V.; Gimeno, Á.V. Aprendizaje adaptativo en moodle: Tres casos prácticos. Educ. Knowl. Soc. 2015, 16, 138–157. [Google Scholar] [CrossRef] [Green Version]
  60. del Puerto, D.A.; Gutiérrez, P.E. La Inteligencia Artificial como recurso educativo durante la formación inicial del profesorado. RIED.-Rev. Iberoam. Educ. Distancia 2022, 25, 347–362. [Google Scholar] [CrossRef]
  61. Hooda, M.; Rana, C.; Dahiya, O.; Rizwan, A.; Hossain, M.S. Artificial Intelligence for Assessment and Feedback to Enhance Student Success in Higher Education. Math. Probl. Eng. 2022, 2022, e5215722. [Google Scholar] [CrossRef]
  62. Vijayakumar, B.; Höhn, S.; Schommer, C. Quizbot: Exploring Formative Feedback with Conversational Interfaces. In Technology Enhanced Assessment; Draaijer, S., Joosten-ten Brinke, D., Ras, E., Eds.; Communications in Computer and Information Science; Springer International Publishing: Cham, Switzerland, 2019; pp. 102–120. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Flow of the research methodology.
Figure 1. Flow of the research methodology.
Education 13 00722 g001
Figure 2. Validation results by topic.
Figure 2. Validation results by topic.
Education 13 00722 g002
Table 1. Summary of the elements identified in the literature review regarding feedback in higher education.
Table 1. Summary of the elements identified in the literature review regarding feedback in higher education.
CategoriesCharacteristicsNo
Understanding of feedbackThe idea that feedback is useful when students understand it and see it as an aid to their learning is supported.37
Problematic final feedbackThe centrality of feedback to the final task or assignment and the tendency to provide comments without ensuring learning are questioned.31
Problematic use of feedbackWhy feedback is not effective and the need to educate students about its use is studied. They identify a tendency among students to consider feedback as having little value in their process.21
Online feedbackThe role of technology in the development of feedback is analysed. The findings are related to the assistance provided by technology to streamline and effectively manage feedback.20
Definition of a modelA feedback model with specific characteristics is proposed, which the studies aim to validate through a methodological design.18
CooperationThe importance of peer feedback, the results it produces, and how to design such activities are explored.17
Presentation of a toolWhich digital tools were used to explore feedback is indicated. Two specific tools mentioned are Socrative and ExamVis.13
Self-regulationContributions of feedback to student self-regulation are indicated.11
Integration of artificial intelligence (AI)AI and studies on adaptive systems are integrated, as well as other trends such as 3D modelling for interaction with real objects that provide feedback.8
Format of feedbackThe format of feedback used (audio, video, or a combination of these with text) is indicated.7
Systematic Review of LiteratureSome systematic reviews on this particular topic were identified.5
InstrumentAn instrument for evaluating feedback is provided.3
Teacher trainingTeacher training strategies on feedback are proposed.3
Table 2. Microcourse topics with the lowest ratings.
Table 2. Microcourse topics with the lowest ratings.
LevelTopicAverage
B1Function of the audio format when providing feedback.2.3
B2Providing scaffolding in feedback.2.3
B2Provide feedback scaffolding.2.1
C1Configuration of conditions within an LMS (Learning Management System).2.4
C2Virtual tutors.2.1
Tools for configuring virtual tutors.2.1
Tools for review of exercises with AI (Gradescope, Cognni, InVideo).2.4
Global relevance.2.4
Feedback design.2.1
Feedback delivery.2.1
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Betancur-Chicué, V.; García-Valcárcel Muñoz-Repiso, A. Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making. Educ. Sci. 2023, 13, 722. https://doi.org/10.3390/educsci13070722

AMA Style

Betancur-Chicué V, García-Valcárcel Muñoz-Repiso A. Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making. Education Sciences. 2023; 13(7):722. https://doi.org/10.3390/educsci13070722

Chicago/Turabian Style

Betancur-Chicué, Viviana, and Ana García-Valcárcel Muñoz-Repiso. 2023. "Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making" Education Sciences 13, no. 7: 722. https://doi.org/10.3390/educsci13070722

APA Style

Betancur-Chicué, V., & García-Valcárcel Muñoz-Repiso, A. (2023). Microlearning for the Development of Teachers’ Digital Competence Related to Feedback and Decision Making. Education Sciences, 13(7), 722. https://doi.org/10.3390/educsci13070722

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop