Next Article in Journal
Categorized and Correlated Multiple-Choice Questions: A Tool for Assessing Comprehensive Physics Knowledge of Students
Next Article in Special Issue
Language Teaching through the Flipped Classroom: A Systematic Review
Previous Article in Journal
Are Veterinary Students Using Technologies and Online Learning Resources for Didactic Training? A Mini-Meta Analysis
Previous Article in Special Issue
Active Teaching Methodologies Improve Cognitive Performance and Attention-Concentration in University Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validation of the DigCompEdu Check-in Questionnaire through Structural Equations: A Study at a University in Peru

by
Lorena Martín Párraga
*,
Carmen Llorente Cejudo
and
Julio Barroso Osuna
Department of Didactics and Educational Organization, University of Seville, 41013 Sevilla, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2022, 12(8), 574; https://doi.org/10.3390/educsci12080574
Submission received: 27 July 2022 / Revised: 10 August 2022 / Accepted: 14 August 2022 / Published: 22 August 2022
(This article belongs to the Special Issue Active Methodologies and Educative Resources Mediated by Technology)

Abstract

:
The technologization of society presents a great challenge for education in the twenty-first century; there is a need to face that challenge to be able to promote quality digital literacy. The use of the teacher’s digital competence in terms of the safe and critical use of technologies is one of the key competencies that can guarantee educational success. The present study analyzes the reliability and validity of the DigCompEdu questionnaire for future teachers within the framework of digital competence improvement. This tool is centered on teacher orientation, with respect to their level of competence, through a self-evaluation of their strengths and needs for improvement in digital learning, according to its different dimensions: digital literacy, communication and collaboration with the organization; search and treatment of data, digital socialization; technological creativity and innovation. For its development, the exploratory and confirmatory factorial analysis (EFA and CFA) technique was utilized via structural equation modeling (SEM). A total of 1659 professors, who are employed at a university in Peru, completed the questionnaire. The analyses that were performed corroborated the reliability and validity of the instrument, as well as the different possibilities guaranteed by the validation method via structural equations. We underline the need to offer training for professors in the area of digital competence and endorse methodologies based on competencies that guarantee the use of valid and reliable tools.

1. Introduction

Today’s society is immersed in a multitude of constant changes, due to the progressive use of information and communication technology, which has forged a new technological era. This technological transformation has resulted in important advances at the social, economic, and cultural levels of society, not forgetting the subject of our concern, the area of education. The easy, immediate access provided to each of the sectors that compose this cycle points to the necessary ingredients that will provide an added value for the improvement of society, in terms of knowledge. These predictions indicate that the fourth industrial revolution demands new digital skills to secure future employment [1].
This technologization of society is the reason why new modifications are made in terms of the organization of information, knowledge, and ways of communicating, as well as the modeling of human cognition.
These accelerated changes have an effect on many areas, among which we find the teaching profession, given the difficulties experienced by educators when they try to update their knowledge to adapt it to the vertiginous rhythm of technology. This versatility, provided by the incorporation of ICT, has resulted in educational institutions bringing forward the updating of their methodological plans, given the inclusion of these technologies, to offer them a place in their educational practices [2,3].
Universities, to a greater extent, must face the challenge of investigating new manners to promote the teaching–learning (T-L) process, considering the alterations produced in the present society [4,5,6].
The additional use of ICT in teaching is problematic in the face of the traditional type of teaching that is predominant today. Thus, there is a need for the updating and training of educators according to the changes experienced by this new reality, with the acquisition of key competencies being essential; these competencies will equip educators with the knowledge and guidelines needed for the effective use of ICT in teaching.
Therefore, it is understood that there is a need for the digital training of educators to guarantee the acquisition of key competencies, defined as “a combination of knowledge, skills, and attitudes associated with the context” [7] (p. 7). It is, therefore, essential to acquire the associated competencies in a way that is able to respond to current demands. Thus, focusing on the present state of affairs, and according to what was outlined by the European Union’s Council, we can state that one of the most important competencies in this technological era is that of digital competency, involving the reliable and essential use of the new wave of technologies for work, enjoyment, and communication [7].
Based on the above, in 2012, the European Commission planned to “redirect education” as a means to attain quality education in the current environment of transformation, granting usefulness to and integrating ICT into the learning processes in an efficient manner. This implies the development of international training plans that are able to effectively incorporate digital competencies among educators, implying a common standard of education in this competence [8].
This technologization has been able to transform literacy practices, acquiring a fundamental role in the appropriate development of present educational contexts. Therefore, the need arises to review the concept of literacy and to make advancements in terms of new ways of identification, facilitating greater access to the development of competencies that are socially expected, offering a digitized culture that is able to reveal digital literacy, e-learning, e-inclusion, e-health, and commitment to digital solutions [9] (p. 4). The importance of facilitating diverse digital platforms and the technological and didactic educational resources to educational systems are also evidenced as guarantors of the correct use of ICT during the T-L process.
Given the demands of the digital era and the need to acquire a broad sample of competencies and strategies, a list of necessary skills has been created by official institutions and organizations, with digital competency (DC) being found in all of them [10]. For many authors, the term “digital competency” alludes to the creative, critical, and safe use of technologies as tools for achieving the correct performance of work, academic, or leisure objectives, and even for active integration and participation in society [11,12,13].
The importance of developing correct digital literacy as a tool for knowing how to use, manage, evaluate, and identify the ICT has been reported previously [14]; it is fundamental in the search and treatment of data [15] and the development of critical thinking that allows for the resolution of problems and the making of correct decisions [16,17]. On the other hand, the acquisition of essential skills for the development and implementation of digital strategies that are oriented toward collaboration and the communication of information [18] leads to the establishment of ethical notions through good practice [19], with a subsequent effect on the deployment of more innovative and creative educational practices [20]. Even so, from a teaching perspective, it has been suggested that immersion in this technological current does not ensure equal opportunities for its access and use, causing possible social inequalities and, as indicated by the authors of [21], may generate visible inequalities in the different levels of competence. There is a need to turn teachers into content generators, creators of ideas and opinions, relieving them of the passive mindset generated by the lack of teacher training in this field [21]. This approach will foster the relationship between competencies and the adoption of teaching methodologies among teachers since, as indicated in the studies carried out by the authors of [22], who analyzed the teachers’ perspective, the higher the level of digital competence, the greater the teachers’ willingness to incorporate e-learning modalities. In turn, as proficiency improves, more favorable changes will be generated in the didactic models used for the integration of ICT in teaching and learning processes [23].
As the training of university professors in DC is viewed as an urgent necessity [24], many national and international institutions have begun to work on frameworks and models that will facilitate digital competence [25]. Due to this, the National Institute of Education Technologies and Teacher Training developed a reference framework for the diagnosis and measurement of digital competencies of professors, in order to deal with this technological barrier. Thus, in 2017, with the intention of achieving the acquisition of a reference framework by the European Policies, the Joint Research Center of the European Union presented the European Framework of Digital Competence for Educators (DigCompEdu), the result of numerous studies conducted at the local, national, European, and international levels [26,27].
DigCompEdu constitutes a competency model of six areas (see Table 1) incorporating the different competencies that education professionals must develop to promote productive, inclusive, and integrative learning strategies through the use of digital tools [28], as described by the authors of [29].
Within the framework, different levels of competency are also established (see Table 2), comprising a total of 6 progressive levels of mastery. This scheme was created for better detection of the educator’s competencies, making possible the gradual personal development and autonomy. It starts with an initial level A1, to a more complete C1.
The “DigCompEdu” model was created to develop a self-reflection tool for educators, known as the “DigCompEdu Check-in”, which is based on the European Framework of Digital Competence for Educators. The main objective of the questionnaire is for educators to improve their comprehension of said framework, by providing them with a self-evaluation of their strengths and weaknesses, which is needed for educators to become highly competent in their professional practice.
Once the questionnaire is completed, the tool itself is responsible for creating a detailed and personalized report on the level of competence, according to areas of mastery. This instrument is oriented towards different education stages, focusing on the stage of interest to us, university educator training.
The instrument is composed of 22 items that include the content from the 6 areas of competence established in the common framework of digital educator competence: professional commitment (1), digital resources (2), digital pedagogy (3), evaluation and feedback (4), empowering students (5), and facilitating the digital competence of students (6) [29].
The content of the DigCompEdu Check-in has been included in the self-assessment tool, EuSurvey; thanks to its global classification system, sorted according to area, this allows us to identify the level of digital competence acquired by an educator. For this study, a graduated classification system will be used to discover the global digital competence of educators.

2. Materials and Methods

The purpose of the present study is linked to an analysis of the DigCompEdu questionnaire. To measure the reliability and validity of the instrument, exploratory factorial analysis (EFA) and confirmatory factorial analysis (CFA) were performed through the use of structural equation models (SEM).
The SEM techniques allow us to analyze how the existing covariance is distributed in each type of data, and to evaluate if the relationships between variables, as expressed through the model, adjust to the values [30]. In summary, it is a procedure that consists of defining a conceptual model, which represents the relationships between a set of latent factors and their observed variables, from which a covariance matrix will be obtained that will be compared with the matrix from the SEM to measure the validity of this model [31].

2.1. Sample

The total study population was composed of 1659 university educators employed at the Continental University of Peru. Of these, 568 (34.3%) were women and 1090 (65.7%) were men. Most of the respondents to the “DigCompEdu Check-in” questionnaire, which was developed by the European Framework of Digital Competence for Educators to measure the level of educator competence, were aged between 30 and 39 years old (34%) and between 40 and 49 years old (34.9%).

2.2. Data Collection Instrument

For the collection of data and the posterior analysis of the data, the “DigCompEdu Check-in” questionnaire [29] was used, an analysis instrument generated by the European Framework of Digital Competence for Educators (DigCompEdu), validated by Ghomi and Redecker (2018). This framework was selected because it is fundamental in the assessment of DC in university educators through its validation via the expert judgment technique [32].
This instrument is composed of twenty-two items, which are distributed into the six areas of competence analyzed in DigCompEdu. These are related to the different areas of competence: (A) professional commitment (4 items), (B) digital resources (3 items), (C) teaching and learning (4 items), (D) assessment (3 items), (E) empowering students (3 items), and (F) facilitating the DC of students (5 items).
By means of this questionnaire, teachers were asked initially to self-assess their level of competence, thereby classifying themselves in one of the following categories: novice explorer, leader, or pioneer. The same process was repeated once the questionnaire was completed, to verify its level of significance.
In addition, a series of demographic questions were included in the questionnaire, covering sex, age, years of service, and time spent using technologies, among others.

2.3. Collection and Analysis of Data Procedures

The administration of this questionnaire was performed digitally through EuSurvey at the end of 2021; the questionnaire was distributed to university personnel from a Peruvian university, the method guaranteeing the anonymity of the data.
To measure the validity of the questionnaire, reliability and validity studies were performed on the instrument and on the information obtained, for a high level of scientific rigor.
To calculate the reliability and the discriminant and convergent validity, the following coefficients were considered: Cronbach’s alpha, McDonald’s omega, composed reliability (CR), average variance extracted (AVE), and maximum shared variance (MSV), chosen according to the studies conducted by the authors of [32]. So that the obtained results could be contrasted, an inferential statistics analysis was conducted between the items and dimensions, to ensure systematic and efficient evaluation. For this purpose, a bivariate correlation analysis was performed using Spearman’s p-correlation coefficient.
To test the validity of the construct, an exploratory factorial analysis (EFA) was utilized, using the principal components method with Varimax rotation and Kaiser normalization. Once the factors were obtained, confirmatory factor analysis was performed (CFA), to verify if the theoretical means of the model had a good internal consistency, through the use of structural equations [30]. Thus, it was verified that the data obtained did not have a normal distribution through their descriptive study, in which asymmetry and kurtosis were taken into account. To verify this, a Kolmogorov–Smirnov goodness-of-fit test was performed, obtaining a significance equal to 0.000 for the totality of the items.
The program utilized for the analysis of data was JASP 0.16.2. Eric-Jan Wagenmakers (sala G 0.29), University of Amsterdam, Amsterdam, the Netherlands.

3. Analysis and Results

The data obtained were subjected to an analysis of reliability through the calculation of Cronbach’s alpha, with values close to 1 indicating the reliability of its scales [33], along with McDonald’s omega, applied globally and for all the dimensions that constituted the questionnaire.
The data collected obtained a Cronbach’s alpha of 0.937 globally (Table 3). It was determined that the index was very high (> 0.9), which signifies a high degree of reliability. Table 4 shows the reliability indices according to the dimensions from the questionnaire: professional commitment (0.813), digital resources (0.755), digital pedagogy (0.978), assessment and feedback (0.863), empowering students (1.08), and facilitating the digital competence of the students (0.914).
Each of the values, as determined by the authors of [34], obtained levels of reliability that were higher than > 0.75 for the instrument as a whole, as well as the different dimensions that it comprises; therefore, we considered that each dimension possessed a high level of reliability.
Next, the simple correlations of each item, with its theoretical dimension, were calculated. The results are shown in Table 5.
Each of the values that were higher than 0.5 was considered, which allows us to accept the item as a component of that dimension [35].
To calculate the validity of the construct, an EFA analysis was performed (Table 6). Prior to this, its applicability was confirmed through the sampling adequacy test KMO and Bartlett’s sphericity test. The results showed KMO values with a statistically significant coefficient of 0.977, very close to 1, which indicates a high degree of association between the items, along with a sphericity test of 31038.347 in the chi-square, with a p-value < 0.000 (see Table 6). In summary, the factorial analysis of the data could be applied.
From this first analysis, we can extract factor 1, which explains 49.797% of the total variance. The method for the extraction used the principal components with Varimax rotation, from which we obtained the matrix of rotated components shown in Table 7.
The model found by the EFA was contrasted with the CFA. For this, a global adjustment was made through the use of different statistical tests: chi-square (Cmin), goodness-of-fit index (GFI), parsimonious goodness-of-fit index (PGFI), normed fit index (NFI), and parsimonious normed fit index (PNFI).
Table 8 shows the values obtained and the reference values for the adjustment of the model, according to Lévy Mangin et al. (2006).
Considering the indices obtained, it has been confirmed that the model is adequate and fits perfectly with the empirical data. Likewise, the results obtained also confirm the validity of the construct, thus allowing us to corroborate the statement that the model is pertinent for achieving the objectives defined in the study.

4. Discussion and Conclusions

The results of the present study are associated with the validity of the DigCompEdu questionnaire. The reliability and the validity of the instrument provide us with the possibility of creating rigorous, precise, and valid knowledge to offer regarding quality education in the present context of transformation. This is evidenced by the high indices of reliability, the questionnaire’s theoretical validity, and its confirmatory structure.
According to the fit indices, the model is valid and fitted to the empirical data [36]. Thus, the validity and pertinence of the model can be confirmed. This validation described the existing reality, explaining the educators’ perception of the importance of the subject analyzed, as well as the importance of its applicability for their future professional development.
The justification for validating the measurement instruments is supported by authors such as Cole and Maxwell [37], who attest to the relevance of being able to partially but firmly validate the measurement values to affirm both the precision of the data with which we worked and the metric properties of the instruments utilized during the process of research. In this way, we would be assured of maintaining a rigorous scheme. Along with this surety, the need to sustain a theoretical plan and its methodological design was also considered, as precision of the data was required to obtain a robust instrument.
On the other hand, when alluding to the nature of the subject addressed, and following other studies centered on this field, the importance of the referential frameworks that were planned is underlined, as well as the recommendation of how to replicate their study methodology with these models [38], as performed throughout the conducted study. Authors such as [39] point out that the competency models are considered “education priorities from each country, with a convergent view that easily formulates quality and inclusive education, and in which public policies facilitate the democratization of knowledge” as cited in [40,41] and according to different studies discussed throughout this article [14,18,20], it can be verified that all the information is not merely theoretical, but that an increasing number of different paths exist with respect to the digital materials created.
Previous works by [42] stress that almost all the studies conducted have the same weakness: that of assessing educators only in terms of their work in the classroom, ignoring their professional commitment to the community, aside from maintaining a certain pejorative view about the taxonomy of the teaching profession, and not considering the more holistic aspects of their work. At the same time, other studies show that teachers’ competencies are not as wide-ranging as expected, showing the infrequent use of ICT in their educational practices [43], as well as the self-perception received by them based on their low mastery of ICT, which generates insecurity when it comes to adapting their resources, due to their low levels of creation of digital content [44]. Other research [45] offers relevant results regarding the low levels of creation and adaptation of these contents among teachers, which is a worrying situation. This creates a new concept in terms of the need to consider the context in which the profession will be developed, as well as the capacities necessary when acquiring a comprehensive DC of teacher quality. With respect to the training of teachers, it has been verified that in all the education systems, there is a search for the constant and continuous development of the DC model for its standardized integration into educational institutions. This is why the establishment of levels of competence will lead to the establishment of more personalized training itineraries [46].
Along with these initiatives, the support, certification, and recognition of the public administrations will be in only the early stages and will require systematic, previous, and continuous analysis of the education plans [47].
On the other hand, it is important to emphasize the limitations of this study and to detail the fact that these are, to a greater extent, delimited by the characteristics of the sample. In future studies, it would be advisable to expand or replicate it in other national and international universities, in order to take into account the possible digital divide existing in the different geographical areas. However, the questionnaire has been created to serve as a template for better progress in the development of measurement instruments and can always be adapted to serve the needs of each educational center, creating a model of digital development that is capable of guiding educational policies at all levels, regardless of their technological wealth. On the other hand, it would be convenient, depending on the characteristics of the educational institutions under study, to identify the improvement in some ways that would offer more congruent results.
In conclusion, the development of digital competence is essential in this new virtual education, a change that has come to stay. For this development, it is necessary that educators become aware of their responsibility and assume the role of carriers of new pedagogic models, being aware of the importance of being up-to-date by receiving good training that will allow them to examine and innovate educational transformations in the future.
As detailed above, it is no longer enough to possess basic training; however, this training must be put into practice by including pedagogic actions that are able to lead to better performance regarding the use of ICT in the field of education, along with the importance of knowing how to assess the process that guarantees its correct progression.
Therefore, the application of this type of questionnaire, aimed at reinforcing and establishing a better mastery of digital competencies, will be crucial in the future, not only to improve teacher knowledge but also as an infallible means to self-evaluate the level of mastery that a teacher has achieved, as established by the Joint Research Center of the European Union in 2017 [26,27]. Even if one has a certain level of competence in handling digital technologies, this will not be enough to provide quality teaching in the use of technological media since, as has been detailed throughout this research, to achieve a correct inclusion, it will be necessary that the teacher is able to perform a reflective professional practice wherein the ability to produce content, share learning experiences, and be able to transform knowledge is enhanced, thereby contributing simultaneously to a construction of the teacher’s own training that is aimed at the creation of their professional identity.
Society as a whole moves forward, taking on the work of continuous renovation, due to the importance of finding an equilibrium that guarantees future advancement and progress.

Author Contributions

L.M.P., C.L.C. and J.B.O. presented and designed the experiments; L.M.P. performed the experiments, analyzed the data, and wrote the original article. C.L.C. and J.B.O. contributed to the review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

Design: production and evaluation of t-MOOC for the acquisition of digital competences of university teachers. Reference: US-1260616. Junta de Andalucía (Consejería de Economía y Conocimiento).

Institutional Review Board Statement

Ethical review and approval for this study was waived because the subjects participating in the study (1659) responded to the signed consent form before answering the questionnaire.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Due to confidentiality and privacy agreements, it is not possible to make these data publicly available.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Williamson, B.; Potter, J.; Eynon, R. New research problems and agendas in learning, media and technology: The editors’ wishlist. Learn. Media Technol. 2019, 44, 87–91. [Google Scholar] [CrossRef]
  2. Hatlevik, L.K.; Hatlevik, O.E. Examining the relationship between teachers’ ICT selfefficacy for educational purposes, collegial collaboration, lack of facilitation and the use of ICT in teaching practice. Front. Psychol. 2018, 9, 935. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Gómez, O.Y. El uso educativo de las TIC. Rev. Interam. Investig. Educ. Pedagog. 2019, 12, 211–227. [Google Scholar] [CrossRef]
  4. OpenMind BBVA. La Era de la Perplejidad. Repensar el Mundo que Conocíamos; Taurus: Madrid, Spain, 2017. [Google Scholar]
  5. Gómez-Parra, M.E.; Huertas-Abril, C. La importancia de la competencia digital para la superación de la brecha lingüística en el siglo XXI: Aproximación, factores y estrategias. EDMETIC 2019, 8, 88–106. [Google Scholar] [CrossRef]
  6. Ruíz Mezcua, A. Competencia digital y TICs en interpretación: «renovarse o morir». EDMETIC 2019, 8, 55–71. [Google Scholar] [CrossRef]
  7. Consejo de la Unión Europea. Recomendación del CONSEJO, de 22 de Mayo de 2018, Relativa a las Competencias Clave Para el Aprendizaje Permanente; Diario Oficial de la Unión Europea: Luxembourg, 2018; Volume C189/1. [Google Scholar]
  8. Consejo de la Unión Europea. Recomendación del Consejo de 22 de Abril de 2013 Sobre el Establecimiento de la Garantía Juvenil; Diario Oficial de la Unión Europea: Luxembourg, 2013. [Google Scholar]
  9. Disposición 8301, de 4 de Junio de 2019, de Relaciones con las Cortes e Igualdad (2019). Boletín Oficial del Estado, 133, sec.III, de 4 de junio de 2019, 50509. Available online: https://www.boe.es/boe/dias/2019/06/04/pdfs/BOE-A-2019-8301.pdf (accessed on 20 July 2022).
  10. INTEF. Marco Común de Competencia Digital Docente; Instituto Nacional de Tecnologías Educativas y Formación del Profesorado: Madrid, Spain, 2017. [Google Scholar]
  11. Romero-Rodríguez, L.; Contreras-Pulido, P.; Pérez, A. Media competencies of university professors and students. Cult. Educ. 2019, 31, 326–368. [Google Scholar] [CrossRef]
  12. Casal, L.; Barreira, E.M.; Mariño, R.; García, B. Competencia digital docente del profesorado de fP de galicia [digital Teaching Competence of galician Vocational Training Teachers]. Pixel-Bit. Rev. Medios Educ. 2021, 61, 165–196. [Google Scholar] [CrossRef]
  13. Rodríguez-hoyos, C.; Fueyo, A.; Hevia, L. Competencias digitales del profesorado para innovar en la docencia universitaria. Analizando el uso de los dispositivos móviles. Pixel-Bit. Rev. Medios Educ. 2021, 61, 71–97. [Google Scholar] [CrossRef]
  14. Hasse, C. Technological literacy for teachers. Oxf. Rev. Educ. 2017, 43, 365–378. [Google Scholar] [CrossRef]
  15. Çoklar, A.N.; Yaman, N.D.; Yurdakul, I.K. Information literacy and digital nativity as determinants of online information search strategies. Comput. Hum. Behav. 2017, 70, 1–9. [Google Scholar] [CrossRef]
  16. Avsec, S.; Szewczyk-Zakrzewska, A. Predicting academic success and technological literacy in secondary education: A learning styles perspective. Int. J. Technol. Des. Educ. 2017, 27, 233–250. [Google Scholar] [CrossRef]
  17. Infante-Moro, A.; Infante-Moro, J.C.; Gallardo-Pérez, J.; Martínez-López, F. Key Criteria in the Choice of IoT Platforms in Spanish Companies. Appl. Sci. 2021, 11, 10456. [Google Scholar] [CrossRef]
  18. Gutiérrez-Porlán, L.; Román-García, M. Strategies for the communication and collaborative online work by university students. Comunicar. Media Educ. Res. J. 2018, 26, 91–100. [Google Scholar] [CrossRef]
  19. Dominighini, C.; Cataldi, Z. Ética en la investigación en TICS: Formación en buenas prácticas en ciencia y tecnología. Rev. Inf. Educ. Medios Audiov. 2017, 14, 20–25. [Google Scholar]
  20. Stahl, B.C.; Timmermans, J.; Flick, C. Ethics of Emerging Information and Communication Technologieson the implementation of responsible research and innovation. Sci. Public Policy 2017, 44, 369–381. [Google Scholar] [CrossRef] [Green Version]
  21. Macías, E.M.; García, M.A.; Arreguín, G.M. El Alumno Como Prosumidor de Medios. Debates en Evaluación y Currículum, Congreso Internacional de Educación: Evaluación 2018. Available online: https://posgradoeducacionuatx.org/pdf2018/A229.pdf (accessed on 20 July 2022).
  22. Saeidi, L.; Saeidipourb, B.; Safari, Y.; Reza, H. Assessment of readiness to accept the use of e-learning by faculty members in Kermanshah University of Medical Sciences, Iran. Int. J. Curr. Sci. 2016, 19, 116–121. [Google Scholar]
  23. San Nicolás, M.B.; Vargas, E.; Moreira, M. Competencias digitales del profesorado y alumnado en el desarrollo de la docencia virtual. El caso de la Universidad de La Laguna. Rev. Hist. Educ. Latinoam. 2013, 14, 227–245. [Google Scholar] [CrossRef]
  24. Fernández-Batanero, J.; Montenegro-Rueda, M.; Fernández-Cerero, J.; García-Martínez, I. Digital competences for teacher professional development. Systematic review. Eur. J. Teach. Educ. 2020, 18, 1–19. [Google Scholar] [CrossRef]
  25. Sánchez-Caballé, A.; Gisbert-Cervera, M.; Esteve-Mon, F.M. The digital competence of university students: A systematic review of the literature. Aloma. Rev. Psicol. Ciències L’educació L’esport 2020, 38, 63–74. [Google Scholar] [CrossRef]
  26. Ghomi, M.; Redecker, C. Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. CSEDU 2019, 1, 541–548. [Google Scholar]
  27. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu; Punie, Y., Ed.; Publications Office of the European Union: Luxembourg, 2017. [Google Scholar] [CrossRef]
  28. Cabero-Almenara, J.; Tena, R.R. Diseño de un t-MOOC para la formación en competencias digitales docentes: Estudio en desarrollo (Proyecto DIPROMOOC). Innoeduca. Int. J. Technol. Educ. Innov. 2020, 6, 4–13. [Google Scholar] [CrossRef]
  29. Cabero-Almenara, J.; Palacios-Rodríguez, A. Marco europeo de competencia digital docente «digcompedu». Traducción y adaptación del cuestionario «Digcompedu check-in». Edmetic 2020, 9, 213–234. [Google Scholar] [CrossRef] [Green Version]
  30. Schumacker, R.E.; Lomax, R.G. A Beginner’s Guide to Structural Equation Modeling, 2nd ed.; Psychology Press: London, UK, 2004. [Google Scholar] [CrossRef]
  31. Bas-Peña, E.; Ferre-Jaén, E.; Maurandi-López, A. Habilidades Profesionales y Habilidades Sociales en los Grados de Educación: Validación de un Cuestionario Utilizando un Modelo de Ecuación Estructural. Rev. Electrónica Educ. 2020, 24, 1–20. [Google Scholar] [CrossRef]
  32. Cabero-Almenara, J.; Gutiérrez-Castillo, J.J.; Palacios-Rodríguez, A.; Barroso-Osuna, J. Development of the teacher digital competence validation of DigCompEdu check-in questionnaire in the university context of Andalusia (Spain). Sustainability 2020, 12, 6094. [Google Scholar] [CrossRef]
  33. Ruiz, M.A.; Pardo, A.; San Martín, R. Modelos de ecuaciones estructurales. Pap. Psicólogo 2010, 31, 34–45. [Google Scholar]
  34. Marín-Díaz, V.; Sampedro Requena, B.E.; Vega Gea, E. Estudio psicométrico de la aplicación del internet addiction test con estudiantes universitarios españoles. Contextos Educ. Extraordin. 2017, 2, 147–161. [Google Scholar] [CrossRef] [Green Version]
  35. O’Dwyer, L.; Bernauer, J. Quantitative Research for the Qualitative Researcher; SAGE Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar] [CrossRef]
  36. Carmines, E.; Zeller, R. Reliability and Validity Assessment; SAGE Publications: Thousand Oaks, CA, USA, 1979. [Google Scholar] [CrossRef] [Green Version]
  37. Cole, D.A.; Maxwell, S.E. Multitrait-multimethod comparisons across populations: A confirmatory factor analytic approach. Multivar. Behav. Res. 1985, 20, 389–417. [Google Scholar] [CrossRef]
  38. González-Montesinos, M.J.; Backhoff Escudero, E. Validación de un cuestionario de contexto para evaluar sistemas educativos con Modelos de Ecuaciones Estructurales. RELIEVE 2010, 16, 1–17. [Google Scholar] [CrossRef] [Green Version]
  39. Cabero-Almenara, J.; Barroso Osuna, J.M.; Gutiérrez Castillo, J.J.; Palacios-Rodríguez, A.D.P. Validación del cuestionario de competencia digital para futuros maestros mediante ecuaciones estructurales. Bordón. Rev. Pedagog. 2020, 72, 45–63. [Google Scholar] [CrossRef]
  40. Lugo, M.T.; Ruiz, V. Revisión comparativa de iniciativas nacionales de aprendizaje móvil en América Latina. Los casos de Colombia, Costa Rica, Perú y Uruguay. In UNESCO 2016; United Nations Educational, Scientificand Cultural Organization (UNESCO): Paris, France, 2017. [Google Scholar]
  41. Jiménez-Hernández, D.; Muñoz, P.; Sánchez, F. La Competencia Digital Docente, una revisión sistemática de los modelos más utilizados. Rev. Interuniv. Investig. Tecnol. Educ. 2021, 10, 105–120. [Google Scholar] [CrossRef]
  42. Esteve-Mon, F.M.; Castañeda, L.; Adell-Segura, J. Un modelo holístico de competencia docente para el mundo digital. Rev. Interuniv. Form. Profr. 2018, 91, 105–116. [Google Scholar]
  43. Cózar, R.; Zagalaz, J.; Sáez, J.M. Creando contenidos curriculares digitales de Ciencias Sociales para Educación Primaria. Una experiencia TPACK para futuros docentes. Educ. Siglo XXI 2015, 33, 147–168. [Google Scholar] [CrossRef] [Green Version]
  44. Hervás, C.; López, E.; Real, S.; Fernández, E. Tecnofobia: Competencias, actitudes y formación del alumnado del Grado en Educación Infantil. Educ. Int. J. Educ. Res. Innov. 2016, 6, 83–94. [Google Scholar]
  45. Prendes, M.P.; Castañeda, L.; Gutiérrez, I. Competencias para el uso de TIC de los futuros maestros. Comunicar 2010, 17, 175–181. [Google Scholar] [CrossRef] [Green Version]
  46. Martín- Rodríguez, D.; de Jubera, M.M.S.; Campión, R.S.; de Luis, E.C. Diseño de un instrumento para evaluación diagnóstica de la competencia digital docente: Formación flipped classroom. DIM: Didáctica Innovación Multimed. 2016, 33, 1–15. [Google Scholar]
  47. Navío, E.P.; Domínguez, M.M.; Zagalaz, J.C. Perception of the professional competences of last year’s students of pre-primary education and primary education degrees and students of training teachers master. J. New Approaches Educ. Res. (NAER J.) 2019, 8, 58–65. [Google Scholar] [CrossRef]
Table 1. DigCompEdu areas of competence.
Table 1. DigCompEdu areas of competence.
AreasCompetencies
Area 1Professional commitment, centered on the importance of the educator’s work environment.
Area 2Digital resources, in agreement with the creation and distribution of digital resources.
Area 3Digital pedagogy, one of the essential competencies within the framework. This is focused on the creation, organization, and implementation of the ICT in the T-L process.
Area 4Evaluation and feedback: associated with the use of digital resources and strategies for evaluation.
Area 5Empowering students, which instills the importance of the correct use of appropriate digital tools for empowering students in their learning.
Area 6Student competencies: related to the educator’s capacity when facilitating DC among the students.
Table 2. Different DigCompEdu competence levels.
Table 2. Different DigCompEdu competence levels.
Levels of Mastery
A1The person possesses a basic level of competence, which requires support for its future development.
A2The subject has acquired a basic level of competence, which, with adequate support, will lead to an improvement in DC. A certain independence has also been achieved during its practice.
B1The person possesses a medium level of competence, being able to solve simple problems and to gradually make progress toward the development of DC.
B2The person has an intermediate level of competence but is now able to provide answers to his or her needs and to solve correctly defined problems, with definite progress observed in the development of his or her competence.
C1The subject has a more advanced level of competence, which means that he or she is able to guide other individuals toward an increase in digital competence.
C2The person has reached an advanced level of competence, being able to meet his or her needs, just as they can meet the needs of others. The subject has developed a level of competence that is able to provide answers to complex situations.
Table 3. Global Cronbach´s alpha.
Table 3. Global Cronbach´s alpha.
Reliability Statistics
Cronbach´s AlphaNumber of elements
0.95022
Table 4. Reliability dimensions.
Table 4. Reliability dimensions.
Reliability Statistics
Cronbach´s AlphaNumber of elements
0.9336
MeanStandard DeviationN
A2.15780.81384808
B2.33330.75536808
C2.06190.97881808
D1.78140.86309808
E1.86391.08937808
F1.85940.91461808
Table 5. Correlation of the items with the associated dimensions.
Table 5. Correlation of the items with the associated dimensions.
Matrix of the Component a
Component
1
A10.634
A20.580
A30.627
A40.649
B10.625
B20.587
B30.504
C10.757
C20.755
C30.714
C40.783
D10.796
D20.747
D30.762
E10.735
E20.753
E30.736
F10.667
F20.768
F30.700
F40.758
F50.791
Extraction method: analysis of principal components.
a 1 Extracted components.
Table 6. KMO and Bartlett´s test.
Table 6. KMO and Bartlett´s test.
Kaiser-Meyer-Olkin Measurement of Sampling Adequacy0.977
Bartlett’s sphericity testChi-square value31,038.347
gl231
Sig0.000
Table 7. Method of extraction: analysis of the principal components.
Table 7. Method of extraction: analysis of the principal components.
Total Explained Variance
ComponentInitial self-valuesSum of the loads to the square of the extraction
Total% Variance% AccumulatedTotal% of Variance% Accumulated
110.95549.79749.79710.95549.79749.797
Table 8. Goodness-of-fit indices of the model.
Table 8. Goodness-of-fit indices of the model.
IndexResultFitGood fit
CMIN338.347CMIN ≤ 500Yes
CFI0.994GFI > 0.7Yes
PGFI0.993PGFI > 0.7Yes
NFI0.993NFI > 0.7Yes
PNFI0.898PNFI > 0.7Yes
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martín Párraga, L.; Llorente Cejudo, C.; Barroso Osuna, J. Validation of the DigCompEdu Check-in Questionnaire through Structural Equations: A Study at a University in Peru. Educ. Sci. 2022, 12, 574. https://doi.org/10.3390/educsci12080574

AMA Style

Martín Párraga L, Llorente Cejudo C, Barroso Osuna J. Validation of the DigCompEdu Check-in Questionnaire through Structural Equations: A Study at a University in Peru. Education Sciences. 2022; 12(8):574. https://doi.org/10.3390/educsci12080574

Chicago/Turabian Style

Martín Párraga, Lorena, Carmen Llorente Cejudo, and Julio Barroso Osuna. 2022. "Validation of the DigCompEdu Check-in Questionnaire through Structural Equations: A Study at a University in Peru" Education Sciences 12, no. 8: 574. https://doi.org/10.3390/educsci12080574

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop