Next Article in Journal
Plagiarism among Higher Education Students
Previous Article in Journal
To Ban or Not to Ban? A Rapid Review on the Impact of Smartphone Bans in Schools on Social Well-Being and Academic Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Model for Designing Gamified Experiences Mediated by a Virtual Teaching and Learning Environment

by
Glenda Vera-Mora
1,*,
Cecilia V. Sanz
2,
Teresa Coma-Roselló
3 and
Sandra Baldassarri
4
1
Faculty of Legal, Social and Educational Sciences, Technical University of Babahoyo, Babahoyo 120102, Ecuador
2
Institute of Research in Computer Science LIDI (III LIDI–CIC), Faculty of Computer Science, National University of La Plata, Buenos Aires B1900, Argentina
3
Department of Education, Education Faculty, University of Zaragoza, 50009 Zaragoza, Spain
4
Department of Computer Science and Systems Engineering, Research Institute of Aragon (I3A), University of Zaragoza, 50018 Zarogoza, Spain
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 907; https://doi.org/10.3390/educsci14080907
Submission received: 16 July 2024 / Revised: 13 August 2024 / Accepted: 15 August 2024 / Published: 20 August 2024
(This article belongs to the Section Higher Education)

Abstract

:
Higher Education Institutions (HEIs) face new challenges in regard to technological development in light of necessary pedagogical and didactic innovations in educational action. This article proposes a Technological–Pedagogical Gamification Model (MGTP) that guides the design of gamified educational practices in Virtual Teaching and Learning Environments (EVEAs). The MGTP proposal is based on theoretical cores of Pedagogy and Computer Science theories, as well as works related to gamified experiences in EVEA where the social, cognitive, and teaching presences were analyzed. This work also presents an initial validation of the MGTP through expert judgment, and its results are analyzed from both a qualitative (content analysis and comments) and quantitative (using the Content Validity Coefficient method) perspective. These results reveal a high level of acceptance of the model by experts that is corroborated by reliability tests (Cronbach’s alpha and split-half reliability test). The results facilitated the development of a final version of the model for its subsequent application and evaluation in university practice.

1. Introduction

The processes of historical innovation that the educational system undergo as a result of social needs lead to changes and reformulation of established systems [1]. Thus, the daily education provided in Higher Education Institutions (HEIs) has been assumed by many students to be monotonous and sometimes ineffective [2,3]; this has generated questions in regard to emerging methodologies that promote significant improvements in learning and overcome the limitations of traditional learning while being linked to memorization and other aspects related to the acquisition of contents of different disciplines. Considering that university teaching has been the object of severe criticism that questions the role that teachers must assume in the face of these challenges [4], it is necessary to appropriate strategies that contribute to these new needs and improve teaching and learning processes in an efficient and quality manner [1,5,6].
Virtual Teaching and Learning Environments (EVEAs) have become ubiquitous in HEIs. They are mainly used by educational institutions to complement face-to-face classes with a blended learning approach or to enable distance learning modalities [7,8]. This implies the creation of innovative EVEAs that contribute to the improvement of student experiences and outcomes [9] through learning strategies that promote participation and enjoyment, such as gamification [10].
Gamification is defined as the use of game design elements in non-game environments [11]. It seeks to promote student participation and involvement [12] with the aim of influencing people’s behavior through actions on their motivation [13]. The publications of the scientific community refer to several approaches to the game elements applied in gamified experiences [14,15]. In this paper, we adopt the following categorizations, which have been presented in Teixes [13], for the elements that are of importance when creating a game: mechanics and components, dynamics and aesthetics. The mechanics and components refer to the elements that integrate the game and make it possible for progress in the gamified activity to be visible [16]. Meanwhile, the dynamics allow the mechanics and components of gamification to become meaningful to the players, and this helps in avoiding monotonous tasks [13]. Aesthetics mobilize emotions and are linked to what the gamified system generates in the learner through interactions with the mechanics and dynamics [17].
The Community of Inquiry (CoI) model of Garrison et al. [18], which is of current relevance, assumes that learning occurs in a community in which teachers (facilitators) and students interact. It indicates that this community is nurtured by the following three types of presence: social (SP), cognitive presence (CP), and teaching presence (TP). A teacher present in the virtual environment must have a mastery of the contents, technological skills, and didactic knowledge to plan activities, guide strategies, and achieve the expected results [19]. This CoI model has made it possible to analyze and design educational practices mediated by digital technologies, particularly in EVEAs [20,21,22], focusing on the development of critical thinking, interaction, and dialogue among participants and the guidance, structure, and proposals made by the teacher as a facilitator.
While there is some work linking the CoI model to gamification experiences in EVEAs with presence analysis [20,21,22,23,24,25], only the work of Utomo et al. [26] is oriented to present a model that integrates gamification strategies with presences. The model presented in [26] considers the following four components: the user (facilitators and students), the learning process (including triggers, discussions, challenges and tests with gamification elements to analyze understanding of the topics, feedback from facilitators through gamification elements, and problem solving with application of what is understood in the process), the objective (goals to be achieved), and the environment (the EVEA; in this case, the authors use Moodle, and it is linked to the materials and resources to generate discussions and interactions). After presenting the model, the authors describe a gamification experience that draws on the model and obtains results. Thus, the model of Utomo et al. [26] has not been designed as a guide for teachers/institutions that want to design gamified proposals where CoI presences are considered but rather as a theoretical model of the components to be considered. The proposal of Utomo et al. [26] is a background of interest for this work. However, a more comprehensive model (MGTP) is proposed here to guide teachers/researchers and institutions in the design of gamified experiences in EVEA and to consider the evaluation through the presences presented by Garrison et al. [18,27]. The latter can contribute to analyzing how the design and implementation of the gamified experience (designed from the MGTP) impacts on the SP (in the link with teachers and peers) and the student’s CP (with higher order cognitive activities), as well as to analyzing how the TP is perceived and intervenes as a fundamental link to drive and accompany the process. Consequently, this MGTP aims to guide the design, development, and evaluation of gamified experiences in EVEAs while considering the presences of the CoI model for the promotion of motivation, involvement, and commitment of students towards the construction of higher order cognitive levels.
In this way, this paper is organized as follows: Section 2 focuses on presenting some key concepts and background information found in the literature that contribute significantly to this study; Section 3 describes the methodology used for the development of this work; Section 4 presents the MGTP model and its rationale; Section 5 details the validation process carried out and the results of the MGTP evaluation by means of expert judgment and discussion. Finally, Section 6 presents the conclusions and lines of future work.

2. Background and Related Work in EVEA

The research reviewed provides a variety of approaches and methodologies that highlight how gamification can increase student interaction and engagement. However, seven studies that are considered relevant to this work are analyzed throughout this section, providing knowledge regarding the presence of the CoI model as part of a gamification proposal in EVEA. First, Utomo et al. [26] highlights a learning model that combines gamification strategies to integrate the three presences of the CoI model. The authors of [26] present an improvement in collaboration and communication between students and teachers as a result. This approach not only boosts student engagement but also motivates them to be more active and share ideas.
Similarly, the reviewed studies use a diverse range of EVEAs to mediate their courses, from Moodle [20,21,22] and Open edX [23] to customized environments developed specifically for gamification projects. Petroulis et al. [24] implement a reward system in blended courses delivered through Moodle, while, Antonaci et al. [23] study gamification in massive open online courses (MOOCs) where their results reveal an increase in social presence and sense of community.
However, learning in gamified EVEAs is directed by means of strategies that integrate the three presences of the CoI model [21]. The experiences analyzed show that a greater social presence is achieved through collaborative and competitive activities. CP is strengthened through challenges and problems that require the use of critical thinking skills, while PD is manifested in the guidance and continuous feedback provided by the facilitators. Hence, Utomo et al. [25] found that these gamified approaches increase participation in the system and improve learning.
Also, the most common strategies include the use of point systems, levels, rewards and challenges, as in the study by Tzelepi et al. [28], who suggest the following two types of gamification elements: individual badges and community progress, making it known that individual elements are more effective for critical thinking. Utomo et al. [26] implemented assignments and discussions in Moodle that were enriched with gamification, such as badges and scores, with the purpose of encouraging students’ participation and critical thinking.
For their part, Mese and Dursun [20] found that blended learning environments enriched with gamification have a significant impact on these presences, which improves student engagement and motivation. Something similar occurs in the studies of Papanikolaou et al. [21], who integrate gamification with social network metrics to increase the effectiveness of online learning communities.
Consequently, the reviewed studies show a significant positive impact by promoting social, cognitive, and teaching presence. Additionally, Utomo et al. [26] provided evidence that the integration of gamification increases student participation and engagement. Tzelepi et al. [22], concluded that individualized gamification elements are more effective for deep learning, while Antonaci et al. [23] found that gamification in MOOCs enhances social presence and the sense of community.

3. Methodology

This work addressed a Systematic Literature Review (SLR), following the protocol presented by Kitchenham [29], which made it possible to investigate studies that combine the main study variables, EVEA, gamification, and CoI, as central axes of the proposal under investigation. The results of the SLR were published in [27]. Based on these results, a model proposal is created that sifts the pedagogical and technological dimensions, where the presence of the CoI model in the evaluation phase is considered. Thus, the MGTP model proposal emerges.
The construction of a model that integrates pedagogical and technological dimensions, and that considers the presence of the CoI model in the evaluation phase, requires a rigorous and well-founded approach. The MGTP was created based on the background found and from the study of the theoretical cores of Pedagogy and Computer Science. Pedagogy, as a science that studies the education of people [29], is integrated with Computer Science, which focuses on computational processes oriented to the creation of software to solve specific problems in the environment [30]. This interdisciplinary approach allows for the development of a model that seeks to meet the growing needs of higher education [31].
In order to carry out the creation of the model, in addition to considering the studied antecedents with their potentialities and gaps, an iterative process was carried out among the authors. To evaluate the model, the first step was to define the object and objective of the evaluation. It was decided that an expert judgment needed to be carried out. An expert judgment constitutes a valid and reliable methodology that allows for an obtained qualified opinion on an object to be evaluated through a series of procedures to be used to collect information from the experts [32,33]. Expert selection was approached using biograms and a coefficient of knowledge, as well as the development and application of the assessment instrument. Additionally, instrument validity and reliability were calculated.

4. MGTP Model

The Technological–Pedagogical Gamification Model (MGTP) aims to integrate gamification strategies and presence indicators to enable gamified educational practices in EVEAs within HEIs, and it is aimed at impacting the cognitive, social, and teaching presence in these environments. It targets those interested in conducting gamified experiences in EVEAs. Therefore, HEIs require specific actors who can approach the use of the model as a guide for designing, developing, and evaluating gamified experiences in EVEA, taking into account the CoI model’s presence.
The main actors of the MGTP include researchers interested in studying this topic, coordinators with expertise in gamification and Pedagogy who will institutionalize the model, teachers who will participate as actors throughout the methodological process of the model, and students involved in developing the gamified experience in EVEAs. HEI teachers can utilize a MGTP to design and implement gamified pedagogical activities, while researchers can employ it to analyze and enhance educational practices. Students will actively participate in the implementation and evaluation phases to construct meanings, knowledge, and skills, as well as to express their opinions about gamified experiences based on the analysis of presence indicators. This approach allows HEIs to adopt the model to establish educational practices that consider necessary dimensions to enhance quality.
The MGTP is structured around subsystems, their components, and their relationships. The relationships represent the existing links between the subsystems and finally, the novel essence, which arises from the interactions of components–structure–relationships, the result of which is superior to the isolated elements that form it [34]. In an integral look at the model, Figure 1 presents its structure in subsystems, which include components that are related, from the logic of the pedagogical organization of a gamified EVEA, to direct mediation of digital technologies.
Subsystems comprise interrelated components that form a cohesive system as referred to by Bertalanffy [34]. The organization of the MGTP model into subsystems comprises the following:
Subsystem I: Determination of gamification needs. This subsystem covers the analysis of the gamification content requirement, the technological and professional diagnosis of the teaching and learning process, and the decision making required to gamify the content. It includes an analysis of the justification of the needs to implement gamification in EVEAs. It may involve the institution and/or researchers who want to address gamified educational practices and teachers. It is based on Maslow’s “pyramid of human needs [35].” and Leontiev’s activity theory [36], which affirm that, by satisfying basic needs, higher desires are demanded and developed that can make self-actualization possible. In this subsystem, the MGTP proposes interviews and surveys which evaluate the motivation and digital competences of teachers before starting the gamification process. These instruments are found in the Supplementary Materials at the end of the document and are identified as follows: Instrument S1: semi-structured interview: motivation to gamify learning contents; Instrument S2: Survey of Digital Competence of Teachers.
From this subsystem, there will be a diagnosis of gamification needs and training needs for teachers in charge of planning and carrying out the gamified experience.
Subsystem II: Technological–pedagogical gamification projection. This subsystem focuses on the planning of the gamified teaching and learning process, where clear objectives are established that determine resources and specific activities, gamification rules and guidelines are offered, and the gamified experience is designed. Planning includes the selection of appropriate tools and applications for gamification, where it ensures that the activities and resources are effective and motivating for students. This planning, according to Álvarez [29], is directly linked to the methodically organized procedure of the educational process to obtain a given objective. Foncubierta y Chema [37] stresses the importance of an adequate design to avoid boredom and ensure the success of the gamified activity. Strongly involved in this subsystem are the teachers who will depend on the resources and policies of the institution to make decisions. Researchers can guide the use of the model at this stage by analyzing the teachers’ decision making. The model provides teachers with a compilation of good gamification practices so that the teacher has examples. In addition, a document is provided with a list of possible mechanics to consider for the design of the gamified experience. As an instrument, the teacher generates a logbook that allows him/her to record this process (Instrument S3: logbooks of the teacher’s work).
From this subsystem, we have all the planning of the experience, including its mechanics, dynamics, aesthetics, and other components.
Subsystem III: Gamified development of the EVEA experience. In this subsystem, the implementation and evaluation of the gamified experience is carried out. The following four phases are considered for the development of the experience, following aspects of the CoI model: triggering event, exploration and integration, resolution and feedback. The initial phase is oriented to capture students’ interest through problems or dilemmas that require resolution, which encourages critical thinking. During exploration and integration, students actively participate in activities, exchange ideas, and develop solutions. The resolution phase involves the practical application of what has been learned, while feedback focuses on constructive evaluation and continuous improvement of the educational process [31].
The evaluation phase is elementary and stands out in the proposed model due to its significant contribution. The evaluation is performed through the consideration of the CoI model, which analyzes the social, cognitive, and teaching presence. The MGTP suggests using a validated instrument in the evaluation process of the gamified experience. This is the CoI survey designed by Richardson et al. [38]. This consists of 34 items organized by dimensions and evaluative categories. In this case, the instrument will allow for analyzing the interactions and perceptions of the students regarding the gamified experience in the EVEA according to the three presences of the CoI model. In this sense, the cognitive presence dimension focuses on analyzing the students’ understanding and critical reflection on the curricular contents; the gamified activities promote inquiry and knowledge integration. Social presence, on the other hand, evaluates group cohesion, open communication, and affective expressions; it will be considered whether gamified strategies promote interaction and collaboration among students. The teaching presence dimension analyzes the role of the teacher in facilitating discourse, content organization, and feedback. Here the teacher is responsible for guiding and motivating students through proactive and constructive interventions. This dimension of the evaluation will make it possible to know whether the students consider the proposals and interventions made by the teachers in the gamified proposal to be effective. This instrument is available in the Supplementary Materials at the end of the document and are identified as follows: Instrument S4: Inquiry Community Survey. In addition, as part of the evaluation process, the MGTP considers using interviews with teachers to know their opinion as well. This instrument is available in the Supplementary Materials at the end of the document and are identified as follows: Instrument S5: Assessment interview on the impact of the gamified experience. According to Bertalanffy [34], a feedback mechanism can “reactively” reach a higher state of organization (p. 156).
From this subsystem, the development and evaluation of the experience will be achieved for its subsequent analysis and decision making for improvement Thus, the MGTP system has a bidirectional relationship (arrows) between the components of the model’s subsystems, suggesting a relationship of interdependence and/or mutual feedback between them. Thus, activities performed in one component may have an impact on the next component and vice versa, ensuring the development of continuous adjustments and improvements, based on interaction, which ensures a dynamic and effective integrated approach. Thus, the circular date located between the subsystems represents the cyclical, continuous and adaptive nature of the MGTP through action (active participation of the actors), reflection (evaluation and analysis of the processes), and adaptation (adjustments and improvements based on feedback), which ensures that the gamified experience in the EVEA, developed with the guidance of the proposed model, is flexible and adaptable to various needs and contexts where it is applied to maintain the interest and motivation of students [39].
It is important to highlight that, from the relationships between the components of each subsystem, a new quality emerges as a fundamental contribution to science. Consequently, from these relationships a general quality is revealed that is linked to the processual nature of gamification in an EVEA in regard to the determination of needs, action planning and pedagogical and technological interaction with the use of gamification strategies (see Figure 2).

5. Expert Judgment as a First Validation of the Model

In the following, the application of the expert judgment is described first, followed by the presentation of the results and their analysis and discussion.

5.1. Application of Expert Judgment

In this work, expert judgment is used and the content validity method identified as Content Validity Coefficient (CVC) has been chosen. This method determines the content validity for each item and in general, based on the level of agreement between experts. Expert judgment and the CVC validity method are widely used in research and have many advantages over other validation procedures [40].
The methodology followed for this study is shown in Table 1.
The process began with defining the object of evaluation, in this case the MGTP, and the evaluation objective, which was to validate the content and reliability of the model using a measurement instrument. Expert judges in EVEA Pedagogy and Gamification were selected based on criteria established in the biogram and the Knowledge coefficient (Kc) to conduct the assessment.
An evaluation instrument divided into four sections was developed covering the analysis of the three subsystems of the model and opinions on the MGTP in general. This instrument was designed according to the proposal of Hernández-Nieto [40]. The assessment methodology was communicated to the experts along with detailed instructions on the assessment process, which included as its decision criteria the following: CVC < 0.600 = Unacceptable; CVC ≥ 0.600 and ≤0.700 = Poor; CVC > 0.71 and ≤0.800 = Acceptable; CVC > 0.800 and ≤0.900 = Good; and CVC > 0.900 = Excellent. The evaluation methodology was explained to the experts through e-mails, which included the validation format and the link to the instrument in digital format.
Subsequently, the instrument was applied using a Google form, which allowed the experts to evaluate the items according to criteria such as relevance, wording, clarity, coherence, and representativeness with a Likert-type scale of 1 to 5. The instruments are available in the Supplementary Materials at the end of the document and are identified as follows: Instrument S6: MGTP validation format, Input S1: Cards of the validation of the scores obtained by the experts. The process was continuously monitored to ensure the active participation of the experts and to resolve any doubts they might have. Once the evaluation was completed, the CVC was calculated for each item, and for the instrument in general, as well as for the reliability of the instrument, by means of Cronbach’s Alpha Coefficient. Finally, the results obtained were communicated to the experts, who were informed of the entire process and thanked for their participation.

5.2. Results and Discussion

The results of applying the evaluation methods discussed in the previous section to the model are presented below. The selected evaluation methods (expert judgment, Content Validity Coefficient (CVC), Cronbach’s Alpha Coefficient and the test of two halves) are justified and the methodological process that was followed is described. The expert judgment resulted in an overall CVC of 0.902, indicating high validity and agreement among the experts based on the Hernández-Nieto [40] scale of values. All items assessed obtained CVC values above 0.8, indicating that the instrument has an adequate relationship between the items, components, and subsystems of the MGTP. In addition, an overall CVC of 0.949 was obtained, indicating excellent validity and agreement of the instrument. Therefore, the results of the validation process for each of the components and subsystem relationships of the MGTP yielded significant values of relevance, wording, clarity, coherence, and representativeness for the proposed items.
Regarding qualitative assessment, the experts contributed the following improvements: The learning objectives should be clearly defined and aligned with the social task and in relation to the contents of the subject, which is related to what Vygotsky [45] mentions, who emphasizes that learning occurs through social interaction in a given environment and with the contents of the culture offered in the curriculum.
In addition, the experts proposed that the learning process should begin with a triggering event, such as a problem or dilemma, that captures the interest of the students and encourages critical thinking through communication between the actors in the educational process, aspects that, from the theory of presences of Garrison et al. [32], also indicate that the student is in interaction with others, within the learning communities, to build higher-order knowledge [33,46]. The rationale for the quality among the components of subsystem III was also modified to include the student’s reality; and an interview was added to assess the impact of the gamified experience from the teacher’s point of view. It should be clarified that the model presented in Section 3 already includes the improvements indicated by the experts.
Although the experts did not consider it necessary to include additional subsystems or components for the MGTP, they provided valuable qualitative input that enriched its content and led to the design of an additional instrument to assess the impact of the gamified experience from the teacher’s perspective.
Cronbach’s Alpha Coefficient (0.983) and the test of two halves (Cronbach’s Alpha above 0.9 for each half) confirmed the reliability of the instrument, reflecting a very satisfactory internal consistency. This indicates that the instrument was well understood by the experts and is considered for application in HEIs.
In summary, the MGTP was favorably evaluated by the experts and opportunities for improvement were also found. After the analysis of the experts’ opinions, the following modifications were addressed in terms of wording (components and relationships), and regarding the implementation of qualitative characteristics (assessment interview on the impact of the gamified experience), but not in its structural design (number of components and subsystems). Below, for more detail, Figure 3 shows the previous aspects of the MGTP, presented in white color, and the modifications subsequent to the expert judgment, distinguished by their gray color.
The MGTP is presented as a promising tool for the integration of gamification in EVEA. Its validation through expert judgment and reliability testing supports its robustness and applicability. The incorporation of pedagogical and technological elements, together with assessment based on the CoI model, ensures a comprehensive approach to gamification in higher education. The results of this study open new possibilities for research and teaching practice and lay the groundwork for future research exploring the impact of MGTP in different educational contexts and disciplines. It is important to note that the MGTP has been used in two case studies, the results of which are being analyzed and will be the subject of a forthcoming publication.

6. Conclusions

The MGTP is presented as an innovative and promising tool to enrich the educational experience in virtual environments. Its design, validated by experts and supported by pedagogical and technological theories, offers a comprehensive guide for teachers and institutions seeking to implement gamification effectively [47]. The MGTP not only ensures a structured process but also promotes continuous improvement through constant feedback and analysis of social, cognitive, and teaching presences.
The model proposal is supported by the results obtained from expert judgment and reliability tests, demonstrating its validity and applicability in real contexts. The integration of technological tools and resources, coupled with evaluation based on the CoI model, ensures an experience that considers both technological and pedagogical aspects, incorporating the best practices identified in the literature. The MGTP not only contributes to educational innovation but also lays the groundwork for future research and developments in the field of gamification in higher education.
During the research development, a methodological gap was identified regarding the lack of studies utilizing models that integrate quantitative and qualitative approaches to comprehensively understand the effects of gamification on EVEA teaching, learning processes, and motivation. The next steps of this research will focus on evaluating the implementation results of the MGTP across diverse educational contexts, encompassing various academic levels, disciplines, and teaching modalities. The study will assess its impact from both student and teacher perspectives.

Supplementary Materials

The following support information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14080907/s1, Instrument S1: Semi-structured interview: Motivation to gamify learning content; Instrument S2: Survey of Teaching Digital Competence; Instrument S3: Teaching work logs—subsystem II and III; Instrument S4: Inquiry Community Survey; Instrument S5: Assessment interview on the impact of the gamified experience; Instrument S6: MGTP validation format; Input S1: Cards of the validation of the scores obtained by the experts. Reference [48] are cited in the Supplementary Materials.

Author Contributions

Conceptualization, T.C.-R.; Methodology, G.V.-M. and T.C.-R.; Software, S.B.; Validation, G.V.-M.; Formal analysis, G.V.-M. and T.C.-R.; Investigation, G.V.-M., C.V.S. and S.B.; Resources, C.V.S.; Data curation, C.V.S.; Writing—original draft, S.B.; Writing—review & editing, C.V.S.; Visualization, S.B.; Supervision, T.C.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the projects PID2022-136779OB-C31 of the Spanish Government, S49-23R and T60-20R of the Aragon regional Government, PIIDUZ2023_5024 of the University of Zaragoza and 11/F023 of III LIDI of the National University of La Plata.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this paper are publicly available in the project repository: https://bit.ly/3Ajpmnn (accessed on 15 July 2024).

Acknowledgments

We would like to thank the teachers and students of the Technical University of Babahoyo who participated in this study.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

  1. Merellano-Navarro, E.; Almonacid-Fierro, A.; Moreno-Doña, A.; Castro-Jaque, C. Buenos docentes universitarios: ¿Qué dicen los estudiantes? Educ. Pesqui. 2016, 42, 937–952. [Google Scholar] [CrossRef]
  2. Contreras, R.; Eguia, J.L. (Eds.) Gamificación en las Aulas Universitarias; Instituto de la Comunicació Universitat Autónoma de Barcelona: Barcelona, Spain, 2016. [Google Scholar]
  3. Khaldi, A.; Bouzidi, R.; Nader, F. Gamification of e-learning in higher education: A systematic literature review. Smart Learn. Environ. 2023, 10, 10. [Google Scholar] [CrossRef]
  4. Cabalín, D.; Navarro, N.; Zamora, J.; Martín, S.S. Concepción de Estudiantes y Docentes del Buen Profesor Universitario: Facultad de Medicina de la Universidad de La Frontera. Int. J. Morphol. 2010, 28, 283–290. [Google Scholar] [CrossRef]
  5. Saleem, A.N.; Noori, N.M.; Ozdamli, F. Gamification Applications in E-learning: A Literature Review. Technol. Knowl. Learn. 2022, 27, 139–159. [Google Scholar] [CrossRef]
  6. Kalogiannakis, M.; Papadakis, S.; Zourmpakis, A.I. Gamification in science education. A systematic review of the literature. Educ. Sci. 2021, 11, 22. [Google Scholar] [CrossRef]
  7. Tuparov, G.; Keremedchiev, D.; Tuparova, D.; Stoyanova, M. Gamification and educational computer games in open-source learning management systems as a part of assessment. In Proceedings of the 2018 17th International Conference on Information Technology Based Higher Education and Training (ITHET), Olhao, Portugal, 26–28 April 2018; pp. 1–5. [Google Scholar] [CrossRef]
  8. Aguilos, V.; Fuchs, K. The Perceived Usefulness of Gamified E-Learning: A Study of Undergraduate Students with Implications for Higher Education. Front. Educ. 2022, 7, 945536. [Google Scholar] [CrossRef]
  9. Ramírez-Montoya, M.S.; Valenzuela, J.R. Innovación Educativa: Tendencias Globales de Investigación e Implicaciones Prácticas; Octaedro Publisher: Barcelona, Spain, 2019. [Google Scholar]
  10. Zahedi, L.; Batten, J.; Ross, M.; Potvin, G.; Damas, S.; Clarke, P.; Davis, D. Gamification in education: A mixed-methods study of gender on computer science students’ academic performance and identity development. J. Comput. High. Educ. 2021, 33, 441–474. [Google Scholar] [CrossRef]
  11. Deterding, S.; Dixon, D.; Khaled, R.; Nacke, L. From Game Design Elements to Gamefulness: Defining ‘Gamification’. In Proceedings of the MindTrek ’11 Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, Tampere, Finland, 28–30 September 2011; pp. 28–30. [Google Scholar] [CrossRef]
  12. Chans, G.M.; Castro, M.P. Gamification as a strategy to increase motivation and engagement in higher education chemistry students. Computers 2021, 10, 132. [Google Scholar] [CrossRef]
  13. Teixes, F. Gamificación: Fundamentos y Aplicaciones, 1st ed.; UOC: Barcelona, Spain, 2014. [Google Scholar]
  14. Hunicke, R.; LeBlanc, M.; Zubek, R. MDA: A Formal Approach to Game Design and Game Research. Proc. AAAI Workshop Chall. Game AI 2004, 4, 1722. [Google Scholar]
  15. Werbach, K.; Hunter, D. How Game Thinking Can Revolutionize Your Business; Wharton Digital Press: Philadepphia, PA, USA, 2012. [Google Scholar]
  16. Pomata, J.; Díaz, J. TIC y Gamificación en la Enseñanza de Español como Lengua Extranjera: Situación y Líneas de Actuación para las Universidades Japonesas. Cuadernos CANELA: Revista anual de Literatura, Pensamiento e Historia, Metodología de la Enseñanza del Español como Lengua Extranjera y Lingüística de la Confederación Académica Nipona, Española y Latinoamericana. 2017, pp. 79–101. Available online: https://www.researchgate.net/publication/326698094_TIC_y_gamificacion_en_la_ensenanza_de_espanol_como_lengua_extranjera_situacion_y_lineas_de_actuacion_para_las_universidades_japonesas (accessed on 10 March 2024).
  17. Teixes, F. Gamificación: Motivar Jugando; Primera ed.; UOC: Barcelona, Spain, 2015. [Google Scholar]
  18. Garrison, R.; Anderson, T.; Archer, W. Critical inquiry in a text-based environment. Comput. Conf. High. Educ. 2000, 16, 6–12. [Google Scholar] [CrossRef]
  19. Pardo, A.S. Formalización de un modelo de formación online basado en el factor humano y la presencia docente mediante un lenguaje de patrón. In Doctorado Formación en la Sociedad del Conocimiento; Universidad de Salamanca: Salamanca, Spain, 2014. [Google Scholar]
  20. Mese, C.; Dursun, O.O. Effectiveness of gamification elements in blended learning environments. Turk. Online J. Distance Educ. 2019, 20, 119–142. [Google Scholar] [CrossRef]
  21. Papanikolaou, K.; Tzelepi, M.; Moundridou, M.; Petroulis, I. Employing Social Network Analysis to enhance community learning. Lect. Notes Comput. Sci. 2020, 12149, 342–352. [Google Scholar] [CrossRef]
  22. Tzelepi, M.; Makri, K.; Petroulis, I.; Moundridou, M.; Papanikolaou, K. Gamification in online discussions: How do game elements affect critical thinking? In Proceedings of the 2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT), Tartu, Estonia, 6–9 July 2020; pp. 92–94. [Google Scholar] [CrossRef]
  23. Antonaci, A.; Klemke, R.; Lataster, J.; Kreijns, K.; Specht, M. Gamification of MOOCs Adopting Social Presence and Sense of Community to Increase User’s Engagement: An Experimental Study. In Proceedings of the European Conference on Technology Enhanced Learning, Delft, The Netherlands, 16–19 September 2019; Springer International Publishing: Cham, Switzerland, 2019; Volume 11722. [Google Scholar] [CrossRef]
  24. Petroulis, I.; Tzelepi, M.; Papanikolaou, K. On the design of Gamification Elements in Moodle Courses. In Proceedings of the 8th International Conference, GALA 2019, Athens, Greece, 27–29 November 2019; Springer International Publishing: Cham, Switzerland, 2019; Volume 11899. [Google Scholar] [CrossRef]
  25. Utomo, A.Y.; Santoso, H.B. Development of gamification-enriched pedagogical agent for e-learning system based on community of inquiry. In Proceedings of the CHIuXiD ’15: Proceedings of the International HCI and UX Conference in Indonesia, Bandung, Indonesia, 8–10 April 2015; pp. 1–9. [Google Scholar] [CrossRef]
  26. Utomo, A.Y.; Amriani, A.; Aji, A.F.; Wahidah, F.R.N.; Junus, K.M. Gamified E-learning Model Based on Community of Inquiry. In Proceedings of the 2014 International Conference on Advanced Computer Science and Information System, Jakarta, Indonesia, 18–19 October 2014; pp. 474–480. [Google Scholar] [CrossRef]
  27. Vera-Mora, G.; Sanz, C.; Baldassarri, S.; Coma, T. Entornos virtuales de enseñanza y aprendizaje gamificados a la luz del concepto de presencia: Revisión sistemática de literatura. Rev. Iberoam. Tecnol. Educ. Educ. Tecnol. 2023, 33, 25–35. [Google Scholar] [CrossRef]
  28. Tzelepi, M.; Petroulis, I.; Papanikolaou, K. Investigating gamification and learning analytics tools for promoting and measuring communities of inquiry in moodle courses. Adv. Intell. Syst. Comput. 2020, 1007, 89–96. [Google Scholar] [CrossRef]
  29. Pla, R. Una Concepción de la Pedagogía como Ciencia Desde el Enfoque Histórico Cultural. Material en Soporte Digital, Centro de Estudios e Investigación “José Martí”. UCP “Manuel Ascunce Domenech”. Ciego de Ávila. Estudio Sobre el discurso II. 2010, p. 79. Available online: https://profesorailianartiles.files.wordpress.com/2013/03/libro-de-pedagogc3ada.pdf (accessed on 10 March 2024).
  30. Medrano, J. Universidad de Ingeniería y Tecnología—UTEC. Febrero. Available online: https://utec.edu.pe/ (accessed on 10 March 2024).
  31. Garrison, D.; Anderson, T. El e-Learning en el Siglo XXI. Investigación Práctica, 1st ed.; Octaedro Publisher: Barcelona, Spain, 2010. [Google Scholar]
  32. Garrison, D.; Anderson, T.; Archer, W. Critical thinking, cognitive presence, and computer conferencing in distance education. Am. J. Distance 2001, 15, 7–23. [Google Scholar] [CrossRef]
  33. Alenezi, M. Digital Learning and Digital Institution in Higher Education. Educ. Sci. 2023, 13, 88. [Google Scholar] [CrossRef]
  34. von Bertalanffy, L. General System Theory; George Braziller, Inc.: New York, NY, USA, 1980. [Google Scholar]
  35. Maslow, A.H. Motivation and Personality; Harper & Row: New York, NY, USA, 1954. [Google Scholar]
  36. Leontev, A.N. Activity, Consciousness, and Personality; Prentice-Hall Publisher: London, UK, 1978. [Google Scholar]
  37. Foncubierta, J.; Chema, R. Didáctica de la gamificación en la clase de español. Edinumen 2014, 1–8. Available online: http://eleinternacional.com/wp-content/uploads/2017/10/Didactica_Gamificacion_ELE.pdf (accessed on 10 March 2024).
  38. Richardson, J.C.; Cleveland-Innes, M.; Ice, P.; Swan, K.P.; Garrison, D.R. Using the Community of Inquiry Framework to Inform Effective Instructional Design. In The Next Generation of Distance Education: Unconstrained Learning; Springer: Boston, MA, USA, 2012; pp. 1–266. ISBN 978-1-4614-1785-9. [Google Scholar] [CrossRef]
  39. Tzavara, A.; Lavidas, K.; Komis, V.; Misirli, A.; Karalis, T.; Papadakis, S. Using Personal Learning Environments before, during and after the Pandemic: The Case of ‘e-Me’. Educ. Sci. 2023, 13, 87. [Google Scholar] [CrossRef]
  40. Hernández-Nieto, R. Instrumentos de Recolección de Datos en Ciencias Sociales y Ciencias Biomédicas: Validez y Confiabilidad. In Diseño y Construcción; Normas y Formatos: Mérida, Venezuela, 2011. [Google Scholar]
  41. Cabero, J.; Llorente, M.d.C. La Aplicación del Juicio de Experto como Técnica de Evaluación de las Tecnologías de la Información y Comunicación (TIC) The expert’s judgment application as a technic evaluate Information and Communication Technology (ICT). Eduweb. Rev. Tecnol. Inf. Comun. Educ. 2013, 7, 11–22. [Google Scholar]
  42. Cabero, J.; Barroso, J. La utilización del juicio de experto para la evaluación de TIC: El coeficiente de competencia experta. Bordon. Rev. Pedagog. 2013, 65, 25–38. [Google Scholar] [CrossRef]
  43. Escobar-Pérez, J.; Cuervo-Martínez, Á. Validez de contenido y juicio de expertos: Una aproximación a su utilización. Av. Medición 2008, 6, 27–36. [Google Scholar]
  44. García, L.; Fernández, S.J. Procedimiento de aplicación del trabajo creativo en grupo de expertos. Ing. Energ. 2008, 29, 46–50. (In Spanish) [Google Scholar]
  45. Vygotskiĭ, L.S.; Kozulin, A. Thought and Language; MIT Press: Cambridge, MA, USA, 1986. [Google Scholar]
  46. Valencia-Arias, A.; Cartagena Rendón, C.; Palacios-Moya, L.; Benjumea-Arias, M.; Pelaez Cavero, J.B.; Moreno-López, G.; Gallegos-Ruiz, A.L. Model Proposal for Service Quality Assessment of Higher Education: Evidence from a Developing Country. Educ. Sci. 2023, 13, 83. [Google Scholar] [CrossRef]
  47. Gutiérrez-Castillo, J.J.; Palacios-Rodríguez, A.; Martín-Párraga, L.; Serrano-Hidalgo, M. Development of Digital Teaching Competence: Pilot Experience and Validation through Expert Judgment. Educ. Sci. 2023, 13, 52. [Google Scholar] [CrossRef]
  48. Available online: https://www.revistadepedagogia.org/cgi/viewcontent.cgi?article=3452&context=rep (accessed on 10 March 2024).
Figure 1. Technological–Pedagogical Gamification Model (MGTP).
Figure 1. Technological–Pedagogical Gamification Model (MGTP).
Education 14 00907 g001
Figure 2. Relationships and qualities of the MGTP.
Figure 2. Relationships and qualities of the MGTP.
Education 14 00907 g002
Figure 3. Initial version of aspects in the MGTP and additions after expert judgment. Source to Component I—Subsystem III [31].
Figure 3. Initial version of aspects in the MGTP and additions after expert judgment. Source to Component I—Subsystem III [31].
Education 14 00907 g003
Table 1. Description of methodological steps applied in the expert judgment.
Table 1. Description of methodological steps applied in the expert judgment.
StepsDescription
  • Define the object to be evaluated.
Technological–Pedagogical Gamification Model for a Virtual Teaching and Learning Environment (MGTP).
2.
Define the objective of the evaluation by expert judgment.
Validate the content and reliability of the MGTP through the measurement instrument.
3.
Define the method to be used in the expert judgment.
Content Validity Coefficient (CVC) by Hernandez-Nieto [40]
4.
Select the experts involved in the process (Steps 4 and 5 can be performed in the reverse order, if applicable).
Selection of experts in EVEA Pedagogy and Gamification through the biogram and Knowledge Quotient.
5.
Elaborate the evaluation instrument.
The instrument consisted of four sections:
Section 1. Analysis of Subsystem I
Section 2. Analysis of Subsystem II
Section 3. Analysis of Subsystem III
Section 4. Opinion on the Adequacy of the MGTP
6.
Communicate to the experts the evaluation methodology to be followed.
Communication process and methodological follow-up of the expert judgment developed.
7.
Apply the instrument.
E-mail sent to the experts for the development of the validation process by expert judgment.
8.
To carry out the general follow-up of the process.
Communication process and methodological follow-up of the expert judgment developed.
9.
Calculate content validity and interpretation of results.
Procedure to calculate the CVC for each of the items and the general instrument and interpretation of the CVC, based on the scale of values established by Hernandez-Nieto [40].
10.
Conduct qualitative review of the items.
Verification of the concordance of the items for elimination, modification or approval. Review of considerations on comments or improvements suggested by experts.
11.
Calculate the reliability of the instrument.
Procedure to calculate the reliability of the results obtained from the pilot test by means of a Cronbach’s Alpha Coefficient and a Test of Two Halves.
12.
Communicate the results of the evaluation to the participating experts.
Once the application and analysis of the results obtained have been completed, the experts are informed of the entire process.
Adapted from [41,42,43,44].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Vera-Mora, G.; Sanz, C.V.; Coma-Roselló, T.; Baldassarri, S. Model for Designing Gamified Experiences Mediated by a Virtual Teaching and Learning Environment. Educ. Sci. 2024, 14, 907. https://doi.org/10.3390/educsci14080907

AMA Style

Vera-Mora G, Sanz CV, Coma-Roselló T, Baldassarri S. Model for Designing Gamified Experiences Mediated by a Virtual Teaching and Learning Environment. Education Sciences. 2024; 14(8):907. https://doi.org/10.3390/educsci14080907

Chicago/Turabian Style

Vera-Mora, Glenda, Cecilia V. Sanz, Teresa Coma-Roselló, and Sandra Baldassarri. 2024. "Model for Designing Gamified Experiences Mediated by a Virtual Teaching and Learning Environment" Education Sciences 14, no. 8: 907. https://doi.org/10.3390/educsci14080907

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop