Next Article in Journal
Proposed Supercluster-Based UMBBFS Routing Protocol for Emergency Message Dissemination in Edge-RSU for 5G VANET
Previous Article in Journal
Caputo Fabrizio Bézier Curve with Fractional and Shape Parameters
Previous Article in Special Issue
A Platform for Integrating Internet of Things, Machine Learning, and Big Data Practicum in Electrical Engineering Curricula
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Student Teachers’ Perceptions of a Game-Based Exam in the Genial.ly App

by
Elina Gravelsina
and
Linda Daniela
*
Faculty of Education Sciences and Psychology, University of Latvia, LV-1586 Riga, Latvia
*
Author to whom correspondence should be addressed.
Computers 2024, 13(8), 207; https://doi.org/10.3390/computers13080207
Submission received: 9 June 2024 / Revised: 3 August 2024 / Accepted: 12 August 2024 / Published: 19 August 2024
(This article belongs to the Special Issue Smart Learning Environments)

Abstract

:
This research examines student teachers’ perceptions of a game-based exam conducted in the Genial.ly app in the study course ”Legal Aspects of the Pedagogical Process”. This study aims to find out the pros and cons of game-based exams and understand which digital solutions can enable the development and analysis of digital game data. At the beginning of the course, students were introduced to the research and asked to provide feedback throughout the course on what they saw as the most important aspects of each class and insights on how they envisioned the game-based exam could proceed. The game-based exam was built using the digital platform Genial.ly after its update, which provided the possibility to include open-ended questions and collect data for analyses. It was designed with a narrative in which a new teacher comes to a school and is asked for help in different situations. After reading a description of each situation, the students answered questions about how they would resolve them based on Latvia’s regulations. After the exam, students wrote feedback indicating that the game-based exam helped them visualize the situations presented, resulting in lower stress levels compared to a traditional exam. This research was structured based on design-based principles and the data were analyzed from the perspective of how educators can use freely available solutions to develop game-based exams to test students’ knowledge gained during a course. The results show that Genial.ly can be used as an examination tool, as indicated by positive student teachers’ responses. However, this research has limitations as it was conducted with only one test group due to administrative reasons. Future research could address this by including multiple groups within the same course as well as testing game-based exams in other subject courses for comparison.

1. Introduction

In the evolving and ever-changing landscape of education, game-based learning has emerged as a transformative approach, offering a dynamic pedagogical alternative that not only engages students but also fosters deeper understanding and critical thinking skills. In educational research and practice, various terms describe the use of game elements to enhance learning. These terms include “educational games”, “game-based learning”, “serious games”, “gamification”, and “play-based learning”. While they might sound similar, each has its specific focus and application [1,2,3]. In this article, the term “game-based learning” will be used. This refers to a method where educators use tools like board games, escape games, or digital games designed with a clear educational strategy and specific learning goals in mind [4,5]. It is important to note that this is not just about making learning fun. The design process is goal-oriented, aiming for a balance between an educational challenge and enjoyment [6], all while the educator guides and monitors the process [6,7]. Incorporating game-based learning also aligns with modern pedagogical trends that emphasize active learning and student-centered approaches [8,9].
Beyond the delivery of knowledge, game-based learning’s potential also lies in its capacity for assessment. When designed with precision, games can evaluate students’ understanding, track their progress, and pinpoint areas needing attention [10,11]. Ifenthaler et al. [12] suggest that adding assessment features to game-based learning is still a developing concept, mainly because it takes much longer to design such tools. Platforms like Kahoot, Quizlet, and Nearpod have harnessed this dual role, transforming traditional assessments into interactive experiences [13,14,15]. Although these types of assessment are common—as seen from a scoping review of 200 articles (details provided further in the research)—only one article [16] offered insight into using game-based learning for final course summative assessments. The lack of research highlights the need for further studies in this field. To identify research where final course assessments in higher education are specifically designed as games, authors suggest using the term “game-based exam”.
Within this framework, the “Legal Aspects of the Pedagogical Process” course embarked on an innovative journey, leveraging Genial.ly as a tool for a summative game-based exam to assess knowledge gained during the course. The objective of this research is to find out how educators can develop game-based exams incorporating principles of serious games analytics using available digital solutions and to ensure that knowledge gained during the course is tested. Genial.ly, a platform renowned for its interactive content creation capabilities [17], was chosen not just for its visual appeal but also for its recent advancements in facilitating assessment opportunities in the Genial.ly “Master” plan. This case study tests available solutions for educators who lack specific technological competencies and where no specific game-based programs are available to test students’ knowledge. This research is important at all levels of education, where innovative assessment methods can significantly impact student learning and engagement. In higher education, regular exams are a critical part of each course, and exploring alternative assessment methods like game-based exams can provide valuable insights into improving the examination process. Moreover, this study focuses on student teachers who will eventually transfer their knowledge and practices to their own students. By engaging with game-based exams, these future educators can adopt and possibly continue such innovative practices in their teaching careers. This multiplier effect could lead to broader implementation of game-based assessments, enhancing learning experiences across various educational settings. Design-based research conducts a comprehensive analysis of this initiative, and comments on the advantages and potential challenges of using Genial.ly for game-based exams.

2. State of the Art

This section discusses how game-based learning enhances the learning process and the role of games in assessment. In the many articles defining game-based learning, the explanation that game-based learning is opposite to traditional techniques, methods, or conventional instructions stood out [5,8,18]. In this sense, “traditional” refers to the teacher as the primary source of knowledge and lessons most often being delivered through lectures. Game-based learning, on the other hand, represents a shift toward a more student-centered approach. Instead of passively absorbing information, students are actively engaged in the learning process. Coleman and Money [8] state that student-centered learning is a meaningful keyword for game-based learning. It can be observed that student-centered learning is not only an outcome of game-based learning but also the common keyword that links it with active learning strategies.
Games, by their very nature, are interactive and require players to make decisions, solve problems, and think critically. The purpose of using game-based learning comes down to the educator and how they support reaching set goals. Mostly, it is used to deliver knowledge and skills to the learners [5,8,17]; however, when the game’s design is carefully planned, it can do more than just teach. For instance, students’ understanding can also be assessed when proper mechanics are integrated within the game [10,11,12,19]. While students “play”, the game can track their progress or the game outcome highlights what they have mastered and points out areas the educator might need to help them with.
This dual role of teaching and assessing makes game-based learning a valuable tool in modern education. Some games and online platforms that students and teachers prefer to use like Kahoot, Quizlet, Quizziz, and Nearpod [13,14,15,20] have specially designed quiz games that transform traditional quizzes and lecture formats into dynamic sessions where students are actively involved, motivated, and challenged. Although these platforms are highly evaluated by the previously mentioned sources, there are still unidentified aspects, like inclusion, stress levels, the real knowledge revealed, etc. In “The Views of the Teacher Candidates on the Use of Kahoot as a Gaming Tool”, Uzunboylu and colleagues [21] examined Kahoot and identified some negative aspects, such as the application’s dependency on the internet and the issue of leaving user account names to be chosen by students, which could lead to potential problems including unethical usernames. However, they provide generally positive views of the platform.
While platforms like Kahoot offer assessment capabilities, they may not match the interactive potential found in Genial.ly. Several articles have highlighted the positive learning experiences facilitated by Genial.ly’s digital tools [17,22,23,24]. However, it is worth noting that very few of these, if any, delve into the negative aspects; consider potential risks associated with their use; or analyze their potential in using them for course exams.
There are many aspects to consider when creating a game-based learning experience, including educational content, game design, technical issues, and resource management. One of the most common problems is that the game mechanics and design are not well-aligned with the educational objectives [25,26,27]. Another often overlooked issue is the pressure on teachers who create these materials. This process combines all the aforementioned aspects and can lead to teacher burnout [28]. When designing game-based exams with the prior notion to test them as a potential tool for teachers, these aspects need to be taken into consideration.
Having the prior notion of using this digital platform as a tool for examination, it was important to understand the possibilities of designing it for summative assessment. Digital game-based learning goes beyond traditional assessment methods to create immersive learning experiences that actively engage students. Specifically, game-based learning is designed to assist learners in addressing and overcoming potential obstacles. These challenges may include feelings of anxiety, diminished motivation, comprehension difficulties, and reduced engagement [10,29]. Furthermore, game-based learning aims to enhance the level of interaction, thereby offering substantial educational advantages and fostering a more conducive learning environment.
While many articles provide insight into using games-based learning for formative assessment, there is little research conducted to examine this field for summative assessments that are meant for university course exams. We conducted a scoping review in the Web of Science database using the keywords “game-based examination” or “exam”. This search included articles, book chapters, and proceeding papers, applying filters for the English language and research fields such as computer sciences, education, and social sciences. The articles were gathered from 2020 until the summer of 2024. The 5189 search results were sorted by relevance, and the first 100 articles were examined to gather insights and context for this study. Ninety-seven articles used the word “examination” to evaluate methods or, in other contexts, to integrate game-based learning elements into the instructional phase of the course rather than be used as a synonym of summative assessment that happens at the end of a course or program. This is mainly performed to prepare students for an exam or to use an exam to see if game-based learning works. Only in three articles the final “exam” or “examination” was made into a game or used gamified solutions. One of these was conducted in a real-life escape room [30] and another did not use game-based learning but rather separate game elements such as a point system [31] so they are not relevant to this study. The only relevant article from the first hundred on game-based learning exams found on Web of Science was “Digital game-based examination for sensor placement in context of an Industry 4.0 lecture using the Unity 3D engine—a case study” by Julian Koch, Martin Gomse, and Thorsten Schüppstuhl [16]. With the same filters, similar findings were identified using the keywords “game-based assessment” and “summative assessment”. One article on a real-life escape room was included [32] but otherwise, from two-hundred and ninety-three articles, only the first hundred that were the most relevant were examined, and no other inquiries relevant to this study were found.
The research by J. Koch, M. Gomse, and T. Schüppstuhl [16] highlighted the practical implementation of using a game-based examination with the complex Unity 3D engine. Although it is contrary to this research idea—that course teachers with no prior knowledge of coding can make their own game—this article provides an example relevant to the research topic. Three main conclusions that can be used from this article are as follows: First, the knowledge that a digital game-based examination can be an effective tool for assessing complex problem-solving skills and practical knowledge in a more realistic, engaging manner. Second, with its interactive and immersive design, the game allowed students to demonstrate their understanding in a more applied context compared to traditional exams. The gamified environment was found to be more stimulating and enjoyable, as observed by students showing high levels of engagement and motivation while participating in the game-based examination.
Overall, the findings suggest that while game-based learning is frequently used to enhance the instructional process, and the words “examination” and “assessment” are commonly used to refer to the process of evaluating a method or its elements and for formative assessment, they are not synonymous to a summative assessment conducted at the end of a course or program. Formative assessments relative to summative ones, and especially course exams, measure short learning periods. Exams have to have an overview of the whole course. To differentiate between these two types of game-based assessments, in this research, the authors would like to propose the term “game-based exam” to refer to the process where the final course examination or summative assessment is made into an educational game.
The efficacy of technological tools in assessing educational outcomes remains a topic of debate. In game-based learning experiences, there are three predominant methods to evaluate learning data: analyzing in-game scores, utilizing external assessments, and incorporating internal game assessments [12]. A recent advancement in this field is the introduction of “learning analytics”, also known as “serious games analytics” [33,34]. This approach adapts the game environment in response to the player’s performance. It not only considers player-specific data such as demographics and prior knowledge but also examines in-game behaviors, including duration of play, encountered challenges, and achieved objectives that can be obtained from the game and used to improve personalized learning [34,35].
Considering these factors, this study seeks not only to explore the technical aspects of employing Genial.ly as a tool for game-based exams but also to grasp the viewpoints of both students and educators on this matter.

3. Research Design

Building on the foundational understanding of game-based learning and its transformative potential in education, an opportunity arose to change the traditional learning process. Given the multitude of benefits and firsthand experience with game-based learning, the chance to adjust it to a university course arose. This section aims to lay out the process and results obtained while designing and implementing a game-based learning course exam in the Genial.ly app.
The approach of using a game-based exam to test students’ knowledge gained through lectures is new for higher education in Latvia. Another novelty is that the game was developed by educators without specific competencies in coding, and the research data provides insight into how freely available tools can be incorporated into the study process to test knowledge gained, as most previous research is conducted on ready-made games or simple quizzes that cannot be used for the final exams. A design-based research strategy was employed to collect and evaluate the data on the effectiveness of game-based exams in testing the results of students’ learning during the course. It is an empirical research methodology that focuses on the iterative design, testing, and refinement of interventions in real-world settings. It bridges the gap between theoretical research and practice by emphasizing collaboration between researchers and practitioners [36]. For this article, employing design-based research is particularly advantageous as it allows for systematic study and improvement of a game-based exam for the course “Legal Aspects of the Pedagogical Process”. By integrating a design-based research strategy, this research not only captures the effectiveness of the intervention but also contributes to the broader theoretical understanding, ensures that the game-based exam is both pedagogically sound and practically relevant for knowledge testing, and ensures it is accessible for educators without specific competencies in coding who wish to develop game-based solutions for their teaching process. In a similar piece of research, Sidonie Pors [37] also used design-based research and suggested three design stages: pre-implementation, implementation, and post-implementation. For a better overview of this design-based research, please see Figure 1.
For this research, the digital game-based knowledge-testing experience was designed in Genial.ly, a digital platform and tool that allows users to create interactive and animated content, such as presentations, infographics, reports, quizzes, and more. Customization options enable users to design visually engaging content without the need for coding. Positively reviewed by other authors [17], and successfully implemented in other projects such as by the University of Latvia’s Faculty of Education Sciences and Psychology, Genial.ly was chosen for its capability to deliver a rich visual experience and facilitate knowledge testing for students. While Genial.ly is an innovative tool for creating interactive content, it needs more comprehensive capabilities for serious game analytics. However, it does provide some basic data on user interactions, such as click-through rates, time spent in the game, and overall engagement metrics.
Consequently, an innovative approach was initiated to develop an exam in the form of a game for the “Legal Aspects of the Pedagogical Process” course. This initiative aimed to assess students’ comprehension of the subject matter and offer an engaging, interactive, and immersive examination experience. This approach provides a firsthand experience of situations that potential teachers may encounter. “Legal Aspects of the Pedagogical Process” is a course that student teachers take in the second semester of their four-year degree. It provides an overview of the Latvian and European regulations concerning compulsory school students, their parents, and teachers. The course was conducted online in real time, allowing immediate interaction and engagement. Additionally, sessions were recorded, providing students with the opportunity to revisit and review content at their convenience. Each lecture was supplemented with discussions and mid-lecture interactive questions in the platform Mentimeter, and students were prompted to share their reflections after every session.
On a couple of occasions during the course, students were asked to offer insights into how they envisioned the presented information being transformed into a game. In the middle of the semester, students were presented with a first glimpse of the game-based learning experience, and adaptations were made based on their feedback. There were 89 students on the course but only 65 took the final test, and 61 of them filled out the evaluation questionnaire. The participants were full-time student teachers enrolled in a four-year program. The course is offered in the second semester of the first year, and all group members are student teachers specializing in various subjects. By the end of their studies they receive two qualifications, which they choose at the beginning of their studies. In the academic year, the number of active participants fluctuates due to the possibility that existing students may take academic leave, drop out, switch to part-time programs, or transfer to other programs. This course, “Legal Aspects of the Pedagogical Process”, is a mandatory course for all student teachers, regardless of their specialization.
The exam was conducted in person in computer labs, supervised by the course professor. Organized as a Genial.ly game, students logged in with their credentials to participate in an online quiz and role-play scenarios designed to assess their knowledge of various real-life situations related to the course material. The results of this exam contributed 30% toward the students’ final grade for course completion.
The pre-implementation phase included analyzing Genial.ly’s technical aspects, conducting surveys inquiring insight into how students envisioned the game-based learning exam, and determining their thoughts on the course material. In the implementation phase, students were asked for their opinions through a single open-ended question regarding the initial draft of the Genial.ly game-based learning experience to make necessary corrections. At the end of the semester, the final version of the game-based exam was designed and tested, including by the course professor (the second author of this paper). This game-based exam constituted the final part of the implementation phase. In the post-implementation phase, after the exam was administered, the students’ results and feedback were analyzed.
The final exam included the following:
-
quiz-based questions where students had to choose one or several correct answers;
-
situations that they had to analyze and provide the solution for based on legislative frameworks;
-
open-ended questions where students had to provide their opinion.
To better understand the game content, which is directly related to the course “Legal Aspects of the Pedagogical Process”, five main topics that were included in the exam have been outlined:
  • Disciplinary procedures and regulations: this includes handling physical altercations that a teacher may observe, school actions on student violence, and restrictions on teaching;
  • Assessment and evaluation: questions related to summative, diagnostic, and formative assessments as well as support during state examinations and what teachers’ obligations are;
  • Legal and ethical frameworks: covers normative acts, ethical values in education, and legal reporting obligations for teachers experiencing violence or abuse;
  • Teacher qualifications and responsibilities: focusing on the qualifications required for teaching specific subjects, conditions for exclusion from pedagogical work, and responsibilities towards parents and guardians;
  • Student welfare and rights: addresses the entitlement to free school meals, data privacy regulations, social support, etc.
Although there are many topics that students need to cover, the primary objective of the course is for student teachers to know where to find relevant information and which strategies they can use to find potential solutions to problems. It was important to evaluate that students demonstrated the ability to solve problems according to legislative requirements that are in force in Latvia. They were allowed to search for documents online during the exam but the professor suggested keeping track of the time allocated for the exam. For example, situations like two boys fighting in the hallway—which the student teachers had to address—were based on actual experiences gathered from in-service teachers working in schools.
The data for the research was collected in several phases, as follows:
1.
Evaluating the Genial.ly platform based on the possibility of developing the game-based exam and collecting data
The platform’s flexibility was evaluated from a technical standpoint by considering several key factors. First, interactivity, such as the game mechanic capabilities, including the ability to incorporate multimedia content and branching scenarios (essential for testing purposes). Second, visual design and customization, to provide a user-friendly interface. Third, data collection features, focusing on its capacity to gather user interactions, responses, and performance metrics within the game environment for assessment possibilities;
2.
Collecting students’ opinions about possible features of the game for the final examination after each class
Students who attended lectures answered two or three open-ended questions after each class regarding a possible game-based exam and the lecture’s content (what they found important and what a new teacher would need to know in a school environment);
3.
Collecting students’ opinions about developed scenarios (similar to those used in the final exam)
Students who attended lectures in the middle of the semester were given a game similar to the one developed for the final exam and instructed: test the game’s introduction and provide feedback regarding its potential. This resulted in 41 positive comments and 14 suggestions from the students on how they envisioned the game;
4.
Evaluating the game for final corrections
The technology-enhanced learning (TEL) and course content specialist, responsible for course development and examination topic selection, was asked to assess the game based on content relevance, potential responses, and TEL principles, emphasizing the significance of information flow within digital environments for effective learning outcomes;
5.
Collecting students’ opinions on a game developed for knowledge testing at the end of the final exam
The last question in the exam was an open-ended question where students were asked: please share some brief thoughts on this style of examination.
One of the primary limitations of this study is that it was conducted with only one test group due to administrative reasons. Every spring semester there is only one group in this course and no data are saved from previous students’ assessments. This restriction limits the generalizability of the findings. Future research could address this limitation by including multiple groups within the same course and by testing game-based exams in other subject courses for comparison. Such studies would provide a broader understanding of the effectiveness and applicability of game-based exams administered through the platform (for this study it was Genial.ly) as an examination tool across different educational contexts.
All procedures performed in studies that involve human participants should be conducted in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Since the results of specific students’ data were not analyzed for this research, but rather the process of designing a game-based exam, the University of Latvia guidelines indicated that approval from the ethics committee was not required. Everything was coordinated at the beginning of the semester, and the students helped to organize this process by voluntarily providing feedback and were given the possibility of completing a regular exam if necessary.

4. Discussion and Results

In this section, the outcomes and results of the design-based research will be analyzed following the logic used during research implementing the three design stages. Each stage provides insights into the development and assessment of the game-based learning exam.

4.1. Pre-Implementation

In the pre-implementation phase, Genial.ly underwent a comprehensive assessment across three key categories to ensure its compatibility with previously set objectives of facilitating an interactive course examination. The first key category, interactivity, was of the highest value for the design. Within this context, it was established that content creators could integrate a wide array of multimedia elements, including images, videos, audio, animations, and transitions. Furthermore, it is possible to incorporate external resources using embedding links. Users can actively engage with both internal and external links to reveal hidden or extended content. Moreover, the platform allows users to move objects, providing them with the flexibility to organize and structure information in a customized manner.
The second key category is visual design and customization. Although the platform includes customizable templates, it also empowers users to design visually compelling materials with a user-friendly interface and a wide range of design tools, shapes, and text formatting options. Users can adapt the platform to their specific pedagogical needs. This flexibility extends to personalized branding, design choices, and content creation, making it a versatile tool for a variety of educational contexts.
Originally the game was intended to serve as a visual supplement for students. In this scenario, the learning data would have been evaluated by utilizing external assessments (student worksheets); however, with the platform’s recent feature enhancements—including the addition of interactive questions and the capability to insert open-ended queries—an opportunity to fully integrate the course examination within the game itself emerged. Consequently, it was possible to utilize the game’s features to internally assess students’ work and obtain an overview of their performance. Taking this into account, assessment is the third key category that was essential to build this experience. Educators can seamlessly integrate quizzes, surveys, and interactive evaluations within the platform’s content, allowing real-time feedback and analysis (Figure 2).
During the first lecture, students were introduced to the concept of a game-based summative assessment being implemented at the end of the semester. Following this lecture, students were invited to provide suggestions regarding the type of game that could enhance their understanding of the subject matter, and their opinions were collected using Google Forms. The question asked after the lecture was as stated: following today’s presentation and lecture, in your opinion, what type of game would help to reflect this topic? The student survey revealed that of the 49 students who participated in the lecture, 35% envisioned this game as a simulation, scenario role-play, or branched narrative. Some reflected their desire to engage in scenarios that would immerse them in various roles, including that of a teacher or student. They tried to identify and address challenges within these scenarios, offering solutions to improve the situation. Some students even envisioned resolving conflicts with students and their family members, class teachers, or social pedagogues, fostering a sense of empathy and encouraging everyone to advocate for their beliefs. These insights highlight their interest in dynamic and participatory learning experiences, focusing on understanding and problem-solving within diverse roles and situations. Ten students (20% of respondents) perceived the final exam as resembling a quiz-type game. They drew parallels with popular platforms like Kahoot, where questions are closely connected to their respective answers. One student elaborated on a real-life quiz game format that included a game board and random questions. Other students mentioned card games, matching games, and ball games as exam analogies.
While a range of game-based assessment methods was suggested, the initial two examples, i.e., comparing final exams to quiz-like games and scenario-based games, appear to be more coherent and relatable when considering the student teachers’ perspective. In 13 responses, it was evident that a few students may not have had a complete grasp of implementing game-based learning. A few students mentioned active learning methods that lacked gamified elements, such as straightforward discussions or activities like drawing a brainstorming cloud. These comments indicate varying levels of understanding and familiarity with the concept of game-based learning. Since this activity aimed to collect their ideas, not evaluate their understanding, this aspect will not be discussed further.
After the second lecture, 45 students who attended the class answered the question, what games have you played lately in the last few months, naming 59 different games. “Uno” emerged as the most popular choice, with twelve respondents indicating they had played it recently, closely followed by “Monopoly”, named by ten students; “Alias”, a word-guessing game, garnered the interest of seven participants, while five individuals reported playing various tabletop games. While there were ten mentions of mobile or video games, most students indicated a recent experience with board games. These results confirm that students mostly engage in tangible face-to-face interactions facilitated by board games, which presents a challenge for educators who wish to use digital game-based learning. Consequently, a decision was made to develop a game draft that combined quiz-like, role-play, and scenario-based elements. This game was provided in the middle of the semester to introduce students to the concepts planned for the final exam.
Two additional questions were asked after each lecture on the lesson’s content as follows: What do you think was the most important thing for new teachers to know today? Give one specific example and please formulate one question about today’s topic that you find interesting (must be different from the answer to the previous question). The first question encourages reflection on the significance of specific knowledge for aspiring teachers, potentially highlighting key concepts or challenges they perceive as crucial in this class. Meanwhile, the second question prompts students to actively engage with the topic by formulating an inquiry, showcasing their curiosity and understanding of the material. The questions aimed to gather perspectives and engage students with the course content and were regarded as possible questions for the exam.

4.2. Implementation

During the implementation phase, the first game draft was introduced to students and their feedback was gathered to assess their opinions on the game’s effectiveness and to identify areas for improvement. Questions from this draft about the course content were not included in the final test. This phase aimed to gauge student responses and suggestions for refining the game to better align with their learning needs and preferences and to introduce students with the principles of game-based knowledge testing planned for the final exam. The students were asked: test the game’s introduction and provide feedback regarding its potential. A total of 35 students attended class that day and in response, they provided 41 positive comments and 14 suggestions. For the data collection, Google Forms was used. Student teachers generally responded positively, praising the game itself or its visual and audio elements. The game’s intuitiveness and the fact that it can be used on mobile devices were also well-received. Student teachers provided positive feedback, highlighting their enthusiasm for addressing crisis situations and their eagerness to find solutions when role-playing as the new teachers. For some, it was a novelty. The recommendations included making the visuals more engaging and colorful in case they were meant to capture children’s attention. As this was not the case and the game would be used for adults, the previous comments suggested the visualizations with the photography collage were engaging and intuitive enough. Some feedback suggested improving the audio quality for a more pleasant experience. Others suggested addressing language errors, providing additional explanations in some cases, and considering a name change. The feedback collected provided valuable insights into the initial reception of the game-based learning tool and potential enhancements, and based on these comments, the game was improved to develop the game-based exam.
Before administering the game-based exam to students, it was evaluated by two course lecturers, one of whom was an expert in educational game development and the other, an expert in TEL and course content. The expert in game development evaluated the game from the perspective of how students managed to navigate through the game and how the data were collected in the system to evaluate students’ performance. The expert in TEL and course content, who developed the course and chose the topics for examination, evaluated the game from the perspective of content and possible answers and from the perspective of principles of TEL, where the flow of information in the digital space is an important aspect of learning success. After this evaluation, all the comments and suggestions were implemented to ensure that the game could be used for the students’ final exam.
The final part of the implementation phase was to conduct the game-based exam. Bearing in mind that the main points of focus of the research were on Genial.ly as a potential platform for exams, testing the development of game-based exams by educators with no specific competencies in coding, and the pros and cons of the use of games for final exams to test students’ knowledge and opinions rather than analyzing exam results—which cannot be compared with other groups’ results because no information was collected on students’ learning habits, previous knowledge, cognitive capabilities, etc.—therefore, it is not possible to compare different groups of students at this stage. Instead, this paper discusses a case study where researchers (educators with no specific competencies in coding) tried to develop an interactive game-based final exam and test the possibility of replacing a traditional exam with a game-based exam to assess knowledge gained during the course. After the exam, the results of the examination process were evaluated from the perspective of how students managed the process and how they could give reasonable answers. This evaluation was made by one of the experts participating in this research, who had previously administered final exams for students on the same course and had previous experience with knowledge testing. The time allotted for the final exam was 1.5 h, and students were asked questions about all the topics learned during the course. The number of questions in the game-based exam was about the same as it was in previous years with other groups of students. It was concluded that there were no differences compared to the way students had previously answered the exam questions, meaning that a game-based exam in Genial.ly did not restrict them from retrieving their knowledge; this means that the game-based exam fits for the purpose of knowledge evaluation but changes the way the test is administered. The reason for changing the form of administering the final exam was based on the idea that a game-based exam with storytelling elements would help them retrieve their knowledge and reduce their stress levels. At this stage, it can be concluded that such a testing strategy has the potential to reduce stress and anxiety, as also mentioned in previous research [10,29], but this should be tested in further research steps.

4.3. Post-Implementation

The primary objective of this case study was to explore the pros and cons of implementing a Genial.ly game-based course exam in the “Legal Aspects of the Pedagogical Process” course from a student teacher’s perspective. By analyzing these diverse perspectives, this study aims to provide a comprehensive understanding of the efficacy and areas of improvement for game-based examinations in an academic setting. While these insights can offer a glimpse into user behavior, they do not delve deep into the complex learning outcomes and detailed player performance metrics that serious game analytics typically encompass.
At the end of the course, 65 students took the final exam and 61 left feedback about the game-based exam. Acknowledging that game-based learning might be a novel experience for some, it was reassuring that most students (60 of who gave feedback) found the experience positive. One student who gave feedback did not reflect on the game experience, only on the questions and time limit. This reaffirmed the necessity of the pre-implementation and implementation stages to coax the students into thinking about the upcoming experience. It may be said that in this game-based exam, students had a unique and engaging experience that differed from traditional written exams.
In the exam, in response to the last question (Please share some brief thoughts on this style of examination), student teachers expressed a range of opinions about this format, providing an extensive overview. As mentioned before, 60 students found the exam fun and interactive. In one student’s feedback, it was said, “For the first time in my life, I enjoyed taking an exam”. Most appreciated (15/60 responses) was the possibility of immersing themselves in various scenarios. Only one student out of sixty expressed dissatisfaction with the dialogues between the questions, saying they were not necessary. Most students appreciated the opportunity to role-play, and it was mentioned that the interactive nature of the exam allowed them to explore real-life situations and apply their knowledge. They liked the fact that the exam felt like a conversation rather than a test. One student said, “Interactivity and real-life situations allowed me to better understand the role of a teacher”. This may also be due to the visualization element of the game (Figure 3). Visualizations given in this material are from the game developed for this particular reason, and were presented in Latvian but a translation of the text is given below for context. Eleven students mentioned that the visualizations helped them better understand and engage with the content. As one student stated, “It was visually pleasing to answer the questions”.
Interestingly, game-based learning reduces stress, as found by Dziob [10], and this was supported by ten students who mentioned that the game-based format reduced their stress compared to traditional exams. They felt that it was a more relaxed and enjoyable way to demonstrate their knowledge. One student remarked, “It felt less formal, so the stress around the exam was much less”. Although almost all students (60 students) found this exam format to be different and more interesting than traditional written exams, there is still room for improvement. Only one student mentioned that a traditional exam would be preferable, and this was mentioned in the context of a lack of time, which most students found challenging. In all, twelve students mentioned that there was not enough time to write everything they wanted, and four students mentioned that 1000 characters were insufficient to elaborate their answers. The game did not have specific elements that reacted to time, e.g., “You have 2 minutes to answer the following question”. The time for the exam was set by the lead professor, who tested the game before the examination. It was deliberately set to ensure the students focused on the main information and did not provide unnecessary details. While the exam was constrained to a brief 1.5-hour timeframe, its primary objective was to assess the students’ ability to articulate and demonstrate their knowledge effectively.
Another aspect that is always a risk factor when using digital solutions is technical problems. Six students mentioned that the classroom’s internet could have been better, and they had concerns about not being able to save their answers due to these issues. However, if the connection is poor, they can refresh the page, log in again using their username, and return to the point where they left off. From a design aspect, one student mentioned in his feedback that the progress percentage in the corner of the screen was a useful asset during the exam and that it would be good to have it all the time. For now, the progress indicator is shown after each topic is played out in the game, but this is something that can be added to every slide (Figure 4).
In four cases, students mentioned the lack of space to write their answers to open-ended questions. While it does not take much effort to change the basic visual elements, the designer is restricted by some Genial.ly features, such as the limit of 1000 characters that users can write within their open answers or the possibility of changing the words on the buttons and write-in windows into foreign languages. For example, while questions can be written in Latvian, to submit their answer, students need to press the “Send” button, which cannot be translated from English to any other language (Figure 5).
Although all the students had a good level of English, it would be better to have the possibility to create all the content in the same language. In the context of adult education, this does not pose a significant concern; however, when contemplating its application in youth education, it could potentially become problematic.
When comparing the main insights from the article “Digital game-based examination for sensor placement in context of an Industry 4.0 lecture using the Unity 3D engine—a case study”, [16] showed similarities to our study. First, game-based exams can assess problem-solving skills and practical knowledge in a realistic and engaging manner, and although the study course does not provide the student teachers with hands-on experience, from their answers it was clear that it was easier for them to envision the situation and try to find a solution as they could in real life. This is mainly due to the second point—the interactive and immersive design of the game—that allows students to demonstrate their understanding in a more applied context compared to traditional exams. As some students also mentioned they enjoyed the provided game, it can be agreed that a gamified environment is more stimulating and enjoyable, leading to higher student engagement and motivation during game-based examinations. Nevertheless, these two studies have their differences in their complexity but show similarities in the conclusions reached.
To summarize, Table 1 provides a clear overview of the pros and cons of using Genial.ly as a resource for making exams from the student teachers’ and designers’ perspectives. It highlights the benefits and drawbacks of both the game-based format and specific features of the Genially platform, allowing educators to make informed decisions about its implementation.
Observing these points can help identify areas for improvement. The concept of a game-based exam generally has more pros than cons; however, the Genial.ly app, as a platform for this purpose, shows an equal number of pros and cons based on this research. Since many of these cons are technical issues that may be resolved in future updates, Genial.ly can still be considered a viable tool for designing game-based exams as the interface and structure are intuitive and easily understandable and allow educators with no competencies in coding to develop interactive learning materials and exams to test the knowledge gained during learning.
The broader implications of our findings for educational technology and assessment practices are significant. Platforms like Genial.ly not only facilitate the integration of gamification elements into assessments but also allow for comprehensive data collection and analysis. This dual capability enhances the ability of educators to track and measure student performance more effectively. The Authors hope that this article will inspire more educators to provide their courses with different game-based exams as this research field is still little explored.
An unexpected discovery from our study was that the game-based assessment tended to induce less stress among a portion of students compared to traditional assessments. By reducing anxiety, game-based exams can foster a more conducive learning environment, promoting better engagement and deeper learning; however, this needs to be more accurately tested and researched.
However, this research has limitations, as it was conducted with only one test group due to administrative reasons. This single-group approach restricts the generalizability of the findings and may not fully capture the diverse experiences and outcomes that could emerge from a broader participant base.
Future research could address this limitation by including multiple groups within the same course. This would allow for a more robust comparison between different sets of students, providing a clearer picture of how game-based exams impact various learning styles and academic performance. Additionally, testing game-based exams in other subject courses could offer valuable insights into the versatility and effectiveness of this assessment method across different disciplines. By expanding the scope of the research to include multiple test groups and a variety of subjects, future studies can more comprehensively evaluate the potential benefits and challenges of implementing game-based exams in higher education. This broader approach would help in identifying best practices and refining the design of game-based assessments to enhance their applicability and impact in diverse educational settings.
Overall, these findings suggest that incorporating game-based assessments in educational practices can revolutionize the way we approach student evaluation, making it more interactive, engaging, and less stressful. This shift has the potential to improve learning outcomes and provide more meaningful insights into student performance.

5. Conclusions

In the pre-implementation phase, the Genial.ly platform was evaluated for its interactivity, visual design, and customization to meet the objectives of an interactive course examination. The platform offered flexibility in content creation and design, making it suitable for various educational contexts. The addition of ‘interactive questions’ allowed for internal assessment within the game itself. In the implementation phase, the first game draft was introduced to students, and their feedback indicated a positive attitude toward such an assessment strategy. Students appreciated the game’s visual elements, intuitiveness, and mobile compatibility. Recommendations included enhancing the visual engagement and audio quality. In the post-implementation phase, this study found that students generally had a positive experience with the game-based exam:
  • The research showed that the final exam can be organized in the form of an interactive game to help students retrieve the knowledge learned during the course;
  • A total of 60 out of 61 students who gave feedback expressed the opinion that the game-based exam was a positive experience;
  • Overall, 25% of students (N15) enjoyed the game’s interactivity and their immersion in various scenarios;
  • Eleven students noted that the visualizations in the game helped them engage with the content;
  • Ten students mentioned that they believe that such an exam reduces their stress levels;
  • While most students preferred this format over traditional written exams, there is room for improvement, particularly in addressing technical issues and customizing certain elements.
Based on student feedback, we recommend the following for educators interested in implementing similar assessments:
  • Combine question types: use a variety of question types to assess different skills and knowledge areas, not only testing answers but also open-ended questions for their thoughts;
  • Choose platforms that have data collection possibilities, to gain deeper insights into student performance;
  • Ensure the design and visual elements are engaging and clear. Information design and aesthetics play a crucial role in maintaining student interest;
  • Design questions and tasks that reflect real-life situations to make the assessments more relevant and practical;
  • Craft questions and instructions that are unambiguous to avoid confusion and ensure that students understand what is required of them. If possible, test them on a similar audience.
From the designer’s view, they can consider resources and help from other specialists to avoid burnout in designing game-based experiences as this is one of the main risks in designing game-based experiences.
The significance of this study lies in using a game-based exam to enhance the teaching of legal aspects of pedagogical processes. This research project showed that educators with no competencies in coding can develop interactive games for knowledge testing, which replace traditional knowledge testing formats. It has highlighted the benefits of interactivity, reduced stress, and visual engagement in game-based learning but research should be continued to analyze the role of game-based exams to reduce test anxiety. This study also acknowledges the limitations related to technical issues and customization. The Genial.ly app can be used as a platform for creating interactive game-based learning exams with diverse forms of knowledge testing (quizzes with right and wrong answers, scenarios where students should provide the solution for a problem, and open-ended questions where students should provide their opinions).

Author Contributions

Conceptualization, E.G. and L.D.; methodology, E.G. and L.D.; software, E.G.; validation, L.D. and E.G.; formal analysis, L.D.; investigation, E.G.; resources, E.G.; writing—original draft preparation, E.G.; writing—review and editing, E.G. and L.D.; visualization, E.G.; supervision, L.D.; project administration, L.D.; funding acquisition, L.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data supporting the reported results can be found at the following: https://drive.google.com/drive/folders/10C1SkLR1HXXPUXyCQHM8D98TsI4vg9Go?usp=sharing (accessed on 11 August 2024).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Deterding, S.; Dixon, D.; Khaled, R.; Lennart, N. From game design elements to gamefulness: Defining gamification. In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, MindTrek, Tampere, Finland, 28–30 September 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 9–15. [Google Scholar]
  2. Gao, F.; Li, L.; Sun, Y. A systematic review of mobile game-based learning in STEM education. Educ. Technol. Res. Dev. 2020, 68, 1791–1827. [Google Scholar] [CrossRef]
  3. Yildirim, I. The effects of gamification-based teaching practices on student achievement and students’ attitudes toward lessons. Internet High. Educ. 2017, 33, 86–92. [Google Scholar] [CrossRef]
  4. Grāvelsiņa, E.; Daniela, L. Designing an online escape room as an educational tool. In Smart Pedagogy of Game-Based Learning; Daniela, L., Ed.; Springer: Cham, Switzerland, 2020; pp. 119–133. [Google Scholar]
  5. Karakoç, B.; Eryılmaz, K.; Turan Özpolat, E.; Yıldırım, İ. The effect of game-based learning on student achievement: A meta-analysis study. Technol. Knowl. Learn. 2022, 27, 207–222. [Google Scholar] [CrossRef]
  6. Bylieva, D.; Sastre, M. Classification of educational games according to their complexity and the player’s skills. Eur. Proc. Soc. Behav. Sci. 2018, 51, 438–446. [Google Scholar] [CrossRef]
  7. Pho, A.; Dinscore, A. Game-Based Learning. Tips and Trends. 2015. Available online: https://acrl.ala.org/IS/wp-content/uploads/2014/05/spring2015.pdf (accessed on 11 August 2024).
  8. Coleman, T.E.; Money, A.G. Student-centred digital game-based learning: A conceptual framework and survey of the state of the art. High. Educ. 2020, 79, 415–457. [Google Scholar] [CrossRef]
  9. Hartt, M.; Hosseini, H.; Mostafapour, M. Game on: Exploring the effectiveness of game-based learning. Plan. Pract. Res. 2020, 35, 589–604. [Google Scholar] [CrossRef]
  10. Dziob, D. Board game in physics classes—A proposal for a new method of student assessment. Res. Sci. Educ. 2020, 50, 845–862. [Google Scholar] [CrossRef]
  11. Hung, H.-T.; Yang, J.C.; Hwang, G.-J.; Chu, H.-C.; Wang, C.-C. A scoping review of research on digital game-based language learning. Comput. Educ. 2018, 126, 89–104. [Google Scholar] [CrossRef]
  12. Ifenthaler, D.; Eseryel, D.; Ge, X. (Eds.) Assessment for game-based learning. In Assessment in Game-Based Learning: Foundations, Innovations, and Perspectives; Springer: New York, NY, USA, 2012; pp. 3–10. [Google Scholar]
  13. Plump, C.M.; LaRosa, J. Using Kahoot! in the classroom to create engagement and active learning: A game-based technology solution for eLearning novices. Manag. Teach. Rev. 2017, 2, 151–158. [Google Scholar] [CrossRef]
  14. Portela, F. A new approach to perform individual assessments at higher education using gamification systems. In Proceedings of the 4th International Computer Programming Education Conference (ICPEC 2023) Proceedings, Vila do Conde, Portugal, 26–28 June 2023; de Queirós, R.A.P., Teixeira Pinto, M.P., Eds.; Dagstuhl Publishing: Wadern, Germany, 2023; pp. 8:1–8:12. [Google Scholar]
  15. Shakhmalova, I.; Zotova, N. Techniques for increasing educational motivation and the need to assess students’ knowledge: The effectiveness of educational digital games in learning English grammatical material. J. Psycholinguist. Res. 2023, 52, 1875–1895. [Google Scholar] [CrossRef] [PubMed]
  16. Koch, J.; Gomse, M.; Schüppstuhl, T. Digital game-based examination for sensor placement in context of an Industry 4.0 lecture using the Unity 3D engine—A case study. Procedia Manuf. 2021, 55, 563–570. [Google Scholar] [CrossRef]
  17. Sanz-Prieto, M.; de Pablo González, G. Gamify gamifying: Learning with breakouts. Smart Pedagogy of Game-Based Learning: Advances in Game-Based Learning; Daniela, L., Ed.; Springer: Cham, Switzerland, 2021; pp. 103–118. [Google Scholar]
  18. Wang, M.; Zheng, X. Using game-based learning to support learning science: A study with middle school students. Asia-Pac. Educ. Res. 2021, 30, 167–176. [Google Scholar] [CrossRef]
  19. Ke, F. Designing and integrating purposeful learning in game play: A systematic review. Educ. Technol. Res. Dev. 2016, 64, 219–244. [Google Scholar] [CrossRef]
  20. Kurniawan, R.D.K.; Avrianto, R.P.; Indrajit, R.E.; Dazki, E. Selection the best quiz applications as learning performance evaluation media using the analytical hierarchical process method. J. Tek. Inform. (JUTIF) 2022, 3, 1389–1396. [Google Scholar] [CrossRef]
  21. Uzunboylu, H.; Galimova, E.G.; Kurbanov, R.A.; Belyalova, A.M.; Deberdeeva, N.A.; Timofeeva, M. The views of the teacher candidates on the use of Kahoot as a gaming tool. Int. J. Emerg. Technol. Learn. 2020, 15, 158–168. [Google Scholar] [CrossRef]
  22. Cabrera-Solano, P. Game-based learning in higher education: The pedagogical effect of genially games in English as a foreign language instruction. Int. J. Educ. Methodol. 2022, 8, 719–729. [Google Scholar] [CrossRef]
  23. Carceller, M.d.C.; Serna-García, M.; López-Fernández, E.; Pellín Carcelén, A.; Flacco, N. Genially as an interactive tool in the integrated curriculum in basic biomedical sciences. In Proceedings of the INTED2023: 17th International Technology, Education and Development Conference Proceedings, Valencia, Spain, 6–8 March 2023; pp. 3992–3997. [Google Scholar]
  24. De Souza, R.T.M.P.; Kasseboehmer, A.C. The thalidomide mystery: A digital escape room using Genially and WhatsApp for high school students. J. Chem. Educ. 2022, 99, 1132–1139. [Google Scholar] [CrossRef]
  25. Sarva, E.; Gravelsina, E.; Daniela, L. Digital solutions for gamification and game-based learning from the perspective of educators. In Proceedings of the 18th International Technology, Education and Development Conference, Valencia, Spain, 4–6 March 2024; pp. 109–118. [Google Scholar] [CrossRef]
  26. Yasin, A.; Fatima, R.; Wen, L.; Zheng, J.; Niazi, M. What goes wrong during phishing education? A probe into a game-based assessment with unfavorable results. Entertain. Comput. 2024, 2024, 100815. [Google Scholar] [CrossRef]
  27. Tay, J.; Goh, Y.M.; Safiena, S.; Bound, H. Designing digital game-based learning for professional upskilling: A systematic literature review. Comput. Educ. 2022, 184, 104518. [Google Scholar] [CrossRef]
  28. Sobocinski, M. I gamified my courses and I hate that…. World J. Sci. Technol. Sustain. Dev. 2017, 14, 135–142. [Google Scholar] [CrossRef]
  29. Xu, Z.; Chen, Z.; Eutsler, L.; Geng, Z.; Kogut, A. A scoping review of digital game-based technology on English language learning. Educ. Technol. Res. Dev. 2020, 68, 877–904. [Google Scholar] [CrossRef]
  30. Akatsu, H.; Shiima, Y.; Gomi, H.; Hegab, A.E.; Kobayashi, G.; Naka, T.; Ogino, M. Teaching “medical interview and physical examination” from the very beginning of medical school and using “escape rooms” during the final assessment: Achievements and educational impact in Japan. BMC Med. Educ. 2022, 22, 67. [Google Scholar] [CrossRef] [PubMed]
  31. Frank-Bolton, P.; Simha, R. The Reverse Exam: A Gamified Exam Structure to Motivate Studying and Reduce Anxiety. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education, Portland, OR, USA, 11–14 March 2020. [Google Scholar]
  32. Martínez-Cal, J.; Sandoval-Hernández, I.; Ropero-Padilla, C.; Rodriguez-Arrastia, M.; González-Sánchez, M.; Molina-Torres, G. An escape room game-based innovation for the assessment of physiotherapy students: A qualitative study. Stud. Educ. Eval. 2024, 81, 101331. [Google Scholar] [CrossRef]
  33. Loh, C.S.; Sheng, Y.; Ifenthaler, D. (Eds.) Serious games analytics: Theoretical framework. In Serious Games Analytics: Methodologies for Performance Measurement, Assessment, and Improvement; Springer: New York, NY, USA, 2015; pp. 3–29. [Google Scholar]
  34. Niemelä, M.; Kärkkäinen, T.; Äyrämö, S.; Ronimus, M.; Richardson, U.; Lyytinen, H. Game learning analytics for understanding reading skills in transparent writing system. Br. J. Educ. Technol. 2020, 51, 2376–2390. [Google Scholar] [CrossRef]
  35. Kim, Y.J.; Ifenthaler, D. (Eds.) Game-based assessment: The past ten years and moving forward. In Game-Based Assessment Revisited: Advances in Game-Based Learning; Springer: Cham, Switzerland, 2019; pp. 3–11. [Google Scholar]
  36. Baumgartner, E.; Bell, P.; Brophy, S.; Hoadley, C.; His, S.; Joseph, D.; Orrill, C.; Puntambekar, S.; Sandoval, W.; Tabak, I. Design-based research: An emerging paradigm for educational inquiry. Educ. Res. 2003, 32, 5–8. [Google Scholar] [CrossRef]
  37. Pors, S. Conducting Design-Based Research on Web-Enhanced Learning in Cambodia; Sage Research Methods Cases Part 1; SAGE Publications, Ltd.: Thousand Oaks, CA, USA, 2014. [Google Scholar] [CrossRef]
Figure 1. Overview of the design-based research method with 3 main phases and connected parts.
Figure 1. Overview of the design-based research method with 3 main phases and connected parts.
Computers 13 00207 g001
Figure 2. Feedback and analysis from Genial.ly’s “individual activity” view that includes answers to both test-based questions and open-ended questions where students provided their opinions.
Figure 2. Feedback and analysis from Genial.ly’s “individual activity” view that includes answers to both test-based questions and open-ended questions where students provided their opinions.
Computers 13 00207 g002
Figure 3. The game visualizes a conversation in the hallway.
Figure 3. The game visualizes a conversation in the hallway.
Computers 13 00207 g003
Figure 4. The game progress indicator, displayed in the upper left corner.
Figure 4. The game progress indicator, displayed in the upper left corner.
Computers 13 00207 g004
Figure 5. Inability to customize the words of the interactive question interface.
Figure 5. Inability to customize the words of the interactive question interface.
Computers 13 00207 g005
Table 1. Pros and cons of using genially for game-based exams.
Table 1. Pros and cons of using genially for game-based exams.
CategoryProsCons
Game designFun, interactiveLack of time
Immersive scenarios to explore real-life situations and apply knowledge
Game-based format was more relaxed and enjoyableDifferent game types not suitable for everyone
Visualizations helped better understand and engage with the content
Intuitive design and progress indicator
Genially featuresFlexibility in content creation and design (embedding links, different media)Concerns about poor internet connection and potential inability to save answers
Addition of “interactive questions” allowed for internal assessment within the gameLimited ability to change language of interface elements (e.g., ‘Send’ button).
Provides features to create intuitive and mobile-compatible designs1000-character limit for open-ended answers
Suitable for various educational contextsAssigning students their profiles
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gravelsina, E.; Daniela, L. Student Teachers’ Perceptions of a Game-Based Exam in the Genial.ly App. Computers 2024, 13, 207. https://doi.org/10.3390/computers13080207

AMA Style

Gravelsina E, Daniela L. Student Teachers’ Perceptions of a Game-Based Exam in the Genial.ly App. Computers. 2024; 13(8):207. https://doi.org/10.3390/computers13080207

Chicago/Turabian Style

Gravelsina, Elina, and Linda Daniela. 2024. "Student Teachers’ Perceptions of a Game-Based Exam in the Genial.ly App" Computers 13, no. 8: 207. https://doi.org/10.3390/computers13080207

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop