Next Article in Journal
Caputo Fabrizio Bézier Curve with Fractional and Shape Parameters
Previous Article in Journal
Impact of Video Motion Content on HEVC Coding Efficiency
Previous Article in Special Issue
Training and Certification of Competences through Serious Games
 
 
Article
Peer-Review Record

An Educational Escape Room Game to Develop Cybersecurity Skills

Computers 2024, 13(8), 205; https://doi.org/10.3390/computers13080205
by Alessia Spatafora 1, Markus Wagemann 2, Charlotte Sandoval 3, Manfred Leisenberg 3 and Carlos Vaz de Carvalho 4,*
Reviewer 1: Anonymous
Reviewer 2:
Computers 2024, 13(8), 205; https://doi.org/10.3390/computers13080205
Submission received: 23 June 2024 / Revised: 12 August 2024 / Accepted: 14 August 2024 / Published: 19 August 2024
(This article belongs to the Special Issue Game-Based Learning, Gamification in Education and Serious Games 2023)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The world of education has evolved a lot in the last decades and students and trainees are no longer interested in the long-established methods of teaching, which were usually completed by a summative assessment. In the efforts to renew the world of education, educational games are also included.

On the other hand, as well emphasized in the article, too many professional activities (and not only) are carried out via the Internet, so the cybersecurity became a top issue.

Thus, the educational game described in the article responds to two important problems of today's world.

The text of the article is fluent, even too fluent, describing in too much detail an escape room type game, which everyone knows. A shortening of the text related to this topic would be indicated. Also, the designed game is described in too much detail. A summary table would be useful: competences – corresponding play tasks.

A comparison with existing cybersecurity games is missing. It would be indicated that this comparison to emphasize the advantages or advantages of the new game.

Is the game available on the web? If so, which is the internet address?

The sources cited in the Reference section are relatively new and relevant to the topic. Only one self-citation was detected, and it is justified. As there are some errors regarding the in-text citations of references, it is recommended that authors check all citations.

The article does not launch any theory or propose scientific hypotheses to be tested. It only presents the results of the performance testing of an educational game created within an Erasmus+ project.

The figures and tables are appropriate, and their content is clear for the reader.

The discussion and conclusions are properly related to what was presented in the article.

Specific observations:

Row 108 – “ deepen their knowledge [17].” It appears that Prensky (2001) is no. 18 in the reference section. Please check.

Row 111 – “real-time feedback and the provision of safe”. Please solve the correction in red.

Row 112 – “consequences and authentic assessment [18].” It appears that Abt (1998) is no. 19 in the reference section. Please check.

Row 167 – “shown by Tercanli et al. in 167 their study [16].” It appears that Tercanli (2021) is no. 21 in the reference section. Please check.

Row 288 – “and behavioural aspects of human nature [19].” The reference for social engineering appears to be [24]. Please check.

Row 371 – “On average, players spent 46 minutes playing the game (Figure 2) which shows the involvement of the players.” It would be recommended to indicate here which was the time limit – 60 minutes?

Row 437 – “46 participants (59,7%) completed the game”. I suggest specifying that 59,7% refers to those who provided feedback. Even it is obvious, the information become more clear.

Author Response

Reviewer 1

Firstly, we would like to express our gratitude to the reviewers for the work and time they invested in the analysis and revision of the article and the corresponding suggestions towards its improvement. We provide here additional information and the corrections made were highlighted in red in the resubmitted manuscript

The world of education has evolved a lot in the last decades and students and trainees are no longer interested in the long-established methods of teaching, which were usually completed by a summative assessment. In the efforts to renew the world of education, educational games are also included. On the other hand, as well emphasized in the article, too many professional activities (and not only) are carried out via the Internet, so the cybersecurity became a top issue. Thus, the educational game described in the article responds to two important problems of today's world.

We appreciate the similar view the reviewer has concerning the training process and the need to improve it, particularly in relation to cybersecurity.

 

The text of the article is fluent, even too fluent, describing in too much detail an escape room type game, which everyone knows. A shortening of the text related to this topic would be indicated.

We don’t fully agree with the idea that the escape room concept is already familiar to everyone. We did reduce the amount of text dedicated to explaining, though.

 

Also, the designed game is described in too much detail. A summary table would be useful: competences – corresponding play tasks.

We’ve summarized and shortened up the game description. Although we understand the idea of creating the suggested table, that would result in a very extensive document as the game as a large number of challenges and each of these is related to several learning objectives and/or competences.

 

A comparison with existing cybersecurity games is missing. It would be indicated that this comparison to emphasize the advantages or advantages of the new game.

In the first version of the article we focused on the scientific references and now we added references to same CS games we had identified before starting te study. Although we don’t make an individual comparison to each of the games, we have justified the need for a new educational game for CS.

Is the game available on the web? If so, which is the internet address?

We added the link to the game in the document.

 

The sources cited in the Reference section are relatively new and relevant to the topic. Only one self-citation was detected, and it is justified. As there are some errors regarding the in-text citations of references, it is recommended that authors check all citations.

Corrected.

The article does not launch any theory or propose scientific hypotheses to be tested. It only presents the results of the performance testing of an educational game created within an Erasmus+ project.

Yes, the article is more oriented towards the validation of the practical application rather than theoretical considerations. 

The figures and tables are appropriate, and their content is clear for the reader. The discussion and conclusions are properly related to what was presented in the article.

Thank you for the positive comments.

Specific observations:

Row 108 – “ deepen their knowledge [17].” It appears that Prensky (2001) is no. 18 in the reference section. Please check.

Corrected.

Row 111 – “real-time feedback and the provision of safe”. Please solve the correction in red.

Corrected.

Row 112 – “consequences and authentic assessment [18].” It appears that Abt (1998) is no. 19 in the reference section. Please check.

Corrected.

Row 167 – “shown by Tercanli et al. in 167 their study [16].” It appears that Tercanli (2021) is no. 21 in the reference section. Please check.

Corrected.

Row 288 – “and behavioural aspects of human nature [19].” The reference for social engineering appears to be [24]. Please check.

Corrected.

Row 371 – “On average, players spent 46 minutes playing the game (Figure 2) which shows the involvement of the players.” It would be recommended to indicate here which was the time limit – 60 minutes?

We added that information to the document.

Row 437 – “46 participants (59,7%) completed the game”. I suggest specifying that 59,7% refers to those who provided feedback. Even it is obvious, the information become more clear.

We added that information to the document.

Reviewer 2 Report

Comments and Suggestions for Authors

1. The abstract suggests that the methodology used in the experiment is not sufficiently clear. It mentions "experiential learning" and "combining scenario-based practical learning with real-life examples," yet it fails to explain how these methods are practically implemented. Moreover, the abstract states that the research aims to enhance knowledge retention and operational performance in network security practice. However, it only refers to improving network security awareness without specifying the means through which improvements are achieved. The effectiveness of these methods in engaging participants is also not demonstrated through clear results.

2. The revision in line 2.111 has not been deleted.

3. It is recommended that "2. Materials and Methods" be presented in separate chapters to clarify the methods employed in this study and the specific information security topics addressed.

4. The explanation of the learning scenario would be more comprehensible if system diagrams were included.

5. The article states, "The story, and therefore the storyline, concludes with a general quiz on cybersecurity topics already covered." Does this imply that learners take the test after completing all story scenes?

6. If a learner repeatedly fails to achieve his objectives within the story scene, how should his learning outcomes be assessed?

7. Is there further analysis of the participants' backgrounds? It is possible that their backgrounds or prior knowledge could influence the experimental results.

8. Has the questionnaire been analyzed for reliability and validity? The phrasing of some questions is unclear. For instance, what does "I forgot everything around me" imply? If it queries whether students were immersed in the game, a score of only 2.66 suggests that they were not fully engaged.

9. What constitutes completing the game? Merely escaping from the secret room does not convincingly demonstrate that students have fully grasped all the information and training on network security. It is also possible that students were merely attempting to pass levels by guessing, without serious engagement.

10. In the conclusion, it is claimed that EER can foster innovative thinking and pose critical questions. Where can these outcomes be substantiated?

11. The conclusion should resonate with and differentiate from prior research, emphasizing the unique contributions and highlights of this study.

Author Response

Firstly, we would like to express our gratitude to the reviewers for the work and time they invested in the analysis and revision of the article and the corresponding suggestions towards its improvement. We provide here additional information and the corrections made were highlighted in red in the resubmitted manuscript

The abstract suggests that the methodology used in the experiment is not sufficiently clear. It mentions "experiential learning" and "combining scenario-based practical learning with real-life examples," yet it fails to explain how these methods are practically implemented. Moreover, the abstract states that the research aims to enhance knowledge retention and operational performance in network security practice. However, it only refers to improving network security awareness without specifying the means through which improvements are achieved. The effectiveness of these methods in engaging participants is also not demonstrated through clear results.

Experiential learning is the learning methodology that provides the theoretical support for game-based learning, as explained in the document. The reference to it in the abstract was to explain this relation. However, we agree it was not clear so we have revised the abstract and made it clearer. We also made the explanation clearer in section 2.

 

  1. The revision in line 2.111 has not been deleted.

Corrected.

 

  1. It is recommended that "2. Materials and Methods" be presented in separate chapters to clarify the methods employed in this study and the specific information security topics addressed.

We understand the suggestion, but we consider that the section sizes would be quite unbalanced and that it would also harm the reading. We did create a subsection describing the game which we expect to improve the reading.

 

  1. The explanation of the learning scenario would be more comprehensible if system diagrams were included.

Again, we understand the suggestion, but we consider that it would result in too large diagrams due to the number of plot points in each scenario.

 

  1. The article states, "The story, and therefore the storyline, concludes with a general quiz on cybersecurity topics already covered." Does this imply that learners take the test after completing all story scenes?

During the revision of the document, this sentence was removed. Actually, players can do the quiz at any time and asses their knowledge before they start any scenario (diagnostic approach) or after they complete any scenario.

 

  1. If a learner repeatedly fails to achieve his objectives within the story scene, how should his learning outcomes be assessed?

The game includes a hint system designed to ensure that a player is not stuck in a challenge. Even so, the player can exceed the allotted time and “loose” the scenario. When he/she finishes he/she receives feedback on the learning objectives we should have achieved which gives the player some reference points for the next time he/she plays.

 

  1. Is there further analysis of the participants' backgrounds? It is possible that their backgrounds or prior knowledge could influence the experimental results.

The participants experience with gaming and their prior knowledge and awareness of cyber risks was also collected. In the first case the results were too homogenous to provide interesting cross-analysis (almost all youngsters had significative gaming experience while almost all trainers did not). In the second case, some participants had doubts about the formulation of the question so it was decided not to use it.

 

  1. Has the questionnaire been analyzed for reliability and validity? The phrasing of some questions is unclear. For instance, what does "I forgot everything around me" imply? If it queries whether students were immersed in the game, a score of only 2.66 suggests that they were not fully engaged.

Yes, the playability of the game was analyzed using the standard Game Experience Questionnaire (GEQ), a validated tool (more info here: https://pure.tue.nl/ws/files/21666907/Game_Experience_Questionnaire_English.pdf). That question is intended to measure the “Flow” of the player or their immersion, precisely. And, yes, like stated in the article “However, they were not totally absorbed by the game (2,85) to the point of forgetting everything around them (2,66).”

 

  1. What constitutes completing the game? Merely escaping from the secret room does not convincingly demonstrate that students have fully grasped all the information and training on network security. It is also possible that students were merely attempting to pass levels by guessing, without serious engagement.

Completing the game meant that players went through the 3 scenarios and answered the quiz (one or multiple times as the questions are randomly selected so each time the quiz can be different). In this case, the expression “completing the game” has no pedagogical connotation.

 

 

 

 

  1. In the conclusion, it is claimed that EER can foster innovative thinking and pose critical questions. Where can these outcomes be substantiated?

These are based on the collected bibliographical references. It was possible to verify this by conducting some observations and interviews after some participants finished the game but we have not included this in the article.

 

  1. The conclusion should resonate with and differentiate from prior research, emphasizing the unique contributions and highlights of this study.

We have improved this.

Round 2

Reviewer 2 Report

Comments and Suggestions for Authors

1. Abstract: The abstract has been revised for clearer articulation of "experiential learning" methods and the integration of "situational practical learning with real-life examples."

2. Revision of line 2.111: This issue has been corrected.

3. Recommendation for the 'Materials and Methods' chapter: While I understand the need to maintain chapter balance, separating 'Materials' from 'Methods' could more distinctly showcase the research design and implementation steps. It is advisable to clearly delineate and label the specific contents of 'Materials' and 'Methods' within the existing chapters to aid readers in locating information swiftly.

4. System diagram suggestions for learning scenarios: Despite size considerations, a simplified version of the system diagram could still provide vital visual support, aiding readers in comprehending the design of the situation and the learning process. Including brief illustrations of key situations or learning modules in the document is recommended.

5. Question about the timing of testing: Thank you for clarifying that testing can occur at any time. This detail should be explicitly reflected in the revised document to prevent any misunderstanding regarding the timing of study tests.

6. Assessment of scenarios where learning goals are not achieved: The prompt system and feedback mechanism designed by the author are commendable. However, detailing how the learning progress of participants who do not meet their goals will be measured, and how they will receive further support or resources, is advised.

7. Participant background analysis: It is recommended to further explore how background information influences the research outcomes, even in a preliminary analysis, to understand the potential effects of background variables on learning efficacy.

8. Reliability and validity analysis of the questionnaire: Thank you for providing the source information for the questionnaire.

9. Definition of game completion: While "completing the game" is not inherently educational, the linkage between game completion and achievement of learning objectives should be more explicitly articulated. For instance, are there alternative methods to assess the achievement of learning outcomes?

10. Innovative thinking and critical questions in the conclusion: It is recommended to include specific observational or interview data in the conclusion to substantiate the assertion that EER can stimulate innovative thinking and critical questioning. This will lend more robust evidence to these conclusions.

11. The conclusion should echo previous research: The conclusion section should further emphasize the unique contributions of this study, clearly delineating differences from prior research and suggesting avenues for future inquiry.

Author Response

Again, we would like to express our gratitude to the reviewers for the work and time they invested in the analysis and revision of the article and the corresponding suggestions towards its improvement. We provide here additional comments to the suggestions and the corresponding corrections made were highlighted in red in the resubmitted manuscript.

  1. Abstract: The abstract has been revised for clearer articulation of "experiential learning" methods and the integration of "situational practical learning with real-life examples."

No further changes.

 

  1. Revision of line 2.111: This issue has been corrected.

No further changes.

 

  1. Recommendation for the 'Materials and Methods' chapter: While I understand the need to maintain chapter balance, separating 'Materials' from 'Methods' could more distinctly showcase the research design and implementation steps. It is advisable to clearly delineate and label the specific contents of 'Materials' and 'Methods' within the existing chapters to aid readers in locating information swiftly.

We have reorganized section 2 by creating section 2.2. with the study methodology. We expect this to make a clearer distinction between Materials and Methods.

 

  1. System diagram suggestions for learning scenarios: Despite size considerations, a simplified version of the system diagram could still provide vital visual support, aiding readers in comprehending the design of the situation and the learning process. Including brief illustrations of key situations or learning modules in the document is recommended.

We generated system and flow diagrams for the different scenarios but, again, we consider that the added value was small in relation to the already provided textual information and it would actually harm the reading of the article so we finally removed them.

 

  1. Question about the timing of testing: Thank you for clarifying that testing can occur at any time. This detail should be explicitly reflected in the revised document to prevent any misunderstanding regarding the timing of study tests.

We made this clearer in the document.

 

  1. Assessment of scenarios where learning goals are not achieved: The prompt system and feedback mechanism designed by the author are commendable. However, detailing how the learning progress of participants who do not meet their goals will be measured, and how they will receive further support or resources, is advised.

The game is mostly oriented towards self-diagnostics and self-assessment in relation to cybersecurity as it is mostly thought to be used in an informal, autonomous learning process by SME staff. When used in the scope of vocational training, the game can be used by VET trainers as an educational tool. But then, a formal assessment should be conducted through an external procedure. 

 

  1. Participant background analysis: It is recommended to further explore how background information influences the research outcomes, even in a preliminary analysis, to understand the potential effects of background variables on learning efficacy.

We added some cross-variable information about the participants.

 

  1. Reliability and validity analysis of the questionnaire: Thank you for providing the source information for the questionnaire.

No further changes. We did however add references to the tools in the document.

 

  1. Definition of game completion: While "completing the game" is not inherently educational, the linkage between game completion and achievement of learning objectives should be more explicitly articulated. For instance, are there alternative methods to assess the achievement of learning outcomes?

When players finish a scenario, they are confronted with a list of learning objectives and which of them they have accomplished. Therefore, they receive feedback on what they would be expected to achieve at that stage. Then they can go to scenario 4, the quiz, and self-asses as, if they do it immediately, the questions will be related to the played scenario.

 

  1. Innovative thinking and critical questions in the conclusion: It is recommended to include specific observational or interview data in the conclusion to substantiate the assertion that EER can stimulate innovative thinking and critical questioning. This will lend more robust evidence to these conclusions.

We made clearer in the conclusions which references we used to state this.

 

  1. The conclusion should echo previous research: The conclusion section should further emphasize the unique contributions of this study, clearly delineating differences from prior research and suggesting avenues for future inquiry.

We have improved the conclusions section.

Round 3

Reviewer 2 Report

Comments and Suggestions for Authors

The effort the authors have put into addressing the comments and suggestions provided during the review process. The manuscript has improved substantially.

Back to TopTop