Next Article in Journal
Using Robots with Storytelling and Drama Activities in Science Education
Next Article in Special Issue
Bridges and Mediation in Higher Distance Education: HELMeTO 2020 Report
Previous Article in Journal
Making Good of a Pandemic: A Long-Distance Remedial Summer Course in Calculus
Previous Article in Special Issue
New Advances in Second Language Acquisition Methodology in Higher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students

1
School of Applied Educational Science and Teacher Education, University of Eastern Finland, 80101 Joensuu, Finland
2
Faculty of Business, ICT and Chemical Engineering, Turku University of Applied Sciences, 20520 Turku, Finland
3
Department of Public Health, University of Turku, 20500 Turku, Finland
4
Institute of Biomedicine, Integrative Physiology and Pharmacology, University of Turku, 20520 Turku, Finland
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(7), 328; https://doi.org/10.3390/educsci11070328
Submission received: 25 April 2021 / Revised: 23 June 2021 / Accepted: 28 June 2021 / Published: 1 July 2021

Abstract

:
Methods based on simulation pedagogy are widely used to practice hands-on skills in safety environment. The usability of an EEG simulator on clinical neurophysiology course was evaluated. Second-year biomedical laboratory science students (N = 35) on this course were included in the study. They were divided into three groups. Two groups used the EEG simulator with different feedback modes and one group did not use the simulator. Results were expected to reveal a correlation between user experience and learning outcomes. This study made used of a mixed method study design. During the study, students were asked to keep a learning diary throughout the course on their experience. Diaries were analyzed qualitatively based on content analyses. Quantitative analyses based on an UX questionnaire that measures classical usability aspects (efficiency, perspicuity, dependability) and user experience aspects (novelty, stimulation) and the students’ feelings to use simulator. The quantitative data was analyzed using SPSSTM software. The quantitative and qualitative analyses showed that the use of the EEG simulator, which was evaluating teaching–learning process, has an extra benefit in clinical neurophysiology education and students felt that the simulator was useful in learning. The simulation debriefing session should be followed by a full theoretical and practical session. Students compared their learning from the simulator with that of the actual placement which fosters the reflective practice of learning, again deepening the understanding of the EEG electrode placement and different wave patterns.

1. Introduction

Using technology in education has become one of the most important tools to equip learners with so-called 21st century skills. However, there are multiple factors that influence the implementation of devices into the teaching and learning process. These factors include poor infrastructure, inadequate technology, lack of sufficient technological tools, effective professional development (external factors), low teacher self-efficacy and teacher perceptions (internal factors) [1].
Educational simulators are defined as carefully planned instructional programs to simulate real-world situations, e.g., to practice decision-making in military and medical education [2,3]. Simulators have been seen to improve medical training, allowing users to interact in a virtual environment, increasing contact, for example. Simulations especially support learning of hands-on skills, but also theoretical knowledge as shown in previous studies [4,5]. Simulations provide a safe environment for students to practice the role of a registered health professional [6]. Simulation supports collaborative learning among health professionals from different disciplines and fosters multi-disciplinary work [7].
Effective simulation learning, however, requires a positive user experience (UX). UX consists of subjective cognitions of situational learning with complex and dynamic concepts. This is a consequence of the user’s internal content space (motivation, mood, expectations), characteristics of how the system should be used (usability, functionality), and the context in which the interaction takes place (organizational or social environment, voluntary use) [8]. To minimize the risk of missed learning outcomes through a misconception of the virtual simulator space, it is paramount to gauge UX before implementing simulators in a classroom environment [9].

2. Background

Simulation is an important teaching strategy that prepares healthcare students for professionalism through internships. The focus of the simulation is that students can practice EEG-electrode placement related to the 10–20 system. Additionally, there is possibility to recognize EEG-artefacts and EEG-waves (alfa, beta, theta, delta) with several provocative case studies. With the simulator students will be able to practice all the functions. The extended simulation does not fully match the level of stress associated with working in a “real” ward, but it provides a safe environment for students to practice the role of a registered health professional [6]. It also provides the opportunity for health professionals from a different discipline to learn with the simulator and foster multi-disciplinary working skills [7].
Previous research supports the notion that students will achieve better learning outcomes when studying is combined with a simulator rather than studying only from textual material. Simulations especially support the learning of hands-on skills, but also theoretical knowledge as shown in our previous study [4]. This study provides valuable user experience (UX) information when simulator pedagogy is used in biomedical laboratory science education and shows how the use of a simulator in hands-on skill training develops both motor and sensory skills.
Simulation-based teaching can emphasize the participant, the patient, and/or the system results. Current literature favors participant outcomes, including reaction (satisfaction, self-confidence), learning (changes in knowledge, skills, attitudes), and behavior (how learning transitions into the clinical setting) [10]. Simulation training is adept at strengthening self-efficacy skills [11] and further enables cognitive learning maximization (facts and figures), as well as psychomotor (acquisition of technical skills) ability [12]. Acquiring motor skills involves constructing a mental model. The mental model will provide a conceptual representation of the skills for response production and serves as the standard for correcting responses after receiving feedback [13].
Neurophysiology is a small yet vital area of expertise in biomedical laboratory sciences and bears limited training resources [14]. There is a need for professionals in EEG-registration and we should provide them with a suitable teaching approach toward high-quality skills in neurophysiology and EEG analyses [4]. With simulator pedagogy we can open an opportunity for students to practice EEG-placement and registration in a safe, controlled, and repeatable environment [15]. The question of how best to apply simulator-based teaching in a neurophysiology context, however, remains unanswered. This research is a case study attempting to contribute to the existing knowledge on simulator pedagogy.

3. Materials and Methods

3.1. Study Design

Thirty-five (35) second-year Turku University of Applied Sciences BLS students (24 ± 3 years) were included in the study. All students participated in the clinical neurophysiology course introduction and lectures. A partially mixed sequential dominant status design, across several phases, was used. Each phase took place in succession, with more emphasis on the qualitative phases for the objective of this study [16].
Quantitatively, this study compares the learning of EEG registration between two BLS student groups who used the simulator and a control group without simulator exposure. Over the course of the experiment, all students received an opportunity to qualitatively express their opinions on the best way to use the simulator. The two groups who used the simulator kept learning diaries of their experiences. We compared the qualitative learning diary entries with the quantitative The User Experience Questionnaire (UEQ) questionnaire results to eliminate potentially negative comments on the teaching practice caused by undesirable UX and to enrich our insight into the experience diaries. UEQ and diaries collected information related to novelty, stimulation, dependability, efficiency, perspicuity, and attractiveness of the simulator. The EEG simulator development started in 2018 and has undergone several development iterations at Turku Game Lab by researchers from the Futuristic Interactive Technologies research group of Turku University of Applied Sciences.

3.2. Study Participants

After initial lectures and course demonstration, the students were divided randomly into three groups: group 1 (n = 11), group 2 (n = 12), and group 3 (n = 12). All three groups had an edX-platform lecture about EEG placement, theory, and interpretation. After the lecture, groups 1 and 2 continued with three simulator sessions (one per week), while group 3 kept with the existing practice of completing tasks toward self-preparation for a practical session to follow. After the three weeks of preparation, all three groups completed a practical session.

3.3. Measuring Instruments and Process

The quantitative study used the (UEQ) [17] to survey the user experience. The students were required to answer the UEQ after the three simulator sessions. The simulator group was divided into two groups according to the feedback type from the simulator. Group 1 had fuzzy feedback and group 2 had precise feedback regarding their virtual sensor placement. Although we investigated the effects of the two feedback systems in earlier work [4], we kept the natural division in this experiment for the purpose of reliability by cross analyzing all results of groups 1 and 2.
We did not give the simulator groups elaborate guidance on how to use it; students received some technical information and were left to learn with the simulator once per week for three weeks following the edX-platform lecture. After each session, the simulator students were asked to reflect on: Content (familiar, unfamiliar); difficulty level; Usability (ease of use, guidelines, look and feel and value); experience/emotion (enjoyed, hated, frustrated, fun, useful); Process (learning vs. playing, secure before going into practice, value and provided feedback on learning progressed (Figure 1). The students completed the diaries in their mother-tongue (Finnish), which we later had translated to English for analysis purposes. The students were allowed to complete the diaries in any of the most recognizable formats (slide sets, spreadsheets or word processor files). We collected the diaries before the students participated in the practical (clinical practice) session.
After the clinical practice training ended, all three groups could use the simulator to ensure the same exposure and equal opportunity for all students before examination. In conclusion, reflective conversations were held between the course lecturer and all three experiment groups. These conversations served to obtain thoughts, in a less formal (non-intimidating) setting, about how best to incorporate the simulator into the current teaching environment.

4. Results and Statistical Data Analyses

4.1. Quantitative Results

There was no significant difference between the UEQ results from the two simulator student groups. Hence, we pooled the UEQ-measurements into a single dataset (n = 21). The Statistical Package for the Social Sciences (SPSS) Student Version 25 for Windows were used to calculate, analyze and present the descriptive and inferential UEQ statistics. Analysis of UEQ scores for the six dimensions used to measure the level of the student’s EEG simulation experience were reported using mean, median and standard deviation (SD). The internal consistency of responses for each UEQ-factor was determined by Cronbach’s alpha scale of reliability and a UEQ performed to examine the underlying variance in student responses between the 26 measured items. Although it was not the intention of this study, we did notice some patterns along gender lines and set out to investigate this some more. Admittedly, we cannot read too much into these results because of the low number of males (indicative of the course gender profile) in the study, but we do discuss some key points later in the article. Distribution of data was checked for normality and comparison of means tested using Mann–Whitney U test. Statistical significance was set at 0.05. Table 1 summarizes the pooled UEQ results.

4.2. Qualitative Results

The results from the diaries were analyzed using AtlasTi 8. A content analysis approach was used, and inductive coding to construct a theory and deductive coding to prove the theory was applied [18]. Themes and codes were created on the following topics linking them back to the UEQ: attractiveness, perspicuity, efficiency, dependability, stimulation and novelty. The diaries also mentioned aspects related to learning with the simulator. This last topic includes codes about how to use the simulator in the course context and what subject matter would be good to know before using the simulator. Responses were independently reviewed, and keywords color-coded by one member of the research (MB). Discrepancies on coding were discussed among two more of the study researchers and agreement was reached by consensus. The codes were placed into their respective categories and examined for emergence of common themes. Table 2 lists the categories, themes and codes with their relative groundedness from the qualitative analysis.
The students were most outspoken about learning with the simulator (groundedness = 86) and perspicuity (groundedness = 79), or intuitiveness, of the simulator. The learning with simulator category contained an array of codes that all pointed to a positive experience and gave us cause for a deeper exploration into the student perspective of the best practices for implementing such a simulator in a course. The perspicuity received much attention because there were some initial struggles to learn how to use the simulator. The discussion for this falls outside the scope of this study and we will address the simulator onboarding in a next development iteration of the simulator.
We transcribed and quantified the reflective conversations with all the students after the clinical practice. These conversations revealed that 17 (n = 21) of the simulator students felt that the simulation supported the learned theory of EEG sensor placement, while 4 (n = 12) of the control group expressed that the self-exploration helped them. Moreover, 15 (n = 21) of the simulator group and 4 (n = 21) of the control group expressed self-efficacy in setting up the electrodes during the practical session.

4.3. Combined Findings Analysis

Although we analyzed some of the themes and codes from the diaries and the respective UEQ categories along gender lines, we once again note the limitation posed by the low number of males in the study. The percentages given throughout our discussion were derived from Table 2 and indicate the proportions of male (out of 5) and female (out of 13) responses. Table 3 shows a summary of the response proportions per code.
After the first simulation, students found the use of the simulator challenging. At the end of second and third sessions, they found the simulator familiar, and their degree of frustration diminished enough for them to achieve a sense of flow [19]. Students thought that the simulator helped them learn the skill of placing electrodes on a skull and recognizing EEG waves. Students also felt that the simulator helped them remember the names of the electrodes, identify the placement of the electrodes, and measure the nasion–inion point that is before pre-auricular points, guiding them to work with real patients in the practical session. The control group mentioned they had difficulty remembering electrode names and skull electrode placement.
While some students rated the simulator as a real clinical situation, others felt that the simulator did not give a high enough realism.
However, the human/patient’s head cannot be turned in the same way, making a more limited turn more convenient. S13
Still, turning the head was difficult and nerve-wracking. S18
It nevertheless supported their learning of the EEG-set up theory and hands-on work. Initially, students felt that the simulator was complicated, but it became easier after the second session. Both males (60%) and females (69%) became familiar with the simulator and appreciated simulation in a positive way, females (54%) and males (50%). Females became more frustrated (92%) than males (60%).
The feeling was that I could put electrodes on the head of a real person with a few games. S7
Male students rated the perspicuity (intuitiveness) of the simulator lower (2.50) than female students (3.97). When asking the students about this, we learned that the males were less inclined to read the on-screen usage prompts at the initial simulator attempts. After several attempts, though, both women (69%) and men (60%) did become familiar with the simulator.
Today I played the electrode layout twice and it already went well. S18
Both male (80%) and female (92%) students thought that the simulator included clear guidelines. In the UEQ, dependability was significant different between males (2.38) and female (3.25). However, the Cronbach alpha (0.462) indicates this measure to be less reliable. This could be because of too few participants. Once the guidelines were acknowledged, students felt that they were in control of the simulated electrode placement, giving the simulator a high dependability.
Regardless of gender, students thought the simulator was an excellent way to learn (value). Observation during the practical session and reflection conversations clearly indicated that the simulator groups more easily understood the practical and laboratory protocol and its relationship in clinical situations. Students who did not use the simulator needed more teacher guidance in clinical situations.
I need teacher guide a lot. S25
There was a significant (p = 0.031) difference between stimulating male and female students to use the simulator. 92% of women compared to 60% of men were not motivated to use the simulator. This was confirmed by the results of the UEQ questionnaire (Table 2). Once again, we attribute these negative sentiments to a first impression because after the second and third simulator experiences, both male (40%) and female (77%) students felt motivated to reach the learning outcomes with the simulator.
After the first simulation session, half of the students thought that learning with the simulator was tremendous and that it added much value to the course.
I felt that simulation helped to set up electrodes on the skull and give the idea of it. S4
Positively surprised by the simulation, and the first thoughts that this could become helpful in learning. S5
The simulator taught a surprising amount about interpreting the curves, and in the end, it felt like the game was going well. S7
However, the work leading up to using the simulator came under criticism. Female students (54%) felt they needed theoretical knowledge (e.g., EEG wave recognition) before learning by simulation, compared to 20% of the males.
Without an adequate theoretical basis, it was difficult to identify artifacts and waves from EEG curves. S11
During the reflective discussions, students also expressed that it would be useful to see EEG registration in a clinical setting before learning with a simulator. The simulator was perceived as important for 40% of male and 23% of female students before EEG registration in a clinical situation, but it cannot be a substitute for hands-on training and actual clinical situations.
Of course, the simulator is no substitute for hands-on training, but it could be a good addition to studying. S12

5. Discussion

This study intended to answer the question of how to best use an EEG simulator in BLS teaching practice. However, before recommending a suitable simulator teaching strategy, we wanted to be sure that the simulator in question was well-received by the student population of a BLS course on EEG knowledge and practical skills. We attempted to gain this understanding by quantitative and qualitative analyses of the student experiences with the simulator and how this shaped their perceived self-efficacy during a subsequent practical session. Our investigation showed various levels of appreciation for the simulator design and raised the notion that male and female students differed in their learning experiences with the simulator. Although gender-related differences in accepting simulator teaching strategy was not the primary focus of this study, the evidence thereof compelled us to discuss some of these findings in more detail.

5.1. Simulator Efficacy

For digital learning tools to be effective, they need to exhibit what is known as a state of flow for the learner. Flow theory states that learners are optimally immersed in their activity when the activity in question is challenging enough to not cause a constant sense of anxiety (frustration at not being able to complete a task or set of tasks) or boredom (irritation at continuously being able to complete the activity without some sense of challenge) [20].
After the first simulator lesson, students mentioned that using the simulator was challenging. In some cases, it resulted in students not enjoying the simulator. However, after the second and third simulations, the students were committed to the point where they immersed themselves in the simulator and made the most of the learning experience. In the study, students were encouraged to re-introduce the simulator after the first session, but we understand that some students may have rejected the simulator experience because of their initially frustrating experience. This is part of consistent challenge identification and good simulator design. Johnson and Wiles, 2003 [21] explain that simulators should be challenging enough to match a learner’s skill level and varying difficulty levels. The concept of challenge is again related to flow theory [22]. Care must be taken to avoid a steep learning curve when starting to use simulators in a teaching and learning environment. The reason for this caution is that students are not only learning a new subject or topic but are now expected to also learn to use a simulator, creating a double cognitive burden that can lead to overload and consequent frustration. Based on our experience, we conclude that the initial effects of simulator-supplemented instruction can be reduced in two ways: (a) to simplify the implementation of the simulator by gradually implementing the features of the simulator; or (b) prepare students sufficiently for the experience with theory lessons that support the operation of the simulator’s virtual environment. Vorderer, Hartmann, and Klimmt, (2003) similarly state that a lack of previous simulator or learning experience is likely to lead to feelings of frustration [23].
The qualitative results showed that clear guidelines had the third highest count in diaries, supporting the idea that establishing prior knowledge or scaffolding the initial simulator encounter would lead to a more pleasant simulator introduction. The request for guidelines supports EEG simulator dependability, which was evaluated significantly more positively by male students (Table 3). Men and women have significantly different brain patterns behind the production of visually controlled movements, although both groups are equally adept at the task [24] Our qualitative findings, likewise, indicate that the trustworthiness of a simulator among female students largely depends on understanding how to use the simulator and what a simulator is used for. Without trustworthiness (or dependability), students are unlikely to show a continued interest in using a simulator [25].
In extension to the dependability, the hedonic quality (or stimulation) of the simulator was also evaluated significantly more positively by male students in the quantitative measurement (Table 1) than by female students. This may be because men process visual stimuli differently and have a more acute ability to recognize their intended implication in training simulators, thereby more quickly acknowledging the value of the visuals and being stimulated by them [26]. Given that more than half of the BLS students are female, the requirement for a softer introduction to the simulator is reiterated. Notwithstanding the gender difference in hedonic quality, the UEQ-questionnaire showed that the average student did like the simulator’s layout and feel (Table 2), and this is further reflected in the qualitative results where only one student indicated that the application/simulator is ugly. The fact that the simulator proved to be attractive goes a long way to securing its value as a teaching tool. Students are less likely to be immersed by unappealing teaching aids [27].
Another factor that influences the appeal of a simulator is shown by the “Novelty” quality. The novelty measure of our EEG simulator showed that almost one-third of all students were not pleased while using the simulator. From the diary entries, we found that the primary reason students felt discontent with the simulator was based on the technical problems during the first simulation session relating to unrealistic virtual head movement and confusing feedback regarding the virtual electrode placement. Students expect simulators to have a high functional fidelity and if such expectations are not met, the overall experience (or novelty) of the simulator is lost [28]. With lecturer presence to explain and assist in the head movement and feedback, we were able to alleviate the initial resistance to the simulator enough that significant learning still took place [4].

5.2. Learning by Simulation

All students said that they would prefer to learn some theory before using the simulator as it would make them feel more prepared to use the simulator. This factor could also contribute to the element of frustration reported by the students during the first simulation session. Previous learning and knowledge are essential factors for simulation to support and teach skills [29] but enjoyability is also a factor. For individuals to enjoy the simulation process, they need to feel capable [30]. Students enjoyed the EEG simulator scenario realism because they were able to apply the knowledge acquired in the course and use it in clinical situations decision. They also benefited from the reflection they experienced. In a previous study, we found that EEG simulator gave the greatest improvement in learning as assessed by pre- and post-test knowledge [4]. This met our expectations that the simulation focuses on Bloom’s higher learning goals [31].

5.3. Influence of Gender

There are distinct differences in gender related to current research. There is a difference in how males and females recognize objects and horizontal and vertical axes, as in this research how they felt the simulator. Combined, the findings suggest that there are gender differences in observation simulators as well [32]. In the EEG simulator, there was x-, y- and z-coordinates where students needed to place electrodes; therefore, males thought of EEG simulator measurement and electrode placement in a different, more positive way than females did. Both males and females thought that the simulator was challenging to use.

5.4. Revising the Simulator Pedagogy Model

Although the learning situation is complicated, effective debriefing is essential to guide learners and as part of the learning process. Debriefing is never straightforward because it is a sensitive part of a simulation session. During that time, learners are taught to understand their weaknesses, what skills they need to develop, but also what aspects of professional practice they have learned appropriately. Not all instructors have this, regardless of their clinical expertise or knowledge of simulation techniques and simulation processes [33].
An interesting point from the diaries after the study is that many of the students not only commented on the simulator, but also offered opinions on the sequence of our study process (Figure 2).
The student comments on the study design gave us valuable insight toward proposing a pedagogy model for using simulators in clinical neurophysiology or BLS courses in general (Figure 3).
To improve learning with an EEG simulator, we propose to start with a high-level theory lesson as the preamble, where the use of the simulator as well as the theory related to different electrodes, their placement and EEG waveforms are explained. The simulator should then be available to students so they may learn the practical application of electrodes and reinforce their knowledge of EEG. After this first simulator session, students and teacher should engage in a debriefing session with a focused discussion on the simulator experience. The debriefing should be followed by an in-depth theory and practical session, which includes actual electrode placement. At this point, students should be tasked with comparing learning-with-simulator to actual placement. This promotes reflective learning practice to deepen understanding [34] of EEG electrode placement and different wave patterns. Students can now actively pursue simulator self-study to hone their skills before evaluation of their hands-on practice and identification of EEG patterns.

6. Conclusions

This study on the student’s experiences of simulation learning process supports our hypothesis that students will achieve better learning experiences with the simulator.
Gender only played a role in the experience of the look and feel of the simulation, but results are inconclusive due to the unequal group. This could be considered when developing a simulation. The primary practical contribution of this study lies in the revised simulator pedagogy model (Figure 3). The revised curricular process improves successful implementation of the simulation by providing a clear path, which additionally allows opportunity for various learning styles and teaching approaches to be implemented. This has led us to formulate major changes in our pedagogical plan for future simulator implementation in the classroom.
The new pedagogical model must be tested and further developed based on the feedback, and an automated immediate theory-based feedback is included in the simulator. A well-designed feedback system empowers student’s engagement more comprehensively with the content.

7. Patents

No patents have arisen from this, or previous, research studies concerning the EEG simulator. The intellectual property rights of the EEG simulator belong to Turku University of Applied Sciences.

Author Contributions

Writing—original draft, M.H.B.; writing—review and editing, W.R., C.B.-R., J.M.L. and T.K.; Supervision, C.B.-R., J.M.L. and T.K.; investigation, M.H.B.; Methodology, W.R. and T.K.; Date curation, M.H.B.; resources, T.K.; formal analysis, M.H.B.; software, W.R.; All authors have read and agreed to the published version of the manuscript.

Funding

There was no external founders for this research.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board University of Eastern Finland.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

We would like to thank Marianne Nielsen from Denmark, and Mari Lahti from Turku University of Applied Sciences.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Inan, F.A.; Lowther, D.L. Factors affecting technology integration in K-12 classrooms: A path model. Educ. Technol. Res. Dev. 2010, 58, 137–154. [Google Scholar] [CrossRef]
  2. Bergeron, B.P. Learning & retention in adaptive serious games. Stud. Health Technol. Inform. 2008, 132, 26–30. [Google Scholar] [PubMed]
  3. Breuer, J.; Bente, G. Why so serious? On the relation of serious games and learning. J. Comput. Game Cult. 2010, 4, 7–24. [Google Scholar]
  4. Björn, M.H.; Laurila, J.M.; Ravyse, W.; Kukkonen, J.; Leivo, S.; Mäkitalo, K.; Keinonen, T. Learning impact of a virtual brain electrical activity simulator among neurophysiology students: Mixed-methods intervention study. JMIR Serious Games 2020, 8, e18768. [Google Scholar] [CrossRef] [PubMed]
  5. Rush, S.; Acton, L.; Tolley, K.; Burke, L. Using simulation in a vocational programme: Does the method support the theory? J. Vocat. Educ. Train. 2010, 62, 467–479. [Google Scholar] [CrossRef]
  6. Davies, H.; Schultz, R.; Sundin, D.; Jacob, E. ‘Ward for the day’: A case study of extended immersive ward-based simulation. Nurse Educ. Today 2010, 90, 104430. [Google Scholar] [CrossRef] [PubMed]
  7. Williams, D.; Stephen, L.; Causton, P. Teaching interprofessional competencies using virtual simulation: A descriptive exploratory research study. Nurse Educ. Today 2020, 93, 104535. [Google Scholar] [CrossRef]
  8. Hassenzahl, M.; Tractinsky, N. User experience—A research agenda. Behav. Inf. Technol. 2006, 25, 91–97. [Google Scholar] [CrossRef]
  9. Altalbe, A. Virtual Laboratories for Electrical Engineering Students: Student Perspectives and Design Guidelines. 2018. Available online: https://core.ac.uk/download/pdf/189928179.pdf. (accessed on 30 June 2021).
  10. Jeffries, P.R.; Rodgers, B.; Adamson, K. NLN jeffries simulation theory: Brief narrative description. Nurs. Educ. Perspect. 2015, 36, 292–293. [Google Scholar] [CrossRef]
  11. Palmer, D.H. Sources of self-efficacy in a science methods course for primary teacher education students. Res. Sci. Educ. 2006, 36, 337–353. [Google Scholar] [CrossRef]
  12. Kardong-Edgren, S.; Adamson, K.A.; Fitzgerald, C. A review of currently published evaluation instruments for human patient simulation. Clin. Simul. Nurs. 2010, 6, e25–e35. [Google Scholar] [CrossRef]
  13. De Koning, B.B.; Bos, L.T.; Wassenburg, S.I.; van der Schoot, M. Effects of a reading strategy training aimed at improving mental simulation in primary school children. Educ. Psychol. Rev. 2017, 29, 869–889. [Google Scholar] [CrossRef] [Green Version]
  14. Lupsakko, M. Milla Lupsakko Kansallinen Selvitys Kliinisen Neurofysiologian Opetuksesta Bioanalytiikan Koulutusohjelmassa Ammattikorkeakouluissa. Master’s Thesis, Metropolia University of Applied Sciences, Vantaa, Finland, 2012. Available online: https://www.theseus.fi/bitstream/handle/10024/52350/Lupsakko_Milla_Opinnaytetyo%2029.10.2012.pdf?sequence=1 (accessed on 30 June 2021).
  15. Schrier, K. Learning, Education and Games. Volume One: Curricular and Design Considerations; Carnegie Mellon University: Pittsburgh, PA, USA, 2014. [Google Scholar] [CrossRef]
  16. Leech, N.L.; Onwuegbuzie, A.J. A typology of mixed methods research designs. Qual. Quant. 2007, 43, 265–275. [Google Scholar] [CrossRef]
  17. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Construction of a benchmark for the user experience questionnaire (UEQ). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 40. [Google Scholar] [CrossRef] [Green Version]
  18. Sarker, S.; Lau, F.; Sahay, S. Using an adapted grounded theory approach for inductive theory building about virtual team development. ACM SIGMIS Database DATABASE Adv. Inf. Syst. 2000, 32, 38–56. [Google Scholar] [CrossRef]
  19. Mäkinen, H.; Haavisto, E.; Havola, S.; Koivisto, J. User experiences of virtual reality technologies for healthcare in learning: An integrative review. Behav. Inf. Technol. 2020, 1–17. [Google Scholar] [CrossRef]
  20. Csikszentmihalyi, M.; Csikzentmihaly, M. Flow: The psychology of optimal experience. J. Leis. Res. 1990, 24, 93–94. [Google Scholar]
  21. Johnson, D.; Wiles, J. Effective affective user interface design in games. Ergonomics 2003, 46, 1332–1345. [Google Scholar] [CrossRef] [Green Version]
  22. Kiili, K.; Lainema, T.; de Freitas, S.; Arnab, S. Flow framework for analyzing the quality of educational games. Entertain. Comput. 2014, 5, 367–377. [Google Scholar] [CrossRef]
  23. Vorderer, P.; Hartmann, T.; Klimmt, C. Explaining the enjoyment of playing video games: The role of competition. [CrossRef]
  24. Gorbet, D.J.; Sergio, L.E. Preliminary sex differences in human cortical BOLD fMRI activity during the preparation of increasingly complex visually guided movements. Eur. J. Neurosci. 2003, 25, 1228–1239. [Google Scholar] [CrossRef]
  25. Maksoud, N.F.A. When virtual becomes better than real: Investigating the impact of a networking simulation on learning and motivation. Int. J. Educ. Pract. 2003, 6, 253–270. [Google Scholar] [CrossRef] [Green Version]
  26. Halpern, D.F.; Benbow, C.P.; Geary, D.C.; Gur, R.C.; Hyde, J.S.; Gernsbacher, M.A. The science of sex differences in science and mathematics. Psychol. Sci. Public Interest 2007, 8, 1–51. [Google Scholar] [CrossRef] [Green Version]
  27. Christopoulos, A.; Conrad, M.; Shukla, M. Increasing student engagement through virtual interactions: How? Virtual Real. 2018, 22, 353–369. [Google Scholar] [CrossRef] [Green Version]
  28. Ravyse, W.; Blignaut, A.; Botha-Ravyse, C. Codebook co-development to understand fidelity and initiate artificial intelligence in serious games. Int. J. Game-Based Learn. 2020, 10, 37–53. [Google Scholar] [CrossRef]
  29. Sweetser, P.; Wyeth, P. GameFlow: A model for evaluating player enjoyment in games. ACM Comput. Entertain. 2005, 3, 3. [Google Scholar] [CrossRef]
  30. Hanekom, S.M.; Botha-Ravyse, C. Does a simulation game for management in health science elicit learning? A mixed method approach. In Proceedings of the EdMedia + Innovate Learning 2019, Amsterdam, The Netherlands, 24–28 June 2019; pp. 1117–1126. [Google Scholar]
  31. Marzano, R.J.; Kendall, J.S. The New Taxonomy of Educational Objectives, 2nd ed.; Corwin Press: Thousand Oaks, CA, USA, 2007; Available online: https://www.ifeet.org/files/The-New-taxonomy-of-Educational-Objectives.pdf (accessed on 12 January 2021).
  32. Zhang, X.; Li, Q.; Eskine, K.J.; Zuo, B. Perceptual simulation in gender categorization: Associations between gender, vertical height, and spatial size. PLoS ONE 2014, 9, e89768. [Google Scholar] [CrossRef]
  33. Der Sahakian, G.; Alinier, G.; Savoldelli, G.; Oriot, D.; Jaffrelot, M.; Lecomte, F. Setting conditions for productive debriefing. Simul. Gaming 2015, 46, 197–208. [Google Scholar] [CrossRef] [Green Version]
  34. Delany, C.; Watkin, D. A study of critical reflection in health professional education: ‘Learning where others are coming from’. Adv. Health Sci. Educ. 2009, 14, 411–429. [Google Scholar] [CrossRef]
Figure 1. Experiment design.
Figure 1. Experiment design.
Education 11 00328 g001
Figure 2. The study process followed with simulation groups 1 and 2.
Figure 2. The study process followed with simulation groups 1 and 2.
Education 11 00328 g002
Figure 3. Proposed pedagogy model for most effective learning with a simulator.
Figure 3. Proposed pedagogy model for most effective learning with a simulator.
Education 11 00328 g003
Table 1. Quantitative results from the user experience questionnaire.
Table 1. Quantitative results from the user experience questionnaire.
UX DimensionsMean (SD)Male (n = 5) MdFemale (n = 13) MdCronbach Alpha (Question Items)
Attractiveness3.04 ± 0.7342.243.230.779 (6)
Perspicuity3.52 ± 1.4322.503.970.779 (6)
Dependability3.08 ± 0.7222.383.25 0.462 (4)
Stimulation2.78 ± 0.6192.192.920.596 (4)
Novelty3.27 ± 0.4393.003.340.596 (4)
Efficiency3.29 ± 0.980--
Table 2. Qualitative results from the user diaries.
Table 2. Qualitative results from the user diaries.
CodeCode GroundednessTotal Number of Unique Mentions (n = 18)Number of Male Mentions (n = 5)Number of Female Mentions (n = 13)
Comment category: Attractiveness (category groundedness = 3)
Positive simulator aesthetic2101
Negative simulator aesthetic1101
Comment category: Perspicuity (category groundedness = 79)
Simulator complexity appropriate1915411
Became familiar with simulator241239
Frustrated at first2915312
Remained unfamiliar with simulator7514
Comment category: Efficiency (category groundedness = 40)
Difficult to use simulator2415411
Easy to use simulator161239
Comment category: Dependability (category groundedness = 27)
Clear guidelines2716412
Comment category: Stimulation (category groundedness = 50)
Enjoyed using simulator2512210
Unpleasant to use simulator7514
Simulator is a nice way to achieve learning outcomes181037
Comment category: Novelty (category groundedness = 7)
Simulator is a fun, new way to learn7505
Comment category: Learning with simulator (category groundedness = 86)
Appropriate learning content difficulty1915411
Under-prepared for simulator work11707
Theory before using simulator11817
Learning with simulator is reliable151037
Simulator as prior learning5505
High self-efficacy for practical after simulator6523
Adds learning value to course13725
Live demo before simulator6422
Table 3. Male and female response proportions.
Table 3. Male and female response proportions.
CodeProportion of Male Mentions (n = 5)Proportion of Female Mentions (n = 13)
Attractiveness
Positive simulator aesthetic08%
Negative simulator aesthetic08%
Perspicuity
Simulator complexity appropriate80%85%
Became familiar with simulator60%69%
Frustrated at first60%92%
Remained unfamiliar with simulator20%31%
Efficiency
Difficult to use simulator80%85%
Easy to use simulator60%69%
Dependability
Clear guidelines80%92%
Stimulation
Enjoyed using simulator40%77%
Unpleasant to use simulator20%31%
Simulator is a nice way to achieve learning outcomes60%54%
Novelty
Simulator is a fun, new way to learn038%
Learning with simulator
Appropriate learning content difficulty80%85%
Under-prepared for simulator work054%
Theory before using simulator20%54%
Learning with simulator is reliable60%54%
Simulator as prior learning038%
High self-efficacy for practical after simulator40%23%
Adds learning value to course40%38%
Live demo before simulator40%15%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Björn, M.H.; Ravyse, W.; Botha-Ravyse, C.; Laurila, J.M.; Keinonen, T. A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students. Educ. Sci. 2021, 11, 328. https://doi.org/10.3390/educsci11070328

AMA Style

Björn MH, Ravyse W, Botha-Ravyse C, Laurila JM, Keinonen T. A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students. Education Sciences. 2021; 11(7):328. https://doi.org/10.3390/educsci11070328

Chicago/Turabian Style

Björn, Marko Henrik, Werner Ravyse, Chrisna Botha-Ravyse, Jonne M. Laurila, and Tuula Keinonen. 2021. "A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students" Education Sciences 11, no. 7: 328. https://doi.org/10.3390/educsci11070328

APA Style

Björn, M. H., Ravyse, W., Botha-Ravyse, C., Laurila, J. M., & Keinonen, T. (2021). A Revised Pedagogy Model for Simulator-Based Training with Biomedical Laboratory Science Students. Education Sciences, 11(7), 328. https://doi.org/10.3390/educsci11070328

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop