Next Article in Journal
Re-Imagining Leadership Roles beyond the Shadow of Bureaucracy
Previous Article in Journal
Problem-Based Learning in Türkiye: A Systematic Literature Review of Research in Science Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Project Report

Using Technology-Supported Approaches for the Development of Technical Skills Outside of the Classroom

School of Dentistry, University of Liverpool, Liverpool L3 5PS, UK
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(3), 329; https://doi.org/10.3390/educsci14030329
Submission received: 2 January 2024 / Revised: 10 March 2024 / Accepted: 13 March 2024 / Published: 20 March 2024

Abstract

:
The COVID-19 pandemic, and the subsequent lockdown, had a significant and disproportionate impact on subjects that required the development of clinical technical skills due to the lack of access to simulation classrooms and patients. To directly address this impact, we developed a conceptual framework for the design and implementation of a progressive simulation approach from the perspective of a teacher. This conceptual framework integrates and draws from key theories of simulation design, curriculum integration, learner motivation, and considerations of the facets of good assessment, including modern approach validity. We used the conceptual framework to inform and develop a progressive simulation design to support the development of essential intra-oral suturing skills outside of the classroom, at home, while still being able to provide external feedback as if learners were in the classroom or clinic. Moreover, the approach described significantly extended the available opportunities for deliberate practice, assisting with the automation of essential skills and aiming to better support learner development during face-to-face patient opportunities. Although further work is needed, we believe that our conceptual framework could be applied to any situation where progressive simulation is seen as beneficial, especially if there are elements of key skills that could initially be developed using a simple take-home simulator.

1. Introduction

The key role of simulation in medical education is that it allows a new skill to be developed and repeatedly practiced in a safe, standardised, and predictable environment [1,2]. Furthermore, there is increasing evidence demonstrating that the active use of simulation has significant and demonstrable benefits to real-world patient care and learner development [3,4,5], especially when combined with deliberate practice strategies [3]. These benefits of simulation have been recognised by the International Association for Health Professions Education (AMEE) who have produced a practical guide covering the best practice for the design and integration of simulation within clinical curricula [6]. Therefore, it is not surprising that regulators of the medical professions [7], including the UK General Dental Council (GDC) [8], regard simulation as an ideal methodology to provide evidence that new skills have been safely acquired before those skills can be further developed through direct patient contact [9]. In addition to protecting the public, the use of simulation can also potentially address issues such as limited opportunities for learners to practice, which are frequently related to increases in student (undergraduate and postgraduate) numbers combined with the same level of staffing, leading to competition pressures for clinical time and resources [1,2].
Simulation has been defined as ‘a person, device, or set of conditions which attempts to present [education and] evaluation problems authentically. The student or trainee is required to respond to the problems as he or she would under natural circumstances’ [10].
This definition highlights the importance of authenticity and, in doing so, suggests the need for high-fidelity simulation, a need that has also been reinforced by several influential publications [3,10], including the AMEE guide [6]. A direct consequence of the perceived need for authenticity has driven the development of ever more sophisticated training devices that, in addition to traditional body part simulators (e.g., urethral catheterisation [11] or dental phantom heads [12]), now include patient-specific virtual reality simulation [13], artificial intelligence and machine learning [14], and haptics [15]. In many cases, these developments have increased the cost and physical size of the training device, meaning that the simulators are limited in number and are frequently located in dedicated multi-stationed facilities, a situation that unfortunately can limit learning opportunities [7].
The COVID-19 pandemic presented significant challenges for higher education because its effects were so wide ranging and resulted in negative effects on student and staff well-being [16], substantive changes to teaching and assessment delivery [17], and deleterious impacts on institutional finances [18]. Moreover, for medical and dental education, these challenges were divisive because learners no longer had access to simulators or patients, which meant that without rapid change and innovation, their ability to develop, graduate, and enter into their respective professional register had effectively been suspended. Therefore, the COVID-19 challenge for dental education was ‘how can we both maintain and continue to develop clinical technical skills in learners who do not have access to simulators or patients’?
Similarly to most other educators, meeting these challenges within our dental school drove an unprecedented upshift in the use of digital-technology-supported solutions [19], which included the development of virtual patients, the implementation of on-line classrooms, and an increased use of virtual learning environments. However, no matter how innovative these approaches were, they could not support the development of surgical techniques that relied on using highly specific psychomotor skills.
To directly address this problem concerning the development of specific psychomotor skills, we devised and implemented a conceptual framework for progressive simulation that could start with a ‘take-home’ low-fidelity simulation approach. Our framework enables a teacher-focused approach to a progressive learning design by integrating all the facets that have been identified as important for the design, implementation, and learner development of a simulation [3,4,6,10,20,21].
In this manuscript, we justify and describe the conceptual framework and explain how it was implemented within our curriculum during COVID-19 and beyond, using suturing skills as the prototype. Our very preliminary data indicate that the framework has been well received, and we suggest that it has the potential to act as a novel approach to enhance the development of a multiplicity of skills by significantly increasing the opportunities for deliberate practice and feedback.

2. Methods

The conceptual framework was derived using Weick’s [22] theoretical framework development strategy to explore the evidence and theories discussed below, with the aim of achieving conceptual integration to inform the development of educational practice [23]. To initially develop, implement, and test the conceptual framework, suturing was used as the prototype skill because it is a skill that has been shown to be amenable to progressive development and can start with easily deployable low-fidelity simulation approaches that could be operationalised in a take-home format [24,25,26,27].

2.1. Simulation Design

Medical simulators are often not portable, and as a result, they have tended to be used in isolation from each other [28], which has likely further influenced the reductionist approaches to their use, increasing their distance from real-world authenticity. However, Ellaway and colleagues (2009) [28] have suggested connecting and combining simulation modalities in a framework they referred to as ‘practica continua’, which was grounded in Vygotsky’s widely cited theory of the ‘zone of proximal development’ [29] and Bruner’s concepts of ‘scaffolding’ [30]. The practica continua framework supports the concept that simulations with different dimensions such as technical, communication, simulated patient, and team work can be progressively integrated to move from individual simulated elements to an integrated synthesised situational environment, easing the transition into a real-world setting [28]. At the level of the technical skills dimension, the framework also fosters the possibility of linking a series of related simulated technical tasks with increasing sophistication, allowing for learner ‘stretch’, which is known to be essential for skill development [31,32]. Crucially, within medical education, the use of a ‘partial task trainer’ [27], or the use of progressive simulation approaches with increasing sophistication, have been shown to have positive benefits for the development of technical skills [33,34,35]. These findings, in addition to the positive benefit of progressive simulation, are also synergistic with theories that relate to supporting learning by limiting the cognitive load to which the learner is being subjected at any one time [36]. Overall, the results from these studies suggest that a progressive approach to simulation may also allow automation of aspects of the skill [31,32], freeing-up additional cognitive capacity for managing new unseen issues during opportunities for patient-facing management.
Taken together, these findings suggest that a justifiable approach would be ‘reverse-engineering’ from a real-world authentic task to (a) identify the totality of the skills being employed in the real-world-task and their relationships; (b) use available assessment data to identify the early ‘threshold concepts’ (or skills) [37] that learners struggle with the most to support deliberate practice [31,32] so that these threshold skills can form the initial focus of the overall skill to be developed; and (c) implement the practica continua design to further develop and integrate the foundation skills with other key skills in order to progressively move from simulation to patient-facing care. An important additional consideration is that care must be taken with respect to how the progressive simulations are designed and integrated so that both learners and faculty understand what each level of simulation is actually demonstrating, in order to prevent the development of misplaced learner confidence [28,38] or, in an extreme case, patient harm due to a progression panel giving the outcome of a simulation inappropriate meaning in relation to real-world skills [39]. Therefore, in common with all forms of assessment, the validity of an approach must be established because it is essential to understand its limitations in order to ensure that the appropriate ‘interpretation use argument’ [40] is derived and understood in relation to any learner feedback provided or decisions being made with respect to the implied real-world skills attributed to the simulation outcome data. To help establish the interpretation use argument, Kane (2013) [40] suggests the consideration of four inferences: Scoring, Generalisation, Extrapolation, and Implication, with which it is necessary to gather evidence to support the credibility and trustworthiness of decision being made. An illustrative flamework for how to use the inferences has been provided by Cook (2015) [41], which we have used previously to investigate aspects of our own assessment validity in relation to making progress decisions [42].
In addition to the need to integrate and direct key aspects of a progressive simulation design, a conceptual framework would also need to support essential aspects of the learning design and environment, because due to the COVID-19 lockdown, learning and skill development was going to have to take place both off campus and on-line at home.

2.2. Learning Environment

The available data strongly suggest that the core attributes to successfully learning with simulation are the opportunity for repetitive/deliberate practice; good pre-event staging learner information; the provision of individualised feedback to close the performance gap post-event; and a range of difficulties within the skill to be developed, combined with the provision of a controlled and safe environment for learning [4,6,10]. These core attributes identified for simulation are not surprising and apply to many aspects of medical education and synergise with established theories that try to explain the motivation to learn [43]. Cook and Artino (2016) [43] reviewed the intersection between key motivational theories, and their findings provide important insights for simulation learning design, especially if it is going to be deployed remotely in an isolated take-home self-directed setting. Two of the key motivators identified were as follows: (1) self-efficacy [44,45], which speaks to the expectancy of success or failure, meaning that the initial simulated task must seem readily achievable. Therefore, the staging of the simulation difficulty is likely to be important not only from the perspective of cognitive load [36] but also for motivation, to which recent data support a relationship [46]; and (2) the perceived ‘value’ or anticipated result of the task to be undertaken, which can also be positively or negatively affected by social influences, which will impact intrinsic and intrinsic motivation [47]. The value influence suggests that (a) it is essential for learners to realise the importance of how the skill supports their professional activities, and (b) consideration needs to be given to the environment in which they are learning because a competitive environment can reinforce the prevalence of a fixed mindset [43,48], which is the opposite of the growth mindset required to accept and respond to feedback [49]. This latter point, regarding mindset, is especially important for the current generation of learners because there is a high prevalence of a fixed mindset, leading to avoidance behaviours to prevent ‘failure’ that should be considered when designing any form of learning [49,50,51]. That said, data from a study that attempted to compare learning gain in small on-site and larger off-site dedicated simulation facilities did not identify any significant differences in learning [7]. However, it is important to note that data from this study would have included a potentially lower number of Generation Z learners in the data, considering it was published in 2017. Furthermore, the studies included within the data analysis were in relation to communal-simulation facilities, albeit of different sizes; therefore, the competition effect could have been missed due to it not including leaners performing simulation in isolation [7]. Therefore, for the conceptual framework, we have included key considerations with regard to the learning environment.
In addition, data from the simulation situation study also highlighted the potential advantages of on-site smaller simulation facilities [7] that included increased ease of access associated with training when less fatigued, a low risk of cancellation, low time for travel, and less anxiety provoking—all of which are advantages that would apply to a take-home simulation situation.
The importance of the provision of useful and timely feedback that informs learner development is a well-established essential attribute that is identified in the medical simulation literature [4,6,10], the motivational literature [43], and the more general educational literature [49,52]. However, the provision of ideal feedback has been noted to be a barrier in the few studies that have investigated simulation in take-home situations [5,53]. In both of these studies, the feedback approach was self-appraisal, using self-rating against a set of criteria in the absence of external input. This self-rating approach does not fulfil all the ideal principles of feedback, which require external input [52] and can also contribute to motivation [43]. Therefore, there is a need to address this shortfall in take-home simulation designs.
For the feedback to have the right educational impact [54], thought must also be given to whether the simulation is being deployed as ‘training’ or ‘assessment’ [7]. Therefore, careful thought must also be given to how the take-home simulation will be integrated into the current curriculum. The importance of curriculum integration for simulation has been identified by several authors [4,6,10]. The approach that has been widely adopted is Kern’s 6-step approach to a curriculum for medical education (2009) [20], which comprises the following: (1) Problem identification, (2) Target needs assessment, (3) Goals and Objectives, (4) Educational strategies, (5) Implementation, and (6) Evaluation and Feedback. More recently, Kern’s approach when applied to simulation was re-evaluated by Khamis (2016) [21], who agreed fully with Kerns steps but suggested that step 6 should be further divided into Individual Assessment/Feedback and Program Evaluation, giving a total of 7 steps.
For our dental curriculum, the need for integration of the take-home simulation was also critical because it follows a programmatic approach to assessment [55,56,57,58,59], which requires evidence for developmental progression from assessment through the meticulous integration and triangulation of data from all forms of assessment. In the paradigm of programmatic assessment, there is no formative or summative [56,57,59], which conceptually means that in our deployment of a take-home simulation, it could be considered part of longitudinal training.

2.3. Acceptability and Feasibility

It has been theorised that for any assessment to have appropriate ‘utility’, it must be valid and reliable, have a positive educational impact, be acceptable to stakeholders, and be feasible [54]. For the conceptual framework, we already incorporated the modern concepts of validity and reliability through the need to use interpretation use arguments [40], as well as aspects of educational impact that relate to the learning environment and the use of the simulation as training or assessment. However, the conceptual framework also needed to include considerations over ‘acceptability’ and ‘feasibility’, which ultimately focus on stakeholder engagement (staff, students, faculty) in terms of time commitment and costs.

3. Results

3.1. Conceptual Framework for the Design and Integration of a Take-Home Simulation

The conceptual framework presented below (Figure 1) was used to provide a teacher perspective for the design and implementation of a progressive simulation, containing a take-home simulation stage. This framework integrated important design and implementation considerations that include utility frameworks for assessment [54], modern approaches to validity and reliability [40], and curriculum integration [20,21], as well as essential facets required for good simulation [6,10,21].

3.2. Using the Framework to Prototype a Take-Home Suturing Simulation

3.2.1. Simulation Design

Suturing is a method of wound closure that promotes healing by bringing the wound edges into close apposition [60]. A good suturing technique requires gentle tissue handling and careful suture placement, combined with accurate knot tying with appropriate tension to ensure the correct alignment and apposition of the tissue planes in order to maximise their associated blood supply for healing. Therefore, together with most other technical skills, the successful outcome of suturing relies on the ability of the operator to master, integrate, and deploy a distinct set of technical skills, underpinned by applying their requisite knowledge so that they can adapt to the contextual needs required by the specific situation, i.e., be competent [61].
The available simulators for suturing range from rudimentary foam pads that enable the learner to place a suture in an easily accessible space [26] through significantly more costly endoscopy box simulators (that, although static, simulate space limitation), all the way to virtual reality simulators that provide tactile, auditory, and visual feedback of both patient and procedural factors [62].
In undergraduate dentistry programmes, a common method of simulation employed to teach suturing is class-based group teaching using a rudimentary foam pad [24]. This method of simulation follows a reductionist approach to skill development but arguably allows the learner to actively practice, develop, and safely automate key skills of suture instrument handling and knot tying. In addition, teaching often takes place in blocks that limit recurrent opportunities for practice, which, when combined with limited opportunities to place sutures in patients, can put learners at risk of de-skilling [24].
The advent of COVID-19 lockdown meant that this already less-than-ideal theoretical learning situation became worse because even the face-to-face class-based block teaching could no longer be delivered. However, available data and experience suggest that the foam-pad simulation approach has developmental benefit [24,25,26,27], and it is very easy to video record a learner placing a suture on a foam pad, making it amenable to technology-supported approaches for assessment and feedback.
In the real world of general dental practice, suturing is normally undertaken under local anaesthetic in a small team dental setting. Therefore, from clinical experience, the domain skills required for the successful placement of a suture include the following [25]:
  • Clinical domain: appropriate and safe administration of the local anaesthetic; safe and appropriate use of suturing instruments; appropriate tissue handling, suture placement to relocate tissues and promote healing; suture handling and knot tying; and appropriate levels of infection control and sharps protection.
  • Knowledge domain: understanding of the local anaesthetic agents, along with its indications and contraindications; understanding of the local anatomy; a critical understanding of the aims of the suturing technique and how adapt it to the current circumstances; and an understanding of the suture material and its selection.
  • Communication domain: ability to effectively communicate with a dental nurse and the patient to ensure effective procedure delivery as well as manage any anxiety; confirm consent; and write up the case notes.
  • Management and Leadership: ability to work effectively and lead the team and mitigate for human factors.
  • Professionalism: ability to take responsibility for the procedure and its outcome; identify any on-going personal developmental needs.
Using the basic foam pad simulator, combined with an appropriate checklist approach, it was possible to develop an interpretation use argument [40] to gain appropriate insight into the following in the context of a progressive simulation design:
  • Clinical domain: the mechanics of tying a knot and moving the hands—demonstrating the safe and appropriate use of suturing instruments; appropriate tissue handling and suture placement to relocate tissues and promote healing; suture handling and knot tying; and appropriate levels of infection control and sharps protection [63].
  • Knowledge domain: a critical understanding of the aims of the suturing technique and some insight into the ability to adapt it to the current circumstances; and an understanding of the suture material and its selection.
  • Professionalism: ability to identify any on-going developmental needs and address them through deliberate practice.
By referencing the longitudinal developmental data gathered from years of work-based assessment held within our database, we were able to identify ‘threshold skills’ through establishing the areas that learners typically struggled to master when placing sutures [42]. The data clearly demonstrated that the areas of difficulty that frequently limited a learner’s ability to independently place a suture in a patient were in relation to suture handling and knot tying, skills that could be developed on a basic foam simulator (see above).
It is well established both theoretically [31,32] and in practice [26] that the development of the threshold skills identified will also require learner ‘stretch’, or increasing difficulty. Therefore, the practica continua outlined for suturing skills was as follows:
  • Simulation 1—Provide learning support over key knowledge aspects, highlighting to learners the links to the real-world task, e.g., relationships between suture handling, knot tying, and placement, as well as tissue blood supply and healing. The simulation was carefully designed to require a focus on the identified threshold skills and minimise the cognitive load by requiring learners to practice the simplest form of suturing. The interpretation use argument [40] indicated that the scoring approach (see later) would be appropriate to demonstrate the areas identified in the clinical, knowledge, and professional domains, meaning that at the point of demonstrating performance consistency, learners would show evidence of having gained the required skills and being ready to move to the next simulation.
  • Simulation 2—Increase the difficulty of the suture placement by limiting access to the suture site using a simple physical barrier (a plastic cup with the bottom cut off), focusing again on the identified threshold skills. The interpretation use argument [40] indicated that this limited level of generalisability, combined with the scoring approach and growing longitudinal data, would show evidence of having gained the required skills and being ready to move to the next simulation.
  • Simulation 3—Introduce new suturing techniques that again increase the stretch in the identified threshold skills. The interpretation use argument [40] indicated that this increased level of generalisability, combined with the scoring approach and growing longitudinal data, would show evidence of having gained the required skills and being ready to move to the next simulation.
  • Simulation 4—In a simulation suite, move to placing sutures in a phantom head that limits access and has more realistic soft tissues; increase stretch through altering suture location and type. The interpretation use argument [40] indicated that this increased level of generalisability, combined with more sophisticated evidence of extrapolation, the scoring approach, and growing longitudinal data, would show evidence of having gained the required skills and being ready to move to the next simulation.
  • Simulation 5—Introduce suture placement on a real patient, commencing with simple and moving to more complex.
Stages 1–4 could all be achieved using a simple foam pad.

3.2.2. Learning Environment

Video resources have previously been shown to be useful for the development of suturing skills [24]. Therefore, a suite of videos was developed and hosted in Panopto, which the school had purchased early in lockdown. Each individual video, within the suite, focused on a single suturing technique to limit cognitive load. However, the suite of videos was designed to work synergistically to support skill automation by requiring learners to re-use the same basic threshold skills while undertaking different techniques that were increasingly complex, according to the simulation practica continua.
In addition, the videos were organised to encourage consideration of the value of the skill, divided into a forethought stage (i.e., what they were trying to achieve in the real world) and a performance stage (i.e., hints and tips) to support the principles of good simulation design [4,6,10]. It was also ensured that the format of each video was constructively aligned to the assessment criteria [64] to facilitate both the staff provision and learner understanding of any subsequent feedback provided to support deliberate practice. Once finalised, the videos were hosted in a dedicated space within the virtual learning environment (CANVAS) along with additional material that focused on supporting the understanding of suture materials, techniques, and their relationship to anatomical requirements.
To complement the learning resources, each student was supplied with their own personal simulation kit, which comprised scissors, needle holders, tissue forceps, silk sutures, a skin pad, and a sharps box for a safe means of suture disposal. Prior to the distribution of the simulation kits, risk assessment was undertaken, and a ‘Safety usage and Sharps Disposal Protocol and Agreement’ was developed that required learners to be professionally committed to engage safely because they would be using sharps in an unsupervised environment.
At the start of every month, students were assigned a new simulation stage in the practica continua supported by its requisite video and associated material in the virtual learning environment. The learners were able to repeatedly practice the technique at home using the learning resources as a reference for self-directed learning. Then, before a set deadline, they recorded a video of themselves placing the required suture on their skin pad using their personal mobile device, which they then uploaded to Panopto.

3.2.3. Acceptability and Feasibility

To support acceptability and feasibility, we developed a web-based portal so that individual staff could be assigned to several learners to provide detailed feedback. The aforementioned portal used the application programming interface (API) available for Panopto so at staff could see the uploaded videos assigned to them (Figure 2a).
To provide the external feedback for each learner, it was necessary to design an interface for staff that could simultaneously display the uploaded Panopto-hosted skill video along with the ability to provide appropriate feedback (Figure 2b). Therefore, we took advantage of an existing technology-supported system that we had developed to manage work-based assessment and feedback known as LiftUpp [59,65,66], which also ensured staff and student familiarity. Furthermore, LiftUpp was used to triangulate all our data collected from work-based assessments to support our programmatic methodology for student development. Therefore, this approach also ensured the seamless integration of this take-home simulation into the existing curriculum.
To manage work-based assessment, LiftUpp uses a forms structure for each skill, which is divided into a series of micro-skills, each essential for the successful completion of the task, and aims to focus the learner on the specific deliberate practice needed. In LiftUpp, the micro-skills for suturing closely resemble those reported previously [25], which also include the threshold skills identified for suturing (highlighted in Table 1):
Prior to COVID-19, in the simulated classroom, following observation, a trained and calibrated member of staff would provide feedback using a 6-point developmental indicator (DI) scale aligned to the developing independence of the learner, along with any additional written coaching feedback to support self-reflection [47] (Table 2).
To support the same provision of external feedback for suturing at home, we created a facsimile of the LiftUpp suture form in the web portal and used the existing LiftUpp API for staff to provide both DIs and written feedback to learners for each of the suturing stages. This approach had the additional advantage that the data from the simulation videos were fully integrated into the full LiftUpp dataset so that they could be used to inform holistic programmatic assessment progress decisions.
The costs involved in providing each student with a simple simulation kit were nominal and considerably cheaper than having to provide box simulators or 3D-print frameworks, which are approaches to take-home simulations that have been reported previously [5,53]. Furthermore, the preliminary anecdotal findings suggest that through using the conceptual framework to design the progressive simulation, the acceptability and feasibility of the approach has been good, evidenced by the following:
  • Learner engagement was, and continues to be, very high.
  • The approach has significantly increased the deliberate practice/feedback opportunities compared to the existing pre-COVID-19 system.
  • The approach has significantly increased the complexity of opportunities compared to the existing pre-COVID-19 system.
  • Learner feedback has been highly supportive, and anecdotal feedback indicated that a significant number believed their confidence with respect to suturing had increased; further detailed work is needed.
  • Staff compliance was universally high, and there we no issues with providing feedback, which only took a few minutes per student due to the integrated design of the web portal. This included watching the uploaded video that was normally 1–2 min long.

4. Discussion

We have described the creation of a conceptual framework for the design and implementation of progressive simulation from the perspective of a teacher. The conceptual framework integrates and draws from key theories of simulation design [4,6,10], curriculum integration [20,21], learner motivation [43], and considerations of the facets of good assessment [54], including modern approach validity [40]. The conceptual framework was successfully used to inform and implement a novel technology-supported learning design that used a take-home progressive simulation approach to support the development of the threshold skills identified as essential for the development of intra-oral suturing during COVID-19 lockdown.
The preliminary data suggest that the design achieved using the framework was cost-effective and well received by staff and students, and it facilitated additional opportunities for deliberate practice supported by external feedback, which have been identified as essential factors in skill development using simulation [3,4,6,10]. Unfortunately, we cannot show this data, because they were gathered during the COVID-19 lockdown when there was difficulty gaining ethical approval. Nevertheless, the initial success of the approach meant that the progressive simulation designed following the conceptual framework has been actively kept within the curriculum post COVID-19.
Although the acceptability and feasibility of the progressive suture simulation created using the conceptual framework are highly encouraging, there is now a need to fully evaluate and improve the approaches to progressive simulation, as well as establish the impact of the take-home elements. The progressive suture simulation created from the framework can act as a focus for these studies that will hopefully lead to a refinement of the conceptual framework and, in doing so, will add to a wider understanding of the validity and educational impact of this form of learning approach.
The most fundamental test will be to establish the relationship between the practica continua and the extrapolation inference [40] to real-world work to further our understanding of the role of simulation in dentistry [67]. However, the conceptual framework and the simulation design will also allow for investigation of how the less-structured home environment contextually effects individual learning strategies, which has been identified as an important issue in relation to motivation [43] and learning [7]. Another area of possible study is in relation to the value of providing additional feedback directly to the video submitted, further improving the feedback process and tightening the temporal relationship between action and need for change [52], which could also be combined with the motivation power of the reinforcement of success [43]. We are in the process of gaining ethical approval to support these investigations.

5. Conclusions

Through the integration of existing evidence and theories that relate to longitudinal simulation design and best design practice [3,4,6,10,28]; the essential need for good feedback [4,10,49,52]; opportunities for deliberate practice [3,31,32]; and the tacit need for curriculum integration [20,21], we have been able to create a novel framework that supports a teacher focus for the implementation of longitudinal simulation that can start with take-home elements. The fact that we have been able to successfully operationalise the framework supports the original work upon which the framework is based. However, through the framework design, we have been able to further the conceptual requirements needed for good longitudinal simulation design by including considerations of motivation and cognitive load [36,43,46], the learner environment [7], assessment design [54,59] and validity [40]. Moreover, through the considered implementation of technology, we have been able to successfully implement an approach to provide meaningful external feedback for a simulation undertaken in an off-campus situation, which is a significant advancement over what has been described previously [5,53].
We fully acknowledge that further work is needed regarding the educational impact of the framework. However, the conceptual framework described could be applied to any situation where progressive simulation is seen as beneficial. The framework also offers the additional novel advantage of supporting the identification of threshold skills that could be developed outside of expensive and limited opportunity simulation environments to extend the prospects for skill development, which likely has positive benefits to learning [7]. This conceptual framework also has the potential to extend the available curricula time for the development of technical skills by significantly increasing the opportunities for deliberate practice and feedback.

Author Contributions

Conceptualization, S.L.M. and L.J.D.; methodology, S.L.M. and L.J.D.; software, E.A.A.; validation, S.L.M., E.A.A. and L.J.D.; formal analysis, S.L.M. and L.J.D.; investigation, S.L.M.; resources, S.L.M. and L.J.D.; data curation, S.L.M.; writing—original draft preparation, S.L.M. and L.J.D.; writing—review and editing, L.J.D.; visualization, L.J.D.; supervision, L.J.D.; project administration, S.L.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not Applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mason, W.T.M.; Strike, P.W. Short Communication See One, Do One, Teach One‚ Alas This Still How It Works? A Comparison of the Medical and Nursing Professions in the Teaching of Practical Procedures. Med. Teach. 2003, 25, 664–666. [Google Scholar] [CrossRef] [PubMed]
  2. Cox, M.; Irby, D.M.; Reznick, R.K.; MacRae, H. Teaching Surgical Skills—Changes in the Wind. N. Engl. J. Med. 2006, 355, 2664–2669. [Google Scholar] [CrossRef]
  3. McGaghie, W.C.; Issenberg, S.B.; Cohen, E.R.; Barsuk, J.H.; Wayne, D.B. Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad. Med. 2011, 86, 706–711. [Google Scholar] [CrossRef] [PubMed]
  4. Cook, D.A.; Hamstra, S.J.; Brydges, R.; Zendejas, B.; Szostek, J.H.; Wang, A.T.; Erwin, P.J.; Hatala, R. Comparative Effectiveness of Instructional Design Features in Simulation-Based Education: Systematic Review and Meta-Analysis. Med. Teach. 2013, 35, e867–e898. [Google Scholar] [CrossRef] [PubMed]
  5. Wilson, E.; Janssens, S.; McLindon, L.A.; Hewett, D.G.; Jolly, B.; Beckmann, M. Improved Laparoscopic Skills in Gynaecology Trainees Following a Simulation-training Program Using Take-home Box Trainers. Aust. N. Z. J. Obstet. Gynaecol. 2019, 59, 110–116. [Google Scholar] [CrossRef]
  6. Motola, I.; Devine, L.A.; Chung, H.S.; Sullivan, J.E.; Issenberg, S.B. Simulation in Healthcare Education: A Best Evidence Practical Guide. AMEE Guide No. 82. Med. Teach. 2013, 35, e1511–e1530. [Google Scholar] [CrossRef]
  7. Sørensen, J.L.; Østergaard, D.; LeBlanc, V.; Ottesen, B.; Konge, L.; Dieckmann, P.; Vleuten, C.V. der Design of Simulation-Based Medical Education and Advantages and Disadvantages of in Situ Simulation versus off-Site Simulation. BMC Med. Educ. 2017, 17, 20. [Google Scholar] [CrossRef]
  8. GDC General Dental Council. Preparing for Practice; GDC: London, UK, 2015.
  9. GDC General Dental Council. Standards for Education; GDC: London, UK, 2015.
  10. Issenberg, S.B.; Mcgaghie, W.C.; Petrusa, E.R.; Gordon, D.L.; Scalese, R.J. Features and Uses of High-Fidelity Medical Simulations That Lead to Effective Learning: A BEME Systematic Review. Med. Teach. 2005, 27, 10–28. [Google Scholar] [CrossRef]
  11. Todsen, T.; Henriksen, M.V.; Kromann, C.B.; Konge, L.; Eldrup, J.; Ringsted, C. Short- and Long-Term Transfer of Urethral Catheterization Skills from Simulation Training to Performance on Patients. BMC Med. Educ. 2013, 13, 29. [Google Scholar] [CrossRef] [PubMed]
  12. Perry, S.; Bridges, S.M.; Burrow, M.F. A Review of the Use of Simulation in Dental Education. Simul. Healthc. J. Soc. Simul. Healthc. 2015, 10, 31–37. [Google Scholar] [CrossRef] [PubMed]
  13. Willaert, W.I.M.; Aggarwal, R.; Herzeele, I.V.; Cheshire, N.J.; Vermassen, F.E. Recent Advancements in Medical Simulation: Patient-Specific Virtual Reality Simulation. World J. Surg. 2012, 36, 1703–1712. [Google Scholar] [CrossRef]
  14. Winkler-Schwartz, A.; Bissonnette, V.; Mirchi, N.; Ponnudurai, N.; Yilmaz, R.; Ledwos, N.; Siyar, S.; Azarnoush, H.; Karlik, B.; Maestro, R.F.D. Artificial Intelligence in Medical Education: Best Practices Using Machine Learning to Assess Surgical Expertise in Virtual Reality Simulation. J. Surg. Educ. 2019, 76, 1681–1690. [Google Scholar] [CrossRef]
  15. Al-Saud, L.M. The Utility of Haptic Simulation in Early Restorative Dental Training: A Scoping Review. J. Dent. Educ. 2021, 85, 704–721. [Google Scholar] [CrossRef]
  16. Chen, T.; Lucock, M. The Mental Health of University Students during the COVID-19 Pandemic: An Online Survey in the UK. PLoS ONE 2022, 17, e0262562. [Google Scholar] [CrossRef] [PubMed]
  17. Adedoyin, O.B.; Soykan, E. Covid-19 Pandemic and Online Learning: The Challenges and Opportunities. Interact. Learn. Environ. 2023, 31, 863–875. [Google Scholar] [CrossRef]
  18. Ahlburg, D.A. COVID-19 and UK Universities. Political Q. 2020, 91, 649–654. [Google Scholar] [CrossRef]
  19. Ironsi, C.S. Navigating Learners towards Technology-Enhanced Learning during Post COVID-19 Semesters. Trends Neurosci. Educ. 2022, 29, 100189. [Google Scholar] [CrossRef] [PubMed]
  20. Kern, D. A Six-Step Approach to Curriculum Development. In Curriculum Development for Medical Education a Six-Step Approach; Thomas, P., Kern, D., Hughes, M., Chen, B., Eds.; The Johns Hopkins University Press: Baltimore, MD, USA, 2009; pp. 5–9. ISBN 9780801893667. [Google Scholar]
  21. Khamis, N.N.; Satava, R.M.; Alnassar, S.A.; Kern, D.E. A Stepwise Model for Simulation-Based Curriculum Development for Clinical Skills, a Modification of the Six-Step Approach. Surg. Endosc. 2016, 30, 279–287. [Google Scholar] [CrossRef] [PubMed]
  22. Weick, K.E. Theory Construction as Disciplined Imagination. Acad. Manag. Rev. 1989, 14, 516–531. [Google Scholar] [CrossRef]
  23. Jaakkola, E. Designing Conceptual Articles: Four Approaches. AMS Rev. 2020, 10, 18–26. [Google Scholar] [CrossRef]
  24. Macluskey, M.; Hanson, C. The Retention of Suturing Skills in Dental Undergraduates: Retention of Suturing Skills. Eur. J. Dent. Educ. 2011, 15, 42–46. [Google Scholar] [CrossRef]
  25. Macluskey, M.; Durham, J.; Balmer, C.; Bell, A.; Cowpe, J.; Dawson, L.; Freeman, C.; Hanson, C.; McDonagh, A.; Jones, J.; et al. Dental Student Suturing Skills: A Multicentre Trial of a Checklist-Based Assessment: Dental Student Suturing Skills. Eur. J. Dent. Educ. 2011, 15, 244–249. [Google Scholar] [CrossRef] [PubMed]
  26. Safir, O.; Williams, C.K.; Dubrowski, A.; Backstein, D.; Carnahan, H. Self-Directed Practice Schedule Enhances Learning of Suturing Skills. Can. J. Surg. 2013, 56, E142–E147. [Google Scholar] [CrossRef] [PubMed]
  27. Fugill, M. Defining the Purpose of Phantom Head. Eur. J. Dent. Educ. 2013, 17, e1–e4. [Google Scholar] [CrossRef]
  28. Ellaway, R.H.; Kneebone, R.; Lachapelle, K.; Topps, D. Practica Continua: Connecting and Combining Simulation Modalities for Integrated Teaching, Learning and Assessment. Med. Teach. 2009, 31, 725–731. [Google Scholar] [CrossRef]
  29. Vygotsky, L.S. Mind in Society: Development of Higher Psychological Processes; Harvard University Press: Cambridge, MA, USA, 1978; ISBN 0674576292. [Google Scholar]
  30. Wood, D.; Bruner, J.S.; Ross, G. The Role of Tutoring in Problem Solving. J. Child Psychol. Psychiatry 1976, 17, 89–100. [Google Scholar] [CrossRef]
  31. Ericsson, K.A. An Expert-Performance Perspective of Research on Medical Expertise: The Study of Clinical Performance. Med. Educ. 2007, 41, 1124–1130. [Google Scholar] [CrossRef]
  32. Ericsson, K.A. Deliberate Practice and the Acquisition and Maintenance of Expert Performance in Medicine and Related Domains. Acad. Med. 2004, 79, S70–S81. [Google Scholar] [CrossRef]
  33. Dubrowski, A.; Park, J.; Moulton, C.; Larmer, J.; MacRae, H. A Comparison of Single- and Multiple-Stage Approaches to Teaching Laparoscopic Suturing. Am. J. Surg. 2007, 193, 269–273. [Google Scholar] [CrossRef]
  34. Brydges, R.; Carnahan, H.; Rose, D.; Rose, L.; Dubrowski, A. Coordinating Progressive Levels of Simulation Fidelity to Maximize Educational Benefit. Acad. Med. 2010, 85, 806–812. [Google Scholar] [CrossRef]
  35. Aul, K.; Ferguson, L.; Bagnall, L. Students’ Perceptions of Intentional Multi-Station Simulation-Based Experiences. Teach. Learn. Nurs. 2021, 16, 121–124. [Google Scholar] [CrossRef]
  36. Paas, F.; Renkl, A.; Sweller, J. Cognitive Load Theory and Instructional Design: Recent Developments. Educ Psychol 2003, 38, 1–4. [Google Scholar] [CrossRef]
  37. Meyer, J.H.F.; Land, R. Threshold Concepts and Troublesome Knoweldge: Linkages to Ways of Thinking and Practising within the Disciplines. In Improving Student Learning—Ten Years On; Rust, C., Ed.; Oxford Centre for Staff and Learning Development: Oxford, UK, 2003. [Google Scholar]
  38. Massoth, C.; Röder, H.; Ohlenburg, H.; Hessler, M.; Zarbock, A.; Pöpping, D.M.; Wenk, M. High-Fidelity Is Not Superior to Low-Fidelity Simulation but Leads to Overconfidence in Medical Students. BMC Med. Educ. 2019, 19, 29. [Google Scholar] [CrossRef]
  39. Eva, K.W.; Bordage, G.; Campbell, C.; Galbraith, R.; Ginsburg, S.; Holmboe, E.; Regehr, G. Towards a Program of Assessment for Health Professionals: From Training into Practice. Adv. Health Sci. Educ. 2015, 21, 897–913. [Google Scholar] [CrossRef]
  40. Kane, M.T. Validating the Interpretations and Uses of Test Scores: Validating the Interpretations and Uses of Test Scores. J. Educ. Meas. 2013, 50, 1–73. [Google Scholar] [CrossRef]
  41. Cook, D.A.; Brydges, R.; Ginsburg, S.; Hatala, R. A Contemporary Approach to Validity Arguments: A Practical Guide to Kane’s Framework. Med. Educ. 2015, 49, 560–575. [Google Scholar] [CrossRef]
  42. Dawson, L.J.; Fox, K.; Jellicoe, M.; Adderton, E.; Bissell, V.; Youngson, C.C. Is the Number of Procedures Completed a Valid Indicator of Final Year Student Competency in Operative Dentistry? Brit. Dent. J. 2021, 230, 663–670. [Google Scholar] [CrossRef] [PubMed]
  43. Cook, D.A.; Artino, A.R. Motivation to Learn: An Overview of Contemporary Theories. Med. Educ. 2016, 50, 997–1014. [Google Scholar] [CrossRef] [PubMed]
  44. Bandura, A. Perceived Self-Efficacy in Cognitive Development and Functioning. Educ. Psychol. 1993, 28, 117–148. [Google Scholar] [CrossRef]
  45. Bandura, A. Exercise of Personal and Collective Efficacy in Changing Societies. In Self-Efficacy in Changing Societies; Cambridge University: New York, NY, USA, 1995; p. 3. ISBN 0521474671. [Google Scholar]
  46. Evans, P.; Vansteenkiste, M.; Parker, P.; Kingsford-Smith, A.; Zhou, S. Cognitive Load Theory and Its Relationships with Motivation: A Self-Determination Theory Perspective. Educ. Psychol. Rev. 2024, 36, 7. [Google Scholar] [CrossRef]
  47. Zimmerman, B.J. Attaining Self-Regulation: A Social Cognitive Perspective. In Handbook of Self-Regulation; Boekaerts, M., Zeidner, M., Pintrich, P.R., Eds.; Part I: General Theories and Models of Self-Regulation; Academic Press: New York, NY, USA, 2000; pp. 13–39. ISBN 978-0-12-109890-2. [Google Scholar]
  48. Dweck, C.S. The Development of Ability Conceptions. In Development of Achievement Motivation; Academic Press: Cambridge, MA, USA, 2002; pp. 57–88. ISBN 9780127500539. [Google Scholar]
  49. Forsythe, A.; Johnson, S. Thanks, but No-Thanks for the Feedback. Assess. Eval. High. Educ. 2016, 42, 1–10. [Google Scholar] [CrossRef]
  50. Dawson, L.; Fox, K. Can Assessment Be a Barrier to Successful Professional Development? Phys. Ther. Rev. 2017, 21, 11–16. [Google Scholar] [CrossRef]
  51. Fox, K. “Climate of Fear” in New Graduates: The Perfect Storm? Br. Dent. J. 2019, 227, 343–346. [Google Scholar] [CrossRef]
  52. Nicol, D.J.; Dick, D.M. Formative Assessment and Self-regulated Learning: A Model and Seven Principles of Good Feedback Practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  53. Barth, B.; Arutiunian, A.; Micallef, J.; Sivanathan, M.; Wang, Z.; Chorney, D.; Salmers, E.; McCabe, J.; Dubrowski, A. From Centralized to Decentralized Model of Simulation-Based Education: Curricular Integration of Take-Home Simulators in Nursing Education. Cureus 2022, 14, e26373. [Google Scholar] [CrossRef] [PubMed]
  54. Vleuten, C. The Assessment of Professional Competence: Developments, Research and Practical Implications. Adv. Health Sci. Educ. Theory Pract. 1996, 1, 41–67. [Google Scholar] [CrossRef] [PubMed]
  55. Vleuten, C.; Schuwirth, L. Assessing Professional Competence: From Methods to Programmes. Med. Educ. 2005, 39, 309–317. [Google Scholar] [CrossRef] [PubMed]
  56. van der Vleuten, C.P.; Schuwirth, L.W.; Driessen, E.W.; Dijkstra, J.; Tigelaar, D.; Baartman, L.K.; van Tartwijk, J. A Model for Programmatic Assessment Fit for Purpose. Med. Teach. 2012, 34, 205–214. [Google Scholar] [CrossRef] [PubMed]
  57. Heeneman, S.; de Jong, L.H.; Dawson, L.J.; Wilkinson, T.J.; Ryan, A.; Tait, G.R.; Rice, N.; Torre, D.; Freeman, A.; Vleuten, C.P.M. van der Ottawa 2020 Consensus Statement for Programmatic Assessment-1. Agreement on the Principles. Med. Teach. 2021, 43, 1139–1148. [Google Scholar] [CrossRef] [PubMed]
  58. Torre, D.; Rice, N.E.; Ryan, A.; Bok, H.; Dawson, L.J.; Bierer, B.; Wilkinson, T.J.; Tait, G.R.; Laughlin, T.; Veerapen, K.; et al. Ottawa 2020 Consensus Statements for Programmatic Assessment-2. Implementation and Practice. Med. Teach. 2021, 43, 1149–1160. [Google Scholar] [CrossRef] [PubMed]
  59. Dawson, L.; Mason, B.; Bissell, V.; Youngson, C. Calling for a Re-Evaluation of the Data Required to Credibly Demonstrate a Dental Student Is Safe and Ready to Practice. Eur. J. Dent. Educ. 2016, 21, 130. [Google Scholar] [CrossRef]
  60. Khan, M.S.; Darzi, A.; Bann, S.D.; Butler, P.E. Suturing: A Lost Art. Ann. R. Coll. Surg. Engl. 2002, 84, 278–279. [Google Scholar] [PubMed]
  61. Govaerts, M.; Vleuten, C.P. Validity in Work-based Assessment: Expanding Our Horizons. Med. Educ. 2013, 47, 1164–1174. [Google Scholar] [CrossRef]
  62. Grover, S.C.; Scaffidi, M.A.; Khan, R.; Garg, A.; Al-Mazroui, A.; Alomani, T.; Yu, J.J.; Plener, I.S.; Al-Awamy, M.; Yong, E.L.; et al. Progressive Learning in Endoscopy Simulation Training Improves Clinical Performance: A Blinded Randomized Trial. Gastrointest. Endosc. 2017, 86, 881–889. [Google Scholar] [CrossRef] [PubMed]
  63. Cox, X. Stories as Case Knowledge: Case Knowledge as Stories. Med. Educ. 2001, 35, 862–866. [Google Scholar] [CrossRef] [PubMed]
  64. Biggs, J. Enhancing Teaching through Constructive Alignment. High Educ. 1996, 32, 347–364. [Google Scholar] [CrossRef]
  65. Dawson, L.; Mason, B.; Balmer, C.; Jimmieson, P. Developing Professional Competence Using Integrated Technology-Supported Approaches: A Case Study in Dentistry; Fry, H., Ketteridge, S., Marshall, S., Eds.; Routledge: London, UK, 2015. [Google Scholar]
  66. Dawson, L.; Mason, B. Developing and Assessing Professional Competence: Using Technology in Learning Design. In For the Love of Learning: Innovations from Outstanding University Teachers; Bilham, T., Ed.; Palgrave Macmillan: London, UK, 2013; p. 135. [Google Scholar]
  67. Mushtaq, F.; Williams, M.; Ahmed, B. Simulation-Based Dental Education: An International Consensus Report. Eur. J. Dent. Educ. 2021. [Google Scholar] [CrossRef]
Figure 1. Conceptual framework for the design and implementation of a progressive simulation design that includes a take-home element. This framework is designed from the perspective of a staff member wising to develop simulation but aims to ensure best practice by integrating key theories of learning, assessment, and simulation curriculum integration.
Figure 1. Conceptual framework for the design and implementation of a progressive simulation design that includes a take-home element. This framework is designed from the perspective of a staff member wising to develop simulation but aims to ensure best practice by integrating key theories of learning, assessment, and simulation curriculum integration.
Education 14 00329 g001
Figure 2. (a) Screen grab from the staff–student allocation interface. (b) Screen grab from staff–student feedback interface.
Figure 2. (a) Screen grab from the staff–student allocation interface. (b) Screen grab from staff–student feedback interface.
Education 14 00329 g002
Table 1. Suturing section within Liftupp, with the identified threshold skills highlighted.
Table 1. Suturing section within Liftupp, with the identified threshold skills highlighted.
QuestionDescription
Safe suture instrument handlingThe ability to use of all instruments, including suture needles, safely
Appropriate soft tissue handling/protectionThe ability to protect the soft tissues from harm and handle them with reference to the blood supply to facilitate healing
Appropriate bites of tissueThe ability to take equal and perpendicular bites of tissue to help ensure equal tension across the wound
Appropriate knot tensionThe ability to tighten the knot to the appropriate level to ensure effective healing or haemorrhage control
Appropriate knot positionThe ability to place the knot buccally
Appropriate wound edge appositionThe ability to ensure adequate wound edge apposition
Ability to remove a sutureThe ability to remove a suture, ensuring that no residual is left in the wound and the wound is not infected by the unnecessary transit of suture through the wound that has been exposed to the oral environment
Management of suture complicationThe ability to demonstrate a systematic understanding of the risks and complications of the procedure, with recognition
Procedural knowledgeThe ability to demonstrate, through action, that they possess the appropriate knowledge of the steps required to safely undertake the procedure in question
Table 2. LiftUpp developmental indicators (DIs) and descriptions.
Table 2. LiftUpp developmental indicators (DIs) and descriptions.
DIDescription
1UNABLE to do this. Has caused harm or does not seek essential guidance.
2UNABLE to do this independently at present. Largely demonstrated by tutor.
3UNABLE to do this independently at present but able to complete, to the required quality, with significant help, either procedural or by instruction.
4ABLE to do this partially independently at the required quality but requires minor help with aspects of the skill, either procedural or through discussion.
5ABLE to do this independently at the required quality. This may include confirmatory advice from the tutor where the student seeks appropriate assurance.
6ABLE to meet the outcome independently, exceeding the required quality.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

McKernon, S.L.; Adderton, E.A.; Dawson, L.J. Using Technology-Supported Approaches for the Development of Technical Skills Outside of the Classroom. Educ. Sci. 2024, 14, 329. https://doi.org/10.3390/educsci14030329

AMA Style

McKernon SL, Adderton EA, Dawson LJ. Using Technology-Supported Approaches for the Development of Technical Skills Outside of the Classroom. Education Sciences. 2024; 14(3):329. https://doi.org/10.3390/educsci14030329

Chicago/Turabian Style

McKernon, Sarah L., Elliot A. Adderton, and Luke J. Dawson. 2024. "Using Technology-Supported Approaches for the Development of Technical Skills Outside of the Classroom" Education Sciences 14, no. 3: 329. https://doi.org/10.3390/educsci14030329

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop