Next Article in Journal
Adolescents’ Vulnerability to Fake News and to Racial Hoaxes: A Qualitative Analysis on Italian Sample
Next Article in Special Issue
Videogame-Based Learning: A Comparison of Direct and Indirect Effects across Outcomes
Previous Article in Journal
Detecting Groups and Estimating F-Formations for Social Human–Robot Interactions
Previous Article in Special Issue
The Pedagogical Value of Creating Accessible Games: A Case Study with Higher Education Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Videogame-Based Training: The Impact and Interaction of Videogame Characteristics on Learning Outcomes

College of Science and Engineering, San Francisco State University, San Francisco, CA 94132, USA
Multimodal Technol. Interact. 2022, 6(3), 19; https://doi.org/10.3390/mti6030019
Submission received: 28 January 2022 / Revised: 21 February 2022 / Accepted: 25 February 2022 / Published: 1 March 2022
(This article belongs to the Special Issue Innovations in Game-Based Learning)

Abstract

:
Virtualized training provides high fidelity environments to practice skills and gain knowledge, potentially mitigating harmful consequences from real life mistakes. Current research has focused on videogames, believed to have characteristics that improve learning. There is conflicting evidence on the benefits of using videogame-based training to improve learning. This study explored the impact of two videogame characteristics (i.e., rules/goals clarity and human interaction), on mid-training scores and post-training scores (i.e., familiar task and novel task). Results from a sample of 513 undergraduates showed that both videogame characteristics significantly impacted mid-training performance but not post-training performance; clear rules/goals and completing the training alone improved mid-training performance. There was also a significant moderation between the two videogame characteristics for post-training scores on the novel task, but not the familiar task, or mid-training performance. Findings suggest videogame characteristics have an immediate but not sustained impact on learning; implications are discussed.

1. Introduction

Organizations have used virtualized trainings for decades [1,2,3] to allow employees to practice complex, work-related skills in a high-fidelity environment before performing those skills on the job [4]. Although virtualized training can come at a high monetary cost [5,6], it remains popular, particularly in high stakes learning environments such as the medical industry. One reason for the continued use of virtualized training is the potential mitigation of harmful or costly consequences of real-life mistakes [7,8,9,10]. Recent research in this area has focused the different elements of videogames [11,12,13,14,15,16,17,18]. Videogame-based trainings, sometimes called serious games [19] attract organizations because they are perceived to increase motivation and attention [20] which is believed to improve learning outcomes [21]. Despite this belief, there is mixed research evidence for the usefulness and application of videogame-based training as a tool for workplace training [22,23]. Despite the growing trend of using videogames as a form of training within organizations [24], the conditions under which videogames impact learning outcomes are still largely unknown [25]. Expanding this research area will provide guidance to organizations currently investing in videogame-based training. To that end, this study explores two videogame characteristics (i.e., rules/goals clarity and human interaction) and their impact on learning outcomes [26], in addition to probing their potential interaction. These two characteristics were selected in particular, because they are aspects of the game that can be easily controlled and changed by an organization without in-depth changes to a game’s design or programing. Many of the other game characteristics would require programming changes with the assistance of a game developer such as changing the story narrative of a game or changing the degree of fantasy that they game introduces. Since the purpose of this, particular, study is to better understand how organizations can optimize the implementation of a videogame-based training intervention, it makes most sense to use game characteristics which an organization has the agency to change.
Some researchers have proposed exploring videogame characteristics to better understand the use and effectiveness of videogames as training tools [27]. Some have predicted that certain videogame characteristics will improve performance during the training and learning outcomes after a training has been completed. However, it is still largely unknown how videogame characteristics influence learning [25]. Luckily there has been ore recent theoretical interest in videogame characteristics [19], and a growing number of empirical studies are being published [28,29]. While several researchers have shifted towards micro examinations of the connection between videogame characteristics and improved training outcomes [30,31,32], few have isolated and manipulated specific videogame characteristics to observe changes in learning outcomes [33,34].
Many definitions for videogame characteristics have been proposed [35,36,37,38,39,40,41,42]. This study uses the Game Attributes taxonomy [30] which is widely cited by videogame researchers [43,44,45,46]. This taxonomy defines nine game characteristics (i.e., action language, assessment, conflict/challenge, control, environment, fiction, human interaction, immersion, and rules/goals). The two videogame characteristics of interest for this study are rules/goals clarity and human interaction because these two characteristics can often be changed without needing to alter a videogame’s programming [30].

1.1. Videogame Characteristic: Rules/Goals Clarity

The rules/goals videogame characteristic is defined as the parameters a player must abide by when playing the game. This includes rules the player must follow (i.e., a player cannot return to a level once they have exited that level) and goals the player must strive to meet in order to win the game (i.e., a player must fix a device using the correct tool before the two-minute timer expires [30]). Videogames inherently have rules/goals, but the clarity of that information often varies. Researchers have previously claimed that providing clear rules/goals for a videogame is required when games are intended to teach players, since it is believed that the clarity of the rules/goals can positively impact performance in the game and learning outcomes [47,48]. It is valuable to establish empirical evidence for the relationship between clear rules/goals and how much a player learns in the game, beyond these theorized benefits [30,48].
Garris argued [48] that the concepts of goal-setting theory (i.e., clear, specific, and difficult goals improve job performance [49]) can be applied to the rules/goals of games that are intended for learning. The theorized benefits of clarifying the rules/goals of a game include adding structure, increasing motivation for the players, and stimulating the gaming process [39,48,50,51,52,53,54,55], all of which may aid performance in the game and learning. Empirical research that has studied the broad topic of instruction, typically neglects the specific effect of clear rules/goals, seldom providing the details of how rules/goals were included, and rarely manipulating this characteristic to understand its impact, e.g., [31,56,57,58]. Mid-training performance in a videogame-based training is a measure of knowledge, skill acquisition, or change in attitude that can be observed in the player while they are playing the training game. These observed behaviors are attributed to the training experience [59]. Based on these theoretical arguments that clear rules/goals will benefit training performance, it is hypothesized that clear rules/goals prior to the training will facilitate mid-training performance in the videogame.
Hypothesis 1:
Mid-training performance in the videogame-based training will be significantly higher for participants given clear rules/goals.

1.2. Videogame Characteristic: Human Interaction

The videogame characteristic human interaction is defined as the amount of contact an individual has with others while playing a videogame [30]. This broadly includes any form of communication between players in the game. Greater human interaction in a videogame-based training when conducted in a cooperative setting, may generate the context of a team-based training experience. Team training is defined as a scenario that includes collaborative involvement with others for the purpose of developing knowledge or skills [60,61]. Few studies have directly compared individual training to team training using the same training scenario [62]. To my knowledge, no peer-reviewed, published studies have directly compared team training to individual training in videogame-based training in the same videogame environment. I want to further recognize that not all attempts to form a team will successfully lead to effective team interactions. As can be seem in educational and workplace settings, assembling a team can often result in ineffective group dynamics, such as social loafing (i.e., individuals do not contribute to the needs and goals of a team) or groupthink (i.e., individuals do not voice concerns if they believe there is otherwise a group consensus on an idea). However, comparing individuals to teams, even when a team was ineffective in some way, is still a beneficial comparison because it will encompass realistic examples of teams (i.e., both effective and ineffective teams).
The few studies that have compared team-based training to individual training have generally found support that in team-based trainings, participants have had better recall on trained information, made fewer errors, and demonstrated better quality task performance than those trained individually [62,63,64]. Virtual team training research does not typically make direct comparisons between virtual teams and individuals [65,66], but has demonstrated several positive mechanisms through which virtual teams interact [67]. Based on these findings it is predicted that human interaction will promote mid-training performance in the videogame-based training.
Hypothesis 2:
Mid-training performance in the videogame-based training will be significantly higher for participants trained with a team (i.e., higher human interaction).

1.3. Post-Training Performance

As previously mentioned, mid-training performance is the scoring of observable behaviors that occurred during the videogame-based training experience. In contrast to this, post-training performance can be observed and scored after the training has been completed, in other tasks given to trainees. This means the knowledge or skills from the training are being applied to another context, such as a post-training task or back on the job; this is referred to as transfer of training [68]. Evaluating post-training performance can be beneficial for organizations to ensure that the knowledge and skills developed in training are applied outside of training to benefit future work. According to a model of decision-based evaluations, post-training performance can impact valuable outcomes such as long-term changes in workplace performance or organizational payoffs [69].
The two post-training performance measures gathered by this study were (1) performance on a familiar task where the knowledge and skills from the training were demonstrated in a repeated task that was given during the training and (2) performance on a novel task where the knowledge and skills learned in training needed to be applied to a new task the player had never seen before (i.e., the knowledge and skills being used were the same but the location and context were different) [30,70]. Previous research has shown that having more opportunity to practice a skill has led to better learning in past studies [71]. Based on these results, it is predicted that clear rules/goals will provide participants with more opportunity to practice and in turn have higher post-training scores because players will spend less time determining the parameters and objectives of the game. Similar to this, I predict that individuals who trained with a team will engage in helpful processes that assist team members with understanding the game and will have more opportunity to practice.
Hypothesis 3:
Post-training performance on a familiar task will be significantly higher for participants given clear rules/goals.
Hypothesis 4:
Post-training performance on a familiar task will be significantly higher for participants trained with a team (i.e., higher human interaction).
Hypothesis 5:
Post-training performance on a novel task will be significantly higher for participants given clear rules/goals.
Hypothesis 6:
Post-training performance on a novel task will be significantly higher for participants trained with a team (i.e., higher human interaction).

1.4. The Impact of Videogame Characteristics

Some researchers have looked at how videogame characteristics uniquely contribute to different outcomes (e.g., game motivation leads to immersive game playing [47]). These studies have not typically examined the statistical interactions between these characteristics [55]. Bedwell and colleagues argued [30] that videogame characteristics are not orthogonal and cannot be easily parsed apart from one another. They further stated that studying videogame characteristics independently is nearly impossible because they are too interdependent (i.e., influencing one videogame characteristic will influence several other characteristics through unintentional manipulation). Thus, it is important to consider videogame characteristics in conjunction with one another and to further explore the extent to which they interact. The analyses of this study will further explore how clear rules/goals and human interaction impact one another.
Researchers who study teams have identified beneficial processes between team members, such as knowledge sharing and information elaboration, which aids performance [72]. Having clear rules/goals is expected to assist learning to a greater extent for team members because they will have the added benefits of engaging in teamwork processes. Team members who are uncertain about the training will have an additional resource of working together to figure out what is needed to complete the training. Participants training individually will not have team member support for clarification or assistance.
Human interaction is expected to have a positive effect on mid-training performance and is anticipated to enhance the positive effects of clear rules/goals on mid-training performance and post-training performance in both the familiar and novel tasks. Participants working with a team may have assisted one another on certain tasks and understanding components of the game. The theory of transactive memory states that teams depend on each other to fill knowledge gaps during times of team interaction [73]. It is likely that participants will employ transactive memory during the videogame-based training in which they interact with their team and will therefore demonstrate the benefits of this in the post-training activities. Thus, participants in the team training condition will demonstrate greater benefit from clear rules/goals than participants who trained individually.
Hypothesis 7:
Human interaction will moderate the relationship between rules/goals clarity and mid-training performance, such that human interaction will enhance the positive impact of clear rules/goals on mid-training performance.
Hypothesis 8:
Human interaction will moderate the relationship between rules/goals clarity and post-training performance in a familiar task, such that human interaction will enhance the positive impact of clear rules/goals on post-training performance in a familiar task.
Hypothesis 9:
Human interaction will moderate the relationship between rules/goals clarity and post-training performance in a novel task, such that human interaction will enhance the positive impact of clear rules/goals on post-training performance in a novel task.

2. Materials and Methods

2.1. Participants

Participants included 553 undergraduate students from a university in the western United States. Participants received class credit in exchange for participation. Data was removed for participants with incomplete responses or who experienced technical problems, leaving a final sample of 513 participants; 64% male, 79% Caucasian, with an average age of 19 years (SD = 3 years). There were 147 participants who completed the videogame-based training individually and 366 participants who completed the training as part of a team; 122 teams of three.

2.2. Procedures

Upon arrival, participants were placed in a room by themselves with a computer, desk, and chair. They first completed a consent form and were assigned to read one of two instruction manuals for the videogame-based training. One version of the instruction manual clearly stated the rules/goals for the game while the other version did not (i.e., having either high or low clarity on the rules/goals of the game). After reading the instruction manual, participants were randomly assigned to play the videogame-based training for 25 min either individually or as part of a team (i.e., having either high or low human interaction during the game). At this time the research assistant began the game and left the room to allow the participant to play. During this videogame-based training, a screen recording was made of their game to be later viewed and scored as the participant’s mid-training performance score. When the training game was complete, participants responded to a survey with the measures listed below and a series of videogame debriefing questions intended to reinforce the learning [74]. This debriefing included reflective questions about the interpretations and emotional reactions participants had to the training [75], based on a process presented by previous researchers [76]. Participants then completed two post-training activities which were also recorded for later viewing and scoring as their post-training performance scores. The first familiar task lasted 10 min and the participants were given the same scenario they had completed in the videogame training. The second novel task lasted 15 min and asked participants to use the same knowledge and skills they had learned but in a new context. Finally, participants were informed of the purpose of the study and were dismissed.

2.3. Videogame-Based Training

The videogame-based training used in this study was Moonbase Alpha [77]. This videogame was developed and published by NASA to be used for science and technology education. The videogame was designed to teach players about the fundamentals of space science using an immersive and interactive environment on a fictitious base on the moon. In the game, players responded to a meteor impact that disabled the oxygen supply system for their moonbase outpost. The game begins with an animation of the meteor shower damaging several pieces of equipment. The players avatars spawn at their base and they hear instructions over their intercom to navigate the map to locate and repair damaged equipment; requiring players to identify, locate, and use various tools and resources to make repairs. As they move throughout the environment, they can see various stations including a toolshed where they must go to retrieve tools that they must use to make the repairs. The immersive background and gameplay mimic the experience of being on the moon as the avatars move slowly and jump and fall slowly in the lower gravity conditions. The primary tasks are for players to retrieve a hand welding tool to repair damaged solar panels and power couplers. Next, players must retrieve a screwdriver to reconnect hoses between the repaired power couplers. In the mid-training performance, completing these tasks correctly and more quickly would result in a higher score. In the post-training performance on a familiar task, players had to repeat this same series of tasks, but in the post-training performance on a novel task, players had to use a robot with a welding arm and a robot with a screwdriver arm to complete the tasks in a radioactive area that is unsafe for humans to enter. Although the tasks (i.e., repairing power couplers and solar panels) is the same, the tools (i.e., using robots) and the context (i.e., a radioactive environment) were new.
Images of both the single player and multi-player versions are shown in Figure 1. In the multi-player mode, the objectives were the same, but players could talk over a headset and hear each other as well as see each other’s avatar. This enabled them to coordinate their actions and strategize their repairs. For example, a well-coordinated team could have different players retrieve different tools so that one could repair power couplers with a welder while another followed behind to reconnect the hose with the screwdriver.

2.4. Scoring Methodology

Because the videogame used for this study was intended for educational purposes, it did not provide a direct score to players. To evaluate performance, a behavioral checklist was developed and used to measure performance in the videogame activities. To create this checklist, a comprehensive list of player actions was reviewed and rated by subject matter experts (SMEs) on importance (i.e., how essential the task is to accomplishing the videogame objectives), difficulty (i.e., the level of skill required from the player to complete the task), and optimal occurrences (i.e., the optimal number of times a task needed to be performed to complete the videogame objectives). These ratings of the different tasks were used to generate a final behavioral checklist. The inter-rater agreement between the initial ratings for when the checklist was created, was ICC = 0.98 which is considered excellent reliability [78]. Scores for the three videogame activities (i.e., mid-training performance, post-training performance on a familiar task, and post-training performance on a novel task) were generated using the behavioral checklists. Participants’ behaviors were observed and rated by two independent raters using the behavioral checklist. The inter-rater agreement for the ratings created to generate participant scores using the behavioral checklist was ICC = 0.99 which is considered excellent reliability [78]. Final scores are presented as percentages with 0 indicating the lowest possible score and 100 indicating the highest possible score.

2.5. Measures

The demographic measures gathered for this study were age, sex, and ethnicity. Game experience was also measured using 6-items taken from a scale development project [79] still being validated at the time of this study. Items used a 5-point Likert scale from 1 (strongly disagree) to 5 (strongly agree). The data produced a reliability of a = 0.90.

3. Results

Descriptive statistics and correlations for all study variables are provided in Table 1. The three learning outcomes were all positively and significantly correlated (r = 0.22 to 0.30, p < 0.01) indicating participants tended to perform similarly in the three videogame activities, see Table 1.
The hypotheses were tested in SPSS using a MANOVA. The overall model controlled for age, sex and game experience, and included the videogame characteristics (i.e., rules/goals clarity and human interaction), the three learning outcomes (i.e., mid-training performance, post-training performance on a familiar task, and post-training performance on a novel task), and the interaction term (i.e., rules/goalsxhuman interaction). All results can be found in Table 2.
Hypothesis 1 stated that mid-training performance scores would be higher for individuals with clearer rules/goals. As predicted in Hypothesis 1, participants given clear rules/goals (n = 244, M = 0.34, SD = 0.19) scored significantly higher in training performance (F1, 512 = 9.41, p < 0.001, b = −5.66, SE = 1.83, R2 = 0.02) than participants given unclear rules/goals (n = 268, M = 0.29, SD = 0.19). This provides evidence that receiving clear rules/goals prior to completing a videogame-based training, predicts better performance during the training.
Hypothesis 2 stated that mid-training performance scores would be higher for participants who played the game with a team (i.e., higher human interaction). Results supported the opposite of what was predicted in Hypothesis 2. Participants who trained individually (n = 147, M = 0.50, SD = 0.20) scored significantly higher in training performance (F1, 512 = 55.83, p < 0.001, b = 12.29, SE = 2.45, R2 = 0.10) than participants who completed the training with a team (n = 365, M = 0.27, SD = 0.18). Counter to what was predicted, evidence showed that in this videogame-based training, participants who played the game without the assistance of a team, performed better in the training.
Hypothesis 3 stated that post-training performance scores on a familiar task would be higher for individuals with clearer rules/goals. Hypothesis 4 stated that post-training performance scores on a familiar task would be higher for participants who played the game with a team (i.e., higher human interaction). Hypotheses 3 and 4 were not supported by the results. In the first post-training activity, performance was on a familiar task and scores were statistically the same across conditions for both videogame characteristics. Participants given clear rules/goals (n = 244, M = 0.24, SD = 0.12) and unclear rules/goals (n = 268, M = 0.23, SD = 0.14) demonstrated statistically the same level of post-training performance on a familiar task (F1, 512 = 0.44, p = 0.51, b = −1.76, SE = 1.35, R2 < 0.01). This was the same for participants who trained individually (n = 147, M = 0.23, SD = 0.13) and who trained with a team (n = 365, M = 0.24, SD = 0.13) who also statistically had the same level of post-training performance on a familiar task (F1, 512 = 0.50, p = 0.48, b = −1.81, SE = 1.81, R2 < 0.01).
Hypothesis 5 stated that post-training performance scores on a novel task would be higher for individuals with clearer rules/goals. Hypothesis 6 stated that post-training performance scores on a novel task would be higher for participants who played the game with a team (i.e., higher human interaction). Hypotheses 5 and 6 were not supported by the results. In the second post-training activity, performance was on a novel task and scores were statistically the same across conditions for both videogame characteristics. Participants given clear rules/goals (n = 244, M = 0.32, SD = 0.22) and unclear rules/goals (n = 268, M = 0.31, SD = 0.23) statistically had the same post-training scores on a novel task (F1, 512 = 0.01, p = 0.97, b = −4.14, SE = 2.23, R2 < 0.01). Similarly, participants who trained individually (n = 147, M = 0.29, SD = 0.22) and who trained with a team (n = 365, M = 0.32, SD = 0.23) demonstrated statistically the same post-training scores on a novel task (F1, 512 = 1.67, p = 0.20, b = −6.76, SE = 2.99, R2 < 0.01).
Hypotheses 7, 8, and 9 stated that human interaction will moderate the relationship between rules/goals clarity and, respectively, mid-training performance, post-training performance on a familiar task, and post-training performance on a novel task. Hypotheses 7–9 proposed this statistical interaction between the two game characteristics in predicting the three learning outcomes. One of these hypotheses was supported. Hypotheses 7 and 8 were not supported, there was no interaction between rules/goals and human interaction for mid-training performance (F1, 512 = 0.07, p = 0.80, b = 0.88, SE = 3.40, R2 < 0.01) or post-training performance on a familiar task (F1, 512 = 0.54, p = 0.46, b = 1.85, SE = 2.52, R2 < 0.01). Hypothesis 9 was supported, there was an interaction between rules/goals and human interaction for post-training performance on a novel task (F1, 512 = 3.82, p = 0.05, b = 8.13, SE = 4.16, R2 = 0.01), see Figure 2. Note that in Figure 2 the groups were separated as individuals with unclear rules/goals n = 76, individuals with clear rules/goals n = 71, teams with unclear rules/goals n = 192, and teams with clear rules/goals n = 173. The positive impact of having clear rules/goals on the ability to apply skills to a post-training activity with a novel task was further enhanced by having trained in the videogame with a team.

4. Discussion

Previous researchers have called for empirical research on educational videogames, with experiments designed to systematically explore the players’ experience [12]. The purpose of this study was to explore the impact of two videogame characteristics, rules/goals and human interaction, on training performance. Hypotheses 1 and 2 stated that clear rules/goals and high human interaction respectively would be positively related to higher mid-training performance. Both videogame characteristics were significantly related to mid-training performance with clear rules/goals and low human interaction (i.e., training as an individual) improving performance. Neither videogame characteristic had a significant impact on post-training performance in both the familiar tasks (Hypotheses 3 and 4) and the novel tasks (Hypotheses 5 and 6). The game characteristics rules/goals and human interaction had a significant statistical interaction only for the post-training in a novel task activity (i.e., significant for Hypothesis 9 but not significant for Hypotheses 7 and 8), such that training with an individual (i.e., low human interaction) dampened the benefits of receiving clear rules/goals, when applying skills to a novel context. Implications for these results are discussed further.
It is important to consider that I selected clear rules/goals and human interaction as the characteristics in this study because they are accessible videogame characteristic that can be easily modified by organizations wishing to use a videogame for training purposes. Organizations may have limited control over the design aspect of a videogame due to programing access and capabilities. However, clarifying of the rules/goals of a videogame and allowing people to play individually or in groups are two characteristic that organizations have a large degree of control over. This control becomes a relatively useful tool for organizations with this evidence that clear rules/goals and playing with a team can impact training performance.

4.1. The Benefits of Clear Rules/Goals

Clear rules/goals produced significantly higher training performance, which is consistent with previous research in which clear instructions prior to training improved performance [21,80,81,82,83]. Logically, it makes sense how clear rules/goals may improve training performance through focused time and attention. Training in a virtual environment requires specialized rules/goals which trainees must invest time and attention to understand. As a result, there is less time and attention to spend on the knowledge and skills in the training. Clear rules/goals afford time and attention to be spent on training relevant information, which is imperative for learning [84,85,86,87].
The limited-capacity model of attention, e.g., [88] demonstrates that the brain has limited cognitive capacity to attend to information [89,90,91]. A study from [92], gathered qualitative data where participants reported challenges simultaneously playing a training videogame while attending to learning content (i.e., language learning). As the brain filters information to meet the capacity for attention, it prioritizes attention on novel elements. Some researchers have demonstrated that videogame elements irrelevant to the training content (i.e., seductive details) can reduce learning outcomes [56,90,93,94]. Although videogame characteristics are believed to add engaging and immersive elements to training [20] these elements may present distractions and be cognitively draining; distraction hypothesis [56]. Clear rules/goals may facilitate learning by dampening the distracting effects of novel information and focusing trainees on the important aspects of the game.

4.2. The Benefits of Human Interaction

Participants who trained individually performed significantly better in the mid-training performance than participants placed in teams. This is inconsistent with previous research where teams generally performed better than individuals in training [63,95,96,97]. Individuals in this study may have performed better than teams due to the time teams needed to interact and plan [98]. According to group bonus theory, groups engage in processes such as transactive memory, e.g., [62], and information elaboration, e.g., [99], that generally benefit learning and performance over time [63,100,101]. These beneficial processes from group bonus theory (i.e., process gains) come with a tradeoff from group deficit theory (i.e., processes losses [63,102,103]). That is, groups may have drawbacks such as lost time, reduced motivation, additional resource requirements, or ineffective execution (e.g., poor communication [103]). Strategies to reduce process loss, include using technology to reduce lost time for coordination [104] and strengthening group identity prior to performance to reduce loss of motivation [105]. Given this information, the group may have needed time prior to the training to meet, coordinate, and adjust to working as a group. Teams typically produce greater performance on unitary tasks (e.g., problem solving, decision-making, generating ideas) when they are provided with practice time, multiple consecutive activities, time to engage in cognitive elaboration, and time for cooperative interactions [63,101]. In the current study, the time allowed may not have been sufficient for teams to experience these group processes. Researchers comparing teams and individuals may want to consider allowing time for teams to experience these group processes [63].
In the study, there were examples of teams of teams working individually and of teams working together. However, it was clear in the examples of teams working together how scores were impacted by team interactions. Anecdotally, there were several examples of team members demonstrating supportive behaviors (e.g., retrieving tools for one another, flagging broken equipment for teammates to find and repair) that did not earn them points but earned points for their teammate. Although the participants demonstrating these supportive behaviors earned fewer points in their mid-training performance, they were actively involved in the learning process such as asking their teammates questions, sharing information, and watching each other perform tasks. Thus, being a part of a team required more time for actions such as communicating, planning, and assisting (i.e., process loss), and this collective execution required time to coordinate, which likely detracted from their mid-training performance scores. However, it is clear how this learning benefitted team members in their later post-training performance scores on a novel task, described below. Future studies may benefit from ensuring teams have time to interact prior to and during the training before assessing performance. This could provide useful guidance on training design and development, which will benefit industries with continued interest and investment in team-based training (e.g., military training [106]).

4.3. Evaluating Post-Training Performance

Other researchers have called for empirical evidence on games to assess post-training performance as a measure of learning [90,107]. In this study neither game characteristics had a significant impact on post-training performance. Results may have differed between mid- and post-training performance due to the novelty effect. Research on the novelty effect of videogame-based learning methods [108] suggests that the initial impact of videogame characteristics on learning may not sustain over time. The novelty effect theory posits that game characteristics have an immediate impact on generating interest and motivation in trainees, that does not sustain after the novelty of the format wears off [109]. Some researchers have stated their concern about how to address the novelty effect to sustain motivation for learning in these types of learning formats [110,111]. Researchers have generated methods to control for the novelty effect, such as measuring motivation at different time points to prevent a misinterpretation of short-term motivation effects [112] or lengthening the time trainees spend in game, since a fundamental component of the novelty effect is that it disappears over time [111,113]. These findings that game characteristics impacted mid-training performance, but not post-training performance are consistent with the novelty effect.

4.4. Distinguishing Videogame Characteristics

Evidence supported one of the predicted moderations where training as an individual (i.e., low human interaction) dampened the positive relationship between clear rules/goals and post-training performance in a novel task. These findings are consistent with previous research that has demonstrated an interaction between information on shared mental models for teams and role understanding [114]. Other researchers have similarly shown interactions based on the amount and type of guidance given to training participants [57]. Performance in the post-training performance in a novel task asked participants to demonstrate the learned skills in a new environment with tools and equipment they had not yet used. Pulling on the previous explanation of the novelty effect, this new context may have revived participant interest and attention due to the introduction of novel content [108]. This was demonstrated in the improved performance for team members with clear rules/goals. However, in this interaction, individuals performed better with unclear rather than clear rules/goals. Although this may initially be confusing, it logically makes sense that providing clear rules and goals to an individual may hinder their later performance if the context in which they learned the information changes.
Individuals training alone may have benefited from unclear rules/goals because they developed resilience to the ambiguity of the game and adopted an understanding that they needed to take initiative to figure out the game. In contrast to this, individuals who had received clear rules/goals may have grown accustomed to knowing what to expect in the training, finding it later challenging to not receive the same guidance in the novel task. However, the participants who received unclear rules/goals in the initial training may have developed a tolerance for dealing with ambiguous scenarios. When presented with the novel task, these participants may have been inadvertently been prepared to deal with the ambiguity of a novel task. This is consistent with previous research that has shown trainees can be taught self-management skills that influence performance outcomes [115]. Future research would benefit from examining the relationship between ambiguous training instructions and the impact this has on preparing trainees for ambiguity and building resilience and self-efficacy.

4.5. The Importance of Game Selection

It is important to know that careful consideration was given to the selection criteria for selecting the game used in the current study. As research suggests, videogame-based interventions are most effective when game content and characteristics are aligned to the desired outcome [19,26,116]. Five selection criteria were used in considering games for this study. First, the game needed to have meaningful learning components which could be applicable to workplace or educational settings. The current game focuses on broadly applicable teamwork skills while also presenting learning content about basic aeronautical knowledge (e.g., use of oxygen, astronaut job tasks, use of tools and materials). Additionally, the current game encouraged cooperative behaviors (rather than competitive behaviors) further aligning it to the purpose of this study [6]. Second, participants needed to begin gameplay with relatively similar levels of pre-existing knowledge. The current game met this criterion as the possibility of advanced pre-existing aeronautical knowledge in the general population is limited. Third, the game needed the ability to be played either individually or with a team. The current game allowed for both individual and team gameplay. Additionally, all tasks could be completed by an individual, even in team gameplay (i.e., there were no tasks that required two players). Fourth, the game needed to have interesting, immersive components which could increase participants’ overall engagement in the game [20,46]. The current game had several immersive components (i.e., player avatars, depleting oxygen levels, high-fidelity space environment, visibility of other team members tasks) which led to a more motivating and engaging experience. These components were relative to the content, limiting the potential effects of seductive details [94]. Fifth, the game needed to be reasonably accessible to the author and participants. The current game is available free of charge, making it a reasonable option due to limited research funding [1,5,19]. Based on these selection criteria Moonbase Alpha was a satisfactory option for the present study. It is important to note that had there been different or additional study objectives, other selection criteria may have been included and altered the selection of this videogame-based training.

4.6. Limitations

There are three primary drawbacks to the current study that I wish to highlight for consideration of these findings and the design of future research.
There were large variations in the number of verbal interactions between team members during the videogame-based training. Teams were able to perform the tasks in the videogame without always coordinating with their team. As a result, participants assigned to work with a team did not always work as a team; some teams performed as separate individuals in the same virtual world. This may have impacted the results because interdependent tasks allow teams to interact and coordinate with one another towards their objectives [117]. Previous research has demonstrated that teams who were forced to collaborate showed significant differences in performance compared to teams who were allowed to work independently (e.g., collaborative teams were better at strategic problem solving [118]). It is important to consider the impact that task interdependence had on team performance. Using a task with forced interdependence between team members may have changed the study results. However, these results are still relevant as they generalize to the typical drawbacks that occur when using teams in educational and workplace settings. It is common for individuals to be assigned to a team but to still work independently. Thus, our findings still replicate a real-to-life extension of this body of research.
The post-training activities were not a strong representation of post-training activities that would be observed on the job. The time delay of 20 min between the mid-training performance measure and the post-training performance measures was relatively short. Additionally, this was not captured in an actual workplace setting, which would better represent the typical definition of transfer of training [119]. The outcomes may have differed if participants had been given a greater delay in the time between the mid- and post-training activities. In addition to this, the nature of the post-training activities were not an established or validated measure. Although the methods used to develop the measures were reasonable and consistent with scientific theory, they are still new and untested measures without construct validity evidence. Results may have changed with different measures.
Previous research has shown the importance of teams having ample time to interact with their team members and with the content of the training in order for learning to occur [105]. Additional time for participants to interact with their team and with the game may have been beneficial. This study may have not provided participants with enough time to interact with the game and with one another in order to demonstrate the positive benefits of working with a team. Having individuals play for extended periods of time and across several iterations may have allowed for more team interaction and for team processes to emerge [120].

4.7. Future Research

The differences between individuals and teams in training performance was inconsistent with previous research [62,121]. It is unclear when to implement team versus individual training as a benefit to individual learning. Previous research has noted the benefits of using team-based training on unitary tasks such as problem solving and decision-making [63]. However, as the format of training transitions with the availability of high fidelity and virtualized environments, there is a greater need to understand the contexts in which learning is enhanced or hampered by human interaction. Future research would benefit from direct comparisons between teams and individuals in other training activities. Other recent studies have similarly called for team and individual comparisons in a training context [122].
Videogame characteristics impacted the learning outcomes in different ways. For example, clear rules/goals improved mid-training performance but had no impact on post-training performance on a familiar task. Other research has highlighted the importance of understanding and clearly defining the type of learning outcomes being measured [123]. Future research is needed to identify differences between types of learning outcomes. This is critical because inadequate distinctions between learning outcomes could convolute research findings and generate controversy on the benefits of videogame-based training on different types of learning outcomes. It is also important for researchers to understand how training design (e.g., videogame characteristics) may influence learning outcomes differently to inform best practices on game design.

5. Conclusions

Results demonstrated that videogame characteristics impacted training performance; clear rules/goals and training as an individual resulted in higher training scores. However, neither videogame characteristic had an impact on post-training performance in either the familiar task or the novel task. When examining post-training performance in a novel task, where participants applied the trained skills to a new context, there was a significant interaction with human interaction moderating the relationship between clear rules/goals and post-training performance scores. Training as an individual dampened the positive impact clear rules/goals had on post-training scores in the novel task. The promising results of the current study bring to light several questions that would benefit from additional research. As game-based training and team-based training gain more interest in research and application, it is imperative that researchers continue to explore and understand the mechanisms of these relationships.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board of Colorado State University (protocol code 036-16H, date of approval 4 October 2015).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data for this study is available upon request from the author.

Acknowledgments

I would like to thank Alyssa Gibbons, Alex Beck, Amanda Rueda, and Ashley Bamberg for their assistance with data collection and feedback which provided invaluable help with this project.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Orlansky, J.; String, J. Cost-Effectiveness of Flight Simulators for Military Training; Document No. P-1275-VOL-1; Institute for Defense Analyses: Alexandria, VA, USA, 1977. [Google Scholar]
  2. Rowland, K.M.; Gardner, D.M. The Uses of Business Gaming in Education and Laboratory Research. Decis. Sci. 1973, 4, 268–283. [Google Scholar] [CrossRef] [Green Version]
  3. Thompson, F.A. Gaming via Computer Simulation Techniques for Junior College Economics Education; ERIC Document Reproduction Service No. ED 021 549; Riverside City College: Riverside, CA, USA, 1968. [Google Scholar]
  4. Agapiou, A. The use and evaluation of a simulation game to teach professional practice skills to undergraduate architecture students. J. Educ. Built Environ. 2006, 1, 3–14. [Google Scholar] [CrossRef] [Green Version]
  5. Bowers, C.A.; Jentsch, F. Use of commercial, off-the-shelf, simulations for team research. In Advances in Human Performance and Cognitive Engineering Research; Bowers, C.A., Salas, E., Eds.; Lawrence Erlbaum: Mahwah, NJ, USA, 2001; pp. 293–317. [Google Scholar]
  6. Marlow, S.L.; Salas, E.; Landon, L.B.; Presnell, B. Eliciting teamwork with game attributes: A systematic review and research agenda. Comput. Hum. Behav. 2016, 55, 413–423. [Google Scholar] [CrossRef]
  7. Dalla Rosa, A.; Vianello, M. On the effectiveness of a simulated learning environment. Procedia-Soc. Behav. Sci. 2015, 171, 1065–1074. [Google Scholar] [CrossRef]
  8. Denby, B.; Schofield, D. Role of virtual reality in safety training of mine personnel. Min. Eng. 1999, 51, 59–64. Available online: https://www.osti.gov/etdeweb/biblio/20007172 (accessed on 27 January 2022).
  9. Wood, A.; McPhee, C. Establishing a virtual learning environment: A nursing experience. J. Contin. Educ. Nurs. 2011, 42, 510–515. [Google Scholar] [CrossRef]
  10. Zajtchuk, R.; Satava, R.M. Medical applications of virtual reality. Commun. ACM 1997, 40, 63–64. [Google Scholar] [CrossRef]
  11. Anderson, G.S.; Hilton, S. Increase team cohesion by playing cooperative video games. CrossTalk 2015, 1, 33–37. [Google Scholar]
  12. Caroux, L.; Isbister, K.; Le Bigot, L.; Vibert, N. Player-video game interaction: A systematic review of current concepts. Comput. Hum. Behav. 2015, 48, 366–381. [Google Scholar] [CrossRef] [Green Version]
  13. Dempsey, J.; Lucassen, B.; Gilley, W.; Rasmussen, K. Since Malone’s theory of intrinsically motivating instruction: What’s the score in the gaming literature? J. Educ. Technol. Syst. 1993, 22, 1973–1983. [Google Scholar] [CrossRef]
  14. Griffiths, M. The educational benefits of videogames. Educ. Health 2002, 20, 47–51. [Google Scholar]
  15. Jacobs, J.W.; Dempsey, J.V. Simulation and gaming: Fidelity, feedback, and motivation. In Interactive Instruction and Feedback; Dempsey, J.V., Sales, G.C., Eds.; Educational Technology Publications: Englewood Hills, NJ, USA, 1993; pp. 197–228. [Google Scholar]
  16. Pierfy, D.A. Comparative simulation game research: Stumbling blocks and stepping stones. Simul. Games 1977, 8, 255–268. [Google Scholar] [CrossRef]
  17. Ricci, K.E.; Salas, E.; Cannon-Bowers, J. Do computer-based games facilitate knowledge acquisition and retention? Mil. Psychol. 1996, 8, 295–307. [Google Scholar] [CrossRef]
  18. Seok, S.; DaCosta, B. Predicting video game behavior: An investigation of the relationship between personality and mobile game play. Games Cult. 2015, 10, 481–501. [Google Scholar] [CrossRef]
  19. Landers, R.N.; Sanchez, D.R. Game-based, gamified, and gamefully designed assessments for employee selection: Definitions, distinctions, design, and validation. Int. J. Sel. Assess. 2022, 11, 12376. [Google Scholar] [CrossRef]
  20. Kirschner, D.; Williams, J.P. Measuring video game engagement through gameplay reviews. Simul. Gaming 2014, 45, 593–610. [Google Scholar] [CrossRef]
  21. Blair, L. The Use of Video Game Achievements to Enhance Player Performance, Self-Efficacy, and Motivation. Ph.D. Thesis, University of Central Florida, Orlando, FL, USA, 2011. [Google Scholar]
  22. Bell, B.S.; Kanar, A.M.; Kozlowski, S.W. Current issues and future directions in simulation-based training in North America. Int. J. Hum. Resour. Manag. 2008, 19, 1416–1434. [Google Scholar] [CrossRef]
  23. Girard, C.; Ecalle, J.; Magnan, A. Serious games as new educational tools: How effective are they? A meta-analysis of recent studies. J. Comput. Assist. Learn. 2013, 29, 207–219. [Google Scholar] [CrossRef]
  24. Fuchsberger, A. Improving decision making skills through business simulation gaming and expert systems. In Proceedings of the Hawaii International Conference on System Sciences, Koloa, HI, USA, 5–8 January 2016; pp. 827–836. [Google Scholar]
  25. Landers, R.N.; Bauer, K.N.; Callan, R.C.; Armstrong, M.B. Psychological theory and the gamification of learning. In Gamification in Education and Business; Reiners, T., Wood, L., Eds.; Springer: New York, NY, USA, 2015; pp. 165–186. [Google Scholar]
  26. Adcock, A. Making digital game-based learning working: An instructional designer’s perspective. Libr. Media Connect. 2008, 26, 56–57. [Google Scholar]
  27. Backlund, P.; Engström, H.; Johannesson, M.; Lebram, M.; Sjödén, B. Designing for self-efficacy in a game-based simulator: An experimental study and its implications for serious games design. In Proceedings of the Visualisation International Conference, London, UK, 9–11 July 2008; pp. 106–113. [Google Scholar]
  28. Lepper, M.R.; Chabay, R.W. Intrinsic motivation and instruction: Conflicting views on the role of motivational processes in computer-based education. Educ. Psychol. 1985, 20, 217–230. [Google Scholar] [CrossRef]
  29. Parry, S.B. The name of the game is simulation. Train. Dev. J. 1971, 25, 28–32. [Google Scholar]
  30. Bedwell, W.L.; Pavlas, D.; Heyne, K.; Lazzara, E.H.; Salas, E. Toward a taxonomy linking game attributes to learning an empirical study. Simul. Gaming 2012, 43, 729–760. [Google Scholar] [CrossRef]
  31. Erhel, S.; Jamet, E. Digital game-based learning: Impact of instructions and feedback on motivation and learning effectiveness. Comput. Educ. 2013, 67, 156–167. [Google Scholar] [CrossRef]
  32. Kampf, R. Are two better than one? Playing singly, playing in dyads in a computerized simulation of the Israeli–Palestinian conflict. Comput. Hum. Behav. 2014, 32, 9–14. [Google Scholar] [CrossRef]
  33. Cameron, B.; Dwyer, F. The effect of online gaming, cognition and feedback type in facilitating delayed achievement of different learning objectives. J. Interact. Learn. Res. 2005, 16, 243–258. Available online: https://www.learntechlib.org/primary/p/5896/ (accessed on 28 January 2022).
  34. Moreno, R.; Mayer, R.E. Role of guidance, reflection, and interactivity in an agent-based multimedia game. J. Educ. Psychol. 2005, 97, 117–128. [Google Scholar] [CrossRef] [Green Version]
  35. Egenfeldt-Nielsen, S. Overview of research on the educational use of video games. Digit. Kompet. 2006, 1, 184–213. [Google Scholar] [CrossRef]
  36. Kickmeier-Rust, M.D.; Peirce, N.; Conlan, O.; Schwarz, D.; Verpoorten, D.; Albert, D. Immersive digital games: The interfaces for next-generation e-learning? In Universal Access in Human-Computer Interaction. Applications and Services; Stephanidis, C., Ed.; Springer: Berlin, Germany, 2007; pp. 647–656. [Google Scholar]
  37. Kickmeier-Rust, M.D. Talking digital educational games. In Proceedings of the International Open Workshop on Intelligent Personalization and Adaptation in Digital Educational Games, Graz, Austria, 14 October 2009; Kickmeier-Rust, M.D., Ed.; pp. 55–66. [Google Scholar]
  38. Kirriemuir, J.; McFarlane, A. Literature Review in Games and Learning; NESTA Futurelab Series: Report 8; NESTA Futurelab: Bristol, UK, 2004; pp. 1–35. [Google Scholar]
  39. Michael, D.R.; Chen, S.L. Serious Games: Games That Educate, Train, and Inform; Thomson Course Technology, Muska & Lipman/Premier-Trade: Boston, MA, USA, 2005. [Google Scholar]
  40. Riedel, J.C.K.H.; Hauge, J.B. State of the art of serious games for business and industry. In Proceedings of the Concurrent Enterprising, Aachen, Germany, 20–22 June 2011; pp. 1–8. [Google Scholar]
  41. Sawyer, B.; Smith, P. Serious games taxonomy. In Proceedings of the Game Developers Conference, San Francisco, CA, USA, 18–22 February 2008. [Google Scholar]
  42. Zyda, M. From visual simulation to virtual reality to games. Computer 2005, 38, 25–32. [Google Scholar] [CrossRef]
  43. Lamb, R.L.; Annetta, L.; Vallett, D.B.; Sadler, T.D. Cognitive diagnostic like approaches using neural-network analysis of serious educational videogames. Comput. Educ. 2014, 70, 92–104. [Google Scholar] [CrossRef]
  44. Landers, R.N.; Landers, A.K. An empirical test of the theory of gamified learning: The effect of leaderboards on time-on-task and academic performance. Simul. Gaming 2015, 45, 769–785. [Google Scholar] [CrossRef]
  45. Oksanen, K. Subjective experience and sociability in a collaborative serious game. Simul. Gaming 2013, 44, 767–793. [Google Scholar] [CrossRef]
  46. Sedano, C.I.; Leendertz, V.; Vinni, M.; Sutinen, E.; Ellis, S. Hypercontextualized learning games fantasy, motivation, and engagement in reality. Simul. Gaming 2013, 44, 821–845. [Google Scholar] [CrossRef]
  47. Blunt, R. Does game-based learning work? Results from three recent studies. In Proceedings of the Interservice/Industry Training, Simulation, & Education Conference, Orlando, FL, USA, 26–29 November 2007; pp. 945–955. [Google Scholar]
  48. Garris, R.; Ahlers, R.; Driskell, J.E. Games, motivation, and learning: A research and practice model. Simul. Gaming 2002, 33, 441–467. [Google Scholar] [CrossRef]
  49. Locke, E.A.; Latham, G.P. A Theory of Goal Setting and Task Performance; Prentice Hall: Englewood Cliffs, NJ, USA, 1990. [Google Scholar]
  50. Akilli, G.K. Games and simulations: A new approach in education? In Games and Simulations in Online Learning: Research and Development Frameworks; Gibson, D.G., Aldrich, C.A., Prensky, M., Eds.; Information Science Publishing: Hershey, PA, USA, 2007; pp. 1–20. [Google Scholar]
  51. Alessi, S.M.; Trollip, S.R. Multimedia for Learning. Methods and Development, 3rd ed.; Allyn and Bacon: Needham Heights, MA, USA, 2001. [Google Scholar]
  52. Bergeron, B. Developing Serious Games; Charles River Media: Hingham, MA, USA, 2006. [Google Scholar]
  53. Hays, R.T. The Effectiveness of Instructional Games: A Literature Review and Discussion; Document No. NAWCTSD-TR-2005-004; Naval Air Warfare Center Training Systems Division: Orlando, FL, USA, 2005. [Google Scholar]
  54. Prensky, M. Digital Game-Based Learning; McGraw-Hill: New York, NY, USA, 2001. [Google Scholar]
  55. Vandercruysse, S.; Vandewaetere, M.; Clarebout, G. Game based learning: A review on the effectiveness of educational games. In Handbook of Research on Serious Games as Educational, Business, and Research Tools; Cruz-Cunha, M.M., Ed.; IGI Global: Hershey, PA, USA, 2012; pp. 628–647. [Google Scholar] [CrossRef]
  56. Adams, D.M.; Mayer, R.E.; MacNamara, A.; Koenig, A.; Wainess, R. Narrative games for learning: Testing the discovery and narrative hypotheses. J. Educ. Psychol. 2012, 104, 235–249. [Google Scholar] [CrossRef]
  57. Leutner, D. Guided discovery learning with computer-based simulation games: Effects of adaptive and non-adaptive instructional support. Learn. Instr. 1993, 3, 113–132. [Google Scholar] [CrossRef]
  58. Yerby, J.; Hollifield, S.; Kwak, M.; Floyd, K. Development of serious games for teaching digital forensics. Issues Inf. Syst. 2014, 15, 335–343. [Google Scholar]
  59. Morrison, J.E. Training for Performance: Principles of Applied Human Learning; Document No. 1507; John Wiley & Sons: Hoboken, NJ, USA, 1991. [Google Scholar]
  60. Carayon, P. Handbook of Human Factors and Ergonomics in Health Care and Patient Safety; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  61. Weaver, S.J.; Salas, E.; King, H.B. Twelve best practices for team training evaluation in health care. Jt. Comm. J. Qual. Patient Saf. 2011, 37, 341–349. [Google Scholar] [CrossRef]
  62. Liang, D.W.; Moreland, R.; Argote, L. Group versus individual training and group performance: The mediating role of transactive memory. Personal. Soc. Psychol. Bull. 1995, 21, 384–393. [Google Scholar] [CrossRef]
  63. Brodbeck, F.C.; Greitemeyer, T. Effects of individual versus mixed individual and group experience in rule induction on group member learning and group performance. J. Exp. Soc. Psychol. 2000, 36, 621–648. [Google Scholar] [CrossRef]
  64. Laughlin, P.R.; Zander, M.L.; Knievel, E.M.; Tan, T.K. Groups perform better than the best individuals on letters-to-numbers problems: Informative equations and effective strategies. J. Personal. Soc. Psychol. 2003, 85, 684–694. [Google Scholar] [CrossRef] [Green Version]
  65. Dennis, K.A.; Harris, D. Computer-based simulation as an adjunct to a flight training. Int. J. Aviat. Psychol. 1998, 8, 261–276. [Google Scholar] [CrossRef]
  66. Proctor, M.D.; Panko, M.; Donovan, S.J. Considerations for training team situation awareness and task performance through PC-gamer simulated multi-ship helicopter operations. Int. J. Aviat. Psychol. 2004, 14, 191–205. [Google Scholar] [CrossRef]
  67. Papargyris, A.; Poulymenakou, A. The constitution of collective memory in virtual game worlds. J. Virtual Worlds Res. 2009, 1, 3–23. [Google Scholar] [CrossRef]
  68. Baldwin, T.T.; Ford, J.K. Transfer of training: A review and directions for future research. Pers. Psychol. 1988, 41, 63–105. [Google Scholar] [CrossRef]
  69. Kraiger, K. (Ed.) Decision-based evaluation. In Creating, Implementing, and Maintaining Effective Training and Development: State-of-the-Art Lessons for Practice; Jossey-Bass: San Francisco, CA, USA, 2002; pp. 331–376. [Google Scholar]
  70. Bloom, B.S. Taxonomy of Educational Objectives, Handbook; David McKay: New York, NY, USA, 1956. [Google Scholar]
  71. Velada, R.; Caetano, A.; Michel, A.; Lyons, B.D.; Kavanagh, M. The effects of training design, individual characteristics and work environment on transfer of training. Int. J. Train. Dev. 2007, 11, 282–294. [Google Scholar] [CrossRef]
  72. van Knippenberg, D.; De Dreu, C.K.W.; Homan, A.C. Work group diversity and group performance: An integrative model and research agenda. J. Appl. Psychol. 2004, 89, 1008–1022. [Google Scholar] [CrossRef] [PubMed]
  73. Lewis, K. Measuring transactive memory systems in the field: Scale development and validation. J. Appl. Psychol. 2003, 88, 587–604. [Google Scholar] [CrossRef] [Green Version]
  74. Tannenbaum, S.I.; Cerasoli, C.P. Do team and individual debriefs enhance performance? A meta-analysis. Hum. Factors 2013, 55, 231–245. [Google Scholar] [CrossRef]
  75. Peters, V.A.; Vissers, G.A. A simple classification model for debriefing simulation games. Simul. Gaming 2004, 35, 70–84. [Google Scholar] [CrossRef]
  76. Petranek, C.F.; Corey, S.; Black, R. Three levels of learning in simulations: Participating, debriefing, and journal writing. Simul. Gaming 1992, 23, 174–185. [Google Scholar] [CrossRef] [Green Version]
  77. NASA Moonbase Alpha Manual. Available online: http://www.nasa.gov/pdf/527465main_Moonbase_Alpha_Manual_Final.pdf (accessed on 27 January 2022).
  78. Koo, T.K.; Li, M.Y. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J. Chiropr. Med. 2016, 15, 155–163. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  79. Sanchez, D.R.; Langer, M. Video Game Pursuit (VGPu) scale development: Designing and validating a scale with implications for game-based learning and assessment. Simul. Gaming 2020, 51, 55–86. [Google Scholar] [CrossRef]
  80. Earley, P.C.; Northcraft, G.B.; Lee, C.; Lituchy, T.R. Impact of process and outcome feedback on the relation of goal setting to task performance. Acad. Manag. J. 1990, 33, 87–105. [Google Scholar] [CrossRef]
  81. Epstein, M.L.; Lazarus, A.D.; Calvano, T.B.; Matthews, K.A.; Hendel, R.A.; Epstein, B.B.; Brosvic, G.M. Immediate feedback assessment technique promotes learning and corrects inaccurate first responses. Psychol. Rec. 2002, 52, 187–201. [Google Scholar] [CrossRef] [Green Version]
  82. Pian-Smith, M.C.; Simon, R.; Minehart, R.D.; Podraza, M.; Rudolph, J.; Walzer, T.; Raemer, D. Teaching residents the two-challenge rule: A simulation-based approach to improve education and patient safety. Simul. Healthc. J. Soc. Simul. Healthc. 2009, 4, 84–91. [Google Scholar] [CrossRef]
  83. Stajkovic, A.D.; Luthans, F. Behavioral management and task performance in organizations: Conceptual background, meta-analysis, and test of alternative models. Pers. Psychol. 2003, 56, 155–194. [Google Scholar] [CrossRef] [Green Version]
  84. Bell, B.S.; Kozlowski, S.W. Adaptive guidance: Enhancing self-regulation, knowledge, and performance in technology-based training. Pers. Psychol. 2002, 55, 267–306. [Google Scholar] [CrossRef] [Green Version]
  85. Gist, M.E.; Stevens, C.K. Effects of practice conditions and supplemental training method on cognitive learning and interpersonal skill generalization. Organ. Behav. Hum. Decis. Processes 1998, 75, 142–169. [Google Scholar] [CrossRef]
  86. Kozlowski, S.W.; Gully, S.M.; Brown, K.G.; Salas, E.; Smith, E.M.; Nason, E.R. Effects of training goals and goal orientation traits on multidimensional training outcomes and performance adaptability. Organ. Behav. Hum. Decis. Processes 2001, 85, 1–31. [Google Scholar] [CrossRef]
  87. Stevens, C.K.; Gist, M.E. Effects of self-efficacy and goal-orientation training on negotiation skill maintenance: What are the mechanisms? Pers. Psychol. 1997, 50, 955–978. [Google Scholar] [CrossRef]
  88. Lee, M.; Faber, R.J. Effects of product placement in on-line games on brand memory: A perspective of the limited-capacity model of attention. J. Advert. 2007, 36, 75–90. [Google Scholar] [CrossRef]
  89. Kirschner, P.A.; Sweller, J.; Clark, R.E. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educ. Psychol. 2006, 41, 75–86. [Google Scholar] [CrossRef]
  90. Mayer, R.E.; Griffith, E.; Naftaly, I.; Rothman, D. Increased interestingness of extraneous details leads to decreased learning. J. Exp. Psychol. Appl. 2008, 14, 329–339. [Google Scholar] [CrossRef] [PubMed]
  91. Sweller, J. Instructional Design in Technical Areas; ACER Press: Camberwell, Australia, 1999. [Google Scholar]
  92. deHaan, J.; Reed, W.M.; Kuwada, K. The effect of interactivity with a music video game on second language vocabulary recall. Lang. Learn. Technol. 2010, 14, 74–94. [Google Scholar]
  93. Harp, S.F.; Mayer, R.E. The role of interest in learning from scientific text and illustrations: On the distinction between emotional interest and cognitive interest. J. Educ. Psychol. 1997, 89, 92–102. [Google Scholar] [CrossRef]
  94. Harp, S.F.; Mayer, R.E. How seductive details do their damage: A theory of cognitive interest in science learning. J. Educ. Psychol. 1998, 90, 414–434. [Google Scholar] [CrossRef]
  95. Brodbeck, F.; Greitemeyer, T. A dynamic model of group performance: Considering the group members’ capacity to learn. Group Processes Intergroup Relat. 2000, 3, 159–182. [Google Scholar] [CrossRef]
  96. Kirschner, F.; Paas, F.; Kirschner, P.A. Superiority of collaborative learning with complex tasks: A research note on an alternative affective explanation. Comput. Hum. Behav. 2011, 27, 53–57. [Google Scholar] [CrossRef] [Green Version]
  97. Kirschner, F.; Paas, F.; Kirschner, P.A.; Janssen, J. Differential effects of problem-solving demands on individual and collaborative learning outcomes. Learn. Instr. 2011, 21, 587–599. [Google Scholar] [CrossRef]
  98. Onrubia, J.; Rochera, M.J.; Engel, A. Promoting individual and group regulated learning in collaborative settings: An experience in Higher Education. Electron. J. Res. Educ. Psychol. 2015, 13, 189–210. [Google Scholar] [CrossRef] [Green Version]
  99. Maynard, M.T.; Mathieu, J.E.; Gilson, L.L.; Sanchez, D.R.; Dean, M.D. Do I really know you and does it matter? Unpacking the relationship between familiarity and information elaboration in global virtual teams. Group Organ. Manag. 2018, 44, 3–37. [Google Scholar] [CrossRef]
  100. Henry, R.A. Improving group judgment accuracy: Information sharing and determining the best member. Organ. Behav. Hum. Decis. Processes 1995, 62, 190–197. [Google Scholar] [CrossRef]
  101. Schultze, T.; Mojzisch, A.; Schulz-Hardt, S. Why groups perform better than individuals at quantitative judgment tasks: Group-to-individual transfer as an alternative to differential weighting. Organ. Behav. Hum. Decis. Processes 2012, 118, 24–36. [Google Scholar] [CrossRef]
  102. Druckman, D.; Bjork, R.A. In the Mind’s Eye: Enhancing Human Performance; National Academies Press: Washington, DC, USA, 1991. [Google Scholar]
  103. Steiner, I.D. Group Process and Productivity; Academic Press: New York, NY, USA, 1972. [Google Scholar]
  104. Valacich, J.S.; Dennis, A.R.; Connolly, T. Idea generation in computer-based groups: A new ending to an old story. Organ. Behav. Hum. Decis. Processes 1994, 57, 448–467. [Google Scholar] [CrossRef]
  105. Brown, R. Group Processes: Dynamics within and between Groups; Blackwell: Oxford, UK, 1988. [Google Scholar]
  106. Salas, E.; Bowers, C.A.; Cannon-Bowers, J.A. Military team research: 10 years of progress. Mil. Psychol. 1995, 7, 55–75. [Google Scholar] [CrossRef]
  107. Hung, W.; Van Eck, R. Aligning problem solving and gameplay: A model for future research and design. In Interdisciplinary Models and Tools for Serious Games: Emerging Concepts and Future Directions; Van Eck, R., Ed.; IGI Global: Hershey, PA, USA, 2010; pp. 227–263. [Google Scholar] [CrossRef] [Green Version]
  108. Blumenfeld, P.C.; Kempler, T.M.; Krajcik, J.S. Motivation and cognitive engagement in learning environments. In The Cambridge Handbook of the Learning Sciences; Sawyer, R.K., Ed.; Cambridge University Press: Cambridge, NY, USA, 2006; pp. 475–488. [Google Scholar]
  109. Clark, R.E. Reconsidering research on learning from media. Rev. Educ. Res. 1983, 53, 445–459. [Google Scholar] [CrossRef]
  110. Annetta, L.A.; Minogue, J.; Holmes, S.Y.; Cheng, M.T. Investigating the impact of video games on high school students’ engagement and learning about genetics. Comput. Educ. 2009, 53, 74–85. [Google Scholar] [CrossRef]
  111. Tüzün, H. Multiple motivations framework. In Affective and Emotional Aspects of Human-Computer Interaction: Game-Based and Innovative Learning Approaches; Pivec, M., Ed.; IOS Press: Washington, DC, USA, 2006; pp. 59–92. [Google Scholar]
  112. Bandura, A. Social Foundations of Thought and Action: A Social Cognitive Theory; Prentice Hall: Englewood Cliffs, NJ, USA, 1986. [Google Scholar]
  113. Krendl, K.A.; Broihier, M. Student responses to computers: A longitudinal study. In Proceedings of the Annual Meeting of the International Communication Association, Chicago, IL, USA, 23–27 May 1991. [Google Scholar]
  114. Smith-Jentsch, K.A.; Mathieu, J.E.; Kraiger, K. Investigating linear and interactive effects of shared mental models on safety and efficiency in a field setting. J. Appl. Psychol. 2005, 90, 523–535. [Google Scholar] [CrossRef]
  115. Gist, M.E.; Stevens, C.K.; Bavetta, A.G. Effects of self-efficacy and post-training intervention on the acquisition and maintenance of complex interpersonal skills. Pers. Psychol. 1991, 44, 837–861. [Google Scholar] [CrossRef]
  116. Weiner, E.J.; Sanchez, D.R. Cognitive ability in Virtual Reality: Validity evidence for VR game-based assessments. Int. J. Sel. Assess. 2020, 28, 215–236. [Google Scholar] [CrossRef]
  117. Levine, J.M.; Moreland, R.L. Culture and socialization in work groups. In Perspectives in Perspectives on Socially Shared Cognition; Resnick, L.B., Levine, J.M., Teasley, S.D., Eds.; American Psychological Association: Washington, DC, USA, 1991; pp. 257–279. [Google Scholar]
  118. Behdani, B.; Sharif, M.R.; Hemmati, F. A comparison of reading strategies used by English major students in group learning vs. individual learning: Implications & applications. In Proceeding of the International Conference on Literature and Linguistics, Scientific Information Database, Tehran, Iran, 19–20 July 2016. [Google Scholar]
  119. Fong, G.T.; Nisbett, R.E. Immediate and delayed transfer of training effects in statistical reasoning. J. Exp. Psychol. Gen. 1991, 120, 34–45. [Google Scholar] [CrossRef]
  120. Kanawattanachai, P.; Yoo, Y. The impact of knowledge coordination on virtual team performance over time. MIS Q. 2007, 31, 783–808. [Google Scholar] [CrossRef] [Green Version]
  121. Dyer, J.L. Annotated Bibliography and State-of-the-Art Review of the Field of Team Training as It Relates to Military Teams; Document No. ARI-RN-86-18; Army Research Institute for the Behavioral and Social Sciences: Alexandria, VA, USA, 1986. [Google Scholar]
  122. Volet, S.; Vauras, M.; Salo, A.E.; Khosa, D. Individual contributions in student-led collaborative learning: Insights from two analytical approaches to explain the quality of group outcome. Learn. Individ. Differ. 2017, 53, 79–92. [Google Scholar] [CrossRef]
  123. Clark, R.E. Learning from serious games? Arguments, evidence, and research suggestions. Educ. Technol. 2007, 47, 56–59. Available online: https://www.jstor.org/stable/44429512 (accessed on 28 January 2022).
Figure 1. Screenshots of the game Moonbase Alpha with both the single player (upper panel) and multi-player (lower panel) modes shown.
Figure 1. Screenshots of the game Moonbase Alpha with both the single player (upper panel) and multi-player (lower panel) modes shown.
Mti 06 00019 g001
Figure 2. Human interaction moderating the relationship between rules/goals clarity and post-training performance in a novel task.
Figure 2. Human interaction moderating the relationship between rules/goals clarity and post-training performance in a novel task.
Mti 06 00019 g002
Table 1. Means, standard deviations, and correlations for study variables.
Table 1. Means, standard deviations, and correlations for study variables.
MSD1234567
1. Age19.653.43
2. Sex0.640.480.08
3. Game Experience2.761.030.010.58 **
4. Rules/Goals Clarity0.480.500.02−0.05−0.03
5. Human Interaction0.710.45−0.070.030.02−0.01
6. Mid-Training0.310.190-.070.23 **0.24 **0.13 **−0.29 **
7. Post-Training—Familiar0.240.13−0.15 **0.030.070.040.040.22 **
8. Post-Training—Novel0.310.23−0.010.30 **0.32 **0.020.060.30 **0.23 **
M = mean; SD = standard deviation; Sex: 0 = female, 1 = male; rules/goals: 0 = unclear, 1 = clear; human interaction: 0 = individuals, 1 = teams. Reliability for game experience was Cronbach’s alpha = 0.90. ** p < 0.01.
Table 2. Hypothesis testing of MANOVA results by learning outcome.
Table 2. Hypothesis testing of MANOVA results by learning outcome.
B (SE)F (1512)pR2
Mid-Training Performance
Age−0.59 (0.23)6.770.010.01
Sex6.28 (1.97)10.16<0.0010.02
GE3.00 (0.92)10.64<0.0010.02
RG−5.66 (1.83)9.41<0.0010.02
HI12.29 (2.45)55.83<0.0010.10
RGxHI0.88 (3.40)0.070.80<0.01
Post-Training Performance—Familiar Task
Age−0.11 (0.28)0.160.69<0.01
Sex8.09 (2.41)11.30<0.0010.02
GE4.77 (1.12)18.01<0.0010.03
RG−4.14 (2.23)<0.010.97<0.01
HI−6.76 (2.99)1.670.20<0.01
RGxHI8.13 (4.16)3.820.050.01
Post-Training Performance—Novel Task
Age−0.55 (0.17)10.98<0.0010.02
Sex−0.19 (1.46)0.020.90<0.01
GE0.98 (0.68)2.070.15<0.01
RG−1.76 (1.35)0.440.51<0.01
HI−1.81 (1.81)0.500.48<0.01
RGxHI1.85 (2.52)0.540.46<0.01
Note. N = 513. Unstandardized regression coefficients (b) are reported with standard errors in parentheses. GE = game experience, RG = rules/goals clarity, HI = human interaction, RGxHI = tested interaction term.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sanchez, D.R. Videogame-Based Training: The Impact and Interaction of Videogame Characteristics on Learning Outcomes. Multimodal Technol. Interact. 2022, 6, 19. https://doi.org/10.3390/mti6030019

AMA Style

Sanchez DR. Videogame-Based Training: The Impact and Interaction of Videogame Characteristics on Learning Outcomes. Multimodal Technologies and Interaction. 2022; 6(3):19. https://doi.org/10.3390/mti6030019

Chicago/Turabian Style

Sanchez, Diana R. 2022. "Videogame-Based Training: The Impact and Interaction of Videogame Characteristics on Learning Outcomes" Multimodal Technologies and Interaction 6, no. 3: 19. https://doi.org/10.3390/mti6030019

Article Metrics

Back to TopTop