Next Article in Journal
Increasing Passenger Efficiency and Minimizing Infection Transmission in Chinese Metro Stations during COVID-19: A Simulation-Based Strategy Analysis
Next Article in Special Issue
Why Do Key Decision-Makers Fail to Foresee Extreme ‘Black Swan’ Events? A Case Study of the Pike River Mine Disaster, New Zealand
Previous Article in Journal
Data-Driven Management of Vaccination and Its Consequences
Previous Article in Special Issue
Systems Approach for the Adoption of New Technologies in Enterprises
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Critical Thinking Skills Enhancement through System Dynamics-Based Games: Insights from the Project Management Board Game Project

1
Dipartimento di Studi Aziendali e Giuridici, University of Siena, 53100 Siena, Italy
2
Dipartimento di Scienze Umane, IUL of Florence, 50122 Firenze, Italy
3
Dipartimento di Scienze Economico-Aziendali e Diritto per l’Economia, University of Milano-Bicocca, 20126 Milano, Italy
4
School of Advanced Defense Studies (CASD), Defense Analysis and Research Institute, 00165 Rome, Italy
*
Author to whom correspondence should be addressed.
Systems 2023, 11(11), 554; https://doi.org/10.3390/systems11110554
Submission received: 9 October 2023 / Revised: 3 November 2023 / Accepted: 14 November 2023 / Published: 19 November 2023
(This article belongs to the Special Issue The Systems Thinking Approach to Strategic Management)

Abstract

:
This study aims to explore and discuss the role of systems thinking and system dynamics-assisted games in enhancing critical thinking skills in learners. In more detail, the study relies on the use of a system dynamics-based interactive learning environment related to project management issues, followed by systems thinking-supported debriefing sessions. The interactive learning environment was developed and used in the form of a single-player, online, computer-based game. The game was designed to mimic all the necessary planning and operational activities needed to organize a wedding ceremony. The acquisition of critical thinking skills in learners was evaluated in three main ways: (1) players’ performances were analyzed through a scoring system embedded in the game that considers several performance dimensions; (2) feedback from the players was collected and analyzed by using basic content analysis; (3) players’ performances were analyzed using five main categories of structures that are typical of project management domains, i.e., project features, the rework cycle, project control, ripple effects, and knock-on effects. The findings show that the joint use of system dynamics and systems thinking tools and principles within a gaming environment has the potential to facilitate and enhance the acquisition of critical thinking skills in learners and may also provide valid support for educators and practitioners interested in the enhancement of project management skills.

1. Introduction

For decades, the relevant literature has emphasized the important role played by critical thinking skills in decision-making processes. Originally a term coined by John Dewey [1] to denote an educational goal and seen as a scientific attitude or state of mind, the concept of critical thinking has been subsequently molded and used by scholars from various disciplines, such as cognitive psychology and philosophy [2,3,4].
One of the most comprehensive definitions was given by Michael Scriven and Richard Paul [5]; according to these scholars, “Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action”.
At present, critical thinking is seen as a fundamental educational goal for teaching and training programs that are increasingly required to teach, transfer, and enhance critical thinking-related skills in learners (e.g., [6]). More specifically, among the main categories of skills frequently mentioned by the relevant literature, educational programs are currently asked to enhance the following abilities in learners [7]:
  • Analyze and account for one’s own biases in judgment and experience;
  • Break down a problem into its parts to expose its underlying logic and assumptions;
  • Gather and evaluate pertinent evidence from one’s own observations and experimentation or outside sources;
  • Modify and reevaluate one’s own thinking in light of what has been learned;
  • Form a reasoned assessment to propose a solution to a problem or a more accurate understanding of the topic at hand.
Notably, the relevant literature (e.g., Kolb, and Hamilton and Klebba [8,9]) has emphasized that the acquisition and the development of critical thinking skills are particularly facilitated when the learners can develop skills through a process where personal experience and experiential learning play a relevant role and where interaction is facilitated.
Whereas previous studies have already underlined that several tools and methods (e.g., [2,10,11]) can be used to ensure the aforementioned conditions, calls for more research in the field (e.g., Cicchino [12], McDonald [13], and Efendi [14]) have also explicitly pointed to the potential of gaming and simulation tools. In this regard, it is noteworthy that it has been already demonstrated that various forms of games can be used effectively in management education and training programs, as exemplified by the work of Salas et al. [15]. Such games may include a variety of typologies, spanning from traditional board games to more technologically advanced simulation models, and even comprehensive computer management flight simulators [16,17,18].
Within this context, one typology of tool that is increasingly used in education and training is that of the “interactive learning environment” (hereafter ILE). As its name suggests (see Atkinson and Renkl [19] for a more detailed definition), an ILE is a computer-based environment aimed at facilitating the interaction between a model (in this case, a system dynamics computer-based simulation model) and a user/player (i.e., a learner) with the final goal of enhancing knowledge acquisition [20].
Ensuring risk-free and safe settings where the players engage with the underlying model and the game as well as with one other [21], ILEs facilitate the analysis of complex business issues, the creation of shared policies, the testing of mental models, an improvement in sense making and reframing, and last but not least, the acceleration of knowledge creation and knowledge sharing (e.g., [22,23,24,25]). In addition, ILEs have the potential to support decision making and the generation of policy insights while letting participants enjoy themselves and participate fully in the process [26,27,28]. In this regard, ILEs may also provide powerful tools to analyze behavioral factors related to decision-making processes [29,30,31,32,33].
The ILE presented in this study is a system dynamics-based [28,29,30,31,32] one, i.e., an interactive game developed on the basis of the methodology theorized by Jay Forrester [34,35], that is particularly well suited to represent, simulate, and analyze complex and dynamic domains.
With this said and for all the reasons above, this study specifically aims to investigate if and how SD-based ILEs can enhance critical thinking skills in learners, with specific regard to the project management field.
From a methodological point of view and to address the research question mentioned above, this paper relies on the experience gained from an Erasmus+ KA2 project (named the “Project Management Board game”, hereafter PMBoG) aimed at developing an SD-based ILE and using it for teaching purposes in the field of project management (PM). The paper will present the main features and some results from a simulation session. The acquisition of critical thinking skills in learners was assessed in three main ways: (1) Players’ performances were analyzed through a scoring system embedded in the game that took into account several performance dimensions. (2) Feedback from the players was collected and analyzed. In detail, players were interviewed and required to describe their learning experience, the strategies they used during the game, and the perceived learning points of their learning experience. Players’ feedback was subsequently investigated using basic content analysis. (3) Players’ performances were analyzed using the five categories of structures identified by Lyneis and Ford [36] that are typical of project management domains, i.e., project features, the rework cycle, project control, ripple effects, and knock-on effects.
Preliminary results and findings show that SD-based ILEs have the potential to facilitate and enhance critical thinking skills acquisition in learners and may also provide valid support for educators and practitioners interested in the enhancement of project management skills in learners.
This study is structured as follows: Section 2 provides a brief literature review of the concept of critical thinking, specifically describing the main categories of critical thinking-related skills and abilities and the tools that can be used to enhance them in learners. In this context, the article subsequently describes the main characteristics of interactive learning environments, briefly discussing how and why they could be used to enhance critical thinking skills—in the case of this research, when applied to analyze issues and options in the field of project management. Section 3 presents the research design and the ILE developed for the PMBoG project, particularly describing its aims, structure, and functionalities. Section 4 reports some of the results. Section 5 provides the discussion and highlights some insights stemming from the PMBoG project. The conclusion, the limitations, and some ideas for further research conclude this article.

2. A Brief Literature Review and the Theoretical Framework

2.1. The Concept of Critical Thinking

For a long time, the relevant literature has emphasized that critical thinking is progressively emerging as a fundamental cognitive skill that decision-makers should have—or learn—and practice, as well emphasized by Dewey [1].
As mentioned in the introductory section of this article recalling seminal works and definitions such as the one given by Scriven and Paul [5], critical thinking can be defined as the ability that enables people to analyze, assess, synthesize, and use information in a systematic and objective way. In broad terms, this concept entails being able to use a range of mental processes, such as analysis, inference, evaluation, interpretation, explanation, and self-regulation, with the ultimate goal of supporting decision making, i.e., to arrive at well-reasoned and informed judgments about complex issues and problems [37,38]. The six attributes mentioned above can be further described as follows:
  • Analysis involves breaking down complex information into smaller parts, identifying patterns, and evaluating evidence to make a reasoned judgment.
  • Evaluation involves assessing the credibility and relevance of information, evaluating arguments, and determining the strengths and weaknesses of different viewpoints.
  • Inference involves drawing conclusions based on available evidence and making predictions about future events based on past experiences and patterns.
  • Interpretation involves understanding the meaning of information, analyzing its significance, and applying it to new situations.
  • Explanation entails communicating complex ideas or concepts clearly and concisely, using appropriate evidence and reasoning to support arguments.
  • Self-regulation involves monitoring one’s own thinking and behavior, recognizing biases and assumptions, and adjusting one’s approach based on feedback and new information.
The previous literature has emphasized that critical thinking is a vital skill for all individuals, as it allows them to think critically and creatively about the world around them, make informed decisions, and solve complex problems effectively (e.g., [39]). Specifically, previous research in the educational field has pointed at various tools as ways of facilitating or enhancing the acquisition of critical thinking skills, as shown by Behar-Horenstein and Niu [40] and by Alsaleh [41].
From an assessment point of view, previous research also emphasized that the acquisition of critical thinking skills may be evaluated with the use of various methods and tools, such as the following.
(a)
Standardized tests, such as the California Critical Thinking Skills Test (CCTST) and the Watson–Glaser Critical Thinking Appraisal: these tests are used to evaluate skills such as analysis, inference, evaluation, and deductive reasoning—as discussed in Bernard et al. [42]—and have been already employed in several studies, such as in the work by Alkharusi [43] focused on university students.
(b)
Portfolios can be used to collect and evaluate students’ work over time. This method provides a more comprehensive view of students’ critical thinking skills development as it captures evidence of their progress and growth over an extended period, as demonstrated by the work of Coleman et al. [44] that focuses on students active in the social work education domain.
(c)
Peer and self-assessment: as also discussed by Siles-González and Solano-Ruiz [45], peer and self-assessment can be effective methods for assessing critical thinking skills as they encourage students to reflect on their own thinking and provide feedback to their peers.
(d)
Observation and performance tasks: Observing students while they engage in tasks that require critical thinking can provide valuable information on their skills. Performance tasks can be different and include activities such as writing essays or analyzing arguments, which can be assessed using rubrics or scoring methods, as Kankaraš and Suarez-Alvarez emphasize [46].
(e)
Interviews can be used to assess critical thinking skills by directly asking students to reflect on their thinking processes and reasoning, as demonstrated in the study by Jaffe et al. [47]. This method provides insights into how students approach problems, make decisions, and evaluate arguments, and allows individually eliciting information about the perceived acquisition of critical skills, as also discussed in Tiwari et al. [48].
(f)
Classroom-based assessments can include a variety of methods such as questioning techniques, class discussions, and simulations. These methods provide opportunities for students to demonstrate their critical thinking skills in a classroom setting (e.g., Zepeda [49]).
Whereas each of the methods mentioned above—or a combination of them—can be used to assess critical thinking skills acquisition effectively, eventually, it is essential to choose methods that align with the learning objectives and the contexts and problems under analysis. In detail, calls for research are provided by the extant literature (such as in the work by Cicchino [12], McDonald [13], and Efendi [14]) with specific regard to the use of interactive games in class, with the specific aim of fostering the acquisition of critical thinking skills. Notably, the employment of systems thinking and system dynamics as the underlying methodologies able to assist the design and use of such games and, subsequently, favor the acquisition of critical thinking skills is something that has been long advocated, starting from the well-known work by Richmond [50]. In this regard, this study focuses on the use of SD-based ILEs in the field of project management, as discussed in the following sections.

2.2. The Use of SD-Based Games and ILEs to Enhance Critical Thinking

Games have been used for centuries as a tool for entertainment, social interaction, and learning [17,18,51,52,53].
Considering this study, it is noteworthy that games may also offer some strengths when used to enhance critical thinking (as shown by Cicchino, [12]), as follows.
Firstly, games require players to solve problems in real time. Players are required to make quick decisions based on available information, identify patterns, and anticipate the outcomes of their actions. These skills are vital in everyday life as they help individuals make informed decisions, avoid risky behaviors, and solve complex problems [54].
Secondly, games provide an opportunity for players to test their hypotheses, mental models, and theories [55]. Players can experiment with different strategies and tactics and analyze the outcomes of their decisions. This process of experimentation helps players develop a better understanding of how the world works and improve their decision-making skills.
Thirdly, games can help learners acquire a general knowledge about the decision-making environment they are challenged with, thereby going beyond the specific decisions they are called upon to make [56].
Finally, games promote collaboration and teamwork. Quite commonly, games require players to work together to achieve a common goal. This requires effective communication, collaboration, and a willingness to share ideas and information. These skills are essential in today’s workplace, where teamwork is a crucial aspect of an organization’s success [52].
Overall, games offer a fun and engaging way to enhance learning in and about complex issues and domains, since they provide an opportunity for players to practice decision making, problem solving, and collaboration in a safe and controlled environment. As a result, games can also be an effective tool for promoting critical thinking skills in individuals of all ages.
As already mentioned, critical thinking refers to the ability to analyze, evaluate, and synthesize information to make informed decisions. In this context, games can effectively promote critical thinking skills by engaging players in a series of challenges that require them to think creatively and strategically, experiencing first-hand the results, the impacts, the consequences, and also the side effects of their decisions and actions.
Notably, SD-based games have proved their effectiveness for a long time and are increasingly used in education and training programs (e.g., [16,24,26,28,57]).
As known and theorized by Jay Forrester [34,35], system dynamics is a methodology that focuses on understanding the behavior of complex systems, using computer-based models to simulate and analyze the interactions between various components of the system under investigation. At the core of SD, there are four key concepts that are also fundamental to addressing critical thinking-related matters [35,58,59]:
  • Systems are considered as a whole;
  • Emphasis is placed on the internal structure of the system as the cause of its dynamic behavior;
  • Rather than considering relationships in a model as being linear for the sake of simplicity, emphasis is placed on the non-linear character of many relationships;
  • Process delays (e.g., information delays) in social systems are considered important.
Several tools can be used when developing an SD-based game, such as causal maps (e.g., causal loop diagrams and stock and flow diagrams), quantitative simulation models, and interactive learning environments (ILEs).
This study employs an SD-based ILE that is about project management, a context that offers various characteristics that would favor the use of SD-based models and simulators for training and educational purposes.

2.3. System Dynamics, Critical Thinking, and the Field of Project Management

The field of project management may benefit greatly from the use of system dynamics principles and tools to enhance critical thinking skills in learners (e.g., [36,59,60,61,62,63]).
Overall, it is relevant to emphasize what a project is and which are the key distinctive skills that project managers should have and apply in managing a project management intervention. In broad terms, a project can be considered as a series of activities and tasks (performed in parallel or in series) that [64]:
  • Have a specific objective (scope) to be completed within certain specifications (requirements);
  • Have defined start and end dates;
  • Have funding limits;
  • Consume and/or utilize resources.
Projects are overall challenging to plan and manage, even when various forms of control are used simultaneously [65], and critically depend on project conditions and the ability of project managers to properly plan, execute, and control all the various tasks and activities associated with them [66].
With specific regard to this study, it is also to underline that typically project conditions and performance may evolve over time, quite frequently because of feedback responses (some of them involving nonlinear relationships), time delays, and accumulations of project progress and resources. Interestingly, these factors may also generate side effects and adverse dynamics—with a few of them that have been well identified and explained by the relevant literature (see Sterman [59] for several examples of side effects)—or even induce the failure of the project (as discussed by Pinto and Mantel [67] or by Al-Ahmad et al. [68]).
With this said, in this work, we advocate relying on SD and, more in detail, on SD-based ILEs for PM, in consideration of several reasons.
First, with projects growing in complexity, there has been a corresponding rise in the requirement for approaches capable of handling this complexity. This is true for all projects, even the largest ones [59].
Second, SD helps project managers “represent” their work, making it clear which tasks need to be finished in what order across time (see, for example, the work by Lyneis and Ford [36]).
Third, SD offers the tools to strategically support project managers throughout all the different phases of their projects, from the design to the implementation (as discussed in Lyneis et al. [69]). In this context, various SD tools can also help discover the side effects of the actions being carried out and the challenges that project managers might be called on to face (e.g., [70]).
Additionally, SD is suitable to improve the understanding of what a project will entail, in terms of resources (e.g., financial resources, human resources, and time) needed for its completion (e.g., [61,62]).
Last, SD may facilitate the understanding of clients’ and stakeholders’ needs throughout the whole project lifecycle (as well emphasized by Rodrigues and Williams [71]).
As the underlying point of reference for this study and about the use of SD in the field of PM, the model and the ILE presented in the following sections were developed to take into account the five categories of structures that Lyneis and Ford [36] identified to describe project management domains and decision making in such contexts, i.e.: (a) project features; (b) the rework cycle; (c) project control; (d) ripple effects; and (e) knock-on effects. All these structures are represented in Figure 1.
The first category is “project features”. As mentioned by Lyneis and Ford [36] (p. 159), “Projects almost always consist of a collection of tasks that are performed in parallel and in series. Therefore, a principal feature of all system dynamics project models is the representation of development tasks or work packages as they flow through a project”. “Project features” are considered by listing and accounting for the development tasks required to carry out the project. From a modelling standpoint, development tasks are represented as a stock of tasks to be carried out that flow into the stocks of tasks carried out as soon as they have been completed due to the actions carried out by the decision-makers by using the resources at their disposal.
A “rework cycle” is often associated with this first structure. Errors are usually considered the cause underlying this structure since, if not discovered in time, they will generate rework discovery that feeds back into the stock of tasks to be carried out, thereby requiring more effort and new resources.
The third category of structure is related to controlling feedback. This entails performing “project control”. This step requires modeling the controlling feedback loops through which management attempts to close gaps between project performance and targets (on time, on budget, and with the desired quality and specifications). Two main methods are usually considered and modelled in this regard: project managers may move project behavior closer to targets (e.g., work overtime) or move targets toward project behavior (e.g., push forward a deadline). Notably, both methods imply costs (monetary and other types).
Projects are also characterized by side effects and unintended consequences [59] of the actions carried out. Two additional categories of structures are used to model such unintended effects.
The so-called “ripple effects” are the primary side effects of well-intentioned project control efforts (e.g., policy resistance). These effects typically reduce productivity or quality (by increasing the error fraction of rework).
Additionally, the so-called “knock-on effects” are effects that may be caused by processes that produce excessive or detrimental concurrence or human factors that amplify the negative effects via channels such as morale.
With this said, whereas the previous literature has already provided good examples of how SD maps and models can be effectively used to analyze project management-related operational and decision-making contexts (e.g., [59,61,62,63,71]), further research is needed in the field of SD-based ILEs, specifically when these tools are used in educational and training programs to enhance critical thinking skills.
To this aim, the next section is devoted to presenting our research design and the main features of the SD-based PMBoG ILE.

3. Research Design and a Presentation of the SD-Based PMBoG Game

3.1. Research Design: Overview

This study relies on the experience gained by the authors developing an Erasmus+ KA2 project named “Project Management Board Game” (PMBoG).
The PMBoG project entailed the participation of various partners with the final aim of developing both a board game and a computer-based game. The two games are meant to be used with numerous categories of learners (e.g., bachelor students, MBA students, doctoral students, adult learners, etc.) in education programs about project management issues and competencies. This article refers exclusively to the stages that involved the authors in developing the SD model during the project and subsequently testing the SD ILE-based game in class.
The SD model and the ILE were developed relying on the tools provided by the software package Stella Architect. The SD computer model is at the core of the simulator, while a detailed and interactive graphical interface facilitates the users’ interactive gaming and learning experience.
The ILE-based PMBoG game can be described as:
  • Online: the PMBoG ILE is available online at the following address: https://exchange.iseesystems.com/public/barnaf/pmbog-ile/index.html#page1 accessed on 13 November 2023;
  • Single-player: the PMBoG ILE allows for one user at a time to play the game;
  • Symmetric: any player accessing the PMBoG ILE will have a set of decision-making levers at their disposal that are equal for all the users;
  • Competitive: the PMBoG ILE stimulates users to perform, that is to say, complete all the tasks successfully and with the highest score possible.
The overall task assigned to players is that of organizing a wedding ceremony and carrying out all the needed activities, on time, with the budget at their disposal, and according to the requirements they are informed about at the beginning of the simulation.
In more detail, over a time horizon of a few months, the player—acting in the game as the wedding planner—is required to satisfy all the needs of a given wedding couple. To this aim, the wedding planner is required to complete six main typologies of tasks (i.e., choosing a restaurant, location, dress, announcements, rings, and photographer) within the deadlines of a specific schedule, based on the available budget, and according to the specific couple’s requirements. The final score of the game depends on the ability of the user to reach all the goals mentioned above, e.g., completing the tasks and fulfilling the couple’s requirements.
Adopting the categories of possible choices to be used in game design as described by Van Daalen, Schaffernicht and Mayer [72], we can additionally emphasize what follows for the PMBoG ILE (see Table 1).
The ILE was used in exploratory tests with various categories of learners. In this study, we report the results and the insights gathered during a simulation session that involved six participants. The players were doctoral students with a major in accounting and rather generic prior knowledge in the field of project management. The choice of doctoral students as the main category of learners is motivated by the relevant literature, specifically when active learning approaches are employed (for example, see the work by Schaller et al. [73]).
Players’ performance and the subsequent acquisition of critical thinking skills were preliminary evaluated in three ways, as follows.
First, players’ performances were analyzed through a scoring system embedded in the game that considers several performance dimensions. As mentioned in the second section of this article and as highlighted by other studies (e.g., [46]), scoring systems may be useful to evaluate whether and to what extent participants make use of critical thinking skills to complete the tasks they are assigned. Full details about the scoring system used in this research and embedded in the game are provided subsequently in the paper.
Second, feedback from the players was collected and analyzed. In detail, players were interviewed and required to describe their learning experience, the strategies they used during the game, and the perceived learning points of their experiences. As we mentioned and demonstrated by other studies (e.g., Jaffe et al. [47]), interviews may provide insights into how students approach problems, make decisions, and evaluate arguments. Players’ feedback was subsequently investigated using basic content analysis principles (as explained by Krippendorf [74]) to understand if the SD-based ILE helped acquire and develop critical thinking skills in the peculiar context under analysis, i.e., a PM-related domain. Notably, interviews also allow individually eliciting information about the perceived acquisition of critical skills (e.g., Tiwari et al. [48]). Third, players’ performances were analyzed using the five categories of structures identified by Lyneis and Ford [36] to describe project management domains and decision making in such contexts, i.e.: (a) project features; (b) the rework cycle; (c) project control; (d) ripple effects; and (e) knock-on effects.
Overall, by requiring students to actively build upon their prior knowledge, the research design we just discussed is also consistent with the use of qualitative research methodologies to analyze learning directly in the classroom, as it is emphasized in Novak and Gowin [75].
More details about the choices for the research design employed in this study are provided below.

3.2. An Overview of the PMBoG SD-Based Model and ILE

According to Sterman [59], an SD intervention is usually structured as follows:
  • Articulate the problem that needs to be addressed;
  • Formulate a dynamic hypothesis or theory about the causes of the problem;
  • Build the simulation model to test the dynamic hypothesis;
  • Test the model;
  • Design and evaluate policies.
With this said, if in broad terms the PMBoG project was aimed at creating a game about PM issues, dynamics, and competencies, the first practical step of our analysis required defining the specific context and problem to be addressed in order to develop the SD-based model and the ILE.
Several options were considered initially by the research group; scores were subsequently given to all the options, specifically evaluating them against the competencies to be taught and transferred to learners. After a scoring phase used to assess and weigh alternative options, the context chosen for the project was that of a wedding, with the players taking on the role of wedding planners.
As mentioned, the PMBoG-related game was developed and is played as an SD-based ILE. Several meetings with the participation of various project units were organized to share information and initially develop the model according to a participatory perspective (e.g., Videira et al. [76]). More advanced technical details were subsequently agreed on relying on group model-building techniques, as was suggested by Vennix [58].
Primarily, the model was developed using stock and flow diagrams, while causal loop diagrams were less frequently used during the developmental phase of the ILE.
Fully developed, the model has the following main characteristics.
  • Time horizon: 7 months, with start time at −1 and stop time at 6.
  • DT: 1 month.
  • Variable counts: stocks: 117; flows: 157; converters: 628; constants: 249; equations: 536; graphicals: 0.
  • Integration method: Euler.
The model was subsequently validated using several key validation tests (as suggested by Senge and Forrester [77] and Barlas [78]), both direct structure tests (e.g., the dimensional consistency test and the direct extreme-condition test) and structure-oriented behavior tests (e.g., the extreme-condition test and the boundary adequacy test).

3.3. Description of the PMBoG ILE

The ILE is made of several windows that:
  • First, provide preliminary information about the PMBoG project and the game;
  • Subsequently, describe the task assigned to the player and the main instructions to follow to play and interact properly with the simulation model;
  • Lastly, allow the player to play the game and check the results of the simulation.
In total, 16 pages populate the ILE. The most relevant ones are described below.
The first two pages of the ILE provide the entry point to the game and some preliminary information about the PMBoG project and its aims. A toolbar (at the bottom of the pages) is used to allow the user to get access to the ILE and the other windows/pages of it. The toolbar is not equal for all the pages of the simulator since it is customized according to the page where it is placed and with the aim of providing a full immersive interaction between the user and the computer model.
The task assigned to the player is framed as depicted in Figure 2.
As mentioned previously, if in broad terms the PMBoG is aimed at teaching and transferring PM competencies, the game created in its context is about a specific domain, i.e., the organization of a wedding. The description of the player’s task included in this page of the ILE immediately clarifies that the final score of the game will depend on their ability to reach all the goals above mentioned.
The player is subsequently required to look for more “Detailed instructions”, i.e., for a list of the main steps that they have to go through playing the game, as follows:
  • Step 1: Choose the couple you are willing to play with and analyze carefully what that couple wishes (“Couples and cards” page). Then, “initialize” the game.
  • Step 2: Check what “special need” you will be required to address (“Couples and cards” page).
  • Step 3: Check which utility cards (e.g., an additional worker or a good musician) will be available during the game (“Couples and cards” page).
  • Step 4: Check which options are available for each category and typology of tasks (“Task options pages”: purple, yellow, and red) and start planning them. Note that only for the first round of action, you will be allowed to plan the tasks in advance of all the other activities … (through the “Planning task dashboard”).
  • Step 5: From clock time = 0, use the “Player’s Decision Dashboard” to make your decisions, such as allocating workers to specific tasks, mitigating risks, and hiring (or releasing) staff.
  • Step 6: The results of your simulation will be available throughout the windows of the simulator, with the final score calculated in the “Scorecard”.
This list provides a snapshot of how the game is structured and unfolds. It also makes clear that more detailed information about the game and the levers to interact with the simulation model are provided in subsequent windows of the ILE, starting from the page titled “Couples and cards” (Figure 3).
The “Couples and cards” page of the ILE is where the game is initialized. According to the list of steps that the user is required to go through, the player is first called upon to select the couple to play with. The user can choose between couple no. 1 (“Posh”) and couple no. 2 (“Friendly”). Each couple is represented by a card (equal to the one used in the PMBoG board game) that contains several icons and information, as described in Table 2.
The following pages of the simulator describe the task options.
Task cards are the focal point of the game and represent activities that must be completed to successfully organize the wedding. There are three typologies of cards, i.e., purple, yellow, and red.
As an example, Figure 4 displays the cards and options for the purple tasks.
The colors are used to highlight the order in which the tasks should be completed: purple ones come first, yellow ones are second, and red ones are the last to be fulfilled. Subsequently, the six main tasks that the user must complete, i.e., choosing the restaurant, the location, the dress, the announcements, the rings, and the photographer, are assigned a color.
Notably, in the PMBoG SD model, tasks are represented as an aggregate variable for reporting purposes and are also disaggregated into more detailed development phases for each of the main categories of tasks assigned to the wedding planner.
As an example, Figure 5 portrays a partial representation of the structure devoted to the management of “Restaurant no. 1” (i.e., REST 1).
During the game, all the task-related options are randomly drawn and made available, and are identified with their color: restaurants and locations are purple tasks; dress and announcement tasks are yellow; and rings and photographer cards are red. For a more straightforward identification, the various tasks are also associated with peculiar icons, as displayed in Table 3.
A set of traffic light icons (red, yellow, and green colors are visualized) is used to immediately communicate if a specific task is available (“A”), planned (“P”), and/or completed (“C”). Notably, each card is displayed on this page (and on the other pages devoted to presenting task card options) with several symbols, as described in Table 4.
Using a visualization that is applied to all the task cards, each card displays specific information about this peculiar task, e.g., the amount of workload required to complete it (one square grey box means one unit of workload), its cost (one banknote equals EUR 1000), and the score associated with it (the star icon equals one point scored at the end of the game).
Additionally, a bar is included in the ILE to highlight the “clock TIME”, i.e., the time of the simulation run (e.g., clock time = 2 would mean that the game is currently at the end of month 2).
Each card has a front side and a rear side (accessible through specific command buttons placed on the bottom right corner of the screen). An example of a page from the ILE that is devoted to presenting the rear side of the task cards is displayed in Figure 6.
The “rear sides” of the task cards are explicitly dedicated to analyzing the “risks” that are associated with such tasks and loom on the user. Risks represent a core element of any project management intervention and must be faced, mitigated, managed, and—when possible—prevented and avoided.
Specifically, the ILE game allows for risk mitigation by investing in risk prevention activities, as we will discuss subsequently. Each card on its rear side displays the typology of risk that could occur, its probability, and the investment needed to mitigate such risk, thereby changing it from a normal risk to a lowly mitigated risk or, even, to a highly mitigated risk.
To start the game by planning and executing tasks, the user is required to choose task cards that are randomly made available (see the green light associated with icon A, i.e., available card). The user has one month of preparation before starting the game; that is to say, the first step of the simulation is considered to be the planning phase of the project.
Any planning decision is taken through the “Planning Task Dashboard” (Figure 7).
All six categories of task options and all the task colors are listed on this page. Slider bars are at the user’s disposal to choose among the available options, as displayed for each category of tasks on other pages of the ILE and made also evident by the traffic light-like icons portrayed on this page of the game.
Task planning status- and task completion status-related icons are portrayed as well. The user advances the game (i.e., the simulation) by 1 month each time they press the command button “ADVANCE Simulation”. A new game can be started by pressing the command button “Start New Run”.
When a task is planned, it must be subsequently carried out.
All the key levers to complete a task (purple, yellow, and red tasks, as well as tasks associated with utility cards) by assigning the workforce to it and mitigating risks are available on the ILE page named “Player’s Decision Dashboard” (Figure 8).
Through the player’s decision dashboard, the user can make a series of decisions, as described in Table 5.
As already exemplified by Figure 5, the decisions made by the players are meant to complete the tasks necessary to organize the wedding on time, with the given budget, using the available workforce, and according to the couple’s wishes.
All these actions are made possible by relying on partial structures included in the SD model, such as the following one that exemplifies how workforce management is modelled within the ILE (Figure 9).
The user will check on the “Scorecard” (Figure 10) if they successfully managed all the tasks.
Even though the scorecard is updated continuously during the game, the final score will be correctly calculated only after 6 months of simulation.
In more detail, the total points scored by a player in a game are given by a combination of partial scores, as described in Table 6.
With the user having initially chosen one out of the two couples available for the game, the scorecard will subsequently show the final score fully calculated only for that specific couple. The final score and the scorecard make clear that the performance of the user depends on a combination of indicators and actions, i.e., on the ability to complete all the tasks on time, according to the wishes of the couple chosen for the game, properly using the budget at the user’s disposal, and properly preventing and facing various risks.
For a better understanding of the simulation runs and the overall experience, the ILE provides an additional page (Figure 11) dedicated to presenting a selection of graphs and data.
This page includes graphs that are useful both during the game (to visualize the outcomes of the actions already carried out and understand how to take further action) and after the game.
The list of graphs provided on this page of the ILE is described in Table 7.
These graphs allow for analyzing the “behavior mode” of a specific (or more) variable(s), and for this reason, are named “behavior over time graphs”. As explained by Ford [79] “Behavior modes are typically displayed graphically using behavior-over-time graphs (BOTG), where time is represented on the X-axis and values of the variables are represented on the Y-axis”.
Such graphs are fundamental not only to inspect the outcome of a specific game but also to understand the relationship between the dynamics shown for the various variables and the underlying systemic structure that is impacted by the user through their decisions.

4. Results

This section reports some preliminary results as registered during a simulation session with doctoral students interacting with the SD-based ILE. Specifically, the following graphs show the results of one simulation played during the session we selected for this article. Such graphs and data are only meant to provide an exemplification and highlight some insights.
In detail, Figure 12 displays the planning tasks dashboard, and Figure 13 portrays the player’s decision dashboard for this player.
The following two figures show the results for the player we are referring to; specifically, Figure 14 shows the scorecard, while Figure 15 portrays the BOTG graphs from this specific simulation run.
This specific player (coded as player no. 2 in the simulation session) was able to plan the tasks but was not entirely efficient in carrying out all the activities needed to complete the six tasks (only four out of six were completed within the given time horizon, thus generating a minus 6 score for this category). Positive scores in the scorecard are the results of actions completed for a preferred category (+6), liked categories (+1 and +2), and the utility card (+1). In terms of budget allocation, the scorecard clarifies that the player had a budget of 2K left (+3).
Further analyzing the simulation session we are referring to, Table 8 reports the players’ feedback (summarized) about their playing experience, specifically in terms of the strategy they developed and applied during the game. Table 8 also shows the final score for each player and if they incurred specific risks during their game.

5. Discussion

According to the research design we presented in Section 3 of this paper, players’ performances and the subsequent acquisition of critical thinking skills [1,5,37,38] were assessed in three different ways.
First, players’ performances were analyzed through a scoring system embedded in the game and considering several performance dimensions. As we mentioned in the second section of this article and as highlighted by other studies (e.g., [46]), scoring systems may be useful to evaluate whether and to what extent participants make use of critical thinking skills to complete the tasks they are assigned. In this regard and in this research, the final score shown within the ILE provides a synthetic evaluation of the whole players’ performance, while the partial scores reflect their ability to successfully complete specific tasks on time, with the budget at their disposal, and in consideration of the couple’s requests. Considering the details included in Table 6 and to provide an exemplification, player no. 2 (for whom the scores are presented in graph 13) was partially effective in completing tasks associated with the couple’s requirements and wish list, thereby highlighting the capacity to detect relevant information when planning a future course of action and select appropriate options among the ones becoming available during the simulation.
It is to be noted that none of the players involved in the simulation session considered for this study was able to achieve the full score, even though all of them scored positive points. This demonstrates, on one side, the ability of the players to interact with the system dynamics [34,35] model and the SD-based ILE [20,21,26,28] to manage the tasks they were assigned and develop appropriate policies (long advocated by authors such as Morecroft [26]), but also, on the other side, that projects (such as the one considered in this work, i.e., the organization of a wedding ceremony) can be quite complex and challenging, thereby calling for a systemic perspective—as clearly emphasized by the relevant literature (such as by Sterman [59] and Rumeser et al. [70]). Both these aspects point to the use of ILEs for educational purposes, specifically to assist learners in analyzing complex and dynamic domains and developing policy-making skills—a concept that is clearly at the center of the academic debate within the SD community (e.g., see the works by Morecroft [26], Alessi and Kopainsky [24], and Davidsen [28]).
Second, feedback from the players taking part in the simulation session was collected and analyzed. As we already stated, the research design entailed interviewing players and asking them to describe their learning experience, the strategies they used during the game, and the perceived learning points of the experience, as reported in Table 8. Players’ feedback was subsequently investigated using basic content analysis (according to the suggestions provided by Krippendorf [74]) to identify particularly relevant terms or concepts.
As an example, the following excerpt emphasizes one of the strengths of this approach, specifically centered on the use of “games” embedded in an ILE (e.g., [16,26,27]):
“I never thought about project management in this way. I’ve been used to reading books and calculating gaps and this experience was rather new and entertaining. I really liked it”.
This excerpt is particularly significant for demonstrating one of the main strengths related to the use of interviews in a research context, that is to say, to provide insights into how students approach problems, make decisions, and evaluate arguments. As we already pointed out, this is also emphasized by other studies, such as the work by Jaffe et al. [47].
Another excerpt from the interviews better clarified what the players usually carried out during the gaming experience, specifically emphasizing the behavioral side of their learning experience, something stressed by several authors (e.g., Lane [31], Barnabè and Davidsen [32], and Lane and Rouwette [33]):
“Of course, it was a game, but I wanted badly to perform and complete my assignment. It was a fight against time, and I had to find several solutions to manage my staff and the costs to complete the game”.
This kind of excerpt may also be useful for eliciting—individually—information about the perceived acquisition of critical skills, as discussed by other studies (e.g., Tiwari et al. [48]).
Third, players’ performances were analyzed using the five categories of structures identified by Lyneis and Ford [36] to describe project management domains and decision making in such contexts, i.e.: (a) project features; (b) the rework cycle; (c) project control; (d) ripple effects; and (e) knock-on effects.
In this regard, we remember that in the PMBoG model, tasks are disaggregated into more detailed development phases for each of the main categories of tasks assigned to the wedding planner (see Figure 5) and are also represented as an aggregate variable for reporting purposes. As mentioned, project managers’ performances are typically measured in terms of schedule, cost, quality, and scope; all these performance measurement areas are included in the PMBoG ILE. Notably, most of the efforts that the players have to put into the game entail carefully planning the activities to carry out using the resources at their disposal (with some of them becoming available randomly during the game) and subsequently operating to close the target–performance gap. Not all the options are always available, and the planning of such tasks is a key element of the game.
As one player underlined:
“there is a trade-off between going for the best score and trying to accomplish all the tasks”.
Analyzing the game results, typical strategies used to do that entailed expanding the workforce capacity (hiring new employees or, even, hiring temporary workers), changing the order of execution for specific activities (thereby missing some deadlines), or lowering the bar in terms of quality and customer satisfaction, thereby going for less preferable options. The development of these strategies can be considered rational and also coherent with what has been observed in other studies or by other authors, such as Sterman [59,60]. Stated differently, the players relied substantially on the existence and use of negative (controlling) feedback loops, with responses typically being proportional to the gap size. Rework cycles and side effects were mostly generated by a limited number of reasons, such as the lack of risk mitigation (that may lead to errors and reworks, as detailed in the last column of Table 8) and accelerated processes (through the extensive hiring of new staff, thereby boosting salaries and costs). Interestingly, behavioral factors such as pressure and stress also played a role in the simulation, causing suboptimal performances and knock-on effects (see Table 8 for more details). Interestingly, the potential for using an ILE to explore the behavioral side effects of in-game simulations is confirmed by other studies, such as the work by Sterman and Dogan [29] and the study by Barnabè and Davidsen [32].
Lastly, to discuss if and how the ILE facilitated the acquisition of critical thinking skills, we have to remember what was already emphasized in the second section of this paper. In broad terms, referring again to the seminal definition provided by Scriven and Paul [5], acquiring critical thinking skills entails being able to use a range of mental processes, such as analysis, inference, evaluation, interpretation, explanation, and self-regulation. These six abilities are fundamental to supporting decision making, i.e., to arrive at well-reasoned and informed judgments about complex issues and problems, something clearly emphasized by the relevant literature [37,38].
Concerning the PMBoG ILE, we can now highlight the following.
  • The ability of analysis involves breaking down complex information into smaller parts, identifying patterns, and evaluating evidence to make a reasoned judgment. The PMBoG ILE helped the players to embrace the complexity characterizing a typical project management intervention, having at their disposal various sources of information and using them to plan activities and carry out actions. The potential of SD principles and tools, in this specific context, supported the players in understanding how complex patterns of events, actions, and consequences were formed within the game, at the same time identifying the role played by critical factors such as time.
  • The ability of evaluation involves assessing the credibility and relevance of information, evaluating arguments, and determining the strengths and weaknesses of different viewpoints. The ILE provided a safe environment where all the information was continuously at the players’ disposal, performances were assessed in real time, and feedback about decisions and actions was provided to the players using several tools (graphs, scores, visual aids such as traffic lights, tables, etc.)
  • In the PMBoG ILE, the skill of inference, which involves drawing conclusions based on available evidence and making predictions about future events based on past experiences and patterns, was critically linked to the players’ ability to plan activities, carry them out, assess the gap in terms of expected results, and adopt corrective actions, mostly according to negative balance feedback-driven reasoning.
  • The ability of interpretation, which entails understanding the meaning of information, analyzing its significance, and applying it to new situations, was greatly enhanced within the PMBoG ILE. Players were called on to play the role of a wedding planner, but this role might be easily shifted to the management of a variety of other projects, all of them relying on similar underlying systemic structures.
  • The ability of explanation (which involves communicating complex ideas or concepts in a clear and concise manner, using appropriate evidence and reasoning to support arguments) was not directly evident from the results of the PMBoG ILE, but it came to life when the players were interviewed and were required to explain their strategies and experiences. The excerpts we report in Table 8 and within the text can provide examples of this.
  • Last, the skill of self-regulation, i.e., the ability that involves monitoring one’s own thinking and behavior, recognizing biases and assumptions, and adjusting one’s approach based on feedback and new information, was challenged within the PMBoG ILE. Players tended to approach the game heavily relying on their mental models and background, but also became able to exploit this experience as the game progressed.

6. Conclusions, Limitations, and Ideas for Further Research

At the end of this paper, our aim is to emphasize some specific learning points that emerged from the project and are related to the methodological principles and the simulation tools employed here.
The starting point is to recognize that SD-based games and ILEs can represent powerful tools to support the acquisition of critical thinking skills about complex issues.
These tools fully combine all the methodological principles theorized by Jay Forrester [34,35] with the tools that the technology provides in terms of developing powerful and effective ILEs (as advocated by many scholars, such as Morecroft [26] and Davidsen [20]) to be subsequently used to support policymakers. In this context, project management is certainly a complex domain that entails the simultaneous management of several variables and policy levers, and subsequently needs to be studied with a systemic perspective, such as system dynamics and systems thinking, as widely demonstrated by the extant literature in the field (e.g., the works by Sterman [59,60], Rodrigues and Bowers [61,62] and Lyneis and Ford [36,63], and Lyneis et al. [69]) and long emphasized by well-known scholars such as Berry Richmond [50].
Moreover, by requiring students to actively build upon their prior knowledge, the research design we employed in this study can be considered as being consistent with the use of qualitative research methodologies to analyze learning directly in classrooms, as emphasized in Novak and Gowin [75].
Interestingly, when playing within an ILE, entertainment is also fostered. Stated differently, SD-based ILEs and SD-based games provide different-from-the-usual forms of education and training, thereby stimulating the users to make decisions and take action while having fun, as suggested by several studies such as the one by Armenia et al. [18].
Additionally, in the specific case of the PMBoG project, we would like to emphasize that the SD-based ILE is available online and open for access. Being available online, the ILE allows (and will increasingly allow) the extensive participation of numerous learners able to interact easily with the underlying SD model and experiment freely with their own decisions, thereby gaining insights about PM issues and activities.
As a last insight emerging from this project, we emphasize again that SD-based simulation models and ILEs may effectively provide a core component of any project management intervention, as Figure 16 portrays.
Specifically, Figure 16 highlights that an SD simulation model may provide the core piece to the overall project management architecture, facilitating the phases of planning and monitoring (i.e., controlling) and thereby enhancing performance and learning.
Of course, this work is not without limitations.
Whereas the modelling phases of the PMBoG project have been concluded, the process whereby the SD-based ILE is used in classrooms has just started. Limited results and analysis are, therefore, the main limitations that can be ascribed to this research at this moment.
In any case, this will allow for further extensive research and experience in classrooms and with a higher number of learners, while also employing different evaluation approaches and methods, such as adopting a pre-test/post-test experimental set-up [80] or employing a more quantitative approach to assessing the acquisition of critical thinking skills, for example using the California Critical Thinking Skills Test or the Watson–Glaser Critical Thinking Appraisal. The use of these tests is exemplified by other studies in the field [42,43].
Extending our research would also allow a better understanding of which conditions systems thinking- and system dynamics-based tools and games may support decision making and enhance the acquisition and/or development of critical thinking skills in knowledge-specific domains, such as in the project management field or in other contexts.

Author Contributions

This article can be considered the results of shared research among the authors. Conceptualization, methodology, software, and Writing-original draft etc.: F.B., S.A., S.N. and A.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Erasmus+ KA project called “Project Management Board Game” (PMBoG), project reference no. 2020-1-IT02-KA204-079724.

Data Availability Statement

Data and information about the PMBoG project can be retrieved at https://www.pmbog.eu/ (accessed on 13 November 2023). The PMBoG ILE is available online at the following address: https://exchange.iseesystems.com/public/barnaf/pmbog-ile/index.html#page1 (accessed on 13 November 2023).

Acknowledgments

The authors sincerely thank all the partners who cooperated in the different stages of the PMBoG project, thereby sharing ideas, comments, and suggestions.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Dewey, J. How We Think; D.C. Heat & Co. Publishers: Boston, MA, USA, 1910. [Google Scholar]
  2. Huitt, W. Critical thinking: An overview. Educ. Psychol. Interact. 1998, 3, 34–50. [Google Scholar]
  3. Halpern, D.F. Thought and Knowledge: An Introduction to Critical Thinking; Psychology Press, Taylor & Francis Group: New York, NY, USA; London, UK, 2013. [Google Scholar]
  4. Daniel, M.F.; Auriac, E. Philosophy, critical thinking and philosophy for children. Educ. Philos. Theory 2011, 43, 415–435. [Google Scholar] [CrossRef]
  5. Scriven, M.; Paul, R. Critical thinking. In Proceedings of the 8th Annual International Conference on Critical Thinking and Education Reform, Summer, CA, USA; 1987; Volume 7. [Google Scholar]
  6. Bezanilla, M.J.; Fernández-Nogueira, D.; Poblete, M.; Galindo-Domínguez, H. Methodologies for teaching-learning critical thinking in higher education: The teacher’s view. Think. Ski. Creat. 2019, 33, 100584. [Google Scholar]
  7. Gosner, W. “Critical Thinking”. Encyclopedia Britannica, 21 September 2023. Available online: https://www.britannica.com/topic/critical-thinking (accessed on 1 November 2023).
  8. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development; Prentice-Hall: Englewood Cliffs, NJ, USA, 1984. [Google Scholar]
  9. Hamilton, J.G.; Klebba, J.M. Experiential learning: A course design process for critical thinking. Am. J. Bus. Educ. 2011, 4, 1. [Google Scholar]
  10. Potts, B. Strategies for teaching critical thinking. Pract. Assess. Res. Eval. 1994, 4, 3. [Google Scholar]
  11. Castellano, J.F.; Lightle, S.; Baker, B. A strategy for teaching critical thinking: The sellmore case. Manag. Account. Q. 2017, 18, 1–10. [Google Scholar]
  12. Cicchino, M.I. Using game-based learning to foster critical thinking in student discourse. Interdiscip. J. Probl.-Based Learn. 2015, 9, 4. [Google Scholar] [CrossRef]
  13. McDonald, S.D. Enhanced critical thinking skills through problem-solving games in secondary schools. Interdiscip. J. E-Ski. Lifelong Learn. 2017, 13, 79–96. [Google Scholar] [CrossRef]
  14. Efendi, A. Improve critical thinking skills with informatics educational games. J. Educ. Technol. 2022, 6, 521–530. [Google Scholar]
  15. Salas, E.; Wildman, J.L.; Piccolo, R.F. Using simulation-based training to enhance management education. Acad. Manag. Learn. Educ. 2009, 8, 559–573. [Google Scholar]
  16. Lane, D.C. On a resurgence of management simulations and games. J. Oper. Res. Soc. 1995, 46, 604–625. [Google Scholar]
  17. Barnabè, F. Policy Deployment and Learning in Complex Business Domains: The Potentials of Role Playing. Int. J. Bus. Manag. 2016, 11, 15–29. [Google Scholar] [CrossRef]
  18. Armenia, S.; Barnabé, F.; Ciobanu, N.; Kulakowska, M. Interactive “Boardgame-Based” Learning Environments for Decision-makers’ Training in Managerial Education, 2020 White Paper. Available online: https://www.pmbog.eu/wp-content/uploads/2021/10/Interactive-Learning-Environments-for-education.pdf (accessed on 13 November 2023).
  19. Atkinson, R.K.; Renkl, A. Interactive example-based learning environments: Using 6 interactive elements to encourage effective processing of worked examples. Educ. Psychol. Rev. 2007, 19, 375–386. [Google Scholar] [CrossRef]
  20. Davidsen, P.I. Issues in the design and use of system-dynamics-based interactive learning environments. Simul. Gaming 2000, 31, 170–177. [Google Scholar] [CrossRef]
  21. Spector, J.M.; Davidsen, P.I. Constructing learning environments using system dynamics. J. Coursew. Eng. 1998, 1, 5–11. [Google Scholar]
  22. Papert, S. Mindstorms; Basic Books Inc.: New York, NY, USA, 1980. [Google Scholar]
  23. Schön, D. The Reflective Practitioner; Basic Books Inc.: New York, NY, USA, 1983. [Google Scholar]
  24. Alessi, S.; Kopainsky, B. System dynamics and simulation/gaming: Overview. Simul. Gaming 2015, 46, 223–229. [Google Scholar] [CrossRef]
  25. Kopainsky, B.; Alessi, S.M. Effects of structural transparency in system dynamics simulators on performance and understanding. Systems 2015, 3, 152–176. [Google Scholar] [CrossRef]
  26. Morecroft, J.D.W. System dynamics and microworlds for policymakers. Eur. J. Oper. Res. 1988, 35, 301–320. [Google Scholar]
  27. Sterman, J.D. Teaching Takes Off: Flight Simulators for Management Education. OR/MS Today 1992, 19, 40–44. [Google Scholar]
  28. Davidsen, P.I.; Spector, J.M. Critical reflections on system dynamics and simulation/gaming. Simul. Gaming 2015, 46, 430–444. [Google Scholar] [CrossRef]
  29. Sterman, J.D.; Dogan, G. “I’m not hoarding, I’m just stocking up before the hoarders get here.”: Behavioral causes of phantom ordering in supply chains. J. Oper. Manag. 2015, 39, 6–22. [Google Scholar]
  30. Kunc, M.; Malpass, J.; White, L. (Eds.) Behavioral Operational Research: Theory, Methodology and Practice; Palgrave Macmillan: London, UK, 2016. [Google Scholar]
  31. Lane, D.C. ‘Behavioural System Dynamics’: A very tentative and slightly sceptical map of the territory. Syst. Res. Behav. Sci. 2017, 34, 414–423. [Google Scholar] [CrossRef]
  32. Barnabè, F.; Davidsen, P.I. Exploring the potentials of behavioral system dynamics: Insights from the field. J. Model. Manag. 2020, 15, 339–364. [Google Scholar]
  33. Lane, D.C.; Rouwette, E.A. Towards a behavioural system dynamics: Exploring its scope and delineating its promise. Eur. J. Oper. Res. 2023, 306, 777–794. [Google Scholar]
  34. Forrester, J.W. Industrial Dynamics; The MIT Press: Cambridge, MA, USA, 1961. [Google Scholar]
  35. Forrester, J.W. Principle of Systems; The MIT Press: Cambridge, MA, USA, 1968. [Google Scholar]
  36. Lyneis, J.M.; Ford, D.N. System dynamics applied to project management: A survey, assessment, and directions for future research. Syst. Dyn. Rev. 2007, 23, 157–189. [Google Scholar]
  37. Turner, P. Critical thinking in nursing education and practice as defined in the literature. Nurs. Educ. Perspect. 2005, 26, 272–277. [Google Scholar]
  38. Bers, T. Assessing critical thinking in community colleges. New Dir. Community Coll. 2005, 130, 15–25. [Google Scholar] [CrossRef]
  39. Facione, P.A. Critical thinking: What it is and why it counts. Insight Assess. 2011, 1, 1–23. [Google Scholar]
  40. Behar-Horenstein, L.S.; Niu, L. Teaching critical thinking skills in higher education: A review of the literature. J. Coll. Teach. Learn. 2011, 8, 25–42. [Google Scholar] [CrossRef]
  41. Alsaleh, N.J. Teaching Critical Thinking Skills: Literature Review. Turk. Online J. Educ. Technol.-TOJET 2020, 19, 21–39. [Google Scholar]
  42. Bernard, R.M.; Zhang, D.; Abrami, P.C.; Sicoly, F.; Borokhovski, E.; Surkes, M.A. Exploring the structure of the Watson–Glaser Critical Thinking Appraisal: One scale or many subscales? Think. Ski. Creat. 2008, 3, 15–22. [Google Scholar] [CrossRef]
  43. Alkharusi, H.A.; Al Sulaimani, H.; Neisler, O. Predicting Critical Thinking Ability of Sultan Qaboos University Students. Int. J. Instr. 2019, 12, 491–504. [Google Scholar] [CrossRef]
  44. Coleman, H.; Rogers, G.; King, J. Using portfolios to stimulate critical thinking in social work education. Soc. Work Educ. 2002, 21, 583–595. [Google Scholar] [CrossRef]
  45. Siles-González, J.; Solano-Ruiz, C. Self-assessment, reflection on practice and critical thinking in nursing students. Nurse Educ. Today 2016, 45, 132–137. [Google Scholar] [CrossRef]
  46. Kankaraš, M.; Suarez-Alvarez, J. Assessment Framework of the OECD Study on Social and Emotional Skills; OECD Publishing: Paris, France, 2019. [Google Scholar]
  47. Jaffe, L.E.; Lindell, D.; Sullivan, A.M.; Huang, G.C. Clear skies ahead: Optimizing the learning environment for critical thinking from a qualitative analysis of interviews with expert teachers. Perspect. Med. Educ. 2019, 8, 289–297. [Google Scholar] [CrossRef] [PubMed]
  48. Tiwari, A.; Lai, P.; So, M.; Yuen, K. A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med. Educ. 2006, 40, 547–554. [Google Scholar] [CrossRef] [PubMed]
  49. Zepeda, S.J. Classroom-based assessments of teaching and learning. In Evaluating Teaching: A Guide to Current Thinking and Best Practice; Corwin Press, Sage: Thousand Oaks, CA, USA, 2006; Volume 2. [Google Scholar]
  50. Richmond, B. Systems thinking: Critical thinking skills for the 1990s and beyond. Syst. Dyn. Rev. 1993, 9, 113–133. [Google Scholar] [CrossRef]
  51. Faria, A.J. Business simulation games: Current usage levels—An update. Simul. Gaming 1998, 29, 295–308. [Google Scholar] [CrossRef]
  52. Faria, A.J.; Hutchinson, D.; Wellington, W.J.; Gold, S. Developments in business gaming: A review of the past 40 years. Simul. Gaming 2009, 40, 464–487. [Google Scholar] [CrossRef]
  53. Armenia, S.; Barnabè, F.; Pompei, A. Game-Based Learning and Decision-Making for Urban Sustainability: A Case of System Dynamics Simulations. In EURO Working Group on DSS: A Tour of the DSS Developments over the Last 30 Years; Jason Papathanasiou, J., Zaraté, P., Freire de Sousa, J., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 325–344. [Google Scholar]
  54. Klabbers, J.H. Social problem solving: Beyond method. In Back to the Future of Gaming; Duke, R.D., Kriz, W.C., Eds.; Bertelsmann Verlag GmbH & Co. KG: Bielefeld, Germany, 2014; pp. 12–29. [Google Scholar]
  55. Palmunen, L.M.; Pelto, E.; Paalumäki, A.; Lainema, T. Formation of novice business students’ mental models through simulation gaming. Simul. Gaming 2013, 44, 846–868. [Google Scholar] [CrossRef]
  56. Wellington, W.J.; Faria, A.J.; Whiteley, T.R. Holistic cognitive strategy in a computer-based marketing simulation game: An investigation of attitudes towards the decision-making process. In Developments in Business Simulation and Experiential Learning: Proceedings of the Annual ABSEL Conference, Hawaii; 1998; Volume 25, pp. 246–252. [Google Scholar]
  57. Meadows, D. A brief and incomplete history of operational gaming in system dynamics. Syst. Dyn. Rev. 2007, 23, 199–203. [Google Scholar] [CrossRef]
  58. Vennix, J.A.M. Group Model Building: Facilitating Team Learning Using System Dynamics; Wiley: Chichester, UK, 1996. [Google Scholar]
  59. Sterman, J.D. Business Dynamics: Systems Thinking and Modeling for a Complex World; Irwin McGraw-Hill: Boston, MA, USA, 2000. [Google Scholar]
  60. Sterman, J.D. System Dynamics Modeling for Project Management; System Dynamics Group, Sloan School of Management, Massachusetts Institute of Technology: Cambridge, MA, USA, 1992; Unpublished manuscript. [Google Scholar]
  61. Rodrigues, A.; Bowers, J. System dynamics in project management: A comparative analysis with traditional methods. Syst. Dyn. Rev. 1996, 12, 121–139. [Google Scholar] [CrossRef]
  62. Rodrigues, A.; Bowers, J. The role of system dynamics in project management. Int. J. Proj. Manag. 1996, 14, 213–220. [Google Scholar] [CrossRef]
  63. Ford, D.N.; Lyneis, J.M. System Dynamics Applied to Project Management: A Survey, Assessment, and Directions for Future Research. In System Dynamics; Encyclopedia of Complexity and Systems Science Series; Dangerfield, B., Ed.; Springer: New York, NY, USA, 2020; pp. 285–314. [Google Scholar]
  64. Turner, R. Projects and their management. In Gower Handbook of Project Management; Routledge: London, UK, 2016; pp. 49–64. [Google Scholar]
  65. Leotta, A. Capitalizing and controlling development project as joint traslations. The mediating role of the information technology. Manag. Control 2015, 2, 101–134. [Google Scholar] [CrossRef]
  66. Lock, D. Project Management; Routledge: London, UK, 2020. [Google Scholar]
  67. Pinto, J.K.; Mantel, S.J. The causes of project failure. IEEE Trans. Eng. Manag. 1990, 37, 269–276. [Google Scholar] [CrossRef]
  68. Al-Ahmad, W.; Al-Fagih, K.; Khanfar, K.; Alsamara, K.; Abuleil, S.; Abu-Salem, H. A taxonomy of an IT project failure: Root causes. Int. Manag. Rev. 2009, 5, 93–104. [Google Scholar]
  69. Lyneis, J.M.; Cooper, K.G.; Els, S.A. Strategic management of complex projects: A case study using system dynamics. Syst. Dyn. Rev. 2001, 17, 237–260. [Google Scholar] [CrossRef]
  70. Rumeser, D.; Emsley, M. Key challenges of system dynamics implementation in project management. Procedia-Soc. Behav. Sci. 2016, 230, 22–30. [Google Scholar] [CrossRef]
  71. Rodrigues, A.G.; Williams, T.M. System dynamics in project management: Assessing the impacts of client behaviour on project performance. J. Oper. Res. Soc. 1998, 49, 2–15. [Google Scholar] [CrossRef]
  72. Van Daalen, C.; Schaffernicht, M.; Mayer, I. System dynamics and serious games. In Proceedings of the 32nd International Conference of the System Dynamics Society, Delft, The Netherlands, 20–24 July 2014. [Google Scholar]
  73. Schaller, M.D.; Gencheva, M.; Gunther, M.R.; Weed, S.A. Training doctoral students in critical thinking and experimental design using problem-based learning. BMC Med. Educ. 2023, 23, 579. [Google Scholar] [CrossRef]
  74. Krippendorff, K. Content Analysis: An Introduction to Its Methodology; SAGE Publications: Beverly Hills, CA, USA, 1980. [Google Scholar]
  75. Novak, J.D.; Gowin, D.B. Learning How to Learn; Cambridge University Press: Cambridge, UK, 1984. [Google Scholar]
  76. Videira, N.; Antunes, P.; Santos, R.; Lopes, R. A participatory modelling approach to support integrated sustainability assessment processes. Syst. Res. Behav. Sci. 2010, 27, 446–460. [Google Scholar] [CrossRef]
  77. Senge, P.M.; Forrester, J.W. Tests for building confidence in system dynamics models. Syst. Dyn. TIMS Stud. Manag. Sci. 1980, 14, 209–228. [Google Scholar]
  78. Barlas, Y. Formal aspects of model validity and validation in system dynamics. Syst. Dyn. Rev. 1996, 12, 183–210. [Google Scholar] [CrossRef]
  79. Ford, D. A System Dynamics Glossary. Syst. Dyn. Rev. 2019, 35, 369–379. [Google Scholar] [CrossRef]
  80. Campbell, D.T.; Stanley, J.C.; Gage, N.L. Experimental and Quasi-Experimental Designs for Research; Houghton Mifflin: Boston, MA, USA, 1963. [Google Scholar]
Figure 1. Basic structures of project management interventions. Source: Lyneis and Ford [36] (p. 165).
Figure 1. Basic structures of project management interventions. Source: Lyneis and Ford [36] (p. 165).
Systems 11 00554 g001
Figure 2. “Your task” page of the SD-based PMBoG ILE.
Figure 2. “Your task” page of the SD-based PMBoG ILE.
Systems 11 00554 g002
Figure 3. “Couples and cards” page of the SD-based PMBoG ILE.
Figure 3. “Couples and cards” page of the SD-based PMBoG ILE.
Systems 11 00554 g003
Figure 4. “Purple Task options” page (front side) of the SD-based PMBoG ILE.
Figure 4. “Purple Task options” page (front side) of the SD-based PMBoG ILE.
Systems 11 00554 g004
Figure 5. Structure from the SD model representing the management actions related to restaurant no. 1.
Figure 5. Structure from the SD model representing the management actions related to restaurant no. 1.
Systems 11 00554 g005aSystems 11 00554 g005b
Figure 6. “Purple Task options” page (rear side) of the SD-based PMBoG ILE.
Figure 6. “Purple Task options” page (rear side) of the SD-based PMBoG ILE.
Systems 11 00554 g006
Figure 7. “Planning Tasks Dashboard” page of the SD-based PMBoG ILE.
Figure 7. “Planning Tasks Dashboard” page of the SD-based PMBoG ILE.
Systems 11 00554 g007
Figure 8. “Player’s Decision Dashboard” page of the SD-based PMBoG ILE.
Figure 8. “Player’s Decision Dashboard” page of the SD-based PMBoG ILE.
Systems 11 00554 g008
Figure 9. Structure from the SD model representing workforce management.
Figure 9. Structure from the SD model representing workforce management.
Systems 11 00554 g009
Figure 10. “Scorecard” page of the SD-based PMBoG ILE.
Figure 10. “Scorecard” page of the SD-based PMBoG ILE.
Systems 11 00554 g010
Figure 11. “Graphs” page of the SD-based PMBoG ILE.
Figure 11. “Graphs” page of the SD-based PMBoG ILE.
Systems 11 00554 g011
Figure 12. “Planning Tasks Dashboard” page for one player after the game.
Figure 12. “Planning Tasks Dashboard” page for one player after the game.
Systems 11 00554 g012
Figure 13. “Player’s Decision Dashboard” page for one player after the game.
Figure 13. “Player’s Decision Dashboard” page for one player after the game.
Systems 11 00554 g013
Figure 14. “Graphs” (partial picture) page for one player after the game.
Figure 14. “Graphs” (partial picture) page for one player after the game.
Systems 11 00554 g014
Figure 15. “Scorecard” page for one player after the game.
Figure 15. “Scorecard” page for one player after the game.
Systems 11 00554 g015
Figure 16. Role of an SD model within a project management-related architecture.
Figure 16. Role of an SD model within a project management-related architecture.
Systems 11 00554 g016
Table 1. Choices in the PMBoG game design.
Table 1. Choices in the PMBoG game design.
ChoicePMBoG ILE
1PurposeManage a project by making decisions, planning activities, carrying out actions, measuring results, and learning.
2Insights obtainedContext-specific competencies (in the field of PM) and systems thinking skills (generally transferrable).
3PlotPlanning of a wedding ceremony.
4Player(s)Various categories of learners.
5Role(s)Wedding planner.
6Objective in-game/incentiveComplete all the tasks on time, with the budget at the user’s disposal, and fulfilling all the requests.Subsequently, the highest score possible.
7RulesBudget constraints; time constraints; and capacity constraints.
8Representation of physical systemFictitious, computer-based in the form of an ILE.
9Representation of inter-actor environmentNot present (single-player online game).
Table 2. Description of a “couple card”.
Table 2. Description of a “couple card”.
Couple CardThe Information Displayed on the Card
Systems 11 00554 i001A = Name and image of the couple;
B = Preferred symbols;
C = Liked symbols;
D = Disliked symbols;
E = Perfect marriage combination.
Table 3. Icons associated with task cards.
Table 3. Icons associated with task cards.
Task Card IconTask Card Typology
Systems 11 00554 i002Restaurant
Systems 11 00554 i003Location
Systems 11 00554 i004Dress
Systems 11 00554 i005Announcements
Systems 11 00554 i006Rings
Systems 11 00554 i007Photographer
Table 4. Description of a “task card”.
Table 4. Description of a “task card”.
Task CardInformation Displayed by the Card
Systems 11 00554 i008A = Card color;
B = Card task symbol;
C = Task cost;
D = Task risk flavor;
E = Task risk number;
F = Workload required;
G = Requested symbols;
H = Pre-requisites.
Table 5. Description of the decisions included in the player’s decision dashboard.
Table 5. Description of the decisions included in the player’s decision dashboard.
DecisionDecision Levers in the ILEInformation
Allocate the staff to selected tasksPlayer’s decision REST workload
Player’s decision LOCATION workload
Player’s decision DRESS workload
Player’s decision ANNOUN workload
Player’s decision RINGS workload
Player’s decision PHOTOG workload
The user specifies the amount of workload they want to assign to each specific task and for that specific month.
Each task to be completed requires a specific workload, as detailed by the task cards on their front sides.
The total amount of workload that can be assigned to the six tasks is constrained by the staff at the user’s disposal (i.e., by the total workload available).
Mitigate risksPlayer’s decision REST risk mitigation
Player’s decision LOCATION risk mitigation
Player’s decision DRESS risk mitigation
Player’s decision ANNOUN risk mitigation
Player’s decision RINGS risk mitigation
Player’s decision PHOTOG risk mitigation
The user can mitigate the various risks associated with the six tasks of the game by investing money.
Mitigating risks will reduce the possibility that such risks will happen.
The amount of money to be spent to this aim is detailed on the rear sides of the task cards.
Hire junior workersJuniors to HIREThe user can decide to increase their staff by hiring new employees. Such workers will be “juniors”, i.e., less productive ones, if compared to seniors.
After a delay time, the juniors become more experienced, i.e., they become seniors.
Release seniorsSeniors to RELEASEThe user can reduce their staff by releasing senior workers. This will reduce the staff and the total workload at the user’s disposal but will also reduce the costs.
Activate temporary workersTemporary workerIf in need of more staff (and quickly), the user can activate a temporary worker.
This employee will work on the project only for one month and will be subsequently and automatically released.
Work on utility card-related activitiesWorkload on utility card 1
Workload on utility card 1
Workload on utility card 1
Depending on the utility card that is randomly assigned to the user at the beginning of the game, the user will have to assign a workload to such a task.
This action is constrained by the staff at the user’s disposal (i.e., by the total workload available).
Table 6. Description of the scorecard.
Table 6. Description of the scorecard.
Points Scored CategoryInformation
+3 pointsPer request symbol in your completed task cards that matches the preferred couple card
+1 pointPer request symbol in your completed task cards that matches the liked couple card
−1 pointPer request symbol in your completed task cards that matches the disliked couple card
+5 pointsPer perfect marriage bonus if you have all the request symbols on your completed task cards that match the perfect request on your couple card
−3 pointsPer not completed/absent task type
+3 pointsTo the player who has the most remaining coins
−1 pointEvery two debts (rounded)
Pre-requisiteThe “special need“ card must be fulfilled.
Table 7. List of graphs available on the “Graphs” page of the SD-based PMBoG ILE.
Table 7. List of graphs available on the “Graphs” page of the SD-based PMBoG ILE.
GraphDescription
Tasks to be completed
Tasks completed counter
This graph shows the number of tasks to be completed (no. 6 at the beginning of the game) compared to the tasks completed by the user at any specific time of the simulation.
REST task completion signal (ILE)
LOCAT task completion signal (ILE)
This graph shows if the tasks associated with choosing a restaurant (i.e., REST) and/or a location (i.e., LOCAT) for the ceremony have been completed (value = 1), are partially completed (value = 0.5), or have not been worked on yet (value = 0).
These are the purple tasks in the game.
ANNOUN task completion signal (ILE)
DRESS task completion signal (ILE)
This graph shows if the tasks associated with making the announcements (i.e., ANNOUN) and/or choosing the dresses (i.e., DRESS) for the ceremony have been completed (value = 1), are partially completed (value = 0.5), or have not been worked on yet (value = 0).
These are the yellow tasks in the game.
RINGS task completion signal (ILE)
PHOTOG task completion signal (ILE)
This graph shows if the tasks associated with choosing the rings (i.e., RINGS) and/or a photographer (i.e., PHOTOG) for the ceremony have been completed (value = 1), are partially completed (value = 0.5), or have not been worked on yet (value = 0).
These are the red tasks in the game.
BudgetThis graph shows the level of the budget at the user’s disposal at any time during the simulation.
The user starts the game with a budget of EUR 30,000.
The user can use more than the available budget, even though this will incur a penalization in the final score.
Junior staffSenior staffTemporary workerThis graph shows the staff at the user’s disposal at any time during the simulation. There are three typologies of workers:
-
Junior workers that have a productivity of 1.5 and a salary of 2000 EUR/month;
-
Senior workers that have a productivity of 3 (i.e., double the productivity of a junior) and a salary of 3000 EUR/month;
-
Temporary workers that have a productivity of 1.5 (equal to juniors) and a salary of 1000 EUR/month.
Potential workload available
Workload allocated by player
The potential workload available during the simulation is given by the staff (the no. of workers for each category of staff) multiplied by the employees’ productivity (each category of workers has a different productivity level).
The workload allocated by the player shows how the user uses the potential workload at their disposal to complete the tasks.
The comparison between these two variables makes clear if there is capacity not currently used by the player.
Table 8. Players’ feedback.
Table 8. Players’ feedback.
Player No.Strategy and FeedbackAdditional Features in/for the GameFinal ScoreRisksIncurred
1I tried to fulfill the Posh couple’s wishes by completing all the required tasks on time and in the correct sequence. It wasn’t easy, because the options that became available in the ILE weren’t always the best, and I didn’t manage to carry out all the actions in parallel, using properly the workforce at my disposal. In the end, I also had to hire some temporary workers, because the pressure I felt to complete my task was very high.None.+5None.
2I had fun and I think I did quite well even though I couldn’t complete all the tasks required for the wedding. Maybe I’ve been worrying too much about managing the budget and staffing, and this has caused me to pay less attention to the time needed to complete the tasks.More planning time.+9None.
3I completed the six types of tasks that were requested, but they weren’t the best possible ones, i.e., they weren’t exactly the ones associated with the couple I had chosen. Not all cards were perfect in that regard. Anyhow, I tried to complete my task and make the wedding happen. However, I spent all the available budget, and I must say that I was not able to mitigate the risk well. In the end, I’m happy, but I’d like to try the game again, to improve my strategy.Repeat the game with the same conditions.+10More budget needed.
4I nearly completed the overall task, but I was not able to actually do that due to some mistakes I made. For example, I chose one wrong option during the planning phase, and this messed up my execution. I subsequently had to rush things and go for sub than optimal options. I definitely felt a bit of pressure to perform, and this made me further rush a couple of decisions. Overall, I think I learnt a lot, and the game pushed me to think about the wedding as a project, with all that this would entail.Play a competitive game against other players.+5Rework cycle.
5I tried to complete all the tasks, without succeeding. I understood that priorities, schedule, budget availability, resources, and time cannot be always managed simultaneously and that it is difficult to find an optimal strategy. Anyhow, I understood that the planning phase and the execution phase are to be managed carefully, and all the resources are to be employed accordingly.Play again, relying on the knowledge gained in previous attempts.+2Rework cycle.
6I felt good playing the game. I really like simulations and games, and this is a nice example of a management-related game. Actually, I struggled to complete all the tasks, but I definitely tried to use all the options and resources at my disposal.None.+3None.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Barnabè, F.; Armenia, S.; Nazir, S.; Pompei, A. Critical Thinking Skills Enhancement through System Dynamics-Based Games: Insights from the Project Management Board Game Project. Systems 2023, 11, 554. https://doi.org/10.3390/systems11110554

AMA Style

Barnabè F, Armenia S, Nazir S, Pompei A. Critical Thinking Skills Enhancement through System Dynamics-Based Games: Insights from the Project Management Board Game Project. Systems. 2023; 11(11):554. https://doi.org/10.3390/systems11110554

Chicago/Turabian Style

Barnabè, Federico, Stefano Armenia, Sarfraz Nazir, and Alessandro Pompei. 2023. "Critical Thinking Skills Enhancement through System Dynamics-Based Games: Insights from the Project Management Board Game Project" Systems 11, no. 11: 554. https://doi.org/10.3390/systems11110554

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop