Next Article in Journal
An Artificial Neural Network-Based Data-Driven Embedded Controller Design for a Pneumatic Artificial Muscle-Actuated Pressing Unit
Next Article in Special Issue
Beyond Sight: Enhancing Augmented Reality Interactivity with Audio-Based and Non-Visual Interfaces
Previous Article in Journal
Filtering Efficiency and Design Properties of Medical- and Non-Medical-Grade Face Masks: A Multiscale Modeling Approach
Previous Article in Special Issue
Impact of Augmented Reality on Assistance and Training in Industry 4.0: Qualitative Evaluation and Meta-Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Creating Interactive Scenes in 3D Educational Games: Using Narrative and Technology to Explore History and Culture

by
Rafał Kaźmierczak
1,*,
Robert Skowroński
2,
Cezary Kowalczyk
1 and
Grzegorz Grunwald
3
1
Department of Land Management and Geographic Information Systems, Institute of Spatial Management and Geography, Faculty of Geoengineering, University of Warmia and Mazury in Olsztyn, Prawocheńskiego 15, 10-724 Olsztyn, Poland
2
SBI Ltd., Wilczyńskiego 25E, 10-686 Olsztyn, Poland
3
Department of Geodesy, Institute of Geodesy and Civil Engineering, Faculty of Geoengineering, University of Warmia and Mazury in Olsztyn, Heweliusza 5, 10-724 Olsztyn, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(11), 4795; https://doi.org/10.3390/app14114795
Submission received: 24 April 2024 / Revised: 23 May 2024 / Accepted: 30 May 2024 / Published: 1 June 2024
(This article belongs to the Special Issue Virtual/Augmented Reality and Its Applications)

Abstract

:
Three-dimensional games are an indispensable tool in education and cultural transmission, offering users immersive experiences that facilitate learning through engagement and direct experience. The aim of this study was to design and implement an advanced cutscene sequencer in Unity 3D, targeted at educational and cultural games, to assist game developers in producing cinematic interludes, which are a key narrative element in games. The project methodology encompassed a detailed process of planning, design, and implementation. This involved the comprehensive use of various tools, including Unity 3D for game development, C# for scripting, Visual Studio for integrated development, Git for version control, Blender for 3D modeling, Substance Painter for texturing, and Audacity for audio editing. These tools collectively facilitated the development of a robust cutscene sequencer system designed to create engaging and dynamic narrative scenes. The project’s results indicate that the use of an advanced sequencer can significantly impact the efficiency and creativity of the game and educational material creation process, offering developers the opportunity to explore practically unlimited viewing perspectives. This tool enables the creation of rich and diverse visual experiences, which is crucial for engaging and educating players.

1. Introduction

Through the use of advanced graphics and interactivity, 3D games open new horizons for education and cultural transmission. The three-dimensional environments they offer allow users to have immersive experiences that can significantly facilitate the learning process through engagement and direct experience. Utilizing 3D educational games by leveraging these properties, contributes to the development of cognitive, motor, and social skills, making them an effective tool to support traditional teaching methods [1]. Research on the impact of 3D games on cultural education highlights their significant potential in conveying knowledge about various cultural aspects, both in the context of formal education and self-education [2,3]. Thus, 3D games are not only a modern form of entertainment but also an effective educational tool that can significantly enrich traditional teaching methods with new, dynamic forms of communication.
Three-dimensional educational games have evolved significantly over the years, providing new opportunities for interactive learning and cultural education. These games enable the development of cognitive, motor, and social skills, making them effective tools to support traditional teaching methods. Research highlights the significant potential of 3D games in conveying knowledge about various cultural aspects in both formal education and self-education contexts.
Despite numerous advantages, the implementation of 3D games in education and cultural transmission encounters challenges such as high production costs, the requirement for specialized equipment, and ensuring universal access, including for people with disabilities. Moreover, excessive gamification risks diminishing the educational value of the content.
This paper presents several major contributions and innovations in the development of an advanced cutscene sequencer. The sequencer introduces dynamic camera perspective changes, real-time manipulation of narrative elements, and seamless integration with existing game development tools. These features significantly enhance the efficiency and creativity of game developers, allowing for the creation of more immersive and educational gaming experiences. The proposed sequencer divides cutscenes into smaller, manageable segments, each encapsulating specific actions such as dialogue, sound, and camera movements. This modular approach not only simplifies the design process but also enhances reusability. Developers can create a library of small cutscenes that can be easily combined and reused across different parts of the application, reducing redundancy and development time. The innovative aspects of our advanced cutscene sequencer include dynamic camera perspective changes, seamless integration with various game development tools, and enhanced flexibility in narrative creation. The sequencer allows for real-time manipulation of camera angles and transitions, providing developers with the ability to create more immersive and engaging educational and cultural experiences. Additionally, integration with tools such as Unity 3D, C#, and Blender enables rapid prototyping and iterative development, significantly improving both efficiency and creativity in game design. This flexibility is crucial for educational and cultural games, where rich and diverse visual experiences enhance learning and engagement. The modular design of the sequencer supports the creation of complex and dynamic narrative flows, making it an indispensable tool for developers aiming to craft sophisticated cutscenes that adapt to various narrative requirements and enhance the overall storytelling capability and player experience in 3D educational games.
The aim of this paper is to design and implement an advanced cutscene sequencer in Unity 3D, specifically for educational and cultural games. This sequencer was created to significantly facilitate the work of game developers in the production of cinematic interludes, which are a key narrative element in games. By automating and simplifying the process of creating cutscenes, this tool aims to enable faster prototyping and implementation of changes, as well as to encourage experimentation with various narrative solutions in a short time period. Additionally, the use of a 3D environment by the sequencer opens up the possibility for creators to explore practically unlimited viewing perspectives, which is crucial for engaging and educating players through rich and diverse visual experiences. This work aims not only to present the design process and functionality of the sequencer but also to demonstrate its impact on efficiency and creativity in the process of creating games and educational materials.

2. State of the Art

In education, 3D games play a key role at various levels of teaching and in many fields of knowledge. Thanks to their flexibility, they can be adapted to the needs of students at different levels of advancement and with different interests. These games are used in teaching mathematics through interactive problem solving in a virtual world, in language learning via immersion in the environment of the language, and in natural sciences through simulations of experiments and observations of natural phenomena. Moreover, they enable the development of key skills, such as problem solving, critical thinking, and teamwork. Studies show that 3D educational games, such as the “Rashi Game”, are effective in engaging students in active exploration regarding educational content, supporting the development of critical thinking and problem-solving skills [4]. Similarly, the impact of serious 3D games on cultural education underscores their potential in teaching and preserving cultural heritage, thus facilitating an understanding of complex cultural and historical concepts through engaging experiences [1]. Additionally, the development of 3D educational games aimed at maximizing children’s memory demonstrates how the use of three-dimensional game environments can improve students’ scores on memory tests, increasing their engagement and interest in the educational material, especially in the context of learning mathematics and natural sciences [5].
Despite numerous advantages, the implementation of 3D games in education and cultural transmission encounters a range of challenges. The most significant include high production and implementation costs, the requirement for specialized equipment, and the risk of excessive gamification, which may diminish the educational value of the content. Additionally, ensuring universal access to these resources, including for people with disabilities, poses a significant challenge. The analysis of the transition from traditional methods of teaching foreign languages to the use of digital media highlights the need to preserve key elements of play in the digitization of the educational process, while also emphasizing the difficulties associated with maintaining the engaging nature of games while achieving educational goals [6]. Furthermore, studies on the impact of fantasy elements on engagement and learning effectiveness in the context of 3D educational games underscore the challenges related to incorporating these aspects into educational 3D games, drawing attention to the necessity of balancing entertainment and education [7]. The design and implementation of the 3D game “ImALeG” for learning the Amazigh language in a virtual environment illustrate the technological and design challenges faced by developers of educational games, striving to provide both engaging and educational experiences for users [8].
The advancement of technology and teaching methods underscores the growing role of 3D games in education and culture. These interactive tools are increasingly perceived not just as support for educational processes but as an integral part of digital culture. To fully leverage their potential, further research and pedagogical experiments are necessary. Recent works in this field, such as the analysis in [9], discuss the current state of 3D games, their development history, and their future prospects, pointing to the remarkable possibilities of using these technologies in education and culture. This study also highlights popular software like Virtual Engine (v. 2021.3.21f1) as a tool for creating 3D games, as well as future directions for their development [9]. Additionally, work [10] on the use of serious 3D VR games for manufacturing and logistical processes shows the promise of offering interactive and immersive educational experiences, which can significantly enrich the teaching process. Equally important is the work done in [11], which focuses on the development of authoring tools for creating individual game-based learning environments. A tool such as AdLer enables the creation of virtual educational environments that promote interactive and engaged learning experiences, showcasing the promising potential of serious games in higher education [11].
The success of 3D games in education and culture depends on many factors, among which narrative plays a crucial role. The ability of storytelling to engage players, convey knowledge and cultural values, and stimulate cognitive processes is invaluable. A high-quality narrative not only draws players into the depicted world but also inspires reflection and independent knowledge seeking, transcending the boundaries of traditional teaching and cultural transmission. In the educational context, the narrative becomes a bridge connecting entertainment with learning, and its effectiveness in conveying educational and cultural content can significantly impact the success of the game. Therefore, developing a thoughtful, coherent, and engaging narrative is fundamental to achieving educational and cultural objectives in 3D games. This synergy between narrative and the interactive and visual elements of the game is what most attracts players’ attention and allows for deeper engagement with the conveyed content. The narrative of a game can be presented in various ways, depending on the creators’ goals and the experience they want to offer players. Cutscenes are one of the most popular methods of presenting a narrative. These cinematic interludes, which pause the game action to tell part of the story or show important events, can significantly enrich the narrative. They can take the form of both scenes rendered in real-time and pre-produced films. The analysis presented in ref. [12] discusses how cutscenes can enhance players’ engagement with and emotional connection to the game world, while on the other hand, they can also hinder immersion and the overall gaming experience. The study emphasizes the importance of designing cutscenes according to player preferences to avoid disrupting their immersion. Additionally, ref. [13] explores how skillful use of virtual camera movement in cutscenes can significantly improve the overall effects in action games, as demonstrated by the comparison of Resident Evil 2 and Resident Evil 6. This work presents advances in visual storytelling and technology over a decade, highlighting how innovations in camera movement can contribute to deeper immersion and better narrative presentation.
In video games, the narrative is often conveyed through dialogues and interactions with non-player characters (NPCs). This mechanism allows players to choose their responses in conversations, enabling them to discover the story in a more interactive manner and influence its progression. Ref. [14] analyzes how dynamic dialogues with NPCs contribute to creating a more immersive gaming experience, adding depth to gameplay beyond the basic game mechanics. They highlight that intensive interactions with characters can significantly enrich the player’s experience. Meanwhile, ref. [15] presents KNUDGE, a task focused on generating NPC dialogues that accurately reflect mission specifications and character units. Their study demonstrates the efficiency of neural generation models in producing dialogues, emphasizing both their effectiveness and the areas needing future improvements to achieve even more realistic and qualitative interactions in games [15]. These studies underscore the importance of dialogues and interactions with NPCs as key elements in creating an immersive experience in games, enabling deeper player engagement and influencing the course of the narrative interactively.
Narration in video games can be conveyed not only through text and dialogues but also through the environmental elements of the game. The design of locations, background items, and even architecture can tell the story of the game world and reveal secrets without words. Ref. [16] presents a comprehensive study on the ways in which stories can be effectively conveyed through environmental elements in video games, without the use of words. The author emphasizes the importance of environmental narration in game design, showing how the environment can serve as a powerful tool for storytelling. On the other hand, ref. [17] explores the use of environmental narration in video games, analyzing how the design, creation, and presentation of virtual worlds can contribute to narration. The work shows how game environments can act as narrators, woven within the framework of the game’s narrative design, highlighting their role in creating a deep and engaging story. These studies point to the crucial role of game environment design in creating a profound and engaging narrative, allowing players to discover the history of the game world and its secrets through environmental elements alone, without the need for text or dialogues.
Many games incorporate audio logs, journals, letters, and other collectible items that players can find during exploration, enhancing the narrative and providing deeper insights into the game world. These elements often contain key information about the game world and its characters, as well as plot details that enrich the experience. A video game’s narrative can be effectively developed through a system of tasks and missions that the player must complete. Each task can reveal new parts of the story, lead to interactions with characters, or allow exploration of new locations, facilitating the gradual discovery and development of the narrative. Ref. [18] discusses a real-time task generation system that, by dynamically adjusting to the player’s actions, provides a more responsive gaming experience. This system automatically generates tasks based on the current state of the game world and narrative goals, enhancing players’ sense of autonomy within the game environment. Meanwhile, ref. [19] presents an automated mission generator based on player skills, which uses predictive player models, based on data, to tailor missions to individual needs and abilities. This approach, combining narrative structures with predictions about player outcomes, enables the generation of missions with personalized difficulty levels, tailored to the specific player. These studies highlight how the task and mission system in games can serve not only as a mechanism for narrative development but also as a tool to tailor the gaming experience to individual preferences and skills, allowing for the gradual discovery and development of the narrative.
Transmedia elements such as books, comics, films, or series are increasingly used in video games to expand the narrative beyond the boundaries of the game itself. Through the integration of various media, players have the opportunity to explore the story in diverse formats, significantly deepening their engagement with the depicted world. Ref. [20] focuses on a conceptual model and OWL ontology for representing and organizing knowledge about transmedia narration. This work emphasizes the importance of semantic technologies and standards in creating coherent and enriched consumer experiences through the integration of various media and co-creation of content. This development points to the significant role of transmedia elements in extending video game narratives to other platforms and formats. Media integration and content co-creation allow players to engage with the story on multiple levels and in various media contexts, enriching the overall player experience.
In open-world games, the narrative is often presented in a nonlinear manner, allowing players to discover the story at their own pace and according to their preferences. This mode of storytelling allows for the existence of many diverse narrative threads, which can intertwine in a unique way for each player. Ref. [21] examines the evolution of narrative mechanics in open-world games, showing how dynamic narration is incorporated into these games to enhance the immersiveness of the player’s experience. The article analyses how these games have evolved to give players greater freedom in exploring and interacting with the narrative. Ref. [22] explores a narrative model specific to video games with procedurally generated content, allowing players to dynamically transform the story in real time. This study highlights the potential of modular and adaptive narrative forms in the context of open-world games, where the player influences the development of the plot and the structure of the game world. Both works indicate the significant potential of open-world games and dynamic narration in creating a deep and immersive experience for the player, offering nonlinear approaches to storytelling that allow individual engagement and exploration of the game world by players.
Some video games experiment with more interactive means of storytelling, allowing players to influence the direction and outcome of the plot through their choices and actions. This storytelling model significantly enhances the immersive experience for the player, offering unique narrative paths and endings. Ref. [23] explores the potential of procedurally generated narration in video games, highlighting how it can increase player engagement through dynamically changing storylines. This analysis emphasizes how procedural narrative generation can provide unique, interactive experiences by adapting the story to the actions and decisions of the player. On the other hand, ref. [24] discusses the importance of extracting, processing, and generating narrative for interactive fiction and computer games. This work underscores the crucial role of narrative in engaging players and creating meaningful stories and complex worlds in video games, highlighting how interactive narrative elements allow players to engage more deeply and shape the story.
In summary, the ways that narrative can be presented in games are diverse and depend on the creativity of the creators and the technological capabilities of the platform. Technological evolution and increasing hardware capabilities open new horizons for game designers in terms of interactive storytelling.

3. Materials and Methods

Working on the cutscene sequencer project in Unity 3D (2021.3.21f1) was a complex endeavor that required the coordinated use of diverse tools and technologies. The main development environment used for building and designing the game was the Unity 3D platform. Due to its flexibility and support for a wide spectrum of game types, Unity 3D proved to be an ideal choice for projects requiring complex narrative interactions and advanced animations. The C# language was used for scripting the project, as it naturally cooperates with Unity and offers advanced programming capabilities such as object orientation, design patterns, and ease of integration with various APIs. Support for the code writing and debugging process was provided by the Integrated Development Environment (IDE) Visual Studio 2022 (v. 17.7.2), which offered comprehensive tools for code management and interface design, as well as ensuring smooth integration with Unity, significantly speeding up the creation and testing of scripts. For effective source code management and team collaboration, the version control system Git was utilized. It allowed for the tracking of changes, the creation of separate branches for different functionalities, and the safe merging of team efforts. Additionally, the GitHub (Github Desktop—3.3.18) platform served as a hub for team collaboration, enabling code sharing, issue tracking, and for changes to be proposed through the pull request system.
A significant aspect of the project work was also the creation of visual assets. Blender, as a tool for 3D modelling and animation, was used to create character models, game world objects, and animations. Substance Painter was used for creating textures, user interface elements, and other graphics that required high-quality visual presentation.
The sound design, important for building atmosphere and engaging players, required the use of audio editing software, such as Audacity. It enabled the processing of soundtracks, dialogues, and special effects.
The integration and application of a wide range of tools and technologies such as Unity 3D, C#, Visual Studio, Git, GitHub, Blender (v. 3.6), Substance Painter (v. 8.3.1), and Audacity (v. 3.5.1) enabled the development of a comprehensive cutscene sequencer system, supporting the project in creating engaging, dynamic narrative scenes. The sequencer script, written in C# and based on the Unity engine, supports linear cutscenes and dialogical interactions, providing a broad range of tools for narrative creation. The organized code uses an event system and delegates to ensure clarity and ease of modification, although the lack of safeguards against incorrect input data may pose a risk regarding errors. However, at the pilot stage, security control is not a priority. The entire process underscores the multidisciplinary nature of this comprehensive approach to working on modern video game projects.
After establishing a solid foundation with integrated tools and technologies that facilitated the development of the cutscene sequencer, particular attention was given to the visual and narrative aspects of the project, essential for an immersive player experience. A key element that links advanced programming tools with the final visual effect in the game is the UpdateCameraTransform method. Its role in managing camera perspective during cutscenes is fundamental, enabling not only smooth transitions and changes in shots but also significantly influencing how the player experiences the story. The use of this method in scripts such as AnimationCutscene.cs, SimpleCutscene.cs, and SmoothTransitionCutscene.cs allows for dynamic changes in camera settings, giving game developers the opportunity to expose scenes from various perspectives and direct the player’s attention to key narrative elements. This flexibility in manipulating the camera, inspired by cinematic techniques, greatly enhances immersion, which is a testament to the advanced approach to video narration and demonstrates how the technological capabilities of Unity 3D and the applied tools support the creation of deep, interactive experiences in video games.
In the context of the technical sophistication and visual presentation offered by the UpdateCameraTransform method in managing camera perspective, a deeper mathematical analysis of this function becomes essential. Transitioning from practical application in scripts to a detailed examination of the mechanisms controlling camera motion reveals the mathematical layer underlying these processes. This analysis provides valuable insights into the techniques used to achieve smooth and dynamic perspective changes, which directly impact the player’s experience during cutscenes. The method, being a component of key scripts such as AnimationCutscene.cs, SimpleCutscene.cs, and SmoothTransitionCutscene.cs, integrates mathematical techniques, including interpolation, easing functions, and transformation matrices. The significance of these elements underscores the complexity and precision that game developers must apply to create engaging and aesthetically rich scenes while maintaining the fluidity and naturalness of camera movements.
In the AnimationCutscene.cs script, the UpdateCameraTransform() method utilizes camera animations to control its movement, allowing for dynamic changes in position, rotation, and field of view. A crucial aspect here is the interpolation between key animation frames, which is based on the current animation time. This mechanism enables smooth camera transitions between defined states, significantly enhancing the visual quality of the presented scenes.
In SimpleCutscene.cs, the application of linear interpolation to move the camera between its starting and ending positions enhances the direct and controlled manipulation of camera movement. This method calculates the duration of the cutscene, which allows for precise determination of the camera’s movement speed, crucial for achieving the intended narrative effects.
The SmoothTransitionCutscene.cs script extends this model by using easing functions for smoother camera movement, making the camera motion more natural and aesthetically pleasing. These functions affect the speed of the animation, especially at the beginning and end of the cutscene, allowing for the creation of more complex visual effects.
The foundation of the discussed techniques are mathematical concepts such as interpolation, which allows for calculating intermediate values between defined camera states; easing functions, which modify the speed of the animation to achieve fluid movements; and transformation matrices, which represent the geometric transformations of the camera in three-dimensional space. These elements create the mathematical basis upon which camera motion control in cutscenes is built, highlighting the complexity of the process and its significance for the final presentation of scenes in the game. Table 1 presents the use of mathematical methods in the UpdateCameraTransform method.
The table summarizes the mathematical aspects used in the UpdateCameraTransform() method across different scripts, showcasing various approaches to controlling camera movement during cutscenes. The use of linear interpolation, time normalization, easing functions, and calculations for camera position, rotation, and field of view highlights the complexity and precision required in creating engaging scenes. Each script utilizes these mathematical tools in distinct ways to achieve its own specific visual and narrative effects, demonstrating the flexibility and benefits of mathematics in crafting experiences in video games.
This article discusses both time discretization techniques used in animation and camera movement control in games, as well as advanced second-order time discretization schemes crucial for solving differential equations. Currently, there are numerous second-order methods for time discretization in solving differential equations. These methods include approaches such as the finite difference method, the finite element method, and the finite volume method. Li, Zhang, and Yang presented a new nonlinear compact difference scheme for Burgers-type equations with a weakly singular kernel [25]. Wang et al. developed a fast compact finite difference scheme for the fourth-order diffusion-wave equation [26]. Shi and Yang estimated the pointwise errors of a conservative difference scheme for supergeneralized Burgers’ equations [27]. Additionally, Wu, Zhang, and Yang proposed a second-order finite difference method for multi-term fourth-order integral-differential equations on graded meshes [28].
In the realm of finite element methods, Zhou, Zhang, and Yang introduced a CN ADI algorithm on non-uniform meshes for three-dimensional evolution equations with multi-memory kernels in viscoelastic dynamics [29]. Yang et al. developed a second-order BDF ADI Galerkin finite element method for evolutionary equations with a nonlocal term in three-dimensional space [30]. Shi and Yang presented a two-grid method for nonlinear generalized Burgers’ equations [31]. Wang et al. successfully applied the two-grid method in time algorithms for two-dimensional nonlinear fractional PIDEs with a weakly singular kernel [32], and Zhang et al. utilized this method in time algorithms combined with difference schemes for two-dimensional nonlocal nonlinear wave equations [33].
In the context of finite volume methods, Yang and Zhang analyzed a new NFV scheme-preserving DMP for two-dimensional sub-diffusion equations on distorted meshes [34]. Furthermore, Yang and Zhang developed a conservative, positivity-preserving nonlinear FV scheme on distorted meshes for multi-term nonlocal Nagumo-type equations [35]. These studies demonstrate the wide range and depth of applications of second-order time discretization methods across different numerical approaches, highlighting their effectiveness and utility in solving complex differential equations.
A standout feature of this sequencer is the integration of the ICutsceneAction interface, which allows developers to dynamically create custom events within movie interludes. This interface makes it easy to incorporate complex interactions, such as physics-based explosions or character animations, without hard-coding them into the cutscenes’ logic. This flexibility enables the creation of highly dynamic and responsive narrative sequences that can adapt to different gameplay contexts and player interactions. The proposed movie interlude sequencer uses key technologies, including Unity 3D for game development, C# for scripting, Blender for 3D modeling, and Substance Painter for texturing. Figure 1 below illustrates the structure of our movie interlude sequencer, highlighting the interaction between various components, such as CutsceneController, CutsceneSequencer, and the ICutsceneAction interface.
The design process for the advanced cutscene sequencer involved several iterative steps, including prototyping, testing, and refinement. Initial prototypes were developed in Unity 3D, and their performance was evaluated based on criteria such as ease of use, flexibility, and impact on game performance.

4. Results

Within contemporary game development practices, the cutscene system is a fundamental mechanism for effectively conveying narrative and providing interactivity with the user. The integrity of this system is based on the synergy between key components: CutsceneEventLinker, ICutsceneAction, CutsceneAction, CutsceneController, and CutsceneSequencer, which together construct a platform for dynamic and engaging narrative sequences. Figure 2 illustrates how these individual components cooperate with each other, creating an integrated system that enables the creation of complex and engaging narratives in games. Each component plays a specific role, from monitoring game events through CutsceneEventLinker, to defining and executing actions in cutscenes, to managing and coordinating the entire process through CutsceneController and CutsceneSequencer. Thanks to this integrated structure, developers can effectively design, implement, and control the flow of cutscenes, which is crucial for creating engaging experiences in games.

4.1. Sequencer Structure

At the initial stage, CutsceneEventLinker functions as a mediator, interpreting events generated in the game environment and initiating appropriate cutscenes through the CutsceneController mechanism. This approach allows for the creation of a reactive system that adapts to player actions, enhancing the immersiveness of the experience. A fundamental element in the cutscene system structure is the ICutsceneAction interface, which serves as a contract for actions used in cutscenes, such as playing music or animations. The standardization of the ICutsceneAction interface allows for consistency and flexibility in the design and implementation of various actions, which are further specified by concrete implementations known as CutsceneActions. Each of these actions contains unique logic, allowing for the creation of diverse and engaging narrative sequences. Oversight of cutscene launch and control rests with CutsceneController, whose functionality is modulated depending on the context of use. In an autonomous variant, this controller initiates and manages cutscenes independently, while when integrated with CutsceneSequencer, actions are coordinated within a defined sequence, enabling the creation of a smooth and cohesive narrative. CutsceneSequencer, serving as the sequence coordinator, is a key component ensuring the management of the order and conditions of launching individual cutscenes. By maintaining a list of controllers, the Sequencer enables the precise coordination of narrative elements, which determines the effectiveness of the message and player engagement. The operation process of the system initiates from the detection of events by CutsceneEventLinker, which then informs CutsceneController of the need to launch the appropriate cutscene. Based on the information received, CutsceneController activates specific actions. In the context of a larger sequence managed by CutsceneSequencer, these actions are synchronized with the overall logic of the system, creating a coherent and multi-threaded narrative. The cutscene system in the game development context presents a complex architecture supporting narrative creation and user interaction. Integrated components such as CutsceneEventLinker, ICutsceneAction, CutsceneAction, CutsceneController, and CutsceneSequencer create a platform that enables game creators to effectively implement narrative assumptions and engage players in unique interactive experiences. Table 2 presents the key components of the cutscene system along with their functions and interactions.

4.1.1. CutsceneEventLinker

CutsceneEventLinker.cs, a key component in the architecture of the cutscene system, serves as an interface between dynamic events generated within the game context and the mechanisms managing cutscenes. The primary task of this component is to effectively listening to and interpret events triggered by player interactions or other game elements, such as entering a specified location, achieving a particular goal, or defeating an opponent. Moreover, CutsceneEventLinker.cs enables the definition of mappings between these events and specific actions in cutscenes, providing significant flexibility in configuring the system’s responses to changing game conditions. CutsceneEventLinker.cs was created to allow users to generate events in the game, and if they find an event interesting enough to use in a cutscene, it can be easily added.
The operation of CutsceneEventLinker.cs is based on several fundamentals. Firstly, this component registers to receive specific events, using mechanisms provided by the game engine, such as UnityEvents. When a specified event is detected, the component initiates the process of launching the associated cutscene or action sequence by calling upon a method in CutsceneController. This method activates the appropriate action, further passing control to the mechanisms of the cutscene system.
The implementation of CutsceneEventLinker.cs may also include mechanisms to handle more complex scenarios, such as event prioritization, managing conditions that trigger actions, or integration with other game systems, further emphasizing its versatility and importance in the structure of the cutscene system. By enabling dynamic responses to in-game events, CutsceneEventLinker.cs ensures the fluidity and coherence of the player’s experience, integrating narrative elements with gameplay in a way that is almost invisible to the user. Table 3 presents the logic of operation of CutsceneEventLinker.
In conclusion, CutsceneEventLinker.cs is an essential tool in creating complex, interactive narrative experiences in games, enabling developers to smoothly integrate cutscenes into the dynamic world of the game. Its modular structure and ability to map events to cutscene actions offer developers extensive capabilities in designing interactions and crafting narratives, which translates into higher quality and depth of immersion in virtual experiences.

4.1.2. ICutsceneAction

In the context of developing cutscene systems in computer games, the ICutsceneAction interface plays an indispensable role, providing the basis for defining and implementing a wide range of cutscene actions. Its basic function is to establish a specific contract that must be fulfilled by every action included in the cutscenes, ensuring uniformity and coherence in the system’s operation. By defining a common set of methods and properties required from actions, ICutsceneAction allows the cutscene system to efficiently manage diverse actions, regardless of their internal specificity or complexity.
The essence of ICutsceneAction is not only to define the basic contract for cutscene actions but also to ensure the modularity of the entire system. By requiring all actions to implement the same interface, the system gains flexibility, allowing the easy addition of new types of actions without the need to interfere with other components. Such architecture promotes the development and scalability of the cutscene system, opening up new possibilities for game developers in terms of creating narratives and interactions with the player. Additionally, ICutsceneAction plays a key role in facilitating control over the flow of cutscenes.
Each class implementing ICutsceneAction must provide its own implementations of the defined methods, allowing for the creation of cutscene actions of varied nature—from simple actions like displaying text to complex animation sequences. This approach ensures that these actions can be effectively managed by the system’s central components, such as CutsceneController and CutsceneSequencer, ensuring smooth transitions and coordination within the cutscenes. Table 4 presents the logic and operation scheme of the ICutsceneAction interface in detail, showing its key methods, the application of these methods, and their impact on processes in the cutscene system.
ICutsceneAction is a key component of the cutscene system, whose modular nature and uniform interface open doors for developers to explore new forms of interactive narration. It enables easy manipulation of various cutscene actions, contributing to the creation of rich and complex scenes that enrich the player’s experience and the game’s narrative. In this way, ICutsceneAction becomes not only a tool for presenting the story but also an integral part of the game mechanics, enabling the creation of deep and immersive virtual worlds.
Unlike Unity’s Timeline, which can be cumbersome to implement across multiple platforms, the proposed sequencer is designed with cross-platform compatibility in mind. By abstracting the cutscene logic into modular components and leveraging Unity’s inherent capabilities, the sequencer ensures consistent performance and functionality across different devices and platforms. This platform-agnostic approach simplifies the deployment process and enhances the user experience by maintaining narrative coherence regardless of the target platform. Figure 3 shows a diagram of how ICutsceneAction works.

4.1.3. CutsceneAction

CutsceneAction, as a concrete implementation of the ICutsceneAction interface, plays a crucial role in the cutscene system of computer games, implementing specific behaviors and actions required to execute a cutscene. This class forms the foundation for a wide range of actions that can be triggered by the system, such as displaying dialogs, playing animations, changing camera settings, and other actions aimed at enhancing the interactivity and immersiveness of the game’s narrative. The primary purpose of CutsceneAction is to execute specific actions in a scene, such as launching specific animations or presenting dialog text. CutsceneAction inherits from ICutsceneAction the flexibility to extend new functionalities while remaining fully compatible with the overall cutscene management system. This structure allows for the creation of complex interactions with the player while influencing various aspects of the game, leading to the creation of a dynamic and engaging narrative.
The implementation of CutsceneAction is flexible and can take various forms, tailored to the specific requirements of cutscene actions. Examples such as DialogAction or AnimationAction illustrate the diverse applications of CutsceneAction, from presenting dialogs to triggering animations, showing how key methods are adapted to achieve the specific goals of these actions. Table 5 presents the logic and architecture of CutsceneAction’s operation, illustrating how this class executes specific behaviors and actions in the cutscene system.
In conclusion, CutsceneAction is an essential component of the cutscene system, enabling the design and execution of complex and interactive narrative elements in computer games. Thanks to its close integration with ICutsceneAction and collaboration with managing components such as CutsceneController and CutsceneSequencer, CutsceneAction ensures the seamless integration of cutscene actions with gameplay, which contributes to enriching the player’s experience and enhancing the narrative value of the game.

4.1.4. CutsceneController

CutsceneController plays an executive role in the process of managing and presenting scenes. Its main function is to initiate, control, and coordinate the execution of the actions defined by CutsceneActions, allowing the smooth and efficient introduction of complex narrative sequences and interactive elements into the game. Working closely with CutsceneSequencer, which defines the order and conditions under which actions should be triggered, CutsceneController ensures that all actions are executed as intended by the creators, guaranteeing the consistency and continuity of the player experience. In addition, by being able to manage the state of individual actions, including starting, pausing, and stopping them, CutsceneController enables the narrative pace to be precisely matched to the dynamics of the game. Its role is invaluable in creating engaging and exciting scenes that enrich the narrative and deepen the immersion, while allowing the game to react flexibly to the player’s actions and keep the gameplay flowing. CutsceneController, as an integral component of the scene system, is therefore essential for the effective creation and presentation of complex narrative elements in games, providing a bridge between narrative design and interactive user experience. Table 6 shows the logic and flowchart of CutsceneController.

4.1.5. CutsceneSequencer

CutsceneSequencer distinguishes itself by coordinating and managing sequences of actions, known as CutsceneControllers, which are essential for creating complex narrative scenarios. It operates by organizing these controllers in a precisely planned order or based on defined conditions, enabling game developers to construct coherent and dynamic cutscenes that enrich the narrative and enhance the immersive experiences of players. The task of CutsceneSequencer is not only to establish the order of actions but also to manage them based on logical conditions such as player decisions or variable game events, allowing for adaptive modifications of the cutscenes in response to in-game interactions. Collaboration between CutsceneSequencer and CutsceneController facilitates the effective initiation, management, and coordination of actions, ensuring their seamless execution and integration with the game’s main storyline. The implementation of this component can take various forms, depending on the specific needs of the project, from simple action lists to complex conditional systems, enabling game creators to flexibly design cutscenes. As a result, CutsceneSequencer is a key tool for game narrative creation, allowing game designers to precisely manage action sequences and the conditions of their execution, significantly impacting the quality of the player experience and the depth and consistency of the narrative. Table 7 details the logic and operation of CutsceneSequencer within the game cutscene system.
The information presented in Table 7 highlights the complexity and essential functions of CutsceneSequencer in the process of creating scenes in games, emphasizing its role in coordinating action sequences, managing the conditions of their initiation, and collaborating with other components of the cutscene system. Thanks to this component, game developers can design dynamic and engaging narrative scenarios that are an integral part of enriching player experiences, enhancing the overall quality and consistency of narration in video games. Therefore, CutsceneSequencer is an indispensable tool in the process of creating advanced and complex narrative elements that impact the depth and immersion of the game world.

4.2. Example Application of the Sequencer

This section presents the practical use of the sequencer for scenes in educational 3D games, with an emphasis on integrating narration and technology in the Unity 3D environment. The sequencer, being a key tool in creating dynamic and engaging narrative scenes, allows for precise management of the progress of cutscenes, from simple animations to complex interactions.
Working with the sequencer in Unity 3D begins when a capsule, simulating the player’s character, interacts with an activator object known as OpenDoorTrigger, which in Figure 4 is marked as a cube. The capsule’s entry into the cube initiates the sequencer, leading to the activation of the programmed sequence of scenes. This event starts a series of cutscene activations, which are programmed to launch sequentially, ensuring a smooth narrative transition. The initiation of the sequencer occurs by placing an activator object on the scene, which, regardless of its form, must contain a collision component that enables sequencer activation.
In the Unity project, a hierarchical structure for managing sequencer elements has been implemented, including key components that enable the effective creation and management of in-game narration. The central element is the IntroSequencer object, which functions as the main container for all sequences. Each sequence within this container is treated as a separate module, ensuring better organization and greater flexibility in managing individual narrative segments. Additionally, the structure includes the IntroCameras object, which is responsible for storing and managing cameras used to record individual scenes. The IntroEvents object serves as an event controller, initiating events during transitions between sequences, which is crucial for the fluidity and consistency of the narrative. An important feature is also the IntroSkip function, which allows players to optionally skip cutscenes. This significantly improves the accessibility and usability of the game, giving users more control over the course of the game. The CutsceneSequencer script, which allows for the management of the order and organization of sequences, is attached to the IntroSequencer object. In the inspector of this script, a list of CutsceneControllers is generated, on which the user can sequentially add and modify individual sequences (Figure 5). This gives game developers full control over the narrative structure, which is essential for achieving the intended visual and emotional effects in the interactive gaming environment.
In the process of implementing the cutscene sequencer in Unity 3D, each sequence, such as “Sequence 1—Animation”, is individually configured, allowing for precise adjustment of both the visual and audio aspects of the narrative. The configuration includes assigning a specific camera that performs the animation, determining the duration of the sequence, which can be automatically adjusted based on the length of the accompanying sound, and defining detailed parameters such as sound delay or visual effects, like screen dimming. Additionally, the use of the “CutsceneEventLinker” function enables the integration of events, such as changing the sky texture, which can be activated at the beginning or end of the sequence depending on the planned narrative effect.
The central element of the system is the “CutsceneSequencer” script, attached to the “IntroSequencer” object. This script generates a list of “CutsceneControllers”, on which the user can sequentially place individual sequences. Each sequence is configured in the Unity Inspector, where settings such as “CutsceneCamera” define the specific camera used in the sequence, “CutsceneDuration” determines the duration of the scene, and “CutsceneSequencer” indicates which sequencer is responsible for the scene. Sound is managed through the “Sound” and “Audio Source”, with sound delay managed by “Audio Delay.” Optionally, “Flash Image” can be added for a screen-dimming effect during transitions between scenes. This detailed configuration enables developers to create immersive and dynamic narrative scenes that enhance the gaming experience by integrating interactive elements seamlessly with the game’s storytelling.
The camera configuration process for the first sequence occurs through manual selection and setup within the Unity space. This camera, typically configured for animation using the “AC_Camera1” file, facilitates smooth transitions between different transformation points. In the scene, “Audio Source” objects are also set up to process dialogue sounds, ensuring that these sounds do not interfere with other audio elements in the game.
The integration of events and the completion of the configuration involve adding additional events such as “ChangeSkyBox”, which is managed by “CutsceneEventLinker” and capable of being activated at the start or end of the sequence depending on narrative needs. Once all elements are properly configured, the entire sequencer system is ready for use, allowing for the creation of complex and dynamic scenes that are integral to the player’s immersive experience and effective storytelling through integrated visual and sound media. The effect of a sample implementation in the form of an intro to educational materials can be observed in video 1 (link: https://zenodo.org/records/11058107?token=eyJhbGciOiJIUzUxMiJ9.eJpZCI6IjgxYzlhMjc1LWVjOWItNDUyMy1hMjk5LTVlNWRiNmYyNjUyMiIsImRhdGEiOnt9LCJyYW5kb20iOJiZjZiZTMyNWM4NzY0ZTRjNTU2ZDAyZDg0NWYzYTE3ZSJ9.wSH6F7ZKnxpzQdeBfPOk_9GA0auTe2ZVabsQzeR2_0QNt68dXjpcrpGjR7teShjFGe2Cddw4BYYSHfGxC6IA accessed on 1 May 2024 [36]). It demonstrates the use of the sequencer in an educational context, where the setting is based on an astronomical observatory, and the role of the narrator is played by the character Nicolaus Copernicus.

5. Discussion

The cutscene system in video games is a sophisticated construction, whose comprehensive analysis reveals a wealth of possibilities and simultaneously highlights the challenges developers may encounter. Characterized by high flexibility and modularity, the system allows creators to easily add, modify, and remove scenes and actions, making it highly adaptive and conducive to dynamically changing project requirements. This approach not only streamlines the game development process but also allows for rapid response to player feedback and evolving trends in the gaming industry.
Interface consistency, ensured by the implementation of the ICutsceneAction standard, is key to the uniform management of actions and scenes. This allows developers to maintain order among the complex components of the system, invaluable in managing more intricate projects. Additionally, the integration of the CutsceneEventLinker mechanism facilitates the creation of engaging narrative interactions that enrich player immersion and engagement, allowing the system to dynamically respond to in-game events and player decisions.
An additional advantage is the advanced sequencing offered by CutsceneSequencer, which enables the design of complex narrative flows. The ability to use CutsceneController as an independent component or an integral part of a larger sequence of scenes gives creators complete control over the presentation and integration of scenes within the game context, crucial for maintaining narrative and visual coherence.
However, the cutscene system also has limitations. Its high modularity and complexity can lead to an increase in system complexity, posing challenges in management and debugging, especially in the context of larger projects. The tight integration among different system components can be challenging in the rapidly changing game development environment, where fast iterations and flexibility are crucial for success. Additionally, the implementation of conditional logic, while allowing for the creation of complex scenarios, requires developers to invest additional effort in designing and testing to ensure narrative consistency and smooth transitions between scenes.
The aspect of optimization becomes particularly significant in the context of system limitations. Resource management and scene optimization are essential to ensure high system performance and to avoid performance drops. Moreover, the effectiveness and fluidity of the system’s operation can be closely linked to the specifics of the game, requiring developers to carefully plan interactions between scenes and gameplay mechanics to maintain player experience coherence.
Seamless integration with dialogue and sound systems is another innovative aspect of this sequencer. By handling these elements natively within cutscenes, the sequencer ensures synchronized playback and interaction between visual and audio components. This holistic approach to designing cinematic interludes enhances immersion and narrative continuity. To evaluate the effectiveness of our movie interlude sequencer, we compared it with other popular sequencers such as Unity 3D and Unreal Engine. Table 8 summarizes the key differences, performance metrics, and innovative aspects of our solution.
Despite the numerous advantages of our advanced cutscene sequencer, there are some limitations. The high modularity and complexity of the system can increase the difficulty of managing and debugging, especially in larger projects. Additionally, the tight integration of various components requires careful planning and optimization to ensure high performance and fluid transitions. Future work will focus on further optimizing the system and exploring ways to simplify its management without compromising on flexibility and functionality.

6. Conclusions

In summary, the cutscene system developed in this study offers developers wide possibilities in creating engaging and immersive narrative elements while maintaining flexibility and control over the creative process. Its modular nature and ability to integrate with dynamic in-game events promote innovation and adaptation, which are invaluable in the rapidly evolving world of computer games.
The advanced cutscene sequencer provides game developers with a robust framework for integrating rich narrative elements into their games. Our findings indicate that the use of this sequencer can improve both the efficiency and creativity involved in game development. The tool’s modular design allows for easy prototyping and implementation of changes, encouraging experimentation with various narrative solutions. This flexibility is particularly beneficial in educational and cultural contexts where diverse and engaging visual experiences are crucial for effective learning and cultural transmission.
Furthermore, the sequencer’s ability to manage complex interactions and dynamic camera perspectives enables developers to create more immersive and interactive gaming experiences. The integration of real-time manipulation of narrative elements with the robust capabilities of Unity 3D, C#, and Blender showcases the potential for this sequencer to become an indispensable tool in the toolkit of game developers aiming to enhance the narrative depth and engagement of their games.
However, developers must be aware of the challenges associated with managing system complexity, optimizing performance, and ensuring the seamless integration of various components. Addressing these challenges is crucial for maintaining the high performance and smooth transitions that are essential for player immersion and engagement.
Overall, the advanced cutscene sequencer not only supports the creation of sophisticated cutscenes but also significantly contributes to the field of educational and cultural game development by providing a comprehensive, user-friendly solution for narrative creation.

Author Contributions

Conceptualization, R.K. and R.S.; data curation, R.S.; formal analysis, C.K.; funding acquisition, R.K.; investigation, G.G.; methodology, R.S.; project administration, R.K.; resources, C.K.; software, R.S.; supervision, R.K.; validation, C.K.; visualization, R.S.; writing—original draft, R.K.; writing—review and editing, G.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the University of Warmia and Mazury in Olsztyn.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

Robert Skowroński was employed by the company SBI Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Alexandri, A.; Drosos, V.; Tsolis, D.; Alexakos, C.E. The impact of 3D serious games on cultural education. In Proceedings of the 10th International Conference on Education and New Learning Technologies, Palma, Spain, 2–4 July 2018; pp. 8557–8566. [Google Scholar] [CrossRef]
  2. Drosos, V.; Alexakos, C.; Alexandri, A.; Tsolis, D. Evaluating 3D serious games on cultural education. In Proceedings of the 2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), Zakynthos, Greece, 23–25 July 2018; pp. 1–5. [Google Scholar] [CrossRef]
  3. Drosos, V.; Alexandri, A.; Tsolis, D.; Alexakos, C.A. 3D serious game for cultural education. In Proceedings of the 2017 8th International Conference on Information, Intelligence, Systems and Applications (IISA), Larnaca, Cyprus, 27–30 August 2017; pp. 1–5. [Google Scholar] [CrossRef]
  4. Floryan, M.; Woolf, B. Rashi Game: Towards an Effective Educational 3D Gaming Experience. In Proceedings of the 2011 IEEE 11th International Conference on Advanced Learning Technologies, Athens, GA, USA, 6–8 July 2011; pp. 430–432. [Google Scholar] [CrossRef]
  5. Eridani, D.; Santosa, P. The development of 3D educational game to maximize children’s memory. In Proceedings of the 2014 International Conference on Information Technology and Electrical Engineering (ICITEE), Yogyakarta, Indonesia, 7–8 October 2014; pp. 1–4. [Google Scholar] [CrossRef]
  6. Schmoll, L. The use of games in foreign language teaching: From traditional to digital. Stud. Digit. J. 2016, 5. [Google Scholar] [CrossRef]
  7. Lee, J. Effects of Fantasy and Fantasy Proneness on Learning and Engagement in a 3D Educational Game. Doctoral Dissertation, The University of Texas at Austin, Austin, TX, USA, 2015. [Google Scholar]
  8. Tazouti, Y.; Boulaknadel, S.; Fakhri, Y. ImALeG: A Serious Game for Amazigh Language Learning. Int. J. Emerg. Technol. Learn. 2019, 14, 108–117. [Google Scholar] [CrossRef]
  9. Li, J. Research and Analysis of 3D games. Highlights Sci. Eng. Technol. 2023, 31, 3847–3861. [Google Scholar] [CrossRef]
  10. Boden, A.; Buchholz, A.; Petrovic, M.; Weiper, F.J. 3D VR Serious Games for Production & Logistics. In Proceedings of the 18. AALE-Konferenz, Pforzheim, Germany, 9–11 March 2022. [Google Scholar] [CrossRef]
  11. Klopp, M.; Dörringer, A.; Eigler, T.; Bartel, P.; Hochstetter, M.; Weishaupt, A.; Geirhos, P.; Abke, J.; Hagel, G.; Elsebach, J.; et al. Development of an Authoring Tool for the Creation of Individual 3D Game-Based Learning Environments. In Proceedings of the 5th European Conference on Software Engineering Education, Seeon, Germany, 19–21 June 2023. [Google Scholar] [CrossRef]
  12. Ruan, X.; Cho, D.-M. Relation between Game Motivation and Preference to Cutscenes. Cartoon. Animat. Stud. 2014, s36, 573–592. [Google Scholar] [CrossRef]
  13. Ruan, X.; Cho, D. Virtual Camera Movement Bring the Innovation and Creativity in Action Video Games for the Player. Korea Open Access J. 2013, 13, 3544. [Google Scholar] [CrossRef]
  14. Hasani, M.F.; Udjaja, Y. Immersive Experience with Non-Player Characters Dynamic Dialogue. In Proceedings of the 2021 1st International Conference on Computer Science and Artificial Intelligence (ICCSAI), Jakarta, Indonesia, 28 October 2021. [Google Scholar] [CrossRef]
  15. Weir, N.; Thomas, R.; D’Amore, R.; Hill, K.; Van Durme, B.; Jhamtani, H. Ontologically Faithful Generation of Non-Player Character Dialogues. arXiv 2022, arXiv:2212.10618. [Google Scholar] [CrossRef]
  16. Barker, D. A story without words: Challenges crafting narrative in the videogame Rise. TEXT J. 2018, 22, 1–12. [Google Scholar] [CrossRef]
  17. Ulaş, E.S. Virtual environment design and storytelling in video games. J. Gaming Virtual Worlds 2014, 4, 75–88. [Google Scholar] [CrossRef] [PubMed]
  18. Malysheva, Y. Dynamic quest generation in Micro Missions. In Proceedings of the 2012 IEEE 4th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES), Genoa, Italy, 29–31 October 2012; pp. 1–4. [Google Scholar] [CrossRef]
  19. Zook, A.; Lee-Urban, S.; Drinkwater, M.; Riedl, M.O. Skill-based Mission Generation: A Data-driven Temporal Player Modeling Approach. In Proceedings of the Third Workshop on Procedural Content Generation in Games, Raleigh, NC, USA, 29 May–1 June 2012. [Google Scholar] [CrossRef]
  20. Sánchez, J.A.; Pérez, T. A Conceptual Model for an OWL Ontology to Represent the Knowledge of Transmedia Storytelling. In Challenges and Opportunities for Knowledge Organization in the Digital Age; Ergon-Verlag: Baden-Baden, Germany, 2018. [Google Scholar] [CrossRef]
  21. Götz, U. On the Evolution of Narrative Mechanics in Open-World Games. Media Stud. 2021, 82, 161–176. [Google Scholar] [CrossRef]
  22. Chauvin, S. A Narrative Model for Emerging Video Games. Doctoral Dissertation, CNAM, Paris, France, 2019. Available online: https://dblp.org/rec/phd/hal/Chauvin19.html (accessed on 18 May 2024).
  23. Wolf, M.J.P. The Potential of Procedurally-Generated Narrative in Video Games. In Clash of Realities 2015/16: On the Art, Technology and Theory of Digital Games, Proceedings of the 6th and 7th Conference; Transcript Verlag: Bielefeld, Germany, 2017; p. 145. [Google Scholar] [CrossRef]
  24. Valls-Vargas, J. Narrative Extraction, Processing and Generation for Interactive Fiction and Computer Games. In Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, Boston, MA, USA, 14–18 October 2013; pp. 37–40. [Google Scholar] [CrossRef]
  25. Li, C.; Zhang, H.; Yang, X. A new nonlinear compact difference scheme for a fourth-order nonlinear Burgers type equation with a weakly singular kernel. J. Appl. Math. Comput. 2024, 70, 2045–2077. [Google Scholar] [CrossRef]
  26. Wang, W.; Zhang, H.; Zhou, Z.; Yang, X. A fast compact finite difference scheme for the fourth-order diffusion-wave equation. Int. J. Comput. Math. 2024, 101, 170–193. [Google Scholar] [CrossRef]
  27. Shi, Y.; Yang, X. Pointwise error estimate of conservative difference scheme for supergeneralized viscous Burgers’ equation. Electron. Res. Arch. 2024, 32, 1471–1497. [Google Scholar] [CrossRef]
  28. Wu, L.; Zhang, H.; Yang, X.; Wang, F. A second-order finite difference method for the multi-term fourth-order integral–differential equations on graded meshes. Comput. Appl. Math. 2022, 41, 313. [Google Scholar] [CrossRef]
  29. Zhou, Z.; Zhang, H.; Yang, X. CN ADI fast algorithm on non-uniform meshes for the three-dimensional nonlocal evolution equation with multi-memory kernels in viscoelastic dynamics. Appl. Math. Comput. 2024, 474, 128680. [Google Scholar] [CrossRef]
  30. Yang, X.; Qiu, W.; Chen, H.; Zhang, H. Second-order BDF ADI Galerkin finite element method for the evolutionary equation with a nonlocal term in three-dimensional space. Appl. Numer. Math. 2022, 172, 497–513. [Google Scholar] [CrossRef]
  31. Shi, Y.; Yang, X. A time two-grid difference method for nonlinear generalized viscous Burgers’ equation. J. Math. Chem. 2024, 62, 1323–1356. [Google Scholar] [CrossRef]
  32. Wang, F.; Yang, X.; Zhang, H.; Wu, L. A time two-grid algorithm for the two dimensional nonlinear fractional PIDE with a weakly singular kernel. Math. Comput. Simul. 2022, 199, 38–59. [Google Scholar] [CrossRef]
  33. Zhang, H.; Jiang, X.; Wang, F.; Yang, X. The time two-grid algorithm combined with difference scheme for 2D nonlocal nonlinear wave equation. J. Appl. Math. Comput. 2024, 70, 1127–1151. [Google Scholar] [CrossRef]
  34. Yang, X.; Zhang, Z. Analysis of a New NFV Scheme Preserving DMP for Two-Dimensional Sub-diffusion Equation on Distorted Meshes. J. Sci. Comput. 2024, 99, 80. [Google Scholar] [CrossRef]
  35. Yang, X.; Zhang, Z. On conservative, positivity preserving, nonlinear FV scheme on distorted meshes for the multi-term nonlocal Nagumo-type equations. Appl. Math. Lett. 2024, 150, 108972. [Google Scholar] [CrossRef]
  36. Attachment 1. Kaźmierczak Rafał, Skowroński Robert, Kowalczyk Cezary, Grunwald Grzegorz (2024) an Example Implementation of the Sequencer in the Form of an Intro to Educational Materials. Available online: https://zenodo.org/records/11058107?token=eyJhbGciOiJIUzUxMiJ9.eJpZCI6IjgxYzlhMjc1LWVjOWItNDUyMy1hMjk5LTVlNWRiNmYyNjUyMiIsImRhdGEiOnt9LCJyYW5kb20iOJiZjZiZTMyNWM4NzY0ZTRjNTU2ZDAyZDg0NWYzYTE3ZSJ9.wSH6F7ZKnxpzQdeB-fPOk_9GA0auTe2ZVabsQzeR2_0QNt68dXjpcrpGjR7teShjFGe2Cddw4BYYSHfGxC6IA (accessed on 1 May 2024). [CrossRef]
Figure 1. Sequencer structure.
Figure 1. Sequencer structure.
Applsci 14 04795 g001
Figure 2. Diagram of the cutscene system creation process.
Figure 2. Diagram of the cutscene system creation process.
Applsci 14 04795 g002
Figure 3. Schematic diagram of ICutsceneAction.
Figure 3. Schematic diagram of ICutsceneAction.
Applsci 14 04795 g003
Figure 4. Scene settings with activator.
Figure 4. Scene settings with activator.
Applsci 14 04795 g004
Figure 5. View in the inspector of events that happen in Sequence 1.
Figure 5. View in the inspector of events that happen in Sequence 1.
Applsci 14 04795 g005
Table 1. Mathematical methods in UpdateCameraTransform method.
Table 1. Mathematical methods in UpdateCameraTransform method.
AspectAnimationCutscene.csSimpleCutscene.csSmoothTransitionCutscene.cs
MethodUses animation to control camera movementUses linear interpolation to move
the camera from its initial to its final position
Uses easing functions to smoothly transition
the camera from its initial to its final position
Linear Interpolationx = (1 − t) × a + t × b
x: position, rotation, FOV (field of view) of the camera
a, b: parameter values in keyframes
t: animation time
x = (1 − t) × a + t × b
x: camera position
a: initial position
b: final position
t: time since the start of the cutscene
N/A
Time Normalizationt = (currentTime − startTime)/duration
currentTime: current animation time
startTime: animation start time
duration: animation duration
t = currentTime/duration
currentTime: current time since the start of the cutscene
duration: duration of the cutscene
t = currentTime/duration
currentTime: current time since the start of the cutscene
duration: duration of the cutscene
Easing FunctionsN/AN/Af(t) = t^3 × (3 − 2 × t)
f(t): weight used for interpolation
t: time since the start of the cutscene
Position CalculationcameraPosition = (1 − t) × startPosition + t × endPositioncameraPosition = (1 − t) × startPosition + t × endPositioncameraPosition = (1 − f(t)) × startPosition + f(t) × endPosition
Rotation CalculationcameraRotation = (1 − t) × startRotation + t × endRotationN/AN/A
Calculating FOVcameraFOV = (1 − t) × startFOV + t × endFOVN/AN/A
Transformation UpdatecameraTransform.position = cameraPosition
cameraTransform.rotation = cameraRotation
camera.fieldOfView = cameraFOV
cameraTransform.position = cameraPositioncameraTransform.position = cameraPosition
Table 2. Key components of the scene system.
Table 2. Key components of the scene system.
ComponentFunctionInteractions
CutsceneEventLinker.csMediates between game events and the cutscene system
-
Listens for game events
-
Informs CutsceneController about the need to launch a cutscene or action
ICutsceneActionDefines basic behaviors for actions used in cutscenes
-
Standardizes the interface for various actions
-
Enables the consistent design of cutscene actions
CutsceneActionExecutes specific actions within cutscenes based on ICutsceneAction
-
Implements unique logic for each action
-
Interacts with CutsceneController to initiate actions
CutsceneControllerManages the launching and controlling of cutscenes
-
Can operate independently or as part of CutsceneSequencer
-
Initiates actions defined by CutsceneAction
-
Coordinates actions within a sequence of cutscenes
CutsceneSequencerManages the sequence of cutscenes, coordinating their launch in the proper order
-
Maintains a list of CutsceneControllers
-
Manages smooth transitions and coordination between cutscenes
Table 3. The logic behind the CutsceneEventLinker.
Table 3. The logic behind the CutsceneEventLinker.
ElementDescriptionImplementation MechanismsExamples of Use
Event ListeningRegistration and response to events generated in the game environmentUse of UnityEvents or custom signals to detect eventsPlayer entering a specified area
Mapping Events to ActionsDefining relationships between events and cutscene actionsDictionary (e.g., Dictionary <TEvent, TAction>) mapping events to actionsDefeating an opponent initiates a dialogue cutscene
Initiating CutscenesLaunching specific cutscenes or action sequences in response to detected eventsCalling methods in CutsceneController to launch actionsLaunching an animation upon achieving a goal
Communication with CutsceneControllerEffective exchange of information and commands between CutsceneEventLinker and CutsceneControllerDirect connections and method calls between componentsQuickly initiating cutscenes in response to events
Table 4. Overview of the ICutsceneAction interface functionality.
Table 4. Overview of the ICutsceneAction interface functionality.
MethodMethod DescriptionApplicationImpact on Process
ExecuteAction()Activates the main functionality of the actionLaunching animations, playing sounds, displaying dialogsInitiates the cutscene action process, introducing the primary narrative or interactive element to the scene
Table 5. Logic and architecture of CutsceneAction operation.
Table 5. Logic and architecture of CutsceneAction operation.
ElementCharacteristicExamples of ImplementationRole in the Cutscene System
Action ExecutionImplements the logic needed to perform a specified actionAnimation, dialog, camera setting changesDirectly triggers interactions with the player or alters the game environment in response to the cutscene scenario
FlexibilityCapability to easily incorporate new features and actionsAdding new types of dialogs, interactive elementsAllows for the creation of complex cutscenes without the need to modify the existing system architecture
Interaction with Player and GameActions can affect the player and game elements, creating a dynamic narrative.Dialog choices that influence plot development, environmental changesEnriches the player’s experience through interactive narration and the game world’s response to player actions
Key MethodsExecuteAction()DialogAction.ExecuteAction() displays dialog; AnimationAction.ExecuteAction() triggers animationForms the basis for controlling the flow of actions, enabling the management of state and responses to changes in the scene
ImplementationTailored to the specific needs of the cutscene actionDialogAction, AnimationActionProvides mechanisms necessary to execute diverse actions, allowing for smooth transitions and coordination
Table 6. Logic and flow chart of CutsceneController.
Table 6. Logic and flow chart of CutsceneController.
ElementFunctionOperating MechanismImpact on Gameplay
Action InitiationLaunches specific CutsceneActions in response to game events or script instructionsCutsceneController activates actions such as animations or dialogs based on a sequence defined by CutsceneSequencerCreates smooth narrative transitions and engaging interactive elements, enriching the player’s experience
State ControlManages the state of actions (start, pause, stop)Uses methods like Execute(), Pause(), and Stop() to control the flow of actionsEnsures that cutscenes are presented at the appropriate pace, enhancing immersion and narrative coherence
Coordination with CutsceneSequencerWorks with CutsceneSequencer to manage the order and conditions of action executionReceives information from CutsceneSequencer about the order and conditions of actions, coordinating their executionAllows dynamic adjustment of cutscenes to player decisions and other variables, creating non-linear narratives
Response to Player InteractionsAdapts to player actions by modifying the flow of cutscenesCapable of changing the course of a cutscene in response to player choices or other game eventsEnhances gameplay by adding interactive elements and influencing the story from the player’s perspective
Implementation of Conditional MechanismsDecides on the execution of actions based on meeting specific conditionsUses conditional logic to trigger or change the order of actionsIntroduces complexity and depth to the narrative, allowing for the creation of branched story paths
Table 7. CutsceneSequencer in the context of a computer game scene system.
Table 7. CutsceneSequencer in the context of a computer game scene system.
ComponentFunctionOperating MechanismImpact on the Game
Sequence Management of ControllersOrganizes scene controllers in a predetermined order or based on defined conditionsCutsceneSequencer creates and maintains a list or queue of actions (CutsceneControllers), which are activated sequentiallyEnables the creation of smooth and complex narrative scenarios, improving the quality of narration and player engagement
Conditional ControlDecides on the initiation of actions based on the fulfillment of specific conditionsUses programming logic to assess conditions (e.g., player choices, game events) before initiating actionsAllows the dynamic adaptation of scenes to player actions and other game variables, creating an interactive and non-linear narrative
Coordination with CutsceneControllerCollaborates with CutsceneController for effective initiation and management of cutscene actionsCutsceneSequencer communicates the sequence and conditions for initiating actions to CutsceneControllerEnsures smooth implementation and execution of scenes, maintaining consistency of experience and integration with gameplay
Flow ControlManages the flow of scenes, including their initiation, stopping, and transition to the next sceneFlow control mechanisms allow for smooth transitions between scenes and their temporal controlMaintains the appropriate pace of narration, affecting player immersion and engagement with the game’s storyline
Table 8. Evaluation of Sequencers.
Table 8. Evaluation of Sequencers.
FeatureCustom SequencerUnreal Engine SequencerUnity Timeline
Dynamic Camera ChangesYes, supports dynamic camera transitions and positionsYes, allows for complex camera animations and transitionsYes, supports camera animations and transitions
Integration with Multiple ToolsModerate, can be extended with Unity’s existing tools and assetsHigh, integrates seamlessly with Unreal’s tools (e.g., Blueprints, VFX, Audio)High, integrates with Unity’s wide array of tools and assets
User Interface FlexibilityBasic UI, skip button functionality, can be improved with custom UIAdvanced, user-friendly with drag-and-drop functionalityAdvanced, intuitive UI with drag-and-drop and keyframe editing
Custom Event HandlingYes, supports custom events with ICutsceneAction interfaceLimited to predefined events, complex custom events need scriptingYes, but more complex to create custom events compared to custom system
Ease of UseModerate, requires scripting knowledge for customizationHigh, user-friendly with visual scripting (Blueprints)High, user-friendly with visual editing and scripting
PerformanceEfficient, but dependent on Unity’s performance optimizationsHigh, optimized for performance in complex scenesHigh, benefits from Unity’s performance optimization
Sound IntegrationYes, integrates sound with adjustable delaysYes, advanced sound integration and mixingYes, supports sound integration and control
ModularityYes, modular with CutsceneControllers for different scenes. Easy to rearrange and modify the order of cutscenesYes, highly modular and reusable componentsYes, modular and reusable components
CustomizationHigh, allows for custom scripts and behaviorHigh, customizable with Blueprints and C++High, customizable with C# and custom scripts
VR CompatibilityYes, supports VR with camera tracking controlYes, supports VR and AR applicationsYes, supports VR and AR applications
Sequential ControlYes, allows sequential control of cutscenes through CutsceneControllersYes, strong sequential control with timeline and keyframesYes, robust sequential control with timeline and keyframes
ExtensibilityHigh, can be extended with additional features and improvementsHigh, highly extensible with plugins and additional modulesHigh, extensible with Unity Asset Store and custom scripts
Debugging and TestingModerate, basic debugging tools, can improve with custom logsHigh, comprehensive debugging and profiling toolsHigh, robust debugging and profiling tools
Custom ActionsYes, easy to define custom actions with ICutsceneAction interfaceLimited, requires extensive scripting or BlueprintsYes, but more scripting required compared to custom approach
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kaźmierczak, R.; Skowroński, R.; Kowalczyk, C.; Grunwald, G. Creating Interactive Scenes in 3D Educational Games: Using Narrative and Technology to Explore History and Culture. Appl. Sci. 2024, 14, 4795. https://doi.org/10.3390/app14114795

AMA Style

Kaźmierczak R, Skowroński R, Kowalczyk C, Grunwald G. Creating Interactive Scenes in 3D Educational Games: Using Narrative and Technology to Explore History and Culture. Applied Sciences. 2024; 14(11):4795. https://doi.org/10.3390/app14114795

Chicago/Turabian Style

Kaźmierczak, Rafał, Robert Skowroński, Cezary Kowalczyk, and Grzegorz Grunwald. 2024. "Creating Interactive Scenes in 3D Educational Games: Using Narrative and Technology to Explore History and Culture" Applied Sciences 14, no. 11: 4795. https://doi.org/10.3390/app14114795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop