Next Article in Journal
“Who Should I Trust with My Data?” Ethical and Legal Challenges for Innovation in New Decentralized Data Management Technologies
Next Article in Special Issue
A Probabilistic Approach to Modeling Students’ Interactions in a Learning Management System for Facilitating Distance Learning
Previous Article in Journal
Towards Safe Cyber Practices: Developing a Proactive Cyber-Threat Intelligence System for Dark Web Forum Content by Identifying Cybercrimes
Previous Article in Special Issue
Enhanced Learning and Forgetting Behavior for Contextual Knowledge Tracing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating a Conceptual Model for Measuring Gaming Experience: A Case Study of Stranded Away Platformer Game

by
Luka Blašković
,
Alesandro Žužić
and
Tihomir Orehovački
*
Faculty of Informatics, Juraj Dobrila University of Pula, Zagrebačka 30, 52100 Pula, Croatia
*
Author to whom correspondence should be addressed.
Information 2023, 14(6), 350; https://doi.org/10.3390/info14060350
Submission received: 15 May 2023 / Revised: 14 June 2023 / Accepted: 16 June 2023 / Published: 18 June 2023
(This article belongs to the Special Issue Artificial Intelligence and Games Science in Education)

Abstract

:
Video games have become a ubiquitous form of entertainment and have been enjoyed by people of all ages around the world. The gaming industry has evolved rapidly, with new games being released every year that push the boundaries of technology and creativity. To ensure that video games are not just technically advanced, but also enjoyable and engaging, measuring the gaming experience is essential because it helps game designers understand how players interact with the game and identify areas for its improvement. The objective of this paper is to examine an interplay of gaming experience dimensions in the context of platform video games and to determine the extent to which they contribute to players’ behavioral intentions. To fulfil this objective, an empirical study was undertaken, involving participants with diverse gaming backgrounds. They were requested to engage in the gameplay of the Stranded Away platformer game and subsequently respond to a post-use questionnaire. The psychometric features of the introduced conceptual model were evaluated with the partial least squares structural equation modeling (PLS-SEM) method. The reported findings demonstrate the importance of evaluating different facets of the gaming experience in video games and showcase the potential of the proposed model and measuring instrument as tools for game designers to enhance the overall quality of their products.

1. Introduction

Video games have emerged as a prominent form of interactive entertainment, captivating millions of players worldwide. These interactive experiences not only offer a means of escape and relaxation but also serve as a platform for social connection, skill development, and creative expression [1]. Video games have evolved a lot since their inception, expanding across genres, platforms, and target audiences. They have become an integral part of modern culture, transcending the boundaries of simple entertainment to become a powerful medium for other purposes such as storytelling, education, and collaboration [2]. As the gaming landscape diversifies, understanding the specific elements that contribute to player enjoyment, engagement, and behavioral intention is essential to creating tailored experiences for an ever-growing gaming community. The advent of virtual reality (VR) and augmented reality (AR) technologies has further revolutionized the gaming industry, offering unprecedented levels of immersion and interactivity [3]. These innovative technologies allow players to explore richly detailed environments and participate in interactive narratives that foster a deep sense of presence and engagement. Additionally, the rise of competitive gaming, or eSports, has introduced sports consumption motivation dimensions to the gaming landscape [4]. The competitive aspect highlights the importance of skill development, learning, and motivation in the gaming experience, as well as the potential for video games to serve as platforms for personal growth and self-improvement. The social aspect of gaming has also become a focal point, as multiplayer games and online communities have facilitated connections among players worldwide. As concluded by Halbrook et al. [5], the role of social interaction in gaming enjoyment, particularly regarding cooperation, competition, and communication, necessitates further exploration to better comprehend its impact on player satisfaction and engagement. Finally, the recent rapid rise of artificial intelligence (AI) has introduced new possibilities for creating more engaging and immersive gaming environments, as well as tailoring experiences to individual player preferences. However, the increased reliance on AI also presents potential challenges, such as ethical concerns and maintaining the human element in game design [6,7].
In the realm of Human–Computer Interaction (HCI), the study of user experience (UX) has emerged as an essential approach to understanding and improving the interaction between players and video games. Understanding player enjoyment and what leads to it provides crucial knowledge for game designers. As the gaming industry continues to expand, the need for sophisticated methods to evaluate these experiences becomes increasingly important. With the rapid development of gaming technology, ensuring a seamless, enjoyable, and engaging experience for players has become a top priority for game designers and developers [8]. In the broader field of HCI, UX research encompasses various aspects, including usability, accessibility, and user satisfaction [9]. These dimensions are integral for evaluating the overall interaction experience and determining how to enhance it further. However, in the context of video games, the concept of UX extends beyond these traditional HCI aspects. The unique, immersive nature of video games requires a more comprehensive understanding of the user experience, including elements such as enjoyment, engagement, immersion, and emotional response [10]. Modularity in video games is essential, as it allows for easy modification and improvement based on user feedback, a critical aspect highlighted by Drozina and Orehovački [11]. Continuous playtesting with a modular design enables developers to quickly iterate and refine various game components thus enhancing overall player contentment. This iterative approach to game development, facilitated by modularity, results in a more polished and engaging gaming experience tailored to the preferences of players. The evaluation of the gaming experience is a multifaceted process that involves not only the assessment of the game’s design and technical aspects but also the player’s behavioral intentions, cognitive and affective responses, and overall satisfaction with the game. Consequently, game designers need reliable tools and methodologies to assess these various dimensions of the gaming experience effectively. Unlike traditional software, the UX in video games must both represent and mediate the player’s values and gameplay preferences [12]. A variety of conceptual models and measurement instruments have been proposed to evaluate user experience in video games [13,14,15]. However, there is still a need for an all-inclusive, robust, and empirically validated model that effectively captures the diverse dimensions of the gaming experience.
This paper aims to address these gaps by introducing and evaluating a novel conceptual model for measuring user experience in video games, with a focus on its applicability in the context of the Stranded Away platformer game. Stranded Away is a 2D side-scrolling platformer game developed by the authors of this paper [16]. It incorporates key features of classic platformers, such as collectibles, moving platforms, various enemy types, and obstacles. Moreover, the game integrates puzzle elements, requiring players to solve riddles to progress through the levels. By examining the interplay of various gaming experience dimensions and their impact on players’ behavioral intentions, we seek to provide valuable insights for game designers and HCI researchers alike. This study also seeks to contribute to the growing body of knowledge on user experience in video games, ultimately informing the development of more effective tools and methods for evaluating and improving the quality of future gaming experiences.
The main objective of this paper is to develop an exhaustive and robust research framework for evaluating the multifaceted dimensions of user experience within the context of video games, ultimately providing valuable insights for game designers and researchers to enhance the overall quality of the gaming experiences of their audience. Therefore, we raised the following research question:
  • Which dimensions of the gaming experience are most influential in determining player enjoyment, player engagement, and their intention to continue playing the video game and recommend it to others gamers?
To address the aforementioned research question, an empirical study was conducted. By reviewing recent and pertinent literature in the field, we identified eight constructs that encompass the gaming experience. Based on these constructs, a research framework was proposed to systematically investigate the underlying dimensions of the gaming experience. The Stranded Away game served as a case study, providing a practical context for applying and evaluating the introduced conceptual model. To ensure the validity and reliability of the research model, we examined both the hypothesized relationships between constructs and the overall model’s performance using the partial least squares structural equation modelling (PLS-SEM) method.
The fundamental contributions of this work include:
  • An overview of current methods, models, and instruments for evaluating video games;
  • A measuring instrument in the form of a post-use questionnaire that can be employed for evaluating the gaming experience;
  • A set of constructs meant for assessing distinct aspects of the gaming experience;
  • A valid and reliable conceptual model that can be used for predicting player engagement, player enjoyment, and their behavioral intentions.
The remainder of the paper is structured as follows. In the next section, we examine related work underpinning our empirical study. Through the review of current advances in the field tackling gaming experience as well as the methods, instruments, and models meant for evaluating video games, we established a theoretical backbone for our conceptual model. The third section demystifies our research methodology. We introduced the Stranded Away platformer game as a case study for gaming experience evaluation, proposed a research framework in which interplay among eight constructs reflects thirteen introduced hypotheses, briefly described the particularities of PLS-SEM method, and elucidated our research method which includes playing the Stranded Away platformer game and completing the post-use questionnaire. The findings of the empirical study are presented in the fourth section. We reported the demographics of study participants, showcased the outcomes of examining the psychometric features of both the measurement and structural models, scrutinized the results of the hypotheses testing, and appraised the overall predictive capability of the research framework. The fifth section offers a discussion of the study findings. We provided valuable insights into confirmed relationships between gaming experience constructs that constitute the research framework together with the rationale for its medium predictive power. Limitations of the study, including the relatively smaller sample size composed of students from the same university, the exclusive use of Stranded Away as the single case study, and the focus of evaluation on only one game genre, are, together with future research directions, explained in the sixth section. The conclusions are drawn in the last section. We underscored the importance of the proposed gaming experience dimensions and explained the benefits of our work for researchers and game developers.

2. Literature Review

Video games have become an integral part of modern entertainment, captivating players of all ages and backgrounds. As the gaming industry continues to evolve, the need for a profound understanding of the gaming experience has become paramount. The emergence of the COVID-19 pandemic has further accelerated the growth of the gaming industry, with more people seeking engaging and immersive experiences while confined to their homes [17]. Although video games represent a breakthrough in interactive entertainment, ensuring a high-quality user experience remains a challenge for game developers and designers. Therefore, it is crucial to examine the dimensions of user experience in video games and develop models that can effectively measure and enhance the overall enjoyment, engagement, and behavioral intention of players. This section provides a brief overview of current relevant studies on the methods and techniques for evaluating gaming experience as well as existing conceptual models and measurement instruments in the same respect.

2.1. Gaming Experience

The process of developing video games has become increasingly complex, prompting smaller teams to utilize custom software and game engines to help streamline their work and reduce the complexity associated with modern large-scale game engines [18]. By developing and using self-made tools tailored to their specific needs and project requirements, these smaller teams can optimize their development process and focus on creating better games. These custom tools, when designed with a deep understanding of the team’s workflow, can improve productivity, enhance collaboration, and facilitate the implementation of innovative gameplay features [19]. Consequently, smaller teams can deliver unique and engaging gaming experiences, despite the inherent challenges posed by the ever-evolving complexity of video game development. Gaming experience refers to the overall subjective perception and emotional response a player has while interacting with a game. It encompasses a wide range of elements including enjoyment, engagement, immersion, challenge, aesthetics, narrative, and social interaction [20]. The gaming experience is a complex and multifaceted concept that may vary significantly among individuals, as different players have unique preferences, playstyles, and expectations of a game [21,22]. According to Yee [23], various factors can contribute to diversity in gaming experiences, such as player demographics, cultural backgrounds, gaming history, and individual personality traits. The gaming experience is often positively correlated with video game quality. Measuring the quality of a video game is important because it allows researchers and developers to assess all the relevant aspects of the gaming experience, leading to a deeper understanding of player satisfaction. An all-encompassing measurement tool for examining video game quality would provide a detailed list of issues, making it easier for developers to improve their games and ultimately enhance the overall gaming experience [24]. Small indie games can be evaluated effectively by conducting empirical studies that explore various aspects of the gaming experience by gathering data with measuring instruments such as post-use questionnaires [25].
Jennett et al. [20] conducted a review of gaming experience research and concluded that immersion is one of the key aspects of the gaming experience, as it allows players to become deeply absorbed in the virtual worlds they explore. The sense of immersion fosters a heightened level of engagement, making the gaming experience more enjoyable and memorable for players. The authors acknowledged that immersion is a distinct concept from related ideas such as attention, flow, and fun, but its exact nature remains somewhat elusive. The current understanding suggests that immersion is a result of the convergence of various psychological processes, including attention, planning, and perception, which together produce a focused state of mind. When players reach this state, they become less aware of their surroundings and fully engrossed in the game, deriving pleasure from the immersive experience. Despite this understanding, several questions remain unanswered, including the specific psychological functions involved in immersion, the ideal balance of these functions, and how games can achieve this balance to create optimal immersion [26]. Enjoyment is another fundamental aspect of the gaming experience, as it captures the intrinsic pleasure and gravitation derived from engaging in gameplay [27]. Recent research has highlighted several factors that contribute to enjoyment in gaming, such as challenge, novelty, aesthetics, and social interaction [28]. For example, Guardini and Maninetti [29] discussed various aspects of game analytics, including the role of gameplay mechanics, visual and auditory elements, and narrative in player enjoyment. Furthermore, Mekler et al. [30] emphasized the importance of player autonomy and competence in fostering enjoyment, as these factors contribute to a sense of personal achievement and satisfaction.

2.2. Evaluation Methods, Instruments, and Models

The high competition in the video game industry has led to an increased emphasis on understanding and optimizing the player experience. To assess gaming experience, various evaluation methods and techniques have been developed and employed by researchers and game developers. Schaffer and Fang [31] pointed out the need for more empirical research on digital game enjoyment to guide interactive system design. In particular, they emphasized the importance of qualitative research and controlled experiments for understanding how facets of enjoyment in digital games interrelate and contribute to learning and behavioral outcomes. Defining a conceptual model for evaluating video games can be challenging due to the varying perceptions of game quality among participants and the diverse range of game types. Creating a universal conceptual model with quality dimensions that apply to all games becomes particularly tricky. As Orehovački and Babić [32,33] concluded in their studies on evaluating games designed for learning programming, the specific features of each genre, such as educational games, can impact one or several quality dimensions. This highlights the importance of tailoring assessment methods and models to the unique characteristics of individual games, to ensure a more accurate and relevant evaluation of the gaming experience.
A variety of quantitative measuring instruments are employed to gauge various aspects of the gaming experience, such as enjoyment, immersion, and challenge. The Game Experience Questionnaire (GEQ) is one such instrument that measures various dimensions of player experience in the form of a self-report questionnaire, including competence, sensory and imaginative immersion, flow, and negative affect [14]. In addition, the Player Experience of Need Satisfaction, more commonly known by its acronym PENS, is another measuring instrument that focuses on evaluating player satisfaction in the gaming context based on the Self-Determination Theory (SDT). It is a research-backed theory on human motivation and personality in social settings which distinguishes between autonomous and controlled motivations [34]. The PENS questionnaire tackles three psychological needs: competence, autonomy, and relatedness. It helps assess the extent to which a game fulfils these needs, thus providing valuable information on player motivation and overall satisfaction [35]. Another well-known tool for evaluating the gaming experience is the GameFlow model based on the concept of flow, which is a state of intense concentration and immersion in an activity [13]. The Game User Experience Satisfaction Scale (GUESS), developed by Phan et al. [36], has been validated in various studies and proven to be a reliable and valid measure of user experience in games, used in both academic and industry contexts to evaluate games and inform design decisions.
Self-report questionnaires rely on the player’s subjective interpretation of their emotional state and may be influenced by factors such as social desirability bias, memory recall bias, and response style bias. Physiological data can provide a more accurate measure of the player’s emotional state because it is less subject to bias and can be measured in real-time during gameplay. Mandryk and Atkins [37] introduced a method for continuously modelling emotion using physiological data, such as heart rate variability and skin conductance, to objectively quantify the player’s physiological responses during gameplay. Their research was focused on the use of psychological signals to measure player affective experience in video games. The reported findings indicate that changes in heart rate variability and electrodermal activity were associated with changes in player affective experience and suggested that those measures could be used to provide real-time feedback to game designers to improve the gaming experience. In addition, eye-tracking technology is another popular method based on physiological data that has been utilized to assess player attention and engagement by monitoring gaze patterns during gameplay. This method can offer valuable insights into how game elements capture and maintain player attention, potentially revealing areas for improvement in game design [38]. The study conducted by Karavidas et al. [39] highlighted the use of bio signal heuristic metrics as a measurement method for evaluating the user experience in video games. By incorporating physiological data, such as heart rate variability and electrodermal activity, the researchers were able to develop a more objective and real-time approach to assess the player’s emotional state and engagement during gameplay. Petri and von Wangenheim [40] proposed the MEEGA+ model that demonstrates the role of quality aspects such as aesthetics, learnability, and engagement in shaping the player’s learning experience within the context of educational games. Lastly, Consalvo and Dutton [41] pointed out that qualitative techniques such as interviews, focus groups, and think-aloud protocols can provide invaluable insights into player preferences, emotions, and cognitive processes.

2.3. Filling the Gap

Despite the growing body of literature in the field, there is still a noticeable gap in the development, application, and evaluation of conceptual models that incorporate a comprehensive set of gaming experience dimensions, particularly in the context of specific video game genres. This paper aims to address this gap by proposing and examining a novel research framework for measuring gaming experience while using the Stranded Away platformer game as a case study. By developing a post-use questionnaire tailored to the unique aspects of the game and its gameplay, this study seeks to contribute to the understanding of how various dimensions of gaming experience interrelate and impact players’ behavioral intentions. This research, therefore, has the potential to not only enhance the academic understanding of gaming experience evaluation but also to offer practical benefits for the gaming industry by providing a robust, adaptable, and thorough tool for assessing and improving the quality of video games.

3. Methodology

3.1. Case Study

Stranded Away is a 2D platformer/side-scrolling game with puzzle elements [16]. The game was created using Unity, a cross-platform development environment that offers fundamental functionalities such as a rendering engine, sound importation and utilization, physics simulation, animation capabilities, and networking support [42].
The player takes control of a mysterious space traveler who lands on the planet Athion in search of the human species following a galactic apocalypse caused by the mad scientist Dr. Hone, the game’s primary antagonist. Stranded Away incorporates several gaming concepts that make it suitable for evaluation in the context of the gaming experience. These concepts contribute to the game’s overall appeal, challenge, and enjoyment, providing a representative case study for empirical research.
Stranded Away adopts a retro pixel art graphics style, paying homage to classic video games while offering a visually appealing and nostalgic experience for players. The distinctive aesthetic of pixel art contributes to the game’s charm and sets it apart from other contemporary titles. The game features immersive audio that enhances the player’s experience by creating a captivating and engaging science fiction atmosphere. The audio design includes background music, sound effects for various actions such as shooting, terrain interaction, or object interaction, and environmental sounds that complement the game’s setting and story. All the aforementioned features draw on the results of a detailed user experience evaluation of the Stranded Away platformer game presented in [16]. The main character and his spaceship, in the form of pixel art graphics, are shown in Figure 1, which illustrates the opening scene of Stranded Away game.
The game features a story mode that takes players on a journey through the galactic apocalypse brought on by the mad scientist. The game’s story unfolds through data files hidden throughout the game world. Players discover these files as they explore the environment, gradually revealing the narrative of the galaxy apocalypse and Dr. Hone’s nefarious plans. This method of storytelling engages players’ curiosity and encourages exploration. Additionally, an extra game mode—“The floor is lava”—provides players with new challenges and opportunities for replayability, extending the game’s overall lifespan and appeal. The level is not directly related to the game’s main narrative as it features a never-ending level, reminiscent of games like Icy Tower, in which the player must continuously jump upwards as the lava below him rises. The objective of this mode is to survive and reach the highest possible altitude.
Stranded Away includes an in-depth tutorial level that introduces players to the game’s core mechanics and gameplay elements. The tutorial ensures that players have a solid understanding of the game’s controls, core mechanics, and objectives before delving into the main story. The game presents players with a variety of enemy types, such as the lizard, turret, and toxic slug, as well as obstacles that require different strategies and tactics to overcome, such as the laser door. This diversity in gameplay challenges keeps players engaged and promotes the development of problem-solving skills as they progress through the game. This platformer incorporates puzzle elements into its gameplay, requiring players to solve riddles and complete tasks to advance through the levels. For instance, the puzzle elements revolve primarily around the use of blue energy boxes, which are integral to solving various challenges throughout the game. Players must interact with these boxes by strategically placing them on pressure plates to trigger specific logic combinations. These puzzle elements add an extra layer of depth to the game, encouraging critical thinking and providing a unique challenge for players to conquer. What is more, the game offers players a selection of different weapon types such as a blaster, rifle gun, sniper, and plasma gun, along with power-ups to use throughout their adventure. These weapons not only provide variety in gameplay but also allow players to customize their playstyle, further enhancing their engagement and enjoyment.
The combination of these diverse gaming concepts in Stranded Away provides a rich and varied gaming experience, making it an ideal case study for evaluating the dimensions of user experience in video games.

3.2. Research Framework

Video games offer a unique blend of multimedia elements that synergistically create a holistic gaming experience that encompasses multiple facets, including game mechanics, narrative, graphics, and social interaction. Players experience video games through their senses and cognitive abilities, which are stimulated and challenged in various ways throughout gameplay. They evaluate the quality of their gaming experience through a combination of tangible factors such as visual elements, audio elements, user interface sensibility, and gameplay mechanics, as well as intangible factors such as learnability, player enjoyment, and player engagement, which ultimately affect their behavioral intentions.
Audio elements (AUD) in video games refer to the various sounds and music that accompany gameplay, including background music, sound effects, voice acting, and ambient noise. Ng and Nesbitt [43] conducted an experiment in which participants played a custom-designed game with different audio conditions: no audio, non-informative audio, and informative audio. The results they reported showed that players in the informative audio condition demonstrated better performance and were able to interpret the visual elements of the game more effectively. Diegetic sound relates to audio elements that originate within the game world and have a direct connection to the visuals. The study carried out by Jørgensen [44] emphasized the role of diegetic sound in enhancing the coherence between audio and visual elements, ultimately creating a more immersive gaming experience. On the other hand, audio elements such as sound effects and music can provide important information to players, helping them understand and interact with various gameplay mechanics more effectively [45]. In that respect, we propose the following hypotheses:
H1. 
Audio elements have a positive impact on visual elements in the context of platform video games.
H2. 
Audio elements have a positive impact on gameplay mechanics in the context of platform video games.
Visual elements (VIS) in video games encompass various aspects such as graphics, animations, character design, and environmental design. Schell [46] showcased the importance of a visually appealing game environment in creating an intuitive and user-friendly interface. A well-designed visual environment can facilitate seamless navigation, interaction, and an understanding of the game’s mechanics, ultimately leading to a more enjoyable experience for players. Skillfully crafted visual elements, such as clear icons, legible text, mini maps, waypoints, and a coherent visual hierarchy, contribute to more intuitive user interfaces [29]. Misztal’s and Schild’s [47] research uncovered the importance of positive visual elements in enhancing player satisfaction, emphasizing the significant role aesthetics play in creating immersive gaming experiences. Furthermore, Birk et al. [48] have illuminated the value of avatar customization, illustrating how personalized in-game representation can foster intrinsic motivation and deepen a player’s connection to the virtual world. Building on the work of the abovementioned studies, we propose the following hypotheses:
H3. 
Visual elements have a positive impact on the user interface sensibility in the context of platform video games.
H4. 
Visual elements have a positive impact on player enjoyment in the context of platform video games.
User interface sensibility (UIS) pertains to the capacity of a video game to provide a fluid, intuitive, and immersive experience to the player. It is a multifaceted concept that includes elements of seamless navigation, interaction, and improved intuitiveness, each of which contributes in different ways to the overall gaming experience. Seamless navigation, as indicated by Isbister [49], involves easy movement and control within the game environment. It contributes to a player’s comfort and ease in traversing through the virtual world. This aspect of sensibility means that the player does not feel lost or disoriented, thereby aiding in the overall enjoyment and success within the game. The interactive aspect of a game’s user interface, as emphasized by Korhonen et al. [50], refers to how effectively players can manipulate and respond to game elements. This involves the clear communication of actions and reactions, making the player feel in control and connected to the game world. The interface should facilitate a sense of agency and feedback, enabling players to experiment, strategize, and engage with the game mechanics. Moreover, an intuitive user interface is crucial for an accessible gaming experience. As suggested by the research of Nacke et al. [45] and Sánchez et al. [51], an interface that is easy to comprehend and use aids in the learning of the game’s mechanics. It lowers the entry barrier for newcomers and decreases the cognitive load, thereby helping players to master the game’s mechanics and enjoy the gaming experience more quickly. Therefore, we hypothesize the following:
H5. 
User interface sensibility has a positive impact on gameplay mechanics in the context of platform video games.
H6. 
User interface sensibility has a positive impact on learnability in the context of platform video games.
Learnability (LRN) denotes the extent to which is easy to become proficient in playing the game. It concerns how easily a player can learn the game mechanics, controls, rules, and strategies [52]. Games with high learnability are intuitive and provide smooth onboarding for the players, which usually involve tutorials, effective feedback mechanisms, and progressive challenges. Poor learnability can lead to player frustration and game abandonment, while good learnability can enhance the overall gaming experience by making the game more accessible and enjoyable [53].
Gameplay mechanics (GMH) point to the rules, systems, and interactions that govern the player’s experience and facilitate their progress within the game. Ermi and Mäyrä [54] explored the essential elements of gameplay experience and immersion. They emphasized the significance of thoughtfully designed gameplay mechanics in cultivating a greater sense of engagement and immersion for players. Alexiou and Schippers [55] conducted a study in which they proposed a conceptual framework that determines how digital game elements, such as gameplay mechanics, influence user experience and learning outcomes. The findings of their study suggest that thoughtfully designed gameplay mechanics can have a significant impact on a player’s intrinsic motivation, which in turn can lead to more effective learning and engagement [55]. In addition, Kiili [56] highlighted the importance of effective and skillfully designed gameplay mechanics in facilitating the learning process for players. By conducting a meta-synthesis of player types, Hamari and Tuunanen [57] found that a positive experience with gameplay mechanics can lead to increased player satisfaction and loyalty, ultimately impacting players’ behavioral intentions to recommend the game or engage with it again in the future. Finally, Vorderer et al. [58] emphasized the role of meticulously developed gameplay mechanics in contributing to a player’s overall enjoyment of the game. Drawing on the findings of the aforementioned studies, we propose the following hypotheses:
H7. 
Gameplay mechanics have a positive impact on player engagement in the context of platform video games.
H8. 
Gameplay mechanics have a positive impact on player enjoyment in the context of platform video games.
H9. 
Gameplay mechanics have a positive impact on learnability in the context of platform video games.
H10. 
Gameplay mechanics have a positive impact on player behavioral intention in the context of platform video games.
Player enjoyment (ENJ) represents the encompassing sense of pleasure and contentment that players experience during gameplay. By establishing a connection between the fulfillment of fundamental psychological needs (such as competence, autonomy, and relatedness) and heightened motivation and involvement in video games, it exerts a positive influence on player engagement [59]. Vorderer et al. [58] have argued that enjoyment is a key factor in engaging players, and they explored various aspects of enjoyment and how they contribute to increased engagement in media entertainment, including video games. Zhao and Huang [60] proposed a conceptual model that explores the factors influencing online game continuance playing. While the specific details of the model are not provided, it can be inferred that player enjoyment likely plays a significant role in determining whether a player continues to engage with an online game. Relevant studies in the field have emphasized the importance of player enjoyment in promoting player engagement. Some of them [61,62] suggest that understanding and enhancing the factors that contribute to player enjoyment can lead to increased engagement and prolonged game-playing behavior. Based on the findings of the abovementioned studies, we propose the following two hypotheses:
H11. 
Player enjoyment has a positive impact on player behavioral intention in the context of platform video games.
H12. 
Player enjoyment has a positive impact on player engagement in the context of platform video games.
Player engagement (ENG) denotes the degree of a player’s involvement, interest, and enthusiasm while interacting with a video game. Behavioral intention (BEH) indicates the extent to which players are willing to continue playing the video game and recommend it to others in their gaming community. The game designers must assure that the player does not feel frustration or boredom when playing a game. Moreover, a game should be effective in capturing players’ attention to such an extent that they remain fully immersed and undistracted when engaging with the game [63]. Understanding and enhancing player engagement can help game developers and marketers create more captivating experiences and encourage positive behavioral intentions among players [63,64]. In that respect, we propose the following hypothesis:
H13. 
Player engagement has a positive impact on player behavioral intention in the context of platform video games.
Figure 2 presents the research model, which consists of the aforementioned constructs and hypotheses. A proposed model was conceptualized based on the literature review discussed earlier, integrating various elements and factors contributing to a holistic gaming experience. By examining the relationships between these constructs, our model aims to provide insights into the dynamics of player enjoyment, engagement, and behavioral intentions in the context of platform video games.

3.3. Apparatus

The data were gathered by employing an online post-use questionnaire distributed through Google Forms in January 2023. The questionnaire encompassed 11 items covering participant demographics (such as gender, age, year of study, frequency and duration of weekly and daily gameplay, preferred game genres, years of gaming experience, prior exposure to platform video games, and the most significant features of video games). Additionally, it consisted of 40 items designed to explore various aspects of eight constructs comprising the research framework. An open-ended item was also included to gather data on the advantages and disadvantages of the evaluated platform video game. The preliminary pool of 40 items can be found in Appendix A. Participants provided responses to questionnaire items using a five-point Likert scale (1-strongly disagree, 5-strongly agree).
To evaluate the reliability and validity of our research framework and test the hypothesized relationships, we applied a partial least squares structural equation modeling (PLS-SEM) statistical technique. PLS-SEM focuses on maximizing the explained variance in endogenous variables by exploring partial model relationships through a series of ordinary least square (OLS) regressions [65]. The PLS-SEM method estimates construct scores as precise linear combinations of their respective indicators [66]. We opted for PLS-SEM over covariance-based SEM (CB-SEM) due to the following key reasons: (1) PLS-SEM is well-suited for exploratory research as it does not necessitate a stringent theoretical foundation [67]; (2) PLS-SEM achieves greater statistical power in comparison with CB-SEM when working with smaller sample sizes [68]; and (3) PLS-SEM’s algorithm accounts for data that significantly deviate from a normal distribution, making parameter estimations more reliable following the central limit theorem [65].
The 10 times rule [69] suggests that the sample size should be 10 times the number of independent variables in the most complex regression within the research framework, which includes both the measurement and structural models. Another way to express this rule of thumb is to state that the minimum sample size should be 10 times the maximum number of arrowheads pointing at a latent variable in any part of the research framework. In our conceptual model, the most intricate latent variables are assessed using five manifest variables, while the maximum number of exogenous variables affecting an endogenous variable is three. As a result, the minimum required sample size for our study is 50. While the 10 times rule provides a general guideline, the determination of the minimum sample size should also consider the statistical power of the estimates. Drawing on the inverse square root method proposed by Kock and Hadaya [70], the ranges of effect sizes for a common power level of 80%, and the minimum path coefficient expected to be significant at the 5% level introduced in [71], the minimum sample size for our study is 69. Therefore, a sample size of 100 is deemed sufficient. To assess the psychometric properties of the measurement and structural models, we employed the SmartPLS 4.0.9.1 software tool [72].

4. Results

4.1. Study Participants

A total of 100 subjects (68% male, 30% female, and 2% choosing not to disclose their gender) took part in the study. The sample was mainly (94%) composed of students, mostly from the Juraj Dobrila University of Pula, while the remaining 6% of study participants were not students. The age of respondents ranged from 18 to 40 years (M = 22.2, SD = 5.1). At the time the study was conducted, 75% of participants were between 18 and 23 years old. The majority (33%) of participants were enrolled in the first year of undergraduate study, 27% were enrolled in the first year of graduate study, 18% were enrolled in the third year of undergraduate study, 15% were enrolled in the second year of graduate study, and 1% were enrolled in the second year of undergraduate study.
The gaming habits of the participants revealed that the greater part (34%) play video games less than once a week. When they do play, the majority (43%) of participants reported they are doing so for 2–4 h a day. On the other hand, 31% of participants stated that they play video games 2–4 times a week, while 41% indicated that they play video games less than 5 h a week. Gaming experience among the participants ranged from 0 to 30 years (M = 11, SD = 5.52).
In terms of genre preferences, action and shooter games are the most popular among study participants, as stated by over half of them (52%). A third of respondents play survival, strategy, and RPG games, while 30% of them prefer genres such as horror, racing, sports, fighting, etc. However, only 15% of study subjects reported platformers as their favorite game genre. When it comes to experience in playing platform video games, 39% of participants reported having a lot of experience, 49% stated having moderate experience, and 12% had little to no experience. Participants also shared their opinions on what they believe to be the most important aspect of video games. A majority of participants (40%) indicated that fun is the most relevant aspect, followed by 31% of respondents who believe it is the story, 23% of participants who emphasized the importance of gameplay mechanics, and 6% of study subjects who prioritized graphics.

4.2. Model Assessment

The PLS-SEM path analysis algorithm calculates standardized partial regression coefficients within the structural model after approximating the parameters of the measurement model [73]. Consequently, a two-stage assessment of the psychometric properties of the proposed conceptual model was conducted. The quality of the measurement model was assessed by evaluating several factors: the reliability of indicators, internal consistency, convergent validity, and discriminant validity.
To assess indicator reliability, the standardized loadings of items with their corresponding constructs were explored. Hulland’s purification guidelines [74] suggest retaining items in the measurement model only if their standardized loadings are equal to or exceed 0.708. Items GMH4, LRN3, LRN5, UIS2, UIS5, ENG3, and BEH5 were eliminated from the measurement model and subsequent analysis since their loadings were below the recommended threshold. The confirmatory factor analysis (CFA) results, as shown in Table 1, indicate that the standardized loadings for all the remaining items in the measurement model are above the acceptable cut-off value. The standardized loadings of items comprising the measurement model ranged from 0.714 to 0.883, signifying that constructs explained between 50.98% and 77.97% of their items’ variance.
The internal consistency of constructs was assessed using three indices: Cronbach’s alpha, composite reliability (rho_C), and the consistent reliability coefficient (rho_A). Cronbach’s alpha [75] serves as a lower bound estimate for construct reliability, assuming an equal weighting of items. Composite reliability [76], which considers actual item loadings, provides a more accurate internal consistency estimate compared with Cronbach’s alpha. Dijkstra and Henseler’s consistent reliability coefficient [77] is an approximate exact measure of construct reliability, acting as a middle ground between Cronbach’s alpha and composite reliability [78]. For these indices, values ranging from 0.60 to 0.70 are deemed satisfactory in exploratory research, while values between 0.70 and 0.95 indicate good internal consistency. However, values exceeding 0.95 suggest item redundancy that can negatively impact content validity [79]. Due to the inadequate phrasing of item ENG3, it was excluded from the measurement model, leading to acceptable values for all three internal consistency indices related to the player engagement construct. The same was done for the BEH5 item and the corresponding construct. As shown in Table 2, the calculated values for all three indices ranged between 0.672 and 0.916, suggesting good internal consistency for all eight constructs in the research framework.
Convergent validity was examined with the Average Variance Extracted (AVE) criterion, as suggested by Hair et al. [78]. An AVE value of 0.50 or higher is deemed satisfactory because it indicates that the shared variance between a construct and its items surpasses the variance due to measurement error. As presented in the last column of Table 2, all constructs which constitute the research framework have met the requirement of this criterion thus signifying the robust convergent validity of the measurement model.
Discriminant validity, which represents the degree to which a specific construct differs from the others in the model, was scrutinized using the Heterotrait–Monotrait (HTMT) ratio of correlations introduced by Henseler et al. [80]. This ratio is computed by dividing the average value of all the correlations of indicators measuring different constructs by the average value of the correlations of indicators measuring the same construct. For related constructs, discriminant validity is deemed absent if the HTMT value oversteps the 0.90 threshold. On the other hand, for conceptually distinct constructs, the cut-off value is reduced to 0.85 [79]. The study findings reported in Table 3 demonstrate that the HTMT values of all the constructs in the research framework are below the aforementioned respective thresholds, thereby meeting the requirement of the discriminant validity criterion.
After confirming that the quality of the measurement model was satisfactory, the appropriateness of the structural model was evaluated. This assessment involved analyzing collinearity, the significance of paths, the explanatory power of the research model, the effect size of exogenous constructs, the predictive power of the research model, and the predictive relevance of exogenous constructs.
Evaluating the structural model requires estimating numerous regression equations that depict the relationships between constructs. If two or more constructs within the structural model represent similar concepts, there is a risk of excessive collinearity, which could potentially result in skewed estimates of partial regression coefficients. The Variance Inflation Factor (VIF) is a widely used metric for detecting the presence of collinearity among predictor constructs in the structural model. While VIF values of 5 or higher indicate collinearity problems among exogenous constructs, issues may arise even with VIF values of 3 [78]. As a result, VIF values should ideally be close to or below 3. Table 4 shows that the VIF values for the predictor constructs range from 1.000 to 2.033, confirming the absence of collinearity in the structural model.
The explanatory power of the model is assessed using the coefficient of determination ( R 2 ), which illustrates the proportion of variance in endogenous constructs explained by their predictors. The acceptable R 2 values depend on the specific research discipline and study in question [81]. Orehovački [82] proposes that, in empirical research focused on software quality evaluation, R 2 values of 0.15, 0.34, and 0.46 signify weak, moderate, and substantial explanatory capacities of exogenous constructs within the research model, respectively. Adjusted R 2 is commonly interpreted instead of R 2 because it considers the size of the model [79]. The study results shown in Table 5 reveal that 67.7% of the variance in behavioral intention was accounted for by player enjoyment, player engagement, and gameplay mechanics; player enjoyment and gameplay mechanics explained 49.6% of the variance in player engagement; 30.4% of the variance in player enjoyment was accounted for by gameplay mechanics and visual elements; 27.8% of the variance in gameplay mechanics was explained by audio elements and visual elements; the user interface sensibility and gameplay mechanics accounted for 29% of the variance in learnability; 29.4% of the variance in user interface sensibility was explained by the visual elements; while 28.3% of the variance in the visual elements was accounted for by the audio elements.
The reported findings indicate that the determinants of behavioral intention and player engagement have a substantial explanatory power while the predictors of player enjoyment, gameplay mechanics, learnability, user interface sensibility, and visual elements demonstrate a weak explanatory power.
The hypothesized interplay among constructs in the research framework was examined by evaluating the goodness of the path coefficients. A bootstrapping resampling procedure was employed, utilizing asymptotic two-tailed t-statistics to evaluate the significance of the path coefficients. The number of cases equaled the sample size, while the number of bootstrap samples amounted to 5000. Table 6 presents the outcomes of hypothesis testing. The findings revealed that gameplay mechanics (β = 0.140, p < 0.05), player enjoyment (β = 0.556, p < 0.001), and player engagement (β = 0.242, p < 0.005) significantly influenced behavioral intention, thus corroborating hypotheses H10, H11, and H13, respectively. The data analysis also determined that audio elements (β = 0.256, p < 0.01) and the user interface sensibility (β = 0.403, p < 0.001) substantially impacted gameplay mechanics, hence supporting hypotheses H2 and H5. Furthermore, gameplay mechanics (β = 0.215, p < 0.05) and player enjoyment (β = 0.576, p < 0.001) contributed significantly to player engagement, thus confirming hypotheses H7 and H12. Additionally, the user interface sensibility (β = 0.233, p < 0.05) and gameplay mechanics (β = 0.399, p < 0.001) exhibited a notable effect on learnability, hence substantiating hypotheses H6 and H9, respectively. The visual elements of the video game (β = 0.274, p < 0.01) and its gameplay mechanics (β = 0.367, p < 0.001) considerably influenced player enjoyment, thus lending support to hypotheses H4 and H8. Finally, the study findings revealed that audio elements (β = 0.539, p < 0.001) are significant determinants of visual elements, which in turn (β = 0.549, p < 0.001) serve as a significant antecedent for the user interface sensibility, thereby providing support for hypotheses H1 and H3, respectively.
The effect size ( f 2 ) represents the magnitude of the influence of an exogenous construct on an endogenous construct. An f 2 value of 0.02, 0.15, or 0.35 signifies a small, medium, or large effect, respectively [83]. Based on the f 2 values presented in Table 7, we can interpret the strength of relationships between the constructs for the given hypotheses. Audio elements considerably impact visual elements ( f 2 = 0.409), while only having a minimal influence on gameplay mechanics ( f 2 = 0.083). The visual aspects greatly contribute to the user interface sensibility ( f 2 = 0.431) and marginally affect player enjoyment ( f 2 = 0.078). The user interface sensibility moderately influences gameplay mechanics ( f 2 = 0.207) and has a minor impact on learnability ( f 2 = 0.060). Gameplay mechanics exert a weak effect on player engagement ( f 2 = 0.069), a moderate influence on learnability ( f 2 = 0.176), a negligible impact on behavioral intention ( f 2 = 0.043), and a mild contribution to player enjoyment ( f 2 = 0.140). Player enjoyment plays a crucial role in shaping both behavioral intention ( f 2 = 0.486) and player engagement ( f 2 = 0.495). Finally, player engagement has a minor, yet noticeable, effect on behavioral intention ( f 2 = 0.093).
The nonparametric cross-validated redundancy measure Q 2 by Stone [84] and Geisser [85], which utilizes the blindfolding reuse technique to predict endogenous construct items, was frequently used in the literature to evaluate the predictive validity of exogenous constructs. However, since Q 2 integrates aspects of both out-of-sample forecasting and in-sample explanatory strength [86], it does not serve as a true measure of out-of-sample prediction [78]. To address this issue, Shmueli et al. [86,87] developed the PLSpredict algorithm as an alternative method for evaluating a model’s predictive relevance.
PLSpredict uses k-fold cross-validation, in which a fold is a subset of the total sample, and k represents the number of subsets. This method determines whether the model performs better than the most basic linear regression benchmark (called Q predict 2 and defined as the indicator means from the analysis sample) [78,79,86]. PLS path models with Q predict 2 values above 0 exhibit lower prediction errors compared with the simplest benchmark. Since Q predict 2 can be understood in a similar manner as Q 2 , values surpassing 0, 0.25, and 0.5 indicate a small, medium, and large predictive relevance of the PLS path model, respectively [78].
The predictive strength of a model is usually evaluated using the root mean squared error (RMSE). However, when the distribution of prediction errors is notably non-symmetric, the mean absolute error (MAE) represents a suitable alternative [87]. This evaluation procedure involves comparing the RMSE (or MAE) values to a simple linear regression model (LM) benchmark. The outcomes from this comparison [87] could be as follows: (a) if the RMSE (or MAE) values surpass those of the simple LM benchmark across all items, this indicates the model lacks predictive strength; (b) if the majority of items in the endogenous construct exhibit larger prediction errors than the LM benchmark, it suggests the model has low predictive strength; (c) when a minority or equal number of construct items show higher prediction errors compared with the LM benchmark, it indicates the model has medium predictive strength; and (d) if none of the items demonstrate higher RMSE (or MAE) values than the LM benchmark, it infers the model has high predictive strength.
Upon visually inspecting the error histograms, it was revealed that the distribution of prediction errors is highly non-symmetric. Consequently, the predictive power evaluation was based on MAE. As displayed in Table 8’s fourth column, a minority of endogenous construct items exhibit higher PLS-SEM_MAE values when compared with the naïve LM_MAE benchmark. This indicates that the proposed model has medium predictive power.
Changes in Q predict 2 represent the relative influence ( q 2 ) of exogenous constructs on predicting the observed measures of endogenous constructs within the structural model. According to Henseler et al. [67], q 2 values of 0.02, 0.15, or 0.35 indicate that a specific exogenous construct has weak, moderate, or substantial relevance in predicting an endogenous construct, respectively. The calculation of q 2 values is performed as follows [83]:
Q predict     I 2 Q predict     E 2 1 Q predict     I 2
Q predict   I 2 represents the value of Q predict 2 for an endogenous construct when the related exogenous construct is factored into the model calculation. On the other hand, Q predict   E 2 signifies the value of Q predict 2 for an endogenous construct when the associated exogenous construct is not considered in the model estimation. The findings presented in Table 9 indicate that player enjoyment ( q 2 = 0.351 ) emerges as a robust predictor and player engagement ( q 2 = 0.180 ) serves as moderate antecedent, while gameplay mechanics ( q 2 = 0.063 ) appear to be a weak determinant of a player’s intention to continue engaging with the game.
While player enjoyment ( q 2 = 0.264 ) exhibits a moderate level of importance in predicting player engagement, gameplay mechanics ( q 2 = 0.004 ) yield very poor predictive relevance for the same construct. Nevertheless, gameplay mechanics ( q 2 = 0.181 ) display a moderate significance in forecasting player enjoyment, akin to visual elements ( q 2 = 0.152 ). Audio elements ( q 2 = 0.075 ) represent a weak predictor of gameplay mechanics, while the user interface sensibility ( q 2 = 0.158 ) emerges as a moderate predictor in the same respect. Gameplay mechanics ( q 2 = 0.189 ) exhibit a moderate degree of relevance concerning learnability whereas the user interface sensibility ( q 2 = 0.005 ) is identified as a very weak predictor of the same construct. Lastly, visual elements demonstrate a moderate level of importance ( q 2 = 0.296 ) in predicting the user interface sensibility, while audio elements ( q 2 = 0.253 ) prove to have a moderate significance in forecasting the quality of visual elements in the realm of platform video games.

5. Discussion

The proposed conceptual model evaluation results have shown that gameplay mechanics, player engagement, and player enjoyment represent important facets of the gaming experience since they affect players’ behavioral intentions. In the gaming industry, understanding players’ behavioral intentions is crucial for creating engaging experiences and encouraging positive actions, such as recommending the game to others, posting positive online reviews, or continuing to play the game [58,59]. Informative audio can enhance players’ performance and interpretation of visual elements [43], while diegetic sound can create a more immersive experience [44]. The findings of our study suggest that audio elements are a strong determinant of visual aesthetics and a significant predictor of gameplay mechanics, highlighting the importance of sound design in video game development. We also found that visual elements significantly contribute to the sensibility of in-game user interfaces and the player’s enjoyment of video games, which is consistent with the results of current studies carried out by Guardini and Maninetti [29] and Schell [46]. In contrast to the findings of the mentioned studies, our research revealed that the majority of participants regarded game graphics as the least significant feature of a video game.
The outcomes of our empirical study also demonstrate that a well-designed user interface leads to players perceiving gameplay mechanics as intuitive and responsive, aligning with the results of studies carried out by Isbister [49] and Korhonen et al. [50]. Furthermore, a high-quality user interface not only enhances the understanding of gameplay mechanics but also improves the game’s learnability. This makes it easier for players to quickly grasp the game’s concepts and mechanics, resulting in an improved and more enjoyable gaming experience, which is in line with the results of current studies [45,49,50,51]. The findings from the studies conducted by Ermi and Mäyrä [54], Alexiou and Schippers [55], Kiili [56], Hamari and Tuunanen [57], and Vorderer et al. [58] align with our research results, indicating that meticulously designed gameplay mechanics enhance player engagement and player enjoyment, and improve the learnability of platform video games. Additionally, player enjoyment has been identified as a crucial factor in determining player engagement in platform video games, which is consistent with the findings of Abbasi et al. [61] and Wu and Liu [62].
The proposed research framework has demonstrated medium predictive power, which indicates that it can provide meaningful insights into the interplay of factors that shape the gaming experience and can accurately predict how players are likely to perceive and respond to certain aspects of the game. However, it is important to note that the model is not infallible, and there may be certain individual differences or contextual factors that can affect the accuracy of the predictions. Individual nuances, including personal preferences, gaming environment, and gaming styles, as well as the subjective nature of user experience, can result in varied interpretations and responses.

6. Limitations and Future Work Directions

Given the empirical nature of our study, it is important to acknowledge and address several limitations that may impact the validity, generalizability, and applicability of the reported findings. The first limitation is the relatively small sample size. While the research results confirmed the hypothesized relations among dimensions of the gaming experience, it is important to replicate the study with a larger sample of participants to further examine the psychometric features of the measuring instrument and the introduced conceptual model. The second limitation of the study is its focus on a single game. While this allowed for a detailed analysis of the game’s design and player experience, it limits the generalizability of the study findings to other games. Furthermore, it is important to note that the study sample primarily comprised individuals from a single university in Croatia, predominantly students aged 18–23 years. Therefore, caution must be exercised when generalizing the study results to other populations of game players. Another limitation of this study is its reliance on a platform game genre. As a result, the generalizability of the findings to video games of other genres or those with different gameplay mechanics may be limited. Considering the aforementioned limitations, it is necessary to conduct further studies that encompass games from different genres and utilize larger and more heterogenous samples of game players. These future investigations will help validate the robustness of the reported findings and enable the drawing of sound conclusions. An additional limitation that needs to be considered is the study’s sole focus on the evaluation of direct links between variables. For instance, we analysed the direct relationship between gameplay mechanics and player engagement and, separately, player engagement and behavioral intention. While this approach provides valuable insights into how individual components interact, it neglects the potential indirect effects that may also play significant roles. For example, the gameplay mechanics might indirectly influence behavioral intention through player engagement, forming a mediation pathway that we have not evaluated in this study. This lack of examination of indirect effects could obscure a more complex and potentially more enlightening picture of how various factors interact within the gaming experience. Future research could explore these indirect links to provide a more holistic understanding of the gaming experience. Lastly, it is important to acknowledge that self-reported questionnaires are susceptible to response bias and social desirability bias, which may compromise the validity of our findings. To mitigate this limitation, future research could incorporate additional methods, such as observational studies or interviews, to obtain a thorough comprehension of the gaming experience. By employing multiple data collection approaches, a more nuanced interpretation of the findings can be achieved.

7. Conclusions

In the realm of video games, the integration of audio elements brings forth a remarkable enhancement to the visual aspects, resulting in a more immersive and captivating gaming experience. By skillfully incorporating masterfully orchestrated audio elements, the gameplay mechanics are elevated, amplifying the joy and interactivity within the game. Furthermore, the presence of visually appealing and thoughtfully crafted elements extends their impact to the user interface, rendering it more intuitive and user-friendly. The allure of visually captivating elements further contributes to the immersive and enjoyable nature of the game.
Within this symbiotic relationship, an artfully created and intuitive user interface plays a vital role in enhancing the gameplay mechanics, effectively increasing the game’s engagement and interactivity. The presence of such a user interface facilitates the learning process, enabling players to swiftly comprehend the game’s mechanics and concepts. Moreover, expertly crafted gameplay mechanics serve as a catalyst for heightened player engagement and immersion in the game. Their seamless integration not only enhances the enjoyable and immersive nature of the game but also expedites the learning process, allowing players to swiftly grasp the mechanics and concepts at play.
The positive influence of diligently created gameplay mechanics extends to players’ intentions to further engage with the game. Their impact is reinforced by the high level of enjoyment experienced by players. A heightened sense of enjoyment not only stimulates players’ intention to delve deeper into the game but also enhances their overall engagement and immersion. As players’ engagement intensifies, their intentions to continue their involvement with the game are further solidified.
In this interconnected web, higher levels of engagement act as a driving force, directly fueling players’ intentions to prolong their interaction with the game. The cycle continues as the audio elements enhance the visual aspects, carefully designed audio elements bolster gameplay mechanics, visually appealing elements enrich the user interface, and an ingeniously constructed user interface facilitates the learning process. In turn, impeccably structured gameplay mechanics amplify engagement, which, coupled with a heightened level of enjoyment, fortifies players’ intentions to continue their journey within the game.
By comprehending the intricate relationships between these elements, game developers can craft games that captivate and engage players on a profound level. The findings from this study shed light on these influential factors, providing valuable insights for developers to prioritize and enhance the core elements that shape the gaming experience. Through a meticulous integration of audio and visual elements, intuitive user interfaces, and precision engineered gameplay mechanics, developers can create games that leave a lasting impression, enticing players to embark on a thrilling and unforgettable gaming adventure.
The findings of this study hold important implications for both researchers and game developers. For researchers, the study highlights the significance of considering the holistic gaming experience and the interplay between its dimensions. It also emphasizes the need for comprehensive and multidimensional assessments when investigating the factors that contribute to the player experience. Researchers can expand upon these findings by investigating additional variables, such as emotional engagement and social interaction, as well as exploring different contexts, including diverse game genres and a heterogeneous structure of the player sample. This broader approach can provide a more thorough understanding of the gaming experience and help uncover additional factors that contribute to player engagement, enjoyment, and behavioral intention.
For developers, this study provides valuable insights into the key elements that contribute to an appealing and enjoyable gaming experience. It emphasizes the importance of investing in artfully orchestrated audio elements, visually appealing graphics, intuitive user interfaces, and engaging gameplay mechanics. By prioritizing these aspects during game design, developers can create immersive and captivating experiences that resonate with players. The findings also underscore the significance of designing games that are not only visually stimulating but also intuitive and easy to learn, ensuring that players can quickly grasp the mechanics and fully engage with the game.
Furthermore, this study highlights the role of player enjoyment, engagement, and behavioral intention in shaping the success of a video game. Developers can leverage these findings to create games that not only provide entertainment but also foster a strong sense of enjoyment and engagement among players. By understanding the factors that influence players intentions to continue playing and recommending the game to others, developers can design games that have a lasting impact and attract a loyal player base.
Overall, the implications of this study call for a holistic approach to video game design and research, where the interrelationships between various elements are carefully considered. By incorporating these insights into their work, researchers and developers can collaboratively advance the field and create games that captivate and engage players on a deeper level.

Author Contributions

Conceptualization, L.B., T.O. and A.Ž.; methodology, T.O., L.B. and A.Ž.; software, L.B.; validation, T.O. and L.B.; formal analysis, L.B. and T.O.; investigation, L.B. and A.Ž.; development, A.Ž. and L.B.; resources, T.O.; data curation, L.B. and T.O.; writing—original draft preparation, L.B. and A.Ž.; writing—review and editing, T.O.; visualization, L.B. and A.Ž.; supervision, T.O.; project administration, T.O.; funding acquisition, T.O. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Research Ethics Committee of Juraj Dobrila University of Pula (3 May 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

  • Gameplay mechanics (GMH)
GMH1. The game’s mechanics have been skillfully crafted.
GMH2. The game’s mechanics are intuitive.
GMH3. The game’s mechanics are dynamic.
GMH4. The game’s mechanics are responsive.
GMH5. The mechanics allowed me to interact with the game in a meaningful way.
  • Learnability (LRN)
LRN1. The game is easy to learn.
LRN2. The game’s objectives were clear.
LRN3. Controls in the game were easy to master.
LRN4. The game’s tutorial effectively explained how to play the game.
LRN5. I encountered very few obstacles while playing the game.
  • User interface sensibility (UIS)
UIS1. The game’s interface is well organized.
UIS2. The size of the in-game user interface was appropriate.
UIS3. I did not feel overwhelmed by the game’s user interface.
UIS4. The game’s user interface is easy to navigate.
UIS5. I felt informed about in-game progress and objectives while playing.
  • Visual elements (VIS)
VIS1. I found the game’s visuals appealing.
VIS2. The game’s visuals successfully portrayed the game’s setting.
VIS3. The game’s visuals reflected the genre well.
VIS4. The game’s visuals made the game stand out.
VIS5. The game’s visuals enhanced the overall gaming experience.
  • Audio elements (AUD)
AUD1. Music and sound effects captured the essence of the game’s setting.
AUD2. The sounds in the game were believable.
AUD3. The ambient sounds have contributed to the game atmosphere.
AUD4: The sound effects in the game were meticulously orchestrated.
AUD5. Sound and music enriched the overall gaming experience.
  • Player engagement (ENG)
ENG1. I was fully engaged while playing the game.
ENG2. I felt absorbed while playing the game.
ENG3. I lost track of time while playing the game.
ENG4. The game kept my attention for a longer time.
ENG5. The story in the game kept me interested.
  • Player enjoyment (ENY)
ENY1. The game is fun.
ENY2. The game is pleasant to play.
ENY3. Playing the game gives me a sense of satisfaction.
ENY4. Playing the game gives me a sense of accomplishment.
ENY5. Playing the game gives me a feeling of happiness.
  • Behavioral intention (BEH)
BEH1. I am looking forward to replaying this game in the future.
BEH2. I am inclined to recommend this game to others in my gaming community.
BEH3. I intend to share my experience with the game on social media.
BEH4. I plan to try other games from the same developer.
BEH5. I am interested in playing other games of the same genre.
  • Note that items in italic were removed from the research framework because they failed to meet the requirements of reliability indices.

References

  1. Granic, I.; Lobel, A.; Engels, R.C.M.E. The Benefits of playing video games. Am. Psychol. 2014, 69, 66–78. [Google Scholar] [CrossRef]
  2. Anderson, C.A.; Dill, K.E. Video games and aggressive thoughts, feelings, and behavior in the laboratory and in life. J. Pers. Soc. Psychol. 2000, 78, 772–790. [Google Scholar] [CrossRef] [PubMed]
  3. Jang, Y.; Park, E. An adoption model for virtual reality games: The roles of presence and enjoyment. Telemat. Inform. 2019, 42, 101239. [Google Scholar] [CrossRef]
  4. Hamari, J.; Sjöblom, M. What is eSports and why do people watch it? Internet Res. 2017, 27, 2–26. [Google Scholar] [CrossRef] [Green Version]
  5. Halbrook, Y.J.; O’Donnell, A.T.; Msetfi, R.M. When and How Video Games Can Be Good: A Review of the Positive Effects of Video Games on Well-Being. Perspect. Psychol. Sci. 2019, 14, 1096–1104. [Google Scholar] [CrossRef] [PubMed]
  6. Kirkpatrick, K. Can AI Demonstrate Creativity? Commun. ACM 2023, 66, 21–23. [Google Scholar] [CrossRef]
  7. Anantrasirichai, N.; Bull, N. Artificial intelligence in the creative industries: A review. Artif. Intell. Rev. 2022, 55, 589–656. [Google Scholar] [CrossRef]
  8. Bernhaupt, R. User Experience Evaluation in Entertainment. In Evaluating User Experience in Games; Bernhaupt, R., Ed.; Springer: London, UK, 2010; pp. 3–7. [Google Scholar] [CrossRef]
  9. Kuniavsky, M. User Experience and HCI. In The Human-Computer Interaction Handbook, 2nd ed.; Sears, A., Jacko, J.A., Eds.; CRC Press: Boca Raton, FL, USA, 2007; pp. 897–917. [Google Scholar] [CrossRef]
  10. Calleja, G. In-Game: From Immersion to Incorporation; The MIT Press: Cambridge, MA, USA, 2011; pp. 30–47. [Google Scholar] [CrossRef]
  11. Drozina, A.; Orehovački, T. Creating a Tabletop Game Prototype in Unreal Engine 4. In Proceedings of the 41st International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1814–1819. [Google Scholar] [CrossRef]
  12. Barr, P.; Noble, J.; Biddle, R. Video games values: Human-computer interaction and games. Interact. Comput. 2007, 19, 180–195. [Google Scholar] [CrossRef]
  13. Sweetser, P.; Wyeth, P. GameFlow: A model for evaluating player enjoyment in games. Comput. Entertain. 2005, 3, 3. [Google Scholar] [CrossRef]
  14. Ijsselsteijn, W.A.; de Kort, Y.A.W.; Poels, K. The Game Experience Questionnaire. Human-Technology Interaction (HTI) Group, Eindhoven University of Technology (TU/e). 2013. Available online: https://research.tue.nl/en/publications/the-game-experience-questionnaire (accessed on 18 April 2023).
  15. Nagalingam, V.; Ibrahim, R. User Experience of Educational Games: A Review of the Elements. Procedia Comput. Sci. 2015, 72, 423–433. [Google Scholar] [CrossRef] [Green Version]
  16. Blašković, L.; Žužić, A.; Orehovački, T. Stranded Away: Implementation and User Experience Evaluation of an Indie Platformer Game Developed Using Unity Engine. In Proceedings of the 46th International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 22–26 May 2023; pp. 96–101. [Google Scholar]
  17. Barr, M.; Copeland-Stewart, A. Playing Video Games During the COVID-19 Pandemic and Effects on Players’ Well-Being. Games Cult. 2022, 17, 122–139. [Google Scholar] [CrossRef]
  18. Šag, A.; Orehovački, T. Development of 2D Game with Construct 2. In Proceedings of the 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1916–1921. [Google Scholar] [CrossRef]
  19. Sršen, M.; Orehovački, T. Developing a Game Engine in C# Programming Language. In Proceedings of the 44th International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 24–28 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1972–1977. [Google Scholar] [CrossRef]
  20. Jennett, C.; Cox, L.A.; Cairns, P.; Dhoparee, S.; Epps, A.; Tijs, T.; Walton, A.M. Measuring and Defining the Experience of the Immersion in Games. Int. J. Hum. Comput. 2008, 66, 641–661. [Google Scholar] [CrossRef]
  21. Drachen, A.; Canossa, A.; Yannakakis, N.A. Player modeling using self-organization in Tomb Raider: Underworld. In Proceedings of the IEEE Symposium on Computational Intelligence and Games, Milan, Italy, 7–10 September 2009; IEEE: Piscataway, NJ, USA, 2009. [Google Scholar] [CrossRef] [Green Version]
  22. Nacke, L.; Drachen, A. Towards a Framework of Player Experience Research. In Proceedings of the 2nd International Workshop on Evaluating Player Experience in Games at FDG, Bordeaux, France, 29 June–1 July 2011. [Google Scholar]
  23. Yee, N. Motivations for Play in Online Games. CyberPsychology Behav. 2006, 9, 772–775. [Google Scholar] [CrossRef] [PubMed]
  24. Bošnjak, M.; Orehovački, T. Measuring Quality of an Indie Game Developed Using Unity Framework. In Proceedings of the 41st International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1820–1825. [Google Scholar] [CrossRef]
  25. Šajina, R.; Orehovački, T. User Experience Evaluation of 2D Side-Scrolling Game Developed Using Overlap2D Game Editor and LibGDX Game Engine. In Proceedings of the 41st International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1826–1831. [Google Scholar] [CrossRef]
  26. Cairns, P.; Cox, L.A.; Nordin, I. Immersion in Digital Games: Review of Gaming Experience Research. In Handbook of Digital Games; Angelides, C.M., Agius, H., Eds.; IEEE Press: Piscataway, NJ, USA, 2014; pp. 339–362. [Google Scholar] [CrossRef]
  27. Koivisto, J.; Hamari, J. Demographic differences in perceived benefits from gamification. Comput. Hum. Behav. 2014, 35, 179–188. [Google Scholar] [CrossRef]
  28. Koivisto, J.; Hamari, J. The rise of motivational information systems: A review of gamification research. Int. J. Inform. Manag. 2019, 45, 191–210. [Google Scholar] [CrossRef]
  29. Guardini, P.; Maninetti, P. Better Game Experience Through Game Metrics: A Rally Videogame Case Study. In Game Analytics: Maximizing the Value of Player Data; Seif El-Nasr, M., Drachen, A., Canossa, A., Eds.; Springer: London, UK, 2013; pp. 325–361. [Google Scholar] [CrossRef]
  30. Mekler, D.E.; Brühlmann, F.; Touch, N.A.; Opwis, K. Towards understanding the effects of individual gamification elements on intrinsic motivation and performance. Comput. Hum. Behav. 2017, 71, 525–534. [Google Scholar] [CrossRef]
  31. Schaffer, O.; Fang, X. Digital Game Enjoyment: A Literature Review. In Lecture Notes in Computer Science, Proceedings of the 1st International Conference HCI-Games, Orlando, FL, USA, 26–31 July 2019; Fang, X., Ed.; Springer: Cham, Switzerland, 2019; Volume 11595, pp. 191–214. [Google Scholar] [CrossRef]
  32. Orehovački, T.; Babić, S. Inspecting Quality of Games Designed for Learning Programming. In Lecture Notes in Computer Science, Proceedings of the 2nd International Conference on Learning and Collaboration Technologies, LCT 2015, Los Angeles, CA, USA, 2–7 August 2015; Zaphiris, P., Ioannou, A., Eds.; Springer: Cham, Switzerland, 2015; Volume 9192, pp. 620–631. [Google Scholar] [CrossRef]
  33. Orehovački, T.; Babić, S. Evaluating the Quality of Games Designed for Learning Programming by Students with Different Educational Background: An Empirical Study. In Proceedings of the 38th International Convention on Information and Communication Technology, Electronics and Microelectronics, Opatija, Croatia, 28 May 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 963–968. [Google Scholar] [CrossRef]
  34. Deci, E.L.; Ryan, R.M. Self-determination theory. In Handbook of Theories of Social Psychology; Van Lange, P.A.M., Kruglanski, A.W., Higgins, E.T., Eds.; Sage Publications: Southend Oaks, CA, USA, 2012; Volume 1, pp. 416–436. [Google Scholar] [CrossRef]
  35. Ryan, R.M.; Rigby, C.S.; Przybylski, A. The Motivational Pull of Video Games: A Self-Determination Theory Approach. Motiv. Emot. 2006, 30, 344–360. [Google Scholar] [CrossRef]
  36. Phan, M.K.; Keebler, J.R.; Chaparro, B.S. The Development and Validation of the Game User Experience Satisfaction Scale (GUESS). Hum. Factors 2016, 58, 1217–1247. [Google Scholar] [CrossRef]
  37. Mandryk, R.L.; Atkins, M.S. A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int. J. Hum. Comput. 2007, 65, 329–348. [Google Scholar] [CrossRef]
  38. Almeida, S.; Mealha, O.; Veloso, A. Video game scenery analysis with eye tracking. Entertain. Comput. 2016, 14, 1–13. [Google Scholar] [CrossRef]
  39. Karavidas, L.; Apostolidis, H.; Tsiatsos, T. Usability Evaluation of an Adaptive Serious Game Prototype Based on Affective Feedback. Information 2022, 13, 425. [Google Scholar] [CrossRef]
  40. Petri, G.; von Wangenheim, C.G. MEEGA+: A Method for the Evaluation of the Quality of Games for Computing Education. In Proceedings of the SBGames, Rio de Janeiro, Brazil, 28–31 October 2019; pp. 1384–1387. [Google Scholar]
  41. Consalvo, M.; Dutton, M. Game analysis: Developing a methodological toolkit for the qualitative study of games. Game Stud. Int. J. Comput. Game Res. 2006, 6, 56–70. [Google Scholar]
  42. Unity Documentation. Available online: https://docs.unity3d.com/Manual/index.html (accessed on 4 April 2023).
  43. Ng, P.; Nesbitt, K. Informative Sound Design in Video Games. In Proceedings of the 9th Australasian Conference on Interactive Entertainment Matters of Life and Death, Melbourne, Australia, 30 September 2013; ACM: New York, NY, USA, 2013; pp. 1–9. [Google Scholar] [CrossRef]
  44. Jørgensen, K. How Audio Affects Player Action. Mellen Press. In A Comprehensive Study of Sound in Computer Games, 1st ed.; Jørgensen, K., Jensen, K.B., Eds.; Edwin Mellen Pr: Lewiston, NY, USA, 2009; Volume 3, pp. 120–157. [Google Scholar]
  45. Nacke, L.E.; Grimshaw, M.N.; Lindley, C.A. More than a feeling: Measurement of sonic user experience and psychophysiology in a first-person shooter game. Interact. Comput. 2010, 22, 336–343. [Google Scholar] [CrossRef]
  46. Schell, J. The Art of Game Design, 1st ed.; CRC Press: Boca Raton, FL, USA, 2008; pp. 402–421. [Google Scholar]
  47. Miszal, S.; Schild, J. Visual Delegate Generalization Frame—Evaluating Impact of Visual Effects and Elements on Player and User Experiences in Video Games and Interactive Virtual Environments. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April 2022; pp. 1–20. [Google Scholar] [CrossRef]
  48. Birk, M.V.; Atkins, C.; Bowey, J.T.; Mandryk, R.L. Fostering Intrinsic Motivation through Avatar. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7 May 2016; pp. 2982–2995. [Google Scholar] [CrossRef]
  49. Isbister, K. How Games Move Us: Emotion by Design, Reprint; The MIT Press: Cambridge, MA, USA, 2017; pp. 150–175. [Google Scholar]
  50. Korhonen, H.; Montola, M.; Arrasvuori, J. Understanding playful user experience through digital games. Int. J. Arts Technol. 2009, 2, 204–222. [Google Scholar]
  51. González Sánchez, J.L.; Zea, N.P.; Gutiérrez, F.L. From Usability to Playability: Introduction to Player-Centred Video Game Development Process. In Human Centered Design, Proceedings of First International Conference, HCD 2009, Held as Part of HCI International, San Diego, CA, USA, 19–24 July 2009; Kurosu, M., Ed.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2009; Volume 5619, pp. 65–74. [Google Scholar] [CrossRef]
  52. Poretski, L.; Tang, A. Press A to Jump: Design Strategies for Video Game Learnability. In Proceedings of the 2022 CHI Conference of human Factors in Computing Systems, New Orleans, LA, USA, 30 April–6 May 2022; pp. 1–26. [Google Scholar] [CrossRef]
  53. Atorf, D.; Kannegieser, E.; Roller, W. Study on Enhancing Learnability of a Serious Game by Implementing a Pedagogical Agent. In Games and Learning Alliance. GALA 2019. Lecture Notes in Computer Science; Liapis, A., Yannakakis, G., Gentile, M., Ninaus, M., Eds.; Springer: Cham, Switzerland, 2019; Volume 11899, pp. 158–168. [Google Scholar] [CrossRef]
  54. Ermi, L.; Mäyrä, F. Fundamental Components of the Gameplay Experience: Analysing Immersion. In DIGAREC Keynote-Lectures 2009/10; Günzel, S., Liebe, M., Mersch, D., Eds.; University Press: Potsdam, Germany, 2011; pp. 88–115. [Google Scholar]
  55. Alexiou, A.; Schippers, M.C. Digital game elements, user experience and learning: A conceptual framework. Educ. Inf. Technol. 2018, 23, 2545–2567. [Google Scholar] [CrossRef] [Green Version]
  56. Kiili, K. Digital game-based learning: Towards an experiential gaming model. Internet High. Educ. 2005, 8, 13–24. [Google Scholar] [CrossRef]
  57. Hamari, J.; Tuunanen, J. Player Types: A Meta-synthesis. Trans. Digit. Games Res. Assoc. 2014, 1, 29–53. [Google Scholar] [CrossRef]
  58. Vorderer, P.; Klimmt, C.; Ritterfeld, U. Enjoyment: At the heart of media entertainment. Commun. Theory 2004, 14, 388–408. [Google Scholar] [CrossRef]
  59. Przybylski, A.K.; Rigby, C.S.; Ryan, R.M. A motivational model of video game engagement. Rev. Gen. Psychol. 2010, 14, 154–166. [Google Scholar] [CrossRef] [Green Version]
  60. Zhao, F.; Huang, Q. A Conceptual Model of Online Game Continuance Playing. In Lecture Notes in Computer Science, Proceedings of 17th International Conference, HCI International 2015, Los Angeles, CA, USA, 2–7 August 2015; Kurosu, M., Ed.; Springer: Cham, Switzerland, 2015; Volume 9170, pp. 660–669. [Google Scholar] [CrossRef]
  61. Abbasi, A.Z.; Ting, D.H.; Hlavacs, H. Proposing a New Conceptual Model Predicting Consumer Videogame Engagement Triggered Through Playful-Consumption Experiences. In Lecture Notes in Computer Science, Proceedings of the International Conference on Entertainment Computing ICEC 2016: Entertainment Computing, Vienna, Austria, 28–30 September 2016; Wallner, G., Kriglstein, S., Hlavacs, H., Malaka, R., Lugmayr, A., Yang, H.S., Eds.; Springer: Cham, Switzerland, 2016; Volume 9926, pp. 126–134. [Google Scholar] [CrossRef] [Green Version]
  62. Wu, J.; Liu, D. The effects of trust and enjoyment on intention to play online games. J. Electron. Commer. Res. 2007, 8, 128–140. Available online: http://www.jecr.org/node/159 (accessed on 14 May 2023).
  63. Lapaš, T.; Orehovački, T. Evaluation of User Experience in Interaction with Computer Games. In Lecture Notes in Computer Science, Proceedings of the 4th International Conference on Design, User Experience, and Usability, DUXU 2015, Los Angeles, CA, USA, 2–7 August 2015; Marcus, A., Ed.; Springer: Cham, Switzerland, 2015; Volume 9188, pp. 271–282. [Google Scholar] [CrossRef]
  64. O’Brien, H.L.; Toms, E. What is User Engagement? A Conceptual Framework for Defining User Engagement with Technology. J. Assoc. Inf. Sci. Technol. 2008, 59, 938–955. [Google Scholar] [CrossRef] [Green Version]
  65. Hair, J.F.; Ringle, C.M.; Sarstedt, M. PLS-SEM: Indeed a Silver Bullet. J. Mark. Theory Pract. 2014, 19, 139–152. [Google Scholar] [CrossRef]
  66. Hair, J.F.; Sarstedt, M.; Ringle, C.M.; Mena, J.A. An assessment of the use of partial least squares structural equation modeling in marketing research. J. Acad. Mark. Sci. 2012, 40, 414–433. [Google Scholar] [CrossRef]
  67. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The use of partial least squares path modeling in international marketing. Adv. Int. Mark. 2009, 20, 277–319. [Google Scholar] [CrossRef] [Green Version]
  68. Tenenhaus, M.; Esposito Vinzi, V.; Chatelin, Y.-M.; Lauro, C. PLS path modeling. Comput. Stat. Data Anal. 2005, 48, 159–205. [Google Scholar] [CrossRef]
  69. Gefen, D.; Straub, D.; Boudreau, M.C. Structural equation modeling and regression: Guidelines for research practice. Commun. Assoc. Inf. Syst. 2000, 4, 2–77. [Google Scholar] [CrossRef] [Green Version]
  70. Kock, N.; Hadaya, P. Minimum sample size estimation in PLS-SEM: The inverse square root and gamma-exponential methods. Inf. Syst. J. 2018, 28, 227–261. [Google Scholar] [CrossRef]
  71. Hair, J.F.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 3rd ed.; Sage: Thousand Oaks, CA, USA, 2022. [Google Scholar]
  72. Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 4. Oststeinbek: SmartPLS GmbH. 2023. Available online: http://www.smartpls.com (accessed on 18 April 2023).
  73. Esposito Vinzi, V.; Trinchera, L.; Amato, S. PLS path modeling: From foundations to recent developments and open issues for model assessment and improvement. In Handbook of Partial Least Squares; Esposito Vinzi, V., Chin, W.W., Henseler, J., Wang, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 47–82. [Google Scholar] [CrossRef]
  74. Hulland, J. Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strateg. Manag. J. 1999, 20, 195–204. [Google Scholar] [CrossRef]
  75. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  76. Werts, C.E.; Linn, R.L.; Jöreskog, K.G. Intraclass Reliability Estimates: Testing Structural Assumptions. Educ. Psychol. Meas. 1974, 34, 25–33. [Google Scholar] [CrossRef]
  77. Dijkstra, T.K.; Henseler, J. Consistent partial least squares path modeling. MIS Q. 2015, 39, 297–316. [Google Scholar] [CrossRef]
  78. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  79. Russo, D.; Stol, K.-J. PLS-SEM for Software Engineering Research: An Introduction and Survey. ACM Comput. Surv. 2022, 54, 1–38. [Google Scholar] [CrossRef]
  80. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  81. Götz, O.; Liehr-Gobbers, K.; Krafft, M. Evaluation of structural equation models using the Partial Least Squares (PLS) approach. In Handbook of Partial Least Squares; Esposito Vinzi, V., Chin, W.W., Henseler, J., Wang, H., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 691–711. [Google Scholar] [CrossRef]
  82. Orehovački, T. Methodology for Evaluating the Quality in Use of Web 2.0 Applications. Ph.D. Thesis, University of Zagreb, Faculty of Organization and Informatics, Varaždin, Croatia, 2013. [Google Scholar]
  83. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Lawrence Erlbaum Associates: Hillsdale, MI, USA, 1988. [Google Scholar]
  84. Stone, M. Cross-validatory choice and assessment of statistical predictions. J. R. Stat. Soc. B 1974, 36, 111–147. [Google Scholar] [CrossRef]
  85. Geisser, S. The predictive sample reuse method with applications. J. Am. Stat. Assoc. 1975, 70, 320–328. [Google Scholar] [CrossRef]
  86. Shmueli, G.; Ray, S.; Estrada, J.M.V.; Chatla, S.B. The elephant in the room: Predictive performance of PLS models. J. Bus. Res. 2016, 69, 4552–4564. [Google Scholar] [CrossRef]
  87. Shmueli, G.; Sarstedt, M.; Hair, J.F.; Cheah, J.-H.; Ting, H.; Vaithilingam, S.; Ringle, C.M. Predictive model assessment in PLS-SEM: Guidelines for using PLSpredict. Eur. J. Mark. 2019, 53, 2322–2347. [Google Scholar] [CrossRef]
Figure 1. The opening scene of Stranded Away video game.
Figure 1. The opening scene of Stranded Away video game.
Information 14 00350 g001
Figure 2. Research model with corresponding hypotheses.
Figure 2. Research model with corresponding hypotheses.
Information 14 00350 g002
Table 1. Standardized factor loadings and cross-loadings of items.
Table 1. Standardized factor loadings and cross-loadings of items.
AUDBEHENGENJGMHLRNUISVIS
AUD10.7960.2940.4240.3300.2930.2370.1740.292
AUD20.8220.3450.3040.4200.3290.1000.2050.399
AUD30.7440.1560.1700.2410.3260.1170.3740.530
AUD40.8230.1960.1560.2360.3330.2090.2610.497
AUD50.7140.1690.2030.2130.1390.0600.1190.270
BEH10.2340.8890.6590.7130.5740.3500.4090.399
BEH20.2600.8190.5730.6930.4520.1750.2850.391
BEH30.2830.7640.5510.5890.2980.1370.0890.194
BEH40.1770.7680.4590.5700.4360.2440.2690.294
ENG10.2140.4660.7990.4820.3830.1650.3130.306
ENG20.1200.4620.7950.4410.3740.2200.2340.196
ENG40.2880.6310.8280.6200.4250.1830.2640.337
ENG50.3240.6300.7870.6190.4470.2660.2050.250
ENY10.3200.7090.6490.8400.5180.2970.5000.457
ENY20.2760.6520.6010.7950.5370.3530.4970.455
ENY30.2720.6680.5630.8600.3840.3000.3660.342
ENY40.3500.5910.4570.7710.2570.1840.2560.360
ENY50.3130.6530.5460.8710.3850.2570.2940.319
GMH10.3670.4530.4160.4330.8410.4890.4400.436
GMH20.2530.4060.3680.3450.7420.3450.3230.354
GMH30.2560.4210.4690.4380.7420.2850.3350.459
GMH50.2930.4160.3270.3680.7570.4480.3830.410
LRN10.0310.1370.0690.1920.2990.7650.2710.130
LRN20.1920.3370.3130.3490.4210.7630.3570.317
LRN40.1860.1660.1920.2350.4440.7970.3480.219
UIS10.3460.2740.2740.3970.3860.2350.8250.453
UIS30.2430.1620.1100.2700.3350.2090.8000.449
UIS40.2170.3690.3610.4870.4710.5540.8830.477
VIS10.3270.3810.3160.4050.4740.3620.6130.833
VIS20.5150.2600.1650.3360.3720.1980.4350.819
VIS30.4120.3030.2550.3310.3760.2520.4400.805
VIS40.4620.3250.3060.3640.4570.2100.3440.839
VIS50.4980.3730.3620.4880.5230.1970.4070.814
Note that bold values on the diagonal represent standardized factor loadings.
Table 2. Convergent validity and internal consistency of constructs.
Table 2. Convergent validity and internal consistency of constructs.
ConstructsCronbach’s Alpharho_Arho_CAVE
Audio elements (AUD)0.8440.8620.8860.610
Behavioral intention (BEH)0.8270.8400.8850.659
Player engagement (ENG)0.8180.8270.8790.644
Player enjoyment (ENJ)0.8850.8910.9160.686
Gameplay mechanics (GMH)0.7730.7810.8550.596
Learnability (LRN)0.6720.6770.8180.600
User interface sensibility (UIS)0.7890.8240.8750.700
Visual elements (VIS)0.8800.8820.9120.676
Table 3. Heterotrait–Monotrait ratio of correlations (HTMT).
Table 3. Heterotrait–Monotrait ratio of correlations (HTMT).
AUDBEHENGENJGMHLRNUISVIS
AUD
BEH0.355
ENG0.3690.821
ENJ0.4270.9200.781
GMH0.4440.6790.6370.606
LRN0.2630.3620.3280.4250.687
UIS0.3610.3840.3730.5360.6030.533
VIS0.5890.4600.3970.5250.6490.3700.653
Table 4. Results of testing collinearity among exogenous constructs in the structural model.
Table 4. Results of testing collinearity among exogenous constructs in the structural model.
AIDBEHENGENJGMHLRNUISVIS
AUD 1.109 1.000
BEH
ENG 2.026
ENJ 2.0331.360
GMH 1.4541.3601.409 1.304
LRN
UIS 1.1091.304
VIS 1.409 1.000
Note that endogenous constructs are in the columns while exogenous constructs are in the rows.
Table 5. Results of testing the explanatory power of the research model.
Table 5. Results of testing the explanatory power of the research model.
Endogenous Constructs R 2 R 2   Adjusted
Behavioral intention (BEH)0.6870.677
Player engagement (ENG)0.5060.496
Player enjoyment (ENJ)0.3180.304
Gameplay mechanics (GMH)0.2920.278
Learnability (LRN)0.3040.290
User interface sensibility (UIS)0.3010.294
Visual elements (VIS)0.2900.283
Table 6. Results of hypotheses testing.
Table 6. Results of hypotheses testing.
HypothesesPath CoefficientsT Statisticsp-ValueSupported?
H1. AUD → VIS0.5398.2350.000Yes
H2. AUD → GMH0.2562.6550.008Yes
H3. VIS → UIS0.5495.9820.000Yes
H4. VIS → ENJ0.2742.6730.008Yes
H5. UIS → GMH0.4033.8390.000Yes
H6. UIS → LRN0.2332.0800.038Yes
H7. GMH → ENG0.2152.2920.022Yes
H8. GMH → ENJ0.3674.0420.000Yes
H9. GMH → LRN0.3993.6820.000Yes
H10. GMH → BEH0.1402.0150.044Yes
H11. ENJ → BEH0.5566.2490.000Yes
H12. ENJ → ENG0.5767.1630.000Yes
H13. ENG → BEH0.2422.9230.003Yes
Table 7. Results of testing the effect size.
Table 7. Results of testing the effect size.
AUDBEHENGENJGMHLRNUISVIS
AUD 0.083 0.409
BEH
ENG 0.093
ENJ 0.4860.495
GMH 0.0430.0690.140 0.176
LRN
UIS 0.2070.060
VIS 0.078 0.431
Note that the endogenous constructs are listed in the columns, while the exogenous constructs are found in the rows.
Table 8. Results of testing the predictive power of the research model.
Table 8. Results of testing the predictive power of the research model.
ItemsQ2 PredictPLS-SEM_RMSEPLS-SEM_MAELM_RMSELM_MAE
BEH10.0481.0820.9041.1250.934
BEH20.0590.9840.8191.0560.865
BEH30.0741.3851.1901.3871.152
BEH40.0261.1090.9141.1180.905
ENG10.0380.7940.6060.8060.622
ENG20.0020.9760.7980.9750.794
ENG40.0701.0830.9311.0960.906
ENG50.0871.0400.8601.0030.824
ENY10.0900.7520.5980.7910.642
ENY20.0650.7620.6140.7820.641
ENY30.0701.0470.8551.0650.840
ENY40.1021.0610.8901.1080.913
ENY50.0891.0770.8941.0950.890
GMH10.1140.7120.5900.7100.595
GMH20.0470.8350.6540.8550.681
GMH30.0460.8530.7030.9000.735
GMH50.0700.6310.5280.6450.532
LRN10.0220.7720.6450.7690.616
LRN20.0290.8440.6970.8850.741
LRN40.0290.6170.5290.6330.523
UIS10.0980.5350.4940.5340.464
UIS30.0420.9030.7290.9250.752
UIS40.0310.8490.6960.8820.736
VIS10.0680.8920.6940.8550.655
VIS20.2340.6950.5070.7080.495
VIS30.1450.6540.4910.6730.504
VIS40.1870.7630.6000.7890.604
VIS50.2160.7640.5930.7710.595
Note that bold values in the rows signify that endogenous construct items exhibit higher prediction errors in terms of RMSE or MAE, in comparison with the naïve LM benchmark.
Table 9. Results of testing the predictive relevance of exogenous constructs.
Table 9. Results of testing the predictive relevance of exogenous constructs.
AUDBEHENGENJGMHLRNUISVIS
AUD 0.075 0.253
BEH
ENG 0.180
ENJ 0.3510.264
GMH 0.0630.0040.181 0.189
LRN
UIS 0.1580.005
VIS 0.152 0.296
Note that endogenous constructs are in the columns while exogenous constructs are in the rows.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Blašković, L.; Žužić, A.; Orehovački, T. Evaluating a Conceptual Model for Measuring Gaming Experience: A Case Study of Stranded Away Platformer Game. Information 2023, 14, 350. https://doi.org/10.3390/info14060350

AMA Style

Blašković L, Žužić A, Orehovački T. Evaluating a Conceptual Model for Measuring Gaming Experience: A Case Study of Stranded Away Platformer Game. Information. 2023; 14(6):350. https://doi.org/10.3390/info14060350

Chicago/Turabian Style

Blašković, Luka, Alesandro Žužić, and Tihomir Orehovački. 2023. "Evaluating a Conceptual Model for Measuring Gaming Experience: A Case Study of Stranded Away Platformer Game" Information 14, no. 6: 350. https://doi.org/10.3390/info14060350

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop