Next Article in Journal
Catalytic Performances of Platinum Containing PLLA Macrocomplex in the Hydrogenation of α,β-Unsaturated Carbonyl Compounds
Next Article in Special Issue
3D Approaches and Challenges in Facial Expression Recognition Algorithms—A Literature Review
Previous Article in Journal
Effect of Load Priority Modeling on the Size of Fuel Cell as an Emergency Power Unit in a More-Electric Aircraft
Previous Article in Special Issue
Real Time Shadow Mapping for Augmented Reality Photorealistic Rendering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms

School of Business Administration, Northeastern University, No. 195 Chuangxin Road, Hunnan District, Shenyang 110167, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2019, 9(16), 3242; https://doi.org/10.3390/app9163242
Submission received: 1 July 2019 / Revised: 2 August 2019 / Accepted: 6 August 2019 / Published: 8 August 2019
(This article belongs to the Special Issue Human-Computer Interaction and 3D Face Analysis)

Abstract

:
As an initial channel for users learning about a mobile game, the interactive experience of the navigation interface will directly affect the first impression of the users on the game and their subsequent behaviors and willingness to use. This study aims to investigate players’ visual attention mechanisms of various interactive levels of mobile games’ interfaces under free-browsing and task-oriented conditions. Eye-tracking glasses and a questionnaire were used to measure the interactive experience of mobile games. The results show that in the free-browsing condition, the fixation count, saccade count and average saccade amplitude can be used to reflect and predict the interactive experiences of mobile games’ navigation interface; while in the task-oriented condition, the fixation count, first fixation duration, dwell time ratio and saccade count can be used to reflect and predict the interactive experience of mobile games’ navigation interface. These findings suggest that apart from the different eye movement indicators, players’ motivations should also be considered during the process of the games’ navigation interface design.

1. Introduction

With the popularity of smart mobile devices in recent years, the number of mobile game audiences has increased rapidly, and the mobile game has become an indispensable entertainment form in people’s daily lives [1]. The portable and leisure features of mobile games enable people to play the game and enjoy themselves at anytime and anywhere. Therefore, the mobile game is becoming lucrative, which attracts a great amount of investment towards the mobile game industry. At present, there are more than 20,000 mobile games online, while only a small part of them have achieved great success [2]. Evidence from Appsee, a mobile user data analysis company, revealed that the average first-month player retention rate for mobile games was only 22%, and more than 50% of players gave up the game due to poor initial impressions [3]. Generally, the navigation interface serves as an initial channel and provides primary guidance information for players. This interface will directly affect the first impression of the players on the game and their subsequent behaviors and willingness to use [4,5,6].
In prior game design, the designer often makes the navigation interface based on their own experience and knowledge, which cannot meet the players’ real needs [7]. Today, more and more gamers realize the importance other players’ subjective feelings, which may subsequently influence their intention to play. To better help consumer-centric design of mobile games, a great deal of research has been conducted to try to measure players’ feelings when they are playing games. However, the traditional measurement methods, such as interviews, questionnaires, and focus groups, are too subjective to reflect players’ psychological state and perception process when they are playing [8]. Since the intrusiveness of continuous measurements may change the state of the game experience, it is difficult to ensure the authenticity of the results at every moment of the experience and think aloud is a typical intrusive measurement technique [9]. Questionnaires and interviews are helpful for overall feedback after the game experience, but they cannot respond reliably and without loss to events and features that occurred a few minutes ago; making the player review their game video attempts to solve the problem of losing experience information. In this way, the player can think aloud during the review. This technique helps to understand the player’s cognitive processing and preferences but it cannot grasp the player’s behavior and emotional reactions [10]. A fundamental problem with these linguistic measurement methods is that the emotional experience is not primarily language-based and translating emotional experiences into language requires cognitive effort that may affect measurement.
Some scholars have tried to adopt biometric methods to measure players’ emotional processing during their playing [11,12,13]. These methods provide designers with players’ real-time emotional responses to the game’s functionality and design. Moreover, the biometric methods are not invasive and allow the game to be played without disruption. Accordingly, biometric methods, including eye movement, heart rate, blood pressure, facial muscle activity, and brain imaging, are widely used to measure people’s emotional responses. In this case, the effectiveness of software interface design can be assessed by monitoring players’ physiological activities generated from browsing/using different interfaces. The interactive experience involves the interaction of multi-sensory systems with products, such as visual perception, tactile perception, auditory perception, and even olfactory perception [14]. These sensory systems often act as processes for receiving information when using the product or expecting to use the product. The interactive experience is the process of receiving and processing product information by the user’s senses in a certain interactive environment, where the interactive environment has a great influence on the experience result [15]. Vision is the first channel for people to receive information, which is also the most direct and fastest sensory system for information acceptance. Most of the feelings are influenced by the initial visual perception, and vision is the most important of the senses in the product-selection experience [16]. As a mature physiological measurement method, eye-tracking technology can reflect the user’s receipt of product information by measuring eyeball data [17,18]. The eye movement data cannot lie, and it is possible to measure a player’s real-time attention allocation by tracking the player’s eye movement data. Studies have shown that vision is of great importance for the willingness to purchase products and the willingness of consumers to make consumption decisions [19,20]. Eye-tracking technology has been widely used in marketing, web design, and brand packaging, and has gradually been applied to improve the interactive experience of games in recent years [21]. For example, the shortcomings of game level design can be found according to the player’s gaze hotspot map, and the player’s eye movement track is also used to identify the invalid operation in the game [22,23].
Peoples’ visual information processing mechanism can be divided into top-down and bottom-up [24,25]. Top-down visual attention allocation is usually accompanied by a purpose or task, which is an intrinsic purpose that drives people to selectively pay attention [26]. Bottom-up attention allocation is usually guided by stimuli, such as visual saliency (color, contrast, and so on), object size, and visual position [27]. In this study, when players open the game and look at the game navigation interface, they are in a state of free browsing. This visual information processing mechanism is bottom-up. That means the player’s attention is guided by the navigation interface stimuli, such as the logo pattern of the navigation interface, the contrast of background color, and the position of the control. Afterward, players begin to interact with the game, such as looking for the leaderboard and adding friends, shifting players’ visual information mechanism into top-down processing. In sum, the interactive experience of the navigation interface includes these two kinds of visual mechanisms. The player first freely browses the interface information (bottom-up), and then relies on past experiences or cognitive experience for purposeful exploration (top-down). The latter stage requires players to perform the actual operation so that the fluency, entertainment, and usability of the navigation interface will affect the players’ feelings and behavior [28].
To provide reasonable suggestions for the improvement and design of the game interface, it is important to effectively measure players’ dynamic emotional experience of playing games. In this study, we used eye-tracking technology to measure these feelings when participants play games in different levels of the navigation interface. Some eye movement indicators were identified to reflect players’ emotional responses in free browse and task-oriented mode and could be a prediction of the interactive experience of the game interface.

2. Method

2.1. Participants

The study complied with departmental ethics committee regulations. Through the online recruitment from the website of the Human Factors Engineering Laboratory of Northeastern University, 26 students with the age range of 24–30 years old (M = 26.5, SD = 2.15) were determined to participate in this eye movement experiment. All participants had normal or corrected visual acuity, and they had no eye diseases. Meanwhile, all participants were using smartphones and had the experience of mobile games. The average weekly experience of mobile games was more than 20 min. They were not familiar with the experimental materials used, thus avoiding the impact of memory and experience on the experiment. Before the experiment, they all signed the informed consent forms voluntarily, and after the experiment, a gift worth 20 RMB was given as a reward.

2.2. Apparatus

This experiment used the ETG (eye-tracking glasses) 2w wireless system produced by SMI, Germany. The system consisted of eye-tracking glasses, vision correction lenses, HP (Hewlett-Packard) data acquisition, and analysis workstations. Also, the experiment required a smartphone (iPhone 6S Plus) to load and run the mobile games. Data acquisition was performed by iView ETG 2.2, and the data was exported by BeGaze 3.6. The experimental scheme was designed with a sampling rate of 120 Hz and a three-point calibration (the instructor asked the participants fix their eyes on three randomly selected points on the screen and performed calibration by other targeting points).

2.3. Stimuli

The quality of the experimental material is related to the success of this research. Therefore, the following requirements were made. Firstly, the stimuli materials selected should be the same type and the same theme. Secondly, the selected games should have a different interactive experience. Finally, it is not appropriate to select mobile games with higher popularity in order to avoid the empirical effect of the participants. In sum, 16 entertainment games were selected from the mobile game sharing community TapTap platform (a kind of leisure puzzle game, with mainly similiar game elements, such as fruits, gems, animal avatars, building blocks, mahjong, cards, etc., so that they are paired with each other to eliminate in order to win). Ten game designers, doctoral students focusing on user experience, and senior mobile game players were invited to conduct expert group interviews, and finally determined four mobile games (Monster Elimination, Animal Elimination, Love Every Day, Seaside Entertainment) as eye-movement experimental stimuli (Figure 1), in which Monster Elimination and Love Every Day were defined as high interactive experience, and Animal Elimination and Seaside Entertainment were defined as low interactive experience.

2.4. Eye-Movement Measures

The overall presentation part of the navigation interface was defined as an area of interest (AOI) for analyzing movement data. ETG 2w applied two eye cameras to capture the infrared light reflected by the eyes, and then obtained the eye movement data of the participants. It provided various eye movement indicators, such as the first fixation duration, the number of gaze points, fixation duration, the average pupil diameter, the saccade amplitude, blink count, saccade frequency, blink time, number of saccades, gaze time, etc. According to previous studies [29,30], the following eye movement indicators were selected for each AOI: first fixation duration (length of time of first fixation duration in AOI), dwell time ratio (time to stay in AOI), fixation count (the number of fixation points in AOI >200 ms), saccades number (the number of times the eye beats between different fixation points in AOI), and the average saccade amplitude (the average spatial distance between fixation points in AOI).

2.5. Procedure

This research adopted a within-subjects experimental design. The procedure and some precautions will be introduced before the experiment. Participants were asked to sit comfortably and hold the lab phone naturally. Moreover, the experimenter would ask the participant’s vision status, including whether they had astigmatism, were far/near sighted, and the specific degree. If there was astigmatism, the participant was replaced. If the participant was far/near sighted, the lens was selected according to the degree of the participant and then fixed. After this, the eye calibration was performed by the three-point method, and the calibration accuracy was less than 0.5 [31].
Experimental task: The participant opened the game and freely browsed the mobile game navigation interface for 20 s. After that, the instructor reminded the participant to complete the following three tasks in the navigation interface, including turning off the game sound, opening the interface of inviting friends, and finding the game level 30. Following the above steps, the participant played the selected four games, and there was a three-minute rest after finishing the second game. It took about eight min for the participants to finish all the tasks. Each participant needed to complete eight questionnaires after completing the above tasks. The participant was required to reopen the mobile game that was previously experienced and use the user experience scale of a mobile game to separately evaluate the free browsing mode and task-oriented model for each game. The experimental procedure and scenario are shown in Figure 2 and Figure 3.

3. Results

The subjective evaluation and eye movement data were analyzed through statistical analysis software, SPSS 20.0. The relationship between subjective evaluation and eye movement data was analyzed by using partial least squares (PLS). Repeated measurement ANOVA was used for analyzing the stimuli category (high interactive experience and low interactive experience). Two participants’ data were excluded because of missing data during the experiment, leaving 24 participants (12 males; M = 26.3; SD = 2.38) for final data analysis. If the data did not obey the normal distribution or approximate normal distribution, the criterion and data conversion method referred to the literature [32].

3.1. Subjective Evaluation

After completing the eye movement experiment task, participants were asked to evaluate the two experimental modes of the four games using the mature experience evaluation scale for mobile games in the literature [33]. The score of each dimension (feedback, immersion, challenge, social, mobile, control) was the sum of the scores of the items within the dimension. Therefore, the participant’s interactive experience of the navigation interface was represented by the sum score of all dimensions. The 24 participants’ scores on each game were averaged according to the level of interactive experience. The results were analyzed by repeated measurement ANOVA according to different modes. The results are shown in Table 1.
The results showed that there was a significant effect of the subjective interactive experience between free browsing and task-oriented mode (p < 0.001). The participants’ scores improved when they were operating the game navigation interface with high interaction experience. The result also verified the validity of the selected game.

3.2. Eye-Movement Outcomes

According to the different interactive experience of the navigation interface, the one-way repeated measurement ANOVA was performed on the selected eye movement indicators. The independent variable was the level of interactive experience, and the eye movement indicators in the two modes were respectively taken as the dependent variable according to the mean value. The results of eye movement analysis in the free browsing mode and the task-oriented mode are separately shown in Table 2 and Table 3. The heat map of the two models are shown in Figure 4.
It can be seen from Table 2 and Table 3. In the free browsing mode, there was a significant difference in the three eye movement indicators, including fixation count, saccade count, and the average saccade amplitude, between the two levels of interactive experience (p < 0.05). While for the first fixation duration and dwell time ratio, no significant effects were obtained. In the task-oriented mode, fixation count, first fixation duration, dwell time ratio, and saccades count all had significant differences between the two levels of the interactive experience (p < 0.05), while the average saccade amplitude did not show a significant effect. The heat map (Figure 4) was helpful to understand and compare the visual information processing between the two experimental modes or different interactive experiences [34]. According to the subjective evaluation, the two games with the biggest difference between the two modes were selected (monster elimination and animal elimination). For the convenience of observation, only one subject was given (subject 1).

3.3. Regression Analysis of Eye-Movement Outcomes and Subjective Evaluation

In order to further measure the interactive experience and extract valuable eye movement indicators to predict the interactive experience of the navigation interface, the study used the PLS (partial least squares) method to establish the regression equation between subjective evaluation and eye movement metrics. PLS is a data optimization analysis technique that finds the best function match for a set of data by minimizing the sum of the squares of the errors [35].
The eye movement indicators with significant effects in two experimental modes were respectively used as independent variables, and the corresponding subjective scores were used as dependent variables. The PLS regression was performed using MATLAB software. The results showed the regression equation between the subjective scores and the eye movement indicators.
Free browsing mode: y1 = 126.5607 − 0.0418 x1 − 0.8541 x4 − 5.3526 x5
Task-oriented mode: y2 = 38.1893 + 0.0611 x1 − 0.0216 x2 + 3.2734 x3 + 0.1231 x4
where y1 and y2 stand for the subjective score in free browsing mode and task-oriented mode, x1~x5 stand for the eye movement indicators, which respectively are fixation count, first fixation duration, dwell time ratio, saccade count, and average saccade amplitude.
It can be seen from the above regression equation that the fixation count, saccade count, and average saccade amplitude in the free browsing mode all have a negative effect on the interactive experience. While in the task-oriented mode, the first fixation duration, the dwell time ratio, and the saccade count have a positive effect on the interactive experience and the fixation count has a negative effect on the interactive experience. The results of PLS analysis were consistent with the eye movement measurements, which further validate the feasibility of eye movement technology for measuring the interactive experience of the navigation interface.
In order to ensure the accuracy of the regression equation, another four subjects were selected by the previous experimental procedures to conduct a new experiment. The experimental data were compared with the calculated data by the regression equation, and the paired sample t-test was conducted on the actual values and the predicted values. The test results are shown in Table 4 and Table 5, respectively.
According to the data in Table 4 and Table 5, the significance level between the actual value and the predicted value was greater than 0.05, indicating that there was no significant difference between them. Therefore, the regression equation can ensure the accuracy and validity of the prediction.

4. Discussion

4.1. The Eye-Movement Indicator Reflecting Interactive Experience in the Free Browsing Mode

This experiment aims to identify eye-movement indicators that can predict or measure the interactive experience of mobile games’ navigation interfaces. In the free browsing mode, the results showed that there was a significant difference in fixation count, saccade count, and average saccade amplitude between two levels of stimuli, but the differences in the first fixation duration and dwell time ratio were not significant.
The navigation interface with a high interactive experience had a significant difference in fixation count, and this was consistent with other eye-movement experiments used to detect the emotional state of the participant. Alshehri [36] studied participants in the free browsing of cartoon segments; the fixation count showed significant differences under positive and negative stimuli, and the positive cartoon segments induced a smaller fixation count. Studies have shown that people’s fixation count was related to cognitive control, while cognitive control was related to emotional processing when viewing clips. Luo [37] used eye-tracking technology to explore the role of visual cues on multimedia, autonomous online teaching. The fixation count was used to record the attention-guided process of the students. The results indicated that the fixation count showed a significant difference in the presence or absence of visual cues, and the fixation count was lower on the learning interface with visual cues. The bottom-up attention mechanism was usually motivationally-guided, so it was reasonable to speculate that these stimuli brought more valence and arousal to the player on the navigation interface with a high interactive experience. The interface was clear and easy, which makes browsing more fluent, and the fixation count relatively low.
Research showed that saccade count was an important indicator to reflect the player’s understanding of the information [38]. When the interface information was difficult for the player to understand, it would induce a higher saccade count, and the saccade count induced by the easy-to-understand interface information was small [39]. In the free browsing mode, the navigation interface with high interactive experience had richer design elements, more reasonable layout, and greater appeal on the overall appearance, which helped players better understand the game information and therefore induced a lower saccade count. Also, the difference in saccade count can also be explained in terms of players’ evaluations of an interactive interface. In the context of a shopping website, Guo [40] conducted an eye-movement experiment and found that the users’ emotional evaluation of the shopping website was significantly negatively correlated with the saccade count, that is, the high valence and arousal shopping website would induce a smaller saccade count among users. In this case, the game interface with high interactive experience was bound to bring a high level of emotional experience to the player, especially when the player browsed the game interface for the first time, the participant had a rational understanding of the appearance of the navigation interface within 20 s of free browsing. The emotional differences excited by different navigation interfaces can be reflected according to the subjective evaluation results. This was also one of the reasons why navigation interfaces with high interactive experience induced a significantly lower saccade count than the low interactive experience.
The average saccade amplitude was an important indicator reflecting the usability of the interface [41]. The higher average saccade amplitude indicated that the user could achieve the goal efficiently, and the target search path encountered fewer obstacles or confusion [42]. However, in the free browsing mode of this experiment, the average saccade amplitude induced by a high interactive experience was significantly lower than the low interactive experience. The main reason for this result was that the free browsing process did not involve specific objectives for the mission. It could be found that the controls, logos, and background images on the navigation interface with high interactive experience were more detailed and vivid from the properties of the stimulating material itself; additionally, the number of controls was high, the background image was superimposed by multiple images, and the picture was dynamic. This layout and construction resulted in a large information load on the interface, and the user was more likely to be attracted by various elements in the interface in an untargeted state [43].
Conversely, the design of a navigation interface with low interactive experience was limited—the structure was simple, the picture was static, and the visual load was low. The interface with a larger information load would induce a larger fixation area and a more complicated eye-tracking route when the user was in the free browsing condition [44]. According to the calculation formula of the average saccade amplitude [45], it was reasonable to speculate that in the free browsing mode, the high interactive experience would induce the user’s fixation point area and the eye-movement route to include more fixation points. As more attractive, the transfer distance between the fixation points on the interface with high interactive experience was significantly lower than the low interactive experience. Therefore, there was a significant difference in the average saccade amplitude between the interactive experience levels when the user was in the free browsing condition.
The results of repeated measurement ANOVA indicated that the two levels of stimuli in the free browsing mode showed no significant differences in first fixation duration and dwell time ratio. Studies have shown that the first fixation duration and dwell time ratio had no significant relationship with the user’s ultimate preference [46]. Knickerbocker [47] studied the differences in eye-movement indicators when users read different emotional words (positive, neutral, and negative). The results showed that first fixation duration and dwell time ratio under positive and negative conditions were not significant, but they were significantly increased under the neutral words. The experimental materials selected in the research belong to the positive stimulation. Ding [48] applied eye-tracking technology to evaluate product design. The experimental results showed that in the free browsing state, the eye-movement indicators (first fixation duration and dwell time ratio) induced by smartphones with different user experience did not present significant differences. The free browsing mode in this experiment was the same as the experimental paradigm of Ding. According to the subjective evaluation results, there were no neutral stimuli in the experimental materials. Hence, these two eye-movement indicators showed no significant differences in the navigation interfaces with a different interactive experience.

4.2. The Eye-Movement Indicator Reflecting Interactive Experience in the Task-Oriented Mode

The results of the task-oriented model showed that there were significant differences in fixation count, first fixation duration, dwell time ratio, and saccade count between two levels of stimuli. There was no significant difference in average saccade amplitude.
The fixation count induced by the navigation interface with high interactive experience was higher than the low interactive experience, which is contrary to the experimental results in free browsing mode. Lu [49] studied the influence of interface information on the user’s visual search. The results showed that the fixation count and fixation duration induced by the interface included a large amount of information which was significantly higher than that of the interface with less information. Goyal [50] used eye-movement data to predict consumer behavior. The experiment required the participants to give a preference for the product after browsing a set of shopping pictures. The results showed that the fixation count was significantly different between the tendency and the non-tendency products. Users had great fixation counts for the propensity product, and the fixation count could be used to predict the consumer’s preference decision. The navigation interface of a mobile game with high interactive experience contained various design elements and a larger amount of information. It also attracted user’s attention and increased the possibility of generating preference behavior. Therefore, the user’s fixation count was higher, which was similar to the previous eye-movement studies.
Uzzaman [38] used eye-tracking technology to study the difference in the degree of engagement of users in reading tasks. It was found that when the user-focused more on the tasks, his/her eye-movements were more complicated, and the saccades count was high. The eye-movement became simple, and the saccades count was low when the user was absent-minded in the task. In this experiment, the participants were more likely to be immersed in the navigation interface with high interactive experience so that they are more focused and complete the task itself. The navigation interface with low interactive experience has low gameplay, low attraction to the participants, and it was difficult for the participants to avoid the phenomenon of distraction and to slip during the execution of the task. Therefore, there existed a significant difference in the saccade count of participants in different interactive experiences, and the saccade count induced by the low interactive experience was low. Krejtz [41] studied the influence of interface complexity on the players’ eye-movement. The results showed that the saccade count induced by the game interface with a high visual load was significantly higher than the low visual load regardless of whether there was a hint or not. Moreover, the players were more immersed and enjoyable under the no hint condition. For the navigation interfaces selected, a unified no hint process was performed before the experiment, and the user needed to rely on his or her own experience and cognitive ability to find the target and complete the task. High interactive experience required the user to invest a large visual load because of the abundant representation information. Therefore, the complexity of the interface was also one of the important factors affecting the saccade count, and this was also consistent with the conclusion of this study.
A well-designed interface enables players to quickly find the desired target, while intermediate fixation counts should be avoided in the scan path, which would result in a larger saccade amplitude [51]. If the information provided were meaningless or misleading, then the average saccade amplitude would be small. The user would further explore the interface information before a meaningful prompt appears, which would result in more useless fixation counts. The average saccade amplitude was calculated by dividing the sum of the distances between consecutive fixation counts by the fixation counts minus one. All saccades were used to generate this sum, so there was no minimum length criterion. In the task-oriented mode, there was no significant difference in the average saccade amplitude between the two levels of navigation interfaces (p = 0.383). This showed that although the user needed a broader search in the navigation interface with a high interactive experience, the average search step sizes between different experience levels were similar. That was because the information content of the low interactive experience was small, and the participant was more likely to lock the task target so that the fixation counts were small. The high interactive experience contained a large amount of information. However, its function layout was reasonable and was helpful for the user to visually search for the correct target without increasing the number of fixation points.
The results of repeated measurement ANOVA showed that there were significant differences in the first fixation duration and dwell time ratio between the two levels of navigation interfaces in the task-oriented mode. The high interactive experience had a larger dwell time ratio and a smaller first fixation duration than a low one, which further proved that the difference in the eye-movement index under the two visual information processing mechanisms was significant. In the task-oriented mode, players need to focus on the stimuli of the search target itself [21]. Hence, the highly usable navigation interface avoided unnecessary waste of visual resources when the players performed the task for the first time, and this resulted in a small first fixation duration. The navigation interface with high interactive experience will give players a higher sense of immersion, engagement, and concentration [52], so the time spent staying in the navigation interface is longer and the dwell time ratio is higher.

4.3. The Prediction of Interactive Experience through Eye-Tracking Indicators Should Consider Attention Patterns

The experimental results indicated that there were clear differences between the two patterns, and even the same eye-movement indicator had diametrically opposite results (i.e., fixation count and saccades count). Specifically, the navigation interface of mobile games with high interactive experience would bring players a smooth experience, which stimulated a stronger sense of valence and arousal and resulted in a smaller fixation count in the free browsing mode. However, in the task-oriented mode, the navigation interface with high interactive experience had a large amount of information and high attraction. Since the player’s visual range was limited, the possibility of generating preference for other stimuli other than the target stimuli increased and caused a high fixation count. For the saccade count, the navigation interface with high interactive experience was easy for players to receive the game information during free browsing because of the reasonable layout of interface element so that the saccade count was low. The task-oriented mode required the player to perform a targeted search. The player could concentrate on the task during the high interactive experience and input a higher visual load, which results in a higher saccade count. Also, Pinto [53] pointed out that users’ attention allocation is different between the bottom-up and top-down conditions. In the bottom-up condition, players were attracted to the interface features that captured their vision. While users were driven by goals to allocate more cognitive abilities in the top-down condition. In this study, the results of the regression equation established by PLS supported this difference. Therefore, when using eye-movement to evaluate or predict the interactive experience of the game interface, it is unreasonable to limit the fixed number of eye-movement indicators, and instead visual attention should first be taken into consideration.

5. Conclusions

A navigation interface of mobile games with high interactive experience will have a positive impact on the players’ experience process. Vision is the primary channel for players to receive game information. The visual information of the player can reflect the design of the navigation interface to a certain extent. According to the different processing mechanism of visual information, two modes, including bottom-up (free browsing) and top-down (task-oriented), were simulated by the eye-movement experiment. The results showed that under different processing mechanisms, the eye-movement indicators that can be used to reflect interactive experience were different, and even the same eye-movement index showed the opposite direction. In the free browsing mode, fixation count, saccade count, and the average saccade amplitude could be used to reflect and predict the interactive experience of the navigation interface. In the task-oriented mode, fixation count, first fixation duration, dwell time ratio, and saccade count can be used to reflect and predict the interactive experience of mobile navigation interface. Therefore, when using the eye-movement indicators to evaluate or improve the design of the game interface, it is not only limited to a few fixed eye-movement indicators but should first consider the user’s attention drive.
Exploring the players’ inner visual perception of the navigation interface in different situations is helpful for designers to capture players’ real needs and further improve the interactive experience of the game product. However, there are still some limitations to this study. The subjects selected for the experiment were all college students, and other groups were not considered. Therefore, the representativeness of the sample needs to be considered when generalizing the research conclusion. In addition, the study did not consider the impact of gender differences on the interactive experience of mobile game navigation interfaces and did not explore the differences in eye movements caused by gender differences. This part of the content needs to be further tested by eye movement experiments in the future. It is also necessary to explore ways to identify the key design features of the mobile game navigation interface that cause the players’ cognitive differences. Further studies should consider more sensory systems (heart rate, electroencephalogram, electromyogram, etc.) and analyze the relationship between physiological indicators and key design features, so as to guide the design of mobile games in more detail.

Author Contributions

Conceptualization, F.G. and J.-Y.J.; methodology, J.-Y.J. and J.-H.C.; statistical analysis, X.-H.T. and J.-H.C.; investigation, J.-Y.J. and X.-H.T.; data collection, J.-Y.J. and W.L.; writing—original draft, J.-Y.J., X.-H.T. and J.-H.C.; writing—review and editing, J.-Y.J., X.-H.T. and W.L.

Funding

This research was supported by the National Natural Science Foundation of China (grant no. 71471033, 71771045) and the “Double First-Class” Disciplines Construction Project of Northeastern University (grant no. 02050021940101).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rutz, O.; Aravindakshan, A.; Rubel, O. Measuring and forecasting mobile game app engagement. Int. J. Res. Mark. 2019, 36, 185–199. [Google Scholar] [CrossRef]
  2. Hsiao, K.L.; Chen, C.C. What drives in-app purchase intention for mobile games? An examination of perceived values and loyalty. Electron. Commer. Res. Appl. 2016, 16, 18–29. [Google Scholar] [CrossRef]
  3. Ncube, C.; Shaalan, K.; Alomari, K.M. Predicting Success of a Mobile Game: A Proposed Data Analytics-Based Prediction Model; Springer: Cham, Switzerland, 2018. [Google Scholar]
  4. Merikivi, J.; Tuunainen, V.; Nguyen, D. What makes continued mobile gaming enjoyable? Comput. Hum. Behav. 2017, 68, 411–421. [Google Scholar] [CrossRef]
  5. Yoon, H.; Park, S.; Lee, K.; Park, J.W.; Dey, A.K.; Kim, S.J. A case study on iteratively assessing and enhancing wearable user interface prototypes. Symmetry 2017, 9, 114. [Google Scholar] [CrossRef]
  6. Feng, Y.; Zhang, W.; Luan, P.; Liu, M. Design of Game Style Navigation APP Interface Based on User Experience. In Proceedings of the 3rd International Conference on Culture, Education and Economic Development of Modern Society (ICCESE 2019), Moscow, Russia, 1–3 March 2019; Atlantis Press: Paris France, 2019. [Google Scholar]
  7. Nagai, Y.; Georgiev, G.V. The role of impressions on users’ tactile interaction with product materials: An analysis of associative concept networks. Mater. Des. 2011, 32, 291–302. [Google Scholar] [CrossRef]
  8. Nacke, L.E. Games User Research and Physiological Game Evaluation. Game User Experience Evaluation; Springer: Cham, Switzerland, 2015; pp. 63–86. [Google Scholar]
  9. Ke, F.; Xie, K.; Xie, Y. Game-based learning engagement: A theory-and data-driven exploration. Br. J. Educ. Technol. 2016, 47, 1183–1201. [Google Scholar] [CrossRef]
  10. Roberts, V.L.; Fels, D.I. Methods for inclusion: Employing think aloud protocols in software usability studies with individuals who are deaf. Int. J. Hum.-Comput. Stud. 2006, 64, 489–501. [Google Scholar] [CrossRef]
  11. Matsuno, S.; Terasaki, T.; Aizawa, S.; Mizuno, T.; Mito, K.; Itakura, N. Physiological and Psychological Evaluation by Skin Potential Activity Measurement Using Steering Wheel While Driving. In Proceedings of the International Conference on Human-Computer Interaction, Toronto, ON, Canada, 17–22 July 2016; Springer: Cham, Switzerland, 2016; pp. 177–181. [Google Scholar]
  12. Isbister, K.; Schaffer, N. Game Usability: Advice from the Experts for Advancing the Player Experience; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
  13. Sim, H.; Lee, W.H.; Kim, J.Y. A Study on Emotion Classification utilizing Bio-Signal (PPG, GSR, RESP). Adv. Sci. Technol. Lett. 2015, 87, 73–77. [Google Scholar]
  14. Wang, Y.J.; Minor, M.S. Validity, reliability, and applicability of psychophysiological techniques in marketing research. Psychol. Mark. 2008, 25, 197–232. [Google Scholar] [CrossRef]
  15. Harwood, T.; Garry, T. An investigation into gamification as a customer engagement experience environment. J. Serv. Mark. 2015, 29, 533–546. [Google Scholar] [CrossRef]
  16. Schifferstein, H.N.; Desmet, P.M. The effects of sensory impairments on product experience and personal well-being. Ergonomics 2007, 50, 2026–2048. [Google Scholar] [CrossRef] [PubMed]
  17. Tsai, M.; Huang, L.; Hou, H.; Hsu, C.; Chiou, G. Visual behavior, flow and achievement in game-based learning. Comput. Educ. 2016, 98, 115–129. [Google Scholar] [CrossRef]
  18. Park, K.; Lee, D.-J.; Lee, J.; Ju, J.; Ahn, J.-H. Do Mobile Devices Change Shopping Behavior? An Eye-tracking Approach. In Proceedings of the Americas Conference on Information Systems, Cancun, Mexico, 15–17 August 2019. [Google Scholar]
  19. Vidal, M.; Bulling, A.; Gellersen, H. Pursuits: Spontaneous eye-based interaction for dynamic interfaces. Getmobile Mob. Comput. Commun. 2015, 18, 8–10. [Google Scholar] [CrossRef]
  20. Cortinez, M.; Cabeza, R.; Chocarro, R.; Villanueva, A. Attention to Online Channels Across the Path to Purchase: An Eye-Tracking Study. Electron. Commer. Res. Appl. 2019, 29, 100864. [Google Scholar] [CrossRef]
  21. Chen, Y.; Tsai, M.J. Eye-hand coordination strategies during active video game playing: An eye-tracking study. Comput. Hum. Behav. 2015, 51, 8–14. [Google Scholar] [CrossRef]
  22. Almeida, S.; Mealha, Ó.; Veloso, A. Video game scenery analysis with eye tracking. Entertain. Comput. 2016, 14, 1–13. [Google Scholar] [CrossRef]
  23. Abbaszadegan, M.; Yaghoubi, S.; MacKenzie, I.S. TrackMaze: A Comparison of Head-Tracking, Eye-Tracking, and Tilt as Input Methods for Mobile Games. International Conference on Human-Computer Interaction; Springer: Cham, Switzerland, 2018; pp. 393–405. [Google Scholar]
  24. van der Laan, L.N.; Hooge, I.T.; De Ridder, D.T.; Viergever, M.A.; Smeets, P.A. Do you like what you see? The role of first fixation and total fixation duration in consumer choice. Food Qual. Prefer. 2015, 39, 46–55. [Google Scholar] [CrossRef]
  25. Anderson, P.; He, X.; Buehler, C.; Teney, D.; Johnson, M.; Gould, S.; Zhang, L. Bottom-up and top-down attention for image captioning and visual question answering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 6077–6086. [Google Scholar]
  26. Corbetta, M.; Shulman, G.L. Control of goal-directed and stimulus-driven attention in the brain. Nat. Rev. Neurosci. 2002, 3, 201–215. [Google Scholar] [CrossRef]
  27. Orquin, J.L.; Loose, S.M. Attention and choice: A review on eye movements in decision making. Acta Psychol. 2013, 144, 190–206. [Google Scholar] [CrossRef] [Green Version]
  28. West, G.L.; Al-Aidroos, N.; Pratt, J. Action video game experience affects oculomotor performance. Acta Psychol. 2013, 142, 38–42. [Google Scholar] [CrossRef]
  29. Privitera, C.M.; Renninger, L.W.; Carney, T.; Klein, S.; Aguilar, M. Pupil dilation during visual target detection. J. Vis. 2010, 10, 3–14. [Google Scholar] [CrossRef] [PubMed]
  30. Rosbergen, E.; Pieters, R.; Wedel, M. Visual attention to advertising: A segment-level analysis. J. Consum. Res. 1997, 24, 305–314. [Google Scholar] [CrossRef]
  31. Duchowski, A.T. Eye tracking methodology. Theory Pract. 2007, 328, 614. [Google Scholar]
  32. Inal, T.C.; Serteser, M.; Coşkun, A.; Ozpinar, A.; Unsal, I. Indirect reference intervals estimated from hospitalized population for thyrotropin and free thyroxine. Croat. Med. J. 2010, 51, 124–130. [Google Scholar] [CrossRef] [PubMed]
  33. Guo, F.; Jiang, J.; Lv, W. Development of a scale for user experience in mobile games and construct validation. Ergonomics 2017, 23, 24–32. [Google Scholar]
  34. Kunanusont, K.; Lucas, S.M.; Pérez-Liébana, D. General video game ai: Learning from screen capture. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Donostia-San Sebastián, Spain, 5–8 June 2017; pp. 2078–2085. [Google Scholar]
  35. Abdi, H. Partial least square regression (PLS regression). Encycl. Res. Methods Soc. Sci. 2003, 6, 792–795. [Google Scholar]
  36. Alshehri, M.; Alghowinem, S. An exploratory study of detecting emotion states using eye-tracking technology. In Proceedings of the IEEE Science and Information Conference, Karlsruhe, Germany, 6–10 May 2013; pp. 428–433. [Google Scholar]
  37. Luo, H.; Koszalka, T.; Zuo, M. Investigating the Effects of Visual Cues in Multimedia Instruction Using Eye Tracking. International Conference on Blended Learning; Springer: Cham, Switzerland, 2016; pp. 63–72. [Google Scholar]
  38. Uzzaman, S.; Joordens, S. The eyes know what you are thinking: Eye movements as an objective measure of mind wandering. Conscious. Cogn. 2011, 20, 1882–1886. [Google Scholar] [CrossRef]
  39. Goldberg, J.H.; Kotval, X.P. Computer interface evaluation using eye movements: Methods and constructs. Int. J. Ind. Ergon. 1999, 24, 631–645. [Google Scholar] [CrossRef]
  40. Guo, F.; Cao, Y.; Ding, Y.; Liu, W.; Zhang, X. A multimodal measurement method of users’ emotional experiences shopping online. Hum. Factors Ergon. Manuf. Serv. Ind. 2015, 25, 585–598. [Google Scholar] [CrossRef]
  41. Krejtz, K.; Biele, C.; Chrzastowski, D.; Kopacz, A.; Niedzielska, A.; Toczyski, P.; Duchowski, A. Gaze-controlled gaming: Immersive and difficult but not cognitively overloading. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA, 13–17 September 2014; pp. 1123–1129. [Google Scholar]
  42. Cowen, L.; Ball, L.J.; Delin, J. An Eye Movement Analysis of Web Page Usability. People and Computers XVI-Memorable Yet Invisible; Springer: London, UK, 2002; pp. 317–335. [Google Scholar]
  43. Zagermann, J.; Pfeil, U.; Reiterer, H. Measuring cognitive load using eye tracking technology in visual computing. In Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, Baltimore, MD, USA, 24 October 2016; pp. 78–85. [Google Scholar]
  44. Wang, Q.; Yang, S.; Liu, M.; Cao, Z.; Ma, Q. An eye-tracking study of website complexity from cognitive load perspective. Decis. Support Syst. 2014, 62, 1–10. [Google Scholar] [CrossRef]
  45. Meeter, M.; Van der Stigchel, S. Visual priming through a boost of the target signal: Evidence from saccadic landing positions. Atten. Percept. Psychophys. 2013, 75, 1336–1341. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. Glaholt, M.G.; Reingold, E.M. Direct control of fixation times in scene viewing: Evidence from analysis of the distribution of first fixation duration. Vis. Cogn. 2012, 20, 605–626. [Google Scholar] [CrossRef]
  47. Knickerbocker, H. Release from Proactive Interference: The Impact of Emotional and Semantic Shifts on Recall Performance; State University of New York at Albany: Albany, NY, USA, 2014. [Google Scholar]
  48. Guo, F.; Ding, Y.; Liu, W.; Liu, C.; Zhang, X. Can eye-tracking data be measured to assess product design?: Visual attention mechanism should be considered. Int. J. Ind. Ergon. 2016, 53, 229–235. [Google Scholar] [CrossRef]
  49. Lu, W.; Li, M.; Lu, S.; Song, Y.; Yin, J.; Zhong, N. Visual search strategy and information processing mode: An eye-tracking study on web pages under information overload. In Proceedings of the International Symposium on Information and Automation, Guangzhou, China, 10–11 November 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 153–159. [Google Scholar]
  50. Goyal, S.; Miyapuram, K.P.; Lahiri, U. Predicting consumer’s behavior using eye tracking data. In Proceedings of the IEEE Second International Conference on Soft Computing and Machine Intelligence (ISCMI), Hong Kong, China, 23–24 November 2015; pp. 126–129. [Google Scholar]
  51. Levy, D.L.; Holzman, P.S.; Matthysse, S.; Mendell, N.R. Eye tracking and schizophrenia: A selective review. Schizophr. Bull. 1994, 20, 47–62. [Google Scholar] [CrossRef] [PubMed]
  52. Jennett, C.; Cox, A.L.; Cairns, P.; Dhoparee, S.; Epps, A.; Tijs, T.; Walton, A. Measuring and defining the experience of immersion in games. Int. J. Hum.-Comput. Stud. 2008, 66, 641–661. [Google Scholar] [CrossRef]
  53. Pinto, Y.; van der Leij, A.R.; Sligte, I.G.; Lamme, V.A.; Scholte, H.S. Bottom-up and top-down attention are independent. J. Vis. 2013, 13, 16. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The name of four selected experimental materials.
Figure 1. The name of four selected experimental materials.
Applsci 09 03242 g001
Figure 2. The experimental process.
Figure 2. The experimental process.
Applsci 09 03242 g002
Figure 3. The experimental scenario.
Figure 3. The experimental scenario.
Applsci 09 03242 g003
Figure 4. The heat map of two experimental modes.
Figure 4. The heat map of two experimental modes.
Applsci 09 03242 g004
Table 1. Repeated measures ANOVA of interactive experience.
Table 1. Repeated measures ANOVA of interactive experience.
ModeHigh LevelLow LevelFp η 2
M (SD)M (SD)
Free browse82.6250 (7.6389)50.6042 (6.2814)156.059<0.0010.872
Task-oriented85.9792 (8.1019)51.3958 (8.0595)127.014<0.0010.847
Table 2. Repeated measurement ANOVA of eye-movement metrics in free browsing mode.
Table 2. Repeated measurement ANOVA of eye-movement metrics in free browsing mode.
MetricsInteractive ExperienceHighLowFp η 2
Fixation countM49.395853.00004.6380.0420.168
SD10.679110.2438
First fixation durationM235.0688221.73330.6120.4420.026
SD97.786584.4442
Dwell time ratioM6.02085.84790.9760.3340.041
SD1.12320.8698
Saccades countM43.958349.687511.9950.0020.343
SD7.97547.1183
Average saccade amplitudeM3.01883.69177.5760.0110.248
SD0.91611.6101
Table 3. Repeated measurement ANOVA of eye-movement metrics in the task-oriented mode.
Table 3. Repeated measurement ANOVA of eye-movement metrics in the task-oriented mode.
MetricsInteractive ExperienceHighLowFpη2
Fixation countM66.229254.54175.3700.0300.189
SD19.174819.7082
First fixation durationM260.5417343.30834.3830.0480.160
SD135.1768155.3116
Dwell time ratioM7.81676.243810.8950.0030.321
SD1.76531.5612
Saccades countM59.833351.64585.3980.0290.190
SD16.664015.8947
Average saccade amplitudeM3.22923.44580.3830.5420.016
SD0.90982.4137
Table 4. The test results of the actual value and predicted value in the free browsing stage.
Table 4. The test results of the actual value and predicted value in the free browsing stage.
Interactive ExperienceActual ValuePredicted Valuetdfp
MSDMSD
Monster elimination69.75008.500068.23419.01391.08030.359
Animal elimination58.750010.719958.696112.12730.05730.958
Love everyday66.25006.238365.15086.86861.50830.229
Seaside entertainment54.50007.325853.66066.49751.67830.192
Table 5. The test results of the actual value and predicted value in the task-oriented stage.
Table 5. The test results of the actual value and predicted value in the task-oriented stage.
Interactive ExperienceActual ValuePredicted Valuetdfp
MSDMSD
Monster elimination72.250014.930471.458115.90401.24530.301
Animal elimination61.750010.594861.374511.07151.00030.391
Love everyday73.250013.450572.549614.34511.10430.350
Seaside entertainment61.00004.546160.78784.92550.58930.597

Share and Cite

MDPI and ACS Style

Jiang, J.-Y.; Guo, F.; Chen, J.-H.; Tian, X.-H.; Lv, W. Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms. Appl. Sci. 2019, 9, 3242. https://doi.org/10.3390/app9163242

AMA Style

Jiang J-Y, Guo F, Chen J-H, Tian X-H, Lv W. Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms. Applied Sciences. 2019; 9(16):3242. https://doi.org/10.3390/app9163242

Chicago/Turabian Style

Jiang, Jun-Yi, Fu Guo, Jia-Hao Chen, Xiao-Hui Tian, and Wei Lv. 2019. "Applying Eye-Tracking Technology to Measure Interactive Experience Toward the Navigation Interface of Mobile Games Considering Different Visual Attention Mechanisms" Applied Sciences 9, no. 16: 3242. https://doi.org/10.3390/app9163242

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop