Next Article in Journal
Seismic Performance Evaluation and Analysis of Vertical Hydrogen Storage Vessels Based on Shaking Table Testing
Next Article in Special Issue
An Analysis of Visibility Requirements and Reaction Times of Near-Field Projections
Previous Article in Journal
A Case Study of the Radon Hazard at the Boundary of a Coal Minefield
Previous Article in Special Issue
A Novel Way of Optimizing Headlight Distributions Based on Real-Life Traffic and Eye-Tracking Data Part 2: Analysis of Real-World Traffic Environments Data in Germany
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Parallax of Head-Up Displays and Visual Safety for Driving

1
Academy for Engineering and Technology, Fudan University, Shanghai 200433, China
2
Department of Illumination Engineering & Light Source, School of Information Science and Engineering, Fudan University, Shanghai 200433, China
3
Institute for Six-Sector Economy, Fudan University, Shanghai 200433, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(24), 13189; https://doi.org/10.3390/app132413189
Submission received: 29 October 2023 / Revised: 1 December 2023 / Accepted: 4 December 2023 / Published: 12 December 2023
(This article belongs to the Special Issue Smart Lighting and Visual Safety)

Abstract

:

Featured Application

This research provides research results about important features of head-up displays (HUDs), the application of which is becoming increasingly popular in vehicles. The findings elucidate the mechanism of how the parallaxes of HUDs affect visual task performance and induce visual fatigue and thus help to improve driving safety.

Abstract

Head-up displays (HUDs), a novel form of virtual display, are characterized by their optical structure as a typical binocular virtual display system. This structure exhibits the effect of binocular parallax on visual perception, especially when diverse depth information is displayed on a screen, which makes the eyes switch between different parallax conditions, and easily affects visual tasks, and induces visual fatigue. Augmented reality HUDs (AR-HUDs) have a wider field of view and are more susceptible to parallax effects. In this study, to determine the acceptable parallax threshold in a two-dimensional virtual display system for HUDs, especially for adjacent positions, and to provide a reference for HUD design, visual comfort and task performance were experimentally evaluated by simulating the overall parallax effect and with step changes on the screen. Specifically, the effects of overall and stepped horizontal and vertical parallaxes on visual fatigue and task performance were evaluated under different conditions. The results showed that the overall horizontal and vertical parallaxes had no significant effect on visual fatigue and task performance. However, stepped horizontal parallax had a significant effect on task performance (p < 0.05), with a parallax value of 3.31 mrad between adjacent positions serving as an acceptable threshold for stepped horizontal parallax as a reference. Significant differences (p < 0.05) in the concentration and fluctuation ratios of the results caused by vertical stepped parallax were found, and an acceptable stepped parallax threshold of 2.24 mrad was obtained. Further, experiments revealed that stepped vertical parallax was more likely to lead to reading misalignments, halos, and distortions. In addition, an exponentially varying relationship between stepped parallax and the error rate of visual performance was observed, and a model was built to predict the degree of influence on visual performance caused by horizontal stepped parallax in virtual displays. This study provides a reference for parallax control between neighboring display icons in AR-HUDs.

1. Introduction

Head-up displays (HUDs), widely employed within the automotive industry, constitute a virtual presentation technique that projects essential information onto the vehicle’s windshield. This innovation empowers drivers to access a diverse array of information without the necessity of redirecting their gaze downward, thus effectively reducing potential interruptions in their monitoring of the vehicle’s external environment [1]. And it can be integrated with an autonomous driving system to offer real-time information, thereby providing a more convenient and safer way to drive while interacting with information. Figure 1 shows a schematic diagram of the HUD optical imaging principle and light path.
HUD technology has also evolved from the traditional combiner head-up display (C-HUD) and windshield head-up display (W-HUD) to the augmented reality head-up display (AR-HUD). C-HUDs project information onto a semitransparent surface for the driver to see; W-HUDs and AR-HUDs both project information onto the windshield, where the driver sees a virtual image of the display. The main difference between a W-HUD and an AR-HUD is the difference in virtual image distance (VID) and field of view (FOV), with the VID of an AR-HUD being at least greater than 6 m and the FOV being greater than 10° [2,3,4] and having the ability to interactively display driving information. The display information of a W-HUD is mainly concentrated in the center of the field of view, while the AR-HUD is displayed in the whole field of view, including the edge area. Figure 2 shows the difference between a W-HUD and an AR-HUD in terms of VID and FOV, as well as the difference in display information area.
HUD systems mainly consist of light sources, phase source projections, reflective surfaces, and glass; their configuration and the human eye’s observation of information constitute a typical binocular viewing system [5,6]. Because of the optical characteristics of the reflective mirror surface and the windshield, when light enters the left and right eyes, a spatial perceptual mismatch (horizontal and vertical) is created and, thus, constitutes horizontal and vertical parallaxes [7]. As depicted in Figure 3, the distinction between the viewing angles α and β defines the parallax, typically measured in milliradians (mrad). Figure 4 visually demonstrates the perceptual misalignment of the image between the left and right eyes. Despite the existence of a single image, binocular parallax gives rise to horizontal or vertical misalignments in binocular perception [7,8]. In the context of horizontal parallax, two distinct types exist: convergence and divergence parallaxes, contingent upon the separation distance between binocular viewing points. In the HUD system, horizontal parallax is mainly manifested as convergence parallax.
As a binocular system, HUDs inherently introduce binocular parallax within a binocular viewing context, affording HUDs the capacity to impart a sense of spatial depth [7,9]. Specifically, the design of AR-HUDs necessitates the presentation of information at various screen locations, each corresponding to distinct distances, facilitating a hierarchical information display and layout. This is achieved by introducing parallax at different display locations [1,5]. Consequently, when drivers engage with HUD information, they transition between distinct perceptions of visual depth, resulting in visual fatigue and diminished visual performance due to image parallax and variations in visual perception between the left and right eyes, resulting from parallax differences at different positions [8]. This effect is even more pronounced when the perceptual conflicts caused by the superimposition of virtual images on real objects are also considered [10,11]. Horizontal parallax and vertical parallax have obvious differences in visual perception effects. Changes in horizontal parallax mainly show differences in depth perception, while changes in vertical parallax have no changes in depth perception, but the difficulty of binocular fusion is different, thus causing visual fatigue [12,13]. Such effects may potentially impact driving safety.
Parallax is an important factor in the design of AR-HUD products. The design of AR-HUDs requires more information to be displayed on the screen, occupying a larger display area. This results in a different parallax effect between the edges and the center of the AR-HUD screen. Some designs also utilize this difference in parallax between different positions to achieve depth perception at different distances [1,5]. Parallax between neighboring positions of an image in the horizontal direction causes the eye to undergo frequent convergence adjustments, which affects visual comfort and visual performance [14]. In the display area of AR-HUD, there is a noticeable trend of parallax change from the center to the edges of the frame. However, HUD information is displayed in the form of icons or small areas distributed within the HUD’s display area. As a result, there will be a stairstep parallax perception between adjacent information display areas. In some studies on parallax in three-dimensional (3D) and two-dimensional (2D) virtual displays, a slight effect on visual fatigue was observed in the parallax range of 0–1° (0 to 17.45 mrad) [8,15]. However, whether parallax differences in neighboring positions have an effect on visual fatigue has not been studied yet. Therefore, it is important to find out how the stair step parallax of AR-HUD causes visual fatigue and visual performance degradation to drivers.
In this study, we investigated the effect of HUD parallax on visual comfort and task performance through a human factor engineering experiment. Specifically, we designed experiments involving different conditions of horizontal parallax, vertical parallax, and stepped parallax in order to provide parallax references for AR-HUD design. The following aspects were investigated:
(1)
The effect of overall parallax variation on visual fatigue and task performance in a 2D virtual display of a HUD;
(2)
The effect of parallax variance in neighboring positions on visual fatigue and task performance in a 2D display of a HUD and the range of accepted parallax;
(3)
The correlation between parallax and visual performance and the model of this correlation.

2. Materials and Methods

2.1. Experimental Setup

The reason for the binocular parallax in the HUD system is that when an image is reflected by the mirror and glass and enters the human eye, the images received by the left and right eyes are misaligned. Although the original image is still 2D, the horizontal and vertical offsets correspond to horizontal and vertical parallaxes, respectively. Different parallax values can be realized by controlling different offsets in the horizontal and vertical directions, which is very similar to binocular observation of different scenes separately. Thus, we used a 3D projector to simulate these separated binocular images, which caused parallax similar to that caused by a HUD display. The image from the projector entered the human eye through the shutter 3D glasses, and by controlling the shutter of the 3D glasses, we were able to transmit the image to the left and right eyes independently. The perceived positional deviation of the images entering the left and right eyes can be controlled by controlling the positional deviation of the image input to the projector. The image input is 2D, and its parallax effect is similar to that of HUDs. Figure 5 shows the parallax projection method used in this experiment and the deviation in image position between the left and right eyes: (a) is to construct viewing conditions with parallax effect, where the left and right images are projected by a projector and received by 3D glasses; (b) is the effect when the left and right eye images (for example, when a letter “d” to be observed) are overlapped, which creates visual depth perception [16].
Figure 6 shows the site photo during the experiments, wherein (a) is the input screen of independent channels for the left and right eyes, (b) is the overlapping image of the projected binoculars, and (c) is the observer who wore the switch-type 3D glasses, which present a virtual image display effect with parallax to the user.

2.2. Evaluation Method

The experiment was conducted in a windowless room, and the lights were kept off. The brightness of the display surface was 220 cd/m2, the actual brightness when viewed through general-purpose eyeglasses was 82.5 cd/m2, and the viewing distance was 3.0 m. This setting subtended a viewing angle of 10°, which is similar to a typical AR-HUD projecting the virtual image width of 1.75 m at a distance of 10 m.
The experimental parallax conditions were categorized into two broad categories: horizontal parallax and vertical parallax. Within each category, there were two types of display: (1) uniform overall parallax of the screen, with seven parallax conditions, and (2) nonuniform parallax of the screen, with the center as a reference, showing horizontal and vertical step-ups and the parallax difference between any two adjacent display positions being the same for a total of seven step-ups. The details of the experiment are as follows:
(1)
Overall horizontal parallax change: the overall parallax of the image remained the same, and seven different parallax conditions were set.
(2)
Stepped horizontal parallax variation: the central and edge parallaxes of the image showed a trend of stepped variation, with the central parallax being smaller than the edge parallax; there were different levels of variation, with a total of seven levels being set.
(3)
Overall vertical parallax variation: the overall parallax of the image remained the same, and seven different parallax conditions were set.
(4)
Stepped vertical parallax variation: the central and edge parallaxes of the image showed a trend of step variation, with the central parallax being smaller than the edge parallax; the variation step level was different, with a total of seven levels being set.
The different parallax states were achieved by adjusting the deviation of the input image position corresponding to the left and right eyes, which in turn was achieved by adjusting a fixed number of pixels of the deviation of the input left and right eye images. The actual deviation distance of the projected image was obtained by measuring the parallax at the actual projection position and then combining the result with the observation distance virtual image distance (VID) for parallax calculation [7]. Table 1 shows the parallax independent variables set and calculated according to the settings; it contains two types of parallaxes, horizontal and vertical, with seven parallax values for each type. Table 2 shows the step parallax setting conditions, representing the parallax difference between neighboring positions in the image. Figure 7 presents the overall parallax variation and stepped parallax variation methods. Among them, (a) and (c) are different overall parallaxes, and (b) and (d) are different stepped parallaxes; these two parallax variations have obvious differences in display.

2.3. Participants

The sample size was calculated using G-power software (version 3.1.9.7) [17]. The F-test was used to estimate the required sample size (α = 0.05, power = 0.8 [18,19]), and the minimum sample size of 14 subjects was revealed. Therefore, seventeen subjects (eight men; nine women) were recruited for the experiment. The mean age of the participants was 31 years (20–38 years), and their corrected visual acuity was ≥1.0 and was checked prior to the experiment to ensure the observation during the experiment was under the right conditions. The sample size power value was >0.8. The experiment design was approved by the ethics committee of Fudan University, with the approval number “FE23073R”.

2.4. Experiment Procedure

The experiment was conducted in four phases, with each phase completing the testing of all conditions and all subjects of one parallax type. After that, the next phase of testing was started. A randomized order was used to display the seven parallax conditions in each phase of the experiment. Initially, the subjects underwent 20 min of dark adaptation [20,21,22], after which the visual task was initiated. A subjective questionnaire was administered after completing the visual task. A 10 min break was given before the subjects began the next test with parallax conditions. Figure 8 shows the schematic of the experimental flow. The visual task was adopted as the D2 test task, and task performance could be simultaneously evaluated [23,24]. Participants performed the visual task using a mouse for the projected image with parallax conditions. The subjects filled in the subjective fatigue questionnaires, the Visual Analogue Scale to Evaluate Fatigue Severity (VAS-F) [25,26] and Heuer’ s Subjective Visual Fatigue Scale (VFS) [24,27,28], immediately after the visual task.
Before the experiment, the subjects were trained to become familiar with the experimental procedure and practice the D2 task to eliminate practice effects. Each subject completed the D2 task in a randomized combination, under a total of seven conditions, with the parallax condition presented in a randomized manner. A 10 min dark break was taken after each test, with no viewing of electronic devices.

2.5. Data Analysis

The data were analyzed using SPSS software (IBM Corp., Armonk, NY, USA, 20.0 Version) based on an analysis of the total score TN-E, (E = E1 + E2), omission error rate (E1), violation error rate (E2), concentration (CP), and fluctuation ratio (FR) for D2 task, and the sum of ratings for the VAS-F and VFS. Based on the results of the data normality test, we selected one-way ANOVA analysis of variance or nonparametric tests with multiple K-values for data analysis.

3. Results

3.1. Horizontal Parallax

We evaluated two forms of change in horizontal parallax: the overall parallax uniform change and the adjacent position parallax step change. Two conditions were assessed, and the visual effect presented was different. Overall parallax changes cause a change in the overall depth perception of the image. Stepped parallax changes are depth perceptions that show a step change in the horizontal direction, with the center of the image having the farthest depth perception distance and the two sides becoming progressively closer.
In the subjective assessment of visual fatigue, we used the sum of the individual scores of the VAS-F and VFS as the total score for fatigue assessment, with higher scores indicating greater fatigue. Figure 9 and Figure 10 show the results of the visual fatigue analysis of the subjective questionnaire. Comparative analyses of the total scores did not show any significant difference between the uniform parallax change and the stepped parallax change conditions (p > 0.05), but the scores did show a decreasing and then increasing trend as the parallax value increased. Figure 9a shows the trends in the VAS-F assessment scores, while Figure 9b displays the trends in the VFS assessment scores; both had the smallest fatigue score when the overall parallax was 3.31 mrad. Figure 10a shows the trends in the VAS-F assessment scores, and Figure 10b presents the trends in the VFS assessment scores; both had the smallest fatigue score when the stepped parallax was 0.85 mrad. This result suggests that, based on the VAS-F and VFS scores, the highest fatigue was achieved at a lower parallax. The lower parallax is better than the no parallax condition, which indicates that the horizontal parallax under certain conditions can improve the perception of visual comfort, but it does not show a significant difference.
We analyzed the D2 performance task under horizontal overall uniform parallax and stepped parallax conditions. The results of the D2 task for the overall parallax variation are shown in Figure 11: (a) presents the trends in TN-E on the D2 performance task, and (b) shows the trends in E on the D2 performance task. The results of the D2 task in the horizontal overall parallax condition did not show significant differences, and the overall scores were more consistent, with only a more pronounced downward trend when the parallax was <20.97 mrad.
Analyses of stepped horizontal parallax D2 task performance scores using the Kruskal–Wallis test statistic showed that the different parallax samples showed significant differences for TN, E1, E2, TN-E, CP, and FR; the data are presented in Table 3.
Figure 12 and Figure 13 present a further analysis of the variability in the summary of changes in parallax by step. The analysis of the variability indicated that there were significant differences in the TN-E, CP, E1, and E2 assessment metrics between the 4.05 mrad and 4.65 mrad stepped parallax conditions and conditions with parallax values less than these. Moreover, a tendency for decreased task performance was shown. This result indicates that the two step conditions had a significant effect on the visual task. The condition with a stepped parallax value of 3.31 mrad did not present a significant difference from the condition with a smaller parallax value, and the results remained consistent.
Because the conventional direction of vision for visual tasks is horizontally from left to right, parallax differences between adjacent positions of an image in the horizontal direction can cause the eyes to undergo frequent convergence adjustments, thus affecting visual comfort and visual performance [14]. Among them, the total performance and error rate of the D2 task showed more obvious variability and change trends. In addition, the median change trend in total performance and error rate was used to fit the one-factor change trend, and the fit function of the exponential relationship was obtained.
Figure 14 shows the effect of fitting the relationship between the total score TN-E and the trend in the change in the horizontal ladder for the D2 performance test. The fitted correlation coefficient was R2 = 0.99, which is greater than 0.8, and the fitted result was accepted. The fitted curve equation is
y = y 1 + A 1 e X / t 1
Among them:
y 1 = 628.38625 ± 2.8495; A 1 = −1.94513 ± 0.75004; t 1 = 1.13969 ± 0.10568
Figure 15 shows the D2 performance test omission error rate E1 and horizontal step change trend relationship fitted to the sample. The fitted correlation coefficient was R2 = 0.98, which is greater than 0.8, and the fitted results were admissible. The fitted curve formula is
y = y 2 + B 2 e X / t 2
Among them:
y 2 = 1.70887 ± 0.29988; B 2 = 0.21981 ± 0.10404; t 2 = 1.28658 ± 0.16341

3.2. Vertical Parallax

We analyzed the results of the overall uniform and step changes in vertical parallax, and the results of the subjective fatigue questionnaire analyses are shown in Figure 16 and Figure 17. Again, we found no significant differences in the subjective scale fatigue scores. For overall parallax, it had the smallest fatigue score when the overall parallax was 2.24 mrad in the VAS-F assessment scores, as shown in Figure 16a. But, for VFS assessment, there was an increase in parallax as well as an increase in overall visual fatigue scores, as shown in Figure 16b. This suggests that an increase in parallax affects the overall perception of visual fatigue. However, for the increase in stepped parallax, as shown in Figure 17a,b, visual fatigue scores were overall relatively flat, with no significant trend.
For the D2 task performance, no significant difference was observed in the overall parallax, and there was no obvious trend in changes in several evaluation indexes. In the analysis of the overall change in vertical parallax, there was no significant difference between subjective fatigue perception and D2 task performance, and the effects of the overall change in vertical parallax were more consistent in terms of visual comfort and task performance.
For stepped vertical parallax, there was no significant difference in subjective fatigue scores; for D2 task performance, there was also no variability in total scores and errors, but there was significant variability in CP and FR, the results of which are shown in Table 4, suggesting that the perception of the effect of stepped vertical parallax varies greatly in the performance of the individual subjects. This result might be related to the fact that vertical parallax gives visual perception to the human eye caused by variability in horizontal parallax. Vertical parallax is more likely to cause a sense of aberration in the picture, and once the subject is adapted to this feeling, it causes less impact; however, there are obvious differences in the ability of individuals to adapt. Another reason for this result might be related to the type of task. Under vertical parallax, the D2 task is still to search horizontally from left to right, which might be less affected by vertical parallax.
We made additional two-by-two comparisons of the degree of concentration and volatility, and their differences are shown in Figure 18. Both evaluation metrics showed significant differences between the condition of stepped parallax of 4.05 mrad and the conditions of stepped parallaxes of 0, 0.53, and 1.71 mrad; however, no significant differences were found for ≥2.24 mrad. The condition of stepped parallax at 4.05 mrad showed the worst results for both concentration and volatility. As a reference for the maximum acceptable stepped vertical parallax, 2.24 mrad was used as the critical value for significant differences.

4. Discussion

Among the four parallax variations used in this experiment, overall horizontal parallax and overall vertical parallax did not show significant differences between the parallax conditions in subjective fatigue perception or D2 performance tasks. However, when there was a small parallax, visual comfort perception was optimal, and the trend of this result is consistent with the findings of related studies on the design and analysis of HUD parallax [8,9], especially the effect of the horizontal parallax being the most obvious. In this study, we found the optimal horizontal parallax of visual perception was 3.31 mrad, which is smaller than the optimal parallax of the conventional 3D display [8,9]. This result might be related to the fact that the middle forces in the different positions of the 2D display screen are at the same parallax level, and when the human eye adapts, the focus of the glasses is always on a horizontal plane and does not require frequent adjustments, resulting in no significant differences in visual fatigue perception or task performance. Meanwhile, as a laboratory simulated research, the results did not consider real situations in driving scenarios but only simulated similar variables for observation. This might bring some bias as well. However, regarding only the effect of parallax, our results can still indicate that presenting a smaller parallax value in the overall horizontal parallax could improve the performance of visual comfort perception [8].
In the stepped horizontal parallax test, the horizontal parallax showed a step change in the horizontal direction, and there were significant differences in the visual performance task for different values of parallax in neighboring positions: worse task performance was found with greater values of the neighboring stepped parallax. However, significant differences in performance task scores occurred only when the stepped parallax value was greater than 3.31 mrad. We can take the step parallax value of 3.31 mrad as the maximum acceptable value. The reason that step parallax can have a significant effect on visual tasks is that under the stepped horizontal parallax condition, the eye’s focus of attention must be switched frequently in different horizontal parallax planes. Increasing the frequency of visual convergence adjustment and also increasing the eye focusing time, which increases eye fatigue and larger motion fusion [13,29], makes it more difficult for the eye to process information. This shows that in an AR-HUD screen, the horizontal parallax of different display positions, especially the value of the horizontal parallax of the adjacent position, should not differ greatly; otherwise, the driver’s efficiency of reading real information will be affected, and visual fatigue will occur more easily.
There was no significant difference in visual fatigue perception or D2 task performance and error rate in stepped vertical parallax. However, for concentration (CP) and fluctuation ratio (FR), as the value of stepped parallax increased to 2.24 mrad, significant differences were observed. In contrast to horizontal parallax, which causes differences in depth and position perception, vertical parallax causes the human eye to perceive the target with dizziness, aberration, and ghosting [13]. There are obvious differences in the ability of different individuals to fight against such sensations, resulting in more obvious differences in the concentration and fluctuation ratios of the performance task scores. Vertical parallax between neighboring positions should be kept within acceptable limits.
The total performance or error rate relationship between stepped parallax and D2 task scores can be obtained by function fitting and further quantitative characterization. It provides a computational method reference for visual perception assessment of parallax, thus enabling the selection of proper settings of parallax of a HUD for improving the task performance of drivers to, therefore, improve driving safety. Further research in the future can be conducted to verify this model, and further mathematical identifiers can be developed to quantify the results of this research.

5. Conclusions

In this study, we experimentally modeled the effects of parallax on visual fatigue and task performance at the corresponding viewing angles of an AR-HUD. Our experimental results showed that having a certain parallax uniformly on the display screen has little effect on visual fatigue and task performance, and having a small horizontal parallax improves visual comfort perception. At this point, visual fusion and visual convergence adjustment only need to be completed in the initial state. When there is a difference in parallax between neighboring positions on the display screen, i.e., stepped parallax, it has a significant effect on visual task performance. Horizontal stepped parallax increases the frequency of visual vergence adjustment and reduces visual performance; vertical staircase parallax increases the difficulty of visual integration and brings about feelings such as vertigo.
In addition, we observed an exponentially varying relationship between horizontal stepped parallax and the error rate of visual performance, and the formula can be used to predict the degree of influence on visual performance caused by horizontal stepped parallax in virtual displays. This study provides a reference for parallax control between neighboring display icons in AR-HUDs.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app132413189/s1.

Author Contributions

Conceptualization, J.S. and Y.L.; methodology, J.S.; software, Z.F. and Z.J.; data curation, W.X.; validation, formal analysis, investigation, and writing, J.S.; review and supervision, Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by China National Key R&D Program, Project number 2021YFB3202500.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the ethic committee of Fudan University, with the approval number of “FE23073R”.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in Supplementary Materials.

Conflicts of Interest

The corresponding author of this article is the guest editor of the special issue of this journal.

References

  1. Deng, N.; Ye, J.; Chen, N.; Yang, X. Towards Stereoscopic On-vehicle AR-HUD. Vis. Comput. 2021, 37, 2527–2538. [Google Scholar] [CrossRef]
  2. Yamin, P.A.; Park, J.; Kim, H.K. Towards a Human-Machine Interface Guidance for in-Vehicle Augmented Reality Head-up Displays. ICIC Express Lett. 2021, 15, 1313–1318. [Google Scholar]
  3. Blankenbach, K. Requirements and System Aspects of AR-Head-Up Displays. IEEE Consum. Electron. Mag. 2019, 8, 62–67. [Google Scholar] [CrossRef]
  4. Zhou, T.; Qiao, W.; Hua, J.; Chen, L. Status and Prospect of Augmented Reality Head-Up Display. Laser Optoelectron. Prog. 2023, 60. [Google Scholar]
  5. Kim, K.-H.; Park, S.-C. Optical System Design and Evaluation for an Augmented Reality Head-up Display Using Aberration and Parallax Analysis. Curr. Opt. Photonics 2021, 5, 660–671. [Google Scholar]
  6. Kim, K.-H.; Park, S.-C. Optical System Design for a Head-up Display through Analysis of Distortion and Biocular Parallax. Korean J. Opt. Photonics 2020, 31, 88–95. [Google Scholar]
  7. Zhao, Z.; Cheng, D.; Yang, T.; Wang, Q.; Hou, Q.; Gu, L.; Wang, Y. Design and evaluation of a biocular system. Appl. Opt. 2019, 58, 7851–7857. [Google Scholar] [CrossRef]
  8. Liu, Y.; Guo, X.; Fan, Y.; Meng, X.; Wang, J. Subjective assessment on visual fatigue versus stereoscopic disparities. J. Soc. Inf. Disp. 2021, 29, 497–504. [Google Scholar] [CrossRef]
  9. Zhou, M.; Cheng, D.; Wang, Q.; Chen, H.; Wang, Y. Design of an off-axis four-mirror system for automotive head-up display. In Proceedings of the Applied Optics and Photonics China (AOPC) Conference—MEMS, THz MEMS, and Metamaterials and AI in Optics and Photonics, Beijing, China, 30 November–2 December 2020. [Google Scholar]
  10. Kim, H.; Lee, M.; Kim, G.J.; Hwang, J.-I. The Impacts of Visual Effects on User Perception with a Virtual Human in Augmented Reality Conflict Situations. IEEE Access 2021, 9, 35300–35312. [Google Scholar] [CrossRef]
  11. Pladere, T.; Luguzis, A.; Zabels, R.; Smukulis, R.; Barkovska, V.; Krauze, L.; Konosonoka, V.; Svede, A.; Krumina, G. When virtual and real worlds coexist: Visualization and visual system affect spatial performance in augmented reality. J. Vis. 2021, 21, 17. [Google Scholar] [CrossRef] [PubMed]
  12. Silva, A.R.; Farias, M.C.Q. Perceptual quality assessment of 3D videos with stereoscopic degradations. Multimed. Tools Appl. 2019, 79, 1603–1623. [Google Scholar] [CrossRef]
  13. Sugita, N.; Sasaki, K.; Yoshizawa, M.; Ichiji, K.; Abe, M.; Homma, N.; Yambe, T. Effect of viewing a three-dimensional movie with vertical parallax. Displays 2018, 58, 20–26. [Google Scholar] [CrossRef]
  14. Erkelens, I.M.; MacKenzie, K.J. 19-2: Vergence-Accommodation Conflicts in Augmented Reality: Impacts on Perceived Image Quality. SID Symp. Dig. Tech. Pap. 2020, 51, 265–268. [Google Scholar] [CrossRef]
  15. Hirota, M.; Kanda, H.; Endo, T.; Miyoshi, T.; Miyagawa, S.; Hirohara, Y.; Yamaguchi, T.; Saika, M.; Morimoto, T.; Fujikado, T. Comparison of visual fatigue caused by head-mounted display for virtual reality and two-dimensional display using objective and subjective evaluation. Ergonomics 2019, 62, 759–766. [Google Scholar] [CrossRef] [PubMed]
  16. Wang, Y.; Potemin, I.S.; Zhdanov, A.; Bogdanov, N.; Zhdanov, D.; Livshits, I. Analysis of the visual perception conflicts in designing mixed reality systems. In Proceedings of the Optical Design and Testing Viii, Beijing, China, 11–13 October 2018; SPIE/COS Photonics Asia: Beijing, China, 2018; Volume 10815, p. 108150U. [Google Scholar]
  17. Faul, F.; Erdfelder, E.; Lang, A.-G. G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods 2007, 39, 175–191. [Google Scholar] [CrossRef] [PubMed]
  18. Doshida, Y.; Itabashi, M.; Takei, T.; Takino, Y.; Sato, A.; Yumura, W.; Maruyama, N.; Ishigami, A. Reduced Plasma Ascorbate and Increased Proportion of Dehydroascorbic Acid Levels in Patients Undergoing Hemodialysis. Life 2021, 11, 1023. [Google Scholar] [CrossRef] [PubMed]
  19. Kei, C.P.; Nordin, N.A.M.; Aziz, A.F.A. The effectiveness of home-based therapy on functional outcome, self-efficacy and anxiety among discharged stroke survivors. Medicine 2020, 99, e23296. [Google Scholar] [CrossRef]
  20. Jackson, G.R.; Owsley, C.; McGwin, G. Aging and dark adaptation. Vis. Res. 1999, 39, 3975–3982. [Google Scholar] [CrossRef]
  21. Lamb, T.; Pugh, E. Dark adaptation and the retinoid cycle of vision. Prog. Retin. Eye Res. 2004, 23, 307–380. [Google Scholar] [CrossRef]
  22. Kalloniatis, M.; Luu, C. Light and Dark Adaptation; University of Utah Health Sciences Center: Salt Lake City, UT, USA, 1995. [Google Scholar]
  23. Brickenkamp, R.; Zillmer, E.A. D2 Test of Attention; U.S. Hogrefe & Huber Publishers: Cambridge, MA, USA, 1998; pp. 3–12. [Google Scholar]
  24. Hou, D.; Luo, M.; Hao, L.; Lin, Y. Applicability Analysis of Human Health Evaluation Methods in Light Environment Research. Zhaoming Gongcheng Xuebao 2021, 32, 76–93. [Google Scholar]
  25. Shahid, A.; Wilkinson, K.; Marcu, S.; Shapiro, C.M. (Eds.) Visual Analogue Scale to Evaluate Fatigue Severity (VAS-F). In STOP, THAT and One Hundred Other Sleep; Springer: New York, NY, USA, 2012; pp. 399–402. [Google Scholar]
  26. Lee, K.A.; Hicks, G.; Nino-Murcia, G. Validity and reliability of a scale to assess fatigue. Psychiatry Res. 1991, 36, 291–298. [Google Scholar] [CrossRef]
  27. Benedetto, S.; Carbone, A.; Drai-Zerbib, V.; Pedrotti, M.; Baccino, T. Effects of luminance and illuminance on visual fatigue and arousal during digital reading. Comput. Hum. Behav. 2014, 41, 112–119. [Google Scholar] [CrossRef]
  28. Heuer, H.; Hollendiek, G.; Kröger, H.; Römer, T. Rest position of the eyes and its effect on viewing distance and visual fatigue in computer display work. Z. Exp. Angew. Psychol. 1989, 36, 538–566. [Google Scholar]
  29. Fredenburg, P.; Harwerth, R.S. The relative sensitivities of sensory and motor fusion to small binocular disparities. Vis. Res. 2001, 41, 1969–1979. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of the HUD optical imaging principle and light path.
Figure 1. Schematic diagram of the HUD optical imaging principle and light path.
Applsci 13 13189 g001
Figure 2. W-HUD projection distances and the field of view versus AR-HUD.
Figure 2. W-HUD projection distances and the field of view versus AR-HUD.
Applsci 13 13189 g002
Figure 3. Schematic diagram of the principle of horizontal and vertical parallax.
Figure 3. Schematic diagram of the principle of horizontal and vertical parallax.
Applsci 13 13189 g003
Figure 4. The perceptual misalignment of the image into the left and right eyes.
Figure 4. The perceptual misalignment of the image into the left and right eyes.
Applsci 13 13189 g004
Figure 5. Parallax projection method, left and right eye image position deviation: (a) images projected and entered left and right eyes separately through 3D glasses to present similar parallax caused by HUD display; (b) the final image observed by the two eyes; the effect of overlapping the left and right eye images, have misalignment in horizontal and vertical directions.
Figure 5. Parallax projection method, left and right eye image position deviation: (a) images projected and entered left and right eyes separately through 3D glasses to present similar parallax caused by HUD display; (b) the final image observed by the two eyes; the effect of overlapping the left and right eye images, have misalignment in horizontal and vertical directions.
Applsci 13 13189 g005
Figure 6. Parallax experimental scene layout: (a) left and right eye independent channel input screen; (b) overlapping binocular images after projection; (c) switchable 3D glasses with parallax virtual image display effect after wearing.
Figure 6. Parallax experimental scene layout: (a) left and right eye independent channel input screen; (b) overlapping binocular images after projection; (c) switchable 3D glasses with parallax virtual image display effect after wearing.
Applsci 13 13189 g006
Figure 7. Schematic diagram of how the overall and stepped parallax varies: (a) distribution of different horizontal overall parallaxes in the horizontal direction of the image; (b) distribution of different horizontal stepped parallaxes in the horizontal direction of the image; (c) distribution of different overall vertical parallaxes in the vertical direction of the image; (d) distribution of different stepped vertical parallaxes in the vertical direction.
Figure 7. Schematic diagram of how the overall and stepped parallax varies: (a) distribution of different horizontal overall parallaxes in the horizontal direction of the image; (b) distribution of different horizontal stepped parallaxes in the horizontal direction of the image; (c) distribution of different overall vertical parallaxes in the vertical direction of the image; (d) distribution of different stepped vertical parallaxes in the vertical direction.
Applsci 13 13189 g007
Figure 8. Schematic diagram of the experimental scenario.
Figure 8. Schematic diagram of the experimental scenario.
Applsci 13 13189 g008
Figure 9. Trends in VAS-F and VFS questionnaire scores under overall parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the overall parallax was 3.31 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the overall parallax was 3.31 mrad.
Figure 9. Trends in VAS-F and VFS questionnaire scores under overall parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the overall parallax was 3.31 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the overall parallax was 3.31 mrad.
Applsci 13 13189 g009
Figure 10. Trends in VAS-F and VFS questionnaire scores in stepped parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the stepped parallax was 0.85 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the stepped parallax was 0.85 mrad.
Figure 10. Trends in VAS-F and VFS questionnaire scores in stepped parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the stepped parallax was 0.85 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the stepped parallax was 0.85 mrad.
Applsci 13 13189 g010
Figure 11. Trends in total performance and error rate of D2 test task under horizontal overall parallax conditions: (a) trend in TN-E on the D2 performance task, with no significant differences across parallax conditions; (b) trend in E on the D2 performance task, with no significant differences across parallax conditions.
Figure 11. Trends in total performance and error rate of D2 test task under horizontal overall parallax conditions: (a) trend in TN-E on the D2 performance task, with no significant differences across parallax conditions; (b) trend in E on the D2 performance task, with no significant differences across parallax conditions.
Applsci 13 13189 g011
Figure 12. Results of the analysis of variance of total performance and concentration under different conditions of horizontal parallax step change. (a) When the horizontal stepped parallax value was greater than 3.31 mrad, the TN-E for D2 task performance was significantly different from conditions less than that value. (b) When the horizontal stepped parallax value was greater than 3.31 mrad, the CP for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01, *** p < 0.005, **** p < 0.001, ns no significant.
Figure 12. Results of the analysis of variance of total performance and concentration under different conditions of horizontal parallax step change. (a) When the horizontal stepped parallax value was greater than 3.31 mrad, the TN-E for D2 task performance was significantly different from conditions less than that value. (b) When the horizontal stepped parallax value was greater than 3.31 mrad, the CP for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01, *** p < 0.005, **** p < 0.001, ns no significant.
Applsci 13 13189 g012
Figure 13. Results of variability analysis of omission error rate and violation error rate for different conditions of horizontal parallax step change. (a) When the horizontal stepped parallax value was greater than 3.31 mrad, the E1 for D2 task performance was significantly different from conditions less than that value. (b) When the horizontal stepped parallax value was greater than 3.31 mrad, the E2 for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01, *** p < 0.005.
Figure 13. Results of variability analysis of omission error rate and violation error rate for different conditions of horizontal parallax step change. (a) When the horizontal stepped parallax value was greater than 3.31 mrad, the E1 for D2 task performance was significantly different from conditions less than that value. (b) When the horizontal stepped parallax value was greater than 3.31 mrad, the E2 for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01, *** p < 0.005.
Applsci 13 13189 g013
Figure 14. Comparison of the fitted trends in overall D2 task performance with parallax step values.
Figure 14. Comparison of the fitted trends in overall D2 task performance with parallax step values.
Applsci 13 13189 g014
Figure 15. Comparison of the fitted trend of omission error rate with parallax step value for D2 task performance.
Figure 15. Comparison of the fitted trend of omission error rate with parallax step value for D2 task performance.
Applsci 13 13189 g015
Figure 16. Trends in VAS-F and VFS fatigue evaluation scores under different parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the overall parallax was 2.24 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the overall parallax is 0 mrad.
Figure 16. Trends in VAS-F and VFS fatigue evaluation scores under different parallax conditions: (a) trend in VAS-F assessment scores had the smallest fatigue score when the overall parallax was 2.24 mrad; (b) trend in VFS assessment scores had the smallest fatigue score when the overall parallax is 0 mrad.
Applsci 13 13189 g016
Figure 17. Trends in visual fatigue score error for a vertical parallax step change: (a) trend in VAS-F assessment scores, with no significant differences across stepped parallax conditions; (b) trend in VFS assessment scores, with no significant differences across stepped parallax conditions.
Figure 17. Trends in visual fatigue score error for a vertical parallax step change: (a) trend in VAS-F assessment scores, with no significant differences across stepped parallax conditions; (b) trend in VFS assessment scores, with no significant differences across stepped parallax conditions.
Applsci 13 13189 g017
Figure 18. Results of the variability analysis of total performance and concentration under different conditions of vertical parallax step change. (a) When the vertical stepped parallax value was greater than 2.24 mrad, the CP for D2 task performance was significantly different from conditions less than that value. (b) When the vertical stepped parallax value was greater than 2.24 mrad, the FR for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01.
Figure 18. Results of the variability analysis of total performance and concentration under different conditions of vertical parallax step change. (a) When the vertical stepped parallax value was greater than 2.24 mrad, the CP for D2 task performance was significantly different from conditions less than that value. (b) When the vertical stepped parallax value was greater than 2.24 mrad, the FR for D2 task performance was significantly different from conditions less than that value. * p < 0.05, ** p < 0.01.
Applsci 13 13189 g018
Table 1. Parallax values for overall parallax modes. The parallax values are obtained by actual testing after display. Overall parallax means that the parallax is the same at different positions in the image.
Table 1. Parallax values for overall parallax modes. The parallax values are obtained by actual testing after display. Overall parallax means that the parallax is the same at different positions in the image.
No.Variable NameParallax Value (mrad)
1Overall horizontal parallax0, 1.65, 3.31, 6.24, 10.61, 13.67, 20.97
2Overall vertical parallax0, 2.24, 4.05, 6.61, 8.96, 12.27, 15.15
Table 2. Parallax differences values for stepped parallax modes. The differences in parallax values are obtained by actual testing after display. Stepped parallax means that the icons at neighboring positions in the image deviate from the current value.
Table 2. Parallax differences values for stepped parallax modes. The differences in parallax values are obtained by actual testing after display. Stepped parallax means that the icons at neighboring positions in the image deviate from the current value.
No.Variable NameDifferences Parallax Value (mrad)
1Stepped horizontal parallax0, 0.85, 1.65, 2.45, 3.31, 4.05, 4.64
2Stepped vertical parallax0, 0.53, 1.71, 2.24, 2.88, 3.41, 4.05
Table 3. Stepped horizontal parallax D2 test task analysis results. It used nonparametric tests to show that it exhibited significant differences in each of the evaluation indicators. The p-values of TN-E, E1, E2, and CP were less than 0.01.
Table 3. Stepped horizontal parallax D2 test task analysis results. It used nonparametric tests to show that it exhibited significant differences in each of the evaluation indicators. The p-values of TN-E, E1, E2, and CP were less than 0.01.
Parallax (Median)TNE1E2TN-ECPFR
0 (n = 17)652.0001.3800.180627.000180.0005.000
0.85 (n = 17)652.0001.6400.310635.000185.0006.000
1.65 (n = 17)652.0001.6800.300632.000180.0004.000
2.45 (n = 17)644.0001.5800.340620.000176.0006.000
3.31 (n = 17)649.0002.7600.520612.000165.0008.000
4.05 (n = 17)627.0007.7501.670565.000129.0009.000
4.64 (n = 17)614.00011.3001.760514.00092.00012.000
Kruskal–Wallis test statistic H-value13.95638.11230.65043.34245.95015.079
p-value0.030 *0.000 **0.000 **0.000 **0.000 **0.020 *
* p < 0.05, ** p < 0.01.
Table 4. Results of nonparametric test analysis of D2 test results.
Table 4. Results of nonparametric test analysis of D2 test results.
Parallax (Median)TNE1E2TNECPFR
0 (n = 17)642.0001.8500.470617.000180.0005.000
0.53 (n = 17)625.0002.0800.610603.000167.0006.000
1.71 (n = 17)631.0002.1100.470608.000174.0004.000
2.24 (n = 17)635.0002.9100.910608.000165.0006.000
2.88 (n = 17)619.0003.1800.760590.000157.0008.000
3.41 (n = 17)606.0004.1400.560566.000157.0009.000
4.05 (n = 17)630.0003.9601.080564.000146.00012.000
Kruskal–Wallis test statistic H-value6.0319.79812.37011.47614.29315.079
p0.4200.1330.0540.0750.027 *0.020 *
* p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Song, J.; Fan, Z.; Xu, W.; Ji, Z.; Lin, Y. Parallax of Head-Up Displays and Visual Safety for Driving. Appl. Sci. 2023, 13, 13189. https://doi.org/10.3390/app132413189

AMA Style

Song J, Fan Z, Xu W, Ji Z, Lin Y. Parallax of Head-Up Displays and Visual Safety for Driving. Applied Sciences. 2023; 13(24):13189. https://doi.org/10.3390/app132413189

Chicago/Turabian Style

Song, Jun, Zihang Fan, Wei Xu, Zhengxin Ji, and Yandan Lin. 2023. "Parallax of Head-Up Displays and Visual Safety for Driving" Applied Sciences 13, no. 24: 13189. https://doi.org/10.3390/app132413189

APA Style

Song, J., Fan, Z., Xu, W., Ji, Z., & Lin, Y. (2023). Parallax of Head-Up Displays and Visual Safety for Driving. Applied Sciences, 13(24), 13189. https://doi.org/10.3390/app132413189

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop