Next Article in Journal
Color Normalization Through a Simulated Color Checker Using Generative Adversarial Networks
Previous Article in Journal
LLM-Based Query Expansion with Gaussian Kernel Semantic Enhancement for Dense Retrieval
Previous Article in Special Issue
Deep Learning-Based Train Obstacle Detection Technology: Application and Testing in Metros
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reality Head-Up Display Navigation Design in Extreme Weather Conditions: Enhancing Driving Experience in Rain and Fog

College of Design, Graduate School, Hanyang University, Seoul 04763, Republic of Korea
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(9), 1745; https://doi.org/10.3390/electronics14091745
Submission received: 10 March 2025 / Revised: 18 April 2025 / Accepted: 20 April 2025 / Published: 25 April 2025

Abstract

:
This study investigates the impact of extreme weather conditions (specifically heavy rain and fog) on drivers’ situational awareness by analyzing variations in illumination levels. The primary objective is to identify optimal color wavelengths for low-light environments, thereby providing a theoretical foundation for the design of augmented reality head-up display in adverse weather conditions. A within-subjects experimental design was employed with 26 participants in a simulated driving environment. Participants were exposed to different illumination levels and AR-HUD colors. Eye-tracking metrics, including fixation duration, visit duration, and fixation count, were recorded alongside situational awareness ratings to assess cognitive load and information processing efficiency. The results revealed that the yellow AR-HUD significantly enhanced situational awareness and reduced cognitive load in foggy conditions. While subjective assessments indicated no substantial effect of lighting conditions, objective measurements demonstrated the superior effectiveness of the yellow AR-HUD under foggy weather. These findings suggest that yellow AR-HUD navigation icons are more suitable for extreme weather environments, offering potential improvements in driving performance and overall road safety.

1. Introduction

Head-Up Display (HUD) is a transparent display technology that projects critical information (e.g., speed, navigation) directly into the driver’s line of sight, minimizing gaze deviation to enhance situational awareness and driving safety [1]. HUD provides essential information, such as navigation, vehicle status, and environmental data, thereby significantly enhancing situational awareness [2]. Augmented Reality HUD (AR-HUD) integrates virtual elements (e.g., navigation arrows) with real-world road scenes via projection, enabling intuitive driver guidance [3]. AR-HUD further assists navigation by mitigating the effects of external environmental degradation. These systems improve driver attention and facilitate the recovery of cognitive resources, ultimately enhancing driving performance through visual interaction [4]. Research has shown that directly overlaying AR-HUD navigation and environmental information onto the driver’s line of sight can enhance both cognitive accessibility and visual clarity [5]. However, drivers frequently encounter varying road conditions, and extreme weather (e.g., fog or rain) can cause a deterioration in visual performance. This phenomenon increases driving risks, posing threats to both drivers and pedestrians [6].
Situational awareness (SA) is a multifaceted concept, encompassing the perception, understanding, and projection of elements within the environment [7]. When drivers are exposed to extreme weather conditions, such as fog or rain, the degradation of the visual environment significantly threatens the operational safety and efficiency of ground vehicles [8]. Owens and Tyrrell attributed the degradation of visual functions under varying illumination conditions to the Selective Degradation Hypothesis [9]. Visual complexity, driving scenarios, road types, scene illumination, and informational elements all influence situational awareness and contribute to driver accidents [10,11]. This ability to perceive, understand, and predict relevant factors in complex or dynamic environments to support timely decision making is referred to as situational awareness. AR-HUD systems process real-time video images to identify weather conditions and provide early warnings for hazardous weather types, thereby reducing the risk of traffic accidents [12]. These systems are capable of detecting obstacles and displaying critical driving information, achieving an approximate recognition rate of 73% under adverse weather conditions [3]. Park proposed an AR-based HUD system specifically designed for robust obstacle recognition under adverse weather conditions. The system consists of four components: a ground obstacle detection module, an object decision module, an object recognition module, and a display module [3]. Deng proposed an innovative AR-HUD system that provides drivers with a stereoscopic scene. The system consists of two conventional HUD displays and supports parallax through additive light field decomposition. The optical paths and illumination of the two displays are precisely calibrated for the two views to enhance situational awareness in low-illumination environments [13]. Deng suggested that lane enhancement through AR-HUD could improve lateral control performance under foggy conditions [14].
Although researchers have explored methods to enhance situational awareness in rainy and foggy weather conditions to reduce unpredictable hazardous events, they have overlooked the impact of AR-HUD information design on drivers under different extreme weather conditions. The effectiveness of AR-HUD interfaces varies depending on the perceived color of the graphical elements, which can be either informative or distracting to situational awareness. Thus, this study focuses on the visual design of AR-HUD navigation graphics, specifically exploring how the color of these graphics influences situational awareness in driving environments under rain and fog extreme weather conditions.
Unlike clear weather, the brightness during foggy or rainy conditions is significantly lower than that of sunny days due to the scattering and absorption of light by water droplets and fog particles in the atmosphere [15]. Extreme weather phenomena are challenging to observe clearly due to their complex and variable nature. They are influenced by various factors, including atmospheric conditions, geographical features, and climate change, which can reduce color perception and clarity, making driving in extreme weather particularly challenging [16]. In contrast, clear weather typically improves visibility, stabilizes road conditions, and allows for safer driving speeds, thereby reducing the likelihood of accidents [17]. Tu’s study shows that emergency driving under reduced visibility leads to altered reactions, and the fidelity of driving simulators also affects the estimation of these behaviors [18]. Wet road surfaces and signal control significantly affect the severity of collisions during foggy conditions but have no effect on collision severity in clear weather [19]. Driving in heavy rain severely impacts a driver’s visibility, affecting performance under wet conditions and creating safety concerns [20]. Many studies on HUD under extreme weather conditions have focused primarily on improving driving safety by integrating advanced warning systems, enhancing visibility, and ensuring adaptive brightness control. These systems use video processing and weather forecasting models to alert drivers to hazardous weather conditions, reducing the risk of accidents in extreme weather [12]. Another approach focuses on enhancing the clarity of HUD images and their adaptability to environmental conditions. The use of micro-projectors and Digital Light Processing (DLP) technology enables HUD to provide high brightness and resolution, ensuring that critical information is visible even in low-visibility conditions such as night driving or storms [12]. Kumar notes that adaptive brightness adjustment systems modify the visibility of the HUD based on external illumination conditions, ensuring safe operation during weather changes such as heavy rain or fog [21]. Additionally, user-centered design approaches have explored how AR-HUD can enhance driver awareness by displaying key information in real time, such as road conditions, speed, and weather hazards [22].However, while the aforementioned studies have focused on enhancing situational awareness during extreme weather to improve driving safety, they overlook the impact of navigation information graphics on driver situational awareness under extreme weather conditions.
Studies have shown that color information can shorten reaction times at lower brightness contrast, with reaction times influenced by hue and color contrast, while other studies suggest that reaction time is also related to wavelength [23]. In imaging, the visible spectrum is used to capture images detectable by the human eye, typically ranging from 380 to 750 nm. Long-wavelength colors include red, orange, and yellow [24]. The wavelength of light significantly impacts visual perception in the driving environment, as different wavelengths affect visibility, color recognition, and glare. Shorter wavelengths (blue light) can enhance contrast and visibility under low-illumination conditions, while longer wavelengths (red light) can reduce glare and improve comfort during night driving [25]. Long’s study indicates that wavelength significantly affects dynamic visual acuity (DVA) in driving environments; in particular, long-wavelength light is more suitable for use under low illumination conditions [26]. Gabbard et al. proposed a perceptual color matching method for examining color blending in AR-HUD graphics. Their findings suggest that, under clear daylight conditions, blue and green are more robust than red [27]. Previous research has primarily focused on the effects of colors on driving behavior, attention, and subjective experience but has largely overlooked the unique advantages of wavelength colors in enhancing visual prominence under extreme weather conditions. Kumar suggested that long-wavelength colors can stand out in the soft backgrounds caused by fog to enhance the driver’s situational awareness [21]. Shorter wavelengths (blue light) are more scattered in the atmosphere, which reduces clarity and contrast, especially in fog or rain. In contrast, longer wavelengths (red light) have better penetration under such conditions [28]. The color in augmented reality applications or device interfaces has a tangible impact on usability [29]. This highlights the importance of selecting appropriate spectral intervals for lighting to enhance the distinction between colored surfaces and backgrounds [30]. This study aims to compare the impact of long-wavelength colors in AR-HUD navigation graphics on situational awareness of drivers in rain and fog conditions, exploring which colors in the long-wavelength range are more effective in helping drivers obtain navigation information efficiently while maintaining attention to the road environment, thereby reducing accidents and violations [31].
Although HUD color plays a crucial role in driver perception, it is important to consider other factors such as environmental illumination and the complexity of the HUD. These elements can interact with color choices, affecting the overall effectiveness of the HUD [32]. On clear days, solar irradiance reaches its maximum, providing optimal illumination conditions. In contrast, due to the presence of water droplets in the atmosphere, solar irradiance is reduced on foggy and rainy days [33]. During heavy rain, illumination levels are significantly reduced. A study comparing illumination levels under different weather conditions shows that during particularly heavy rainfall, illumination drops significantly [34]. Fog scatters light, causing attenuation, halos, and masking effects, which reduce visibility and illumination levels [35]. While heavy rain and fog both have significant impacts on driving scenes, they do so through different mechanisms. The illumination characteristics of rainy and foggy days are different due to variations in scattering mechanisms, atmospheric conditions, and particle sizes [36]. The average brightness levels during heavy rain and fog are influenced by various factors, including light scattering, color temperature, and road illumination conditions. Research indicates that the recognition distance of obstacles varies greatly in rainy and foggy conditions, with visibility differences reaching up to 72.86% under the same illumination conditions [37]. Therefore, considering the differences between heavy rain and fog, this study conducts an experimental analysis solely from the perspective of lighting brightness. Long-wavelength colors will be incorporated into scenarios with two different illumination levels (heavy rain and fog) to measure their impact on the driver’s situational awareness, thereby assisting the driver in more effectively perceiving information from both the interface and the surrounding environment.
In summary, research on AR-HUD primarily focuses on the influence of advanced warning systems, enhanced image clarity, graphic brightness, and functional graphics (such as road conditions, speed, and weather) on driver behavior, attention, subjective acceptance, and satisfaction. However, it overlooks whether the design of AR-HUD information influences the driver’s situational awareness during navigation in extreme weather conditions with varying illumination levels. In this regard, how should the color and brightness of AR-HUD navigation graphics be designed to accommodate different illumination levels in heavy rain or fog, enabling drivers to rapidly and effectively perceive information while improving their situational awareness to ensure safe driving? Therefore, this study aims to investigate the effects of long-wavelength colors in AR-HUD navigation graphics and different average brightness levels in rainy and foggy weather conditions on drivers’ situational awareness and user experience by analyzing situational awareness, eye-tracking metrics, and subjective evaluations of the user experience. In pursuit of these objectives, this study addresses the following research questions (RQs):
  • RQ1: How do different colors of AR-HUD designs influence perception during driving in extreme weather conditions?
  • RQ2: Does the color of the HUD affect drivers’ situational awareness depending on the scene illumination?
For RQ1, we reviewed relevant colors suitable for AR-HUD and analyzed those that enhance situational awareness. The experiment was controlled using the SART questionnaire and eye-tracking to obtain both subjective and objective quantitative results. To answer RQ2, we collected 50 photos of rain and fog in various conditions from the internet and calculated their average illumination. Then, we recreated two scenes with different illumination levels using Unity (Version 2023.2.20f1, Unity Technologies, San Francisco, CA, USA). Additionally, scenarios with weak situational awareness were included in each scene, and the interaction between colors and different illumination levels was examined through the experimental setup.

2. Methods

2.1. Experimental Participants and Equipment

The sample size for this study was estimated using G*Power 3.1.9.7 (Düsseldorf, Germany). G*Power is a statistical power analysis program used in research to calculate required sample sizes, effect sizes, and statistical power for various study designs [38]. A statistical power of 0.8 or higher is generally considered reliable, so a power value of 0.8 was set for this study. According to Cohen’s guidelines to interpret the practical significance of statistical results, an effect size (f) below 0.1 is considered small, approximately 0.25 is medium, and above 0.4 is large. Typically, medium and large effect sizes are deemed acceptable [39]. To ensure the reliability and validity of the study, a significance level (α) of 0.05, an effect size (f) of 0.25, and a statistical power of 0.95 were set, resulting in a minimum sample size of 18 participants.
We recruited 26 participants with an approximately balanced gender distribution. Their ages ranged from 20 to 25 years, with a mean age of 22.04 years (SD = 1.62). All participants held valid driver’s licenses, with an average driving experience of 2.19 years (SD = 1.08). The majority had attained a bachelor’s degree and possessed a fundamental understanding of driver-assistance technology, while some had prior experience using HUD. To ensure consistency, all participants were required to have normal or corrected-to-normal vision, hold a valid driver’s license, and be right handed. Eye movements were captured and analyzed using a Tobii desktop eye tracker.

2.2. Experimental Materials

2.2.1. AR-HUD Navigation Graphics Color and Scene Illumination Design

This study used a two-way within-subjects design. The independent variables were the long-wavelength colors of AR-HUD navigation graphics and weather illuminance levels, while the dependent variable was situational awareness. According to current research, long-wavelength colors are defined as red and yellow, and different illuminance levels were applied to rainy and foggy weather conditions. Consequently, four combinations of AR-HUD navigation graphics with varying long-wavelength colors and scene illumination levels were created.
Dynamic navigation graphics were created using Adobe After Effects (Version 24.6, Adobe Inc., San Jose, CA, USA). Based on previous studies, the AR-HUD navigation graphics were designed as flashing animated navigation graphics [40], which have been shown to be relatively robust. Long-wavelengths in the visible spectrum generally refer to colors with wavelengths greater than 580 nm. Yellow light has a wavelength range of 580–595 nm, orange light ranges from 595 to 605 nm, and red light ranges from 605 to 700 nm [41]. The corresponding colors of the graphics were red (R: 255, G: 0, B: 0), orange (R: 255, G: 165, B: 0), and yellow (R: 255, G: 255, B: 0) [42]. Since orange was seldom mentioned in previous studies, this experiment only tested red and yellow within the long-wavelength color range. Figure 1 shows the effects of the two long-wavelength colors used in the AR-HUD navigation graphics.
Rainy and foggy weather conditions result in significant differences in illuminance levels due to the distinct characteristics of each type of weather. Daytime fog brightness is influenced by parameters such as droplet size and the position of the sun, which leads to different illumination levels compared to rainy conditions. Fog creates a glowing veil, reducing visibility, while rain typically causes more uniform light scattering [37]. Due to weather limitations, we were unable to study various rain and fog conditions in a short period. Therefore, 50 images were gathered from online sources, each representing different illuminance levels for rainy and foggy weather. The rainy images include light rain, moderate rain, heavy rain, and torrential rain [43,44], while the foggy images include light fog and dense fog [45]. The average illuminance levels for rainy and foggy days were calculated using the Image Color Summarizer tool [46]. Figure 2 and Figure 3 show sample images collected for foggy and rainy conditions, respectively. Table 1 presents the experimental scene illumination levels of rainy (47) and foggy (62) conditions. Thus, this study utilized a 2 × 2 factorial experimental design, resulting in four combinations: two colors (red and yellow) and two illuminance levels (rainy and foggy). Figure 4 displays the final experimental materials.

2.2.2. AR-HUD Assisted Driving Simulation Scenarios

The AR-HUD context-aware assisted driving simulation in the study of situational awareness during driving was conducted through static image mode [40,47], video mode [48,49,50], and simulator-based objective measurements. Due to technical limitations, we were unable to directly enhance the navigation information display within the driving simulator. Therefore, this study used video material to realistically simulate a first-person driving scenario as accurately as possible. Subsequently, both objective and subjective data from participants were obtained through experimentation. The driving scenes and animations were created and recorded using Unreal Engine 5 [51], with navigation warning graphics overlaid onto the video scenes using Adobe After Effects. The road types in the simulation scenarios are based on real-life roads. Finally, four types of experimental materials were created and exported.
Based on current research and real-world driving conditions, rain and fog can obscure visibility and interfere with sensor data acquisition, making it more difficult to detect obstacles and lane markings [52,53]. The impaired recognition of vehicles, pedestrians, and cyclists during foggy weather can lead to collision accidents [54]. Due to the reduction in depth cues, situational awareness in foggy and rainy weather is impaired. A decrease in brightness and contrast makes warning information in the environment more difficult to perceive, potentially causing drivers to miss intersections or even cause accidents [55,56,57]. Therefore, the Ego-Car in the video drives under rainy and foggy conditions at a constant speed on an urban road containing buildings, trees, traffic lights, street lamps, road signs, vehicles, and pedestrians. In each video, the vehicle experiences four low-illumination scenarios, with the Ego-Car making turns and proceeding straight. Meanwhile, four low-situational-awareness test scenarios were set up, each occurring once.

2.3. Experimental Procedure

We recruited 26 participants with valid driving licenses and required them to carefully read and sign an informed consent form. All participants were briefly introduced to the research background and objectives upon recruitment and before entering the laboratory. Participants were asked to assume the role of the driver while watching first-person perspective videos and respond according to the navigation prompts displayed on the AR-HUD interface.
After ensuring that all participants understood the experimental procedure, vision calibration was performed using a desktop eye tracker. Following the calibration, participants were instructed to minimize body movement and maintain the same position as during the visual calibration. To ensure familiarity with the system, all participants practiced using the Tobii Pro Lab (Version 1.210) on the desktop eye tracker with a simulated video. In the formal experiment, participants read the instructions and initiated each experiment by pressing a button. The four experiments were presented in a random order. After completing each experiment, participants rated the SART [7,29] and DALI [58] questionnaires until all four experiments were finished. The specific experimental procedure is shown in Figure 5 and Figure 6.
The experimental scenarios combined various types of roads to create a driving environment to assess whether long-wavelength colors, such as red and yellow, can improve the driver’s situational awareness when used in different illumination levels in rainy and foggy conditions. The experimental materials involved an urban scenario in which the Ego-Car made three turns: two right turns and one left turn. The Ego-Car drove on an urban road with a left-turn lane on the far right, requiring the Ego-Car to make a left turn. Afterward, the Ego-Car followed the lead car and changed lanes to the right-turn lane, turning right and proceeding straight. Finally, a pedestrian crossed in front of the Ego-Car at an intersection.

2.4. Data Collection and Analysis

To investigate the impact of long-wavelength colors in AR-HUD navigation graphics and different illumination levels in rain and fog conditions on the driver’s situational awareness, this study adopts a multidimensional evaluation method, combining both objective and subjective indicators to comprehensively analyze the effectiveness of the AR-HUD navigation system. Objective indicators primarily assess the visual attention distribution of the driver through eye-tracking data, while subjective indicators include the SART (Situational Awareness Rating Technique) scale and DALI (Driving Activity Load Index), which evaluate the cognitive load of driving activities. Higher SART scores and lower DALI scores indicate a better driving experience. By synthesizing these indicators, we aim to test the hypotheses and gain deeper insights into the impact of AR-HUD navigation systems on driving behavior under low-illumination rainy and foggy weather conditions.
By defining AR-HUD navigation graphics as “Areas of Interest” (AOIs), this study recorded participants’ average fixation duration, average visit duration, and number of fixations [59]. Due to the visual degradation caused by rain and fog compared to clear weather, data on fixation duration and fixation count in rainy and foggy weather were higher than in clear weather. A longer average fixation time refers to the duration of a single fixation on the navigation graphic, with shorter fixation durations indicating that the AR-HUD navigation graphics are simple and easy to understand (400–600 ms is a common range). However, too short a fixation duration might indicate insufficient attention to the graphic, which could affect navigation effectiveness. A fixation duration of more than 600 ms typically indicates that the user needs more time to process information, potentially resulting in cognitive load. Average visit duration refers to the total time spent by the participant fixating on the target area during a navigation task. A duration of 500 ms under clear weather conditions suggests that the navigation graphic is simple and intuitive, allowing the user to quickly obtain the necessary information. Durations between 500 ms and 2 s are common in medium-complexity areas, where the user needs to quickly understand the information, which may lead to neglect of the surrounding driving environment or cause cognitive load. The number of fixations refers to the total number of times the navigation graphic area was fixated upon during the task. Under normal clear weather conditions, 1–3 fixations indicate relatively low attention, possibly suggesting that the graphic design is clear and information transmission is straightforward. A total of 5–6 fixations indicates that users are somewhat attentive to the information in that area, suitable for moderately complex scenarios. More than 7 fixations may suggest that the user is repeatedly processing the information. Due to the effects of extreme weather and low-illumination environments, these indicators may be extended, which is within the normal range. These metrics assess the participant’s attention distribution and cognitive processing related to the navigation graphics (see Table 2).
This study uses the SART scale’s three core dimensions of demand, supply, and clarity to assess the driver’s situational awareness level. Meanwhile, in order to ensure the questionnaire’s usefulness, the scale was expanded to include additional dimensions, such as information quantity, information quality, complexity, uncertainty, predictability, stress, and overall satisfaction [60,61,62]. Higher scores indicate higher situational awareness. The demand dimension evaluates the task’s demand on the individual’s attention resources, while the supply dimension assesses the available attention resources. The clarity dimension measures the individual’s understanding of the current information in the situation. Information quantity assesses the amount of information presented in the situation, while information quality evaluates whether the information is accurate, clear, and useful. Complexity measures the dynamic and complex nature of the situation, while uncertainty evaluates the ambiguity or uncertainty of the information. Predictability assesses the individual’s ability to predict future changes in the situation. Stress measures whether the task or situation causes emotional or psychological stress on the individual. Overall satisfaction evaluates the driver’s general experience using the AR-HUD. This study tests the subjective impressions of situational awareness regarding the AR-HUD color graphics under different weather illumination levels using a 7-point Likert scale, with 1 representing “strongly disagree” and 7 representing “strongly agree.” The final analysis is based on the total score of 70 points from the ten dimensions.
Lastly, the DALI was used to evaluate the cognitive load of the experimental task, with lower scores indicating lower load [57]. The index includes six evaluation dimensions: attention demand, interference, situational stress, temporal demand, visual demand, and auditory demand. In this study, auditory demand was excluded, as the interface does not include sound. Attention demand measures the degree to which the driver needs to concentrate on the task, while interference measures the conflict between experimental tasks and other activities (such as using a phone or conversing with passengers). Situational stress evaluates the psychological and emotional impact of the driving context, while temporal demand assesses the urgency of completing the task. Visual demand measures the visual attention required during driving. We used a 7-point Likert scale, with 1 representing “strongly disagree” and 7 representing “strongly agree”.

3. Results

The color of AR-HUD navigation graphics significantly affects drivers’ behavior and subjective experience under rain and fog illumination conditions. This study investigates the impact of two long-wavelength colors in two different illumination scenarios by analyzing objective eye movement data, including average fixation duration, average visit duration, and fixation count, as well as subjective response data from a situation awareness questionnaire.

3.1. Situation Awareness Results

This study used the SART scale to assess the drivers’ situation awareness, with higher scores indicating a higher level of situation awareness [7]. The descriptive statistics of the scale, ranked in descending order (Figure 7, Table 3), are as follows: yellow fog (YF: 58.77 ± 5.95), yellow rain (YR: 52.65 ± 6.08), red fog (RF: 36.62 ± 6.17), and red rain (RR: 31.92 ± 5.73).
To examine the effects of color and illumination on situational awareness, a two-way repeated measures analysis of variance (ANOVA) was conducted, with color and illumination as independent variables and situational awareness as the dependent variable. Prior to analysis, for those data that did not meet the variance homogeneity, Welch’s variance test was used for analysis, and data were confirmed to follow a normal distribution. As shown in Table 4, the main effect of color on situational awareness was statistically significant (F = 333.680, p < 0.001, ηp2 = 0.769). Similarly, the main effect of illumination was also significant (F = 21.193, p < 0.001, ηp2 = 0.175). The results far exceeded commonly accepted thresholds for large effect sizes in human factors research. However, the interaction effect between color and illumination was not significant (F = 0.367, p = 0.546, ηp2 = 0.004).
Post hoc multiple comparisons revealed that, under red AR-HUD display conditions, situational awareness scores in foggy weather were significantly higher than in rainy weather (RF: 36.62 > RR: 31.92, p = 0.006). Moreover, in foggy conditions, the yellow AR-HUD yielded significantly higher scores than the red AR-HUD (YF: 58.77 > RF: 36.62, p < 0.001). Similarly, in rainy conditions, situational awareness scores under the yellow AR-HUD were significantly higher than under the red AR-HUD (YR: 52.65 > RR: 31.92, p < 0.001). Additionally, when using a yellow AR-HUD, situational awareness scores in foggy conditions were significantly higher than in rainy conditions (YF: 58.77 > YR: 52.65, p < 0.001). Overall, the ranking of mean scores was YF > YR > RF > RR (see Table 5).

3.2. DALI Results

This study employed the DALI to assess drivers’ cognitive load. DALI is a revised version of the NASA Task Load Index (NASA-TLX), specifically designed for evaluating the workload of automobile drivers. Its fundamental framework and computational methodology are consistent with those of NASA-TLX. While the original DALI employed a weighted average for workload calculation, this approach was later deemed cumbersome. Approximately 20 years after its development, the original author officially stated that arithmetic averaging is a more sensitive measure of overall workload, as demonstrated by subsequent research. Therefore, this study adopted arithmetic averaging instead of the weighted method. Lower scores indicate reduced cognitive load and superior interface solutions [57]. The descriptive statistics of the DALI scale, ranked in descending order, are as follows: RR (5.37 ± 0.18), RF (5.25 ± 0.20), YR (3.57 ± 0.23), and YF (2.11 ± 0.24) (see Figure 8, Table 6).
To examine the effects of color and illumination on cognitive load, a two-way repeated measures ANOVA was conducted, with color and illumination as independent variables and cognitive load as the dependent variable. Normality was confirmed prior to the analysis. Table 7 shows that the main effect of color on cognitive load was significant (F = 3480.953, p < 0.001, ηp2 = 0.972), as was the main effect of illumination (F = 352.646, p < 0.001, ηp2 = 0.779). Additionally, a significant interaction effect was observed between color and illumination (F = 256.769, p < 0.001, ηp2 = 0.720). The results far exceeded commonly accepted thresholds for large effect sizes in human factors research.
A simple effects analysis was further conducted, and the results are presented in Table 8. When using a red HUD, the cognitive load score was significantly higher in rainy conditions compared to foggy conditions (RR: 5.37 > RF: 5.25, p = 0.054). Under foggy conditions, the red HUD resulted in significantly higher cognitive load scores than the yellow HUD (RF: 5.25 > YF: 2.11, p < 0.001). Similarly, in rainy conditions, the red HUD led to significantly higher cognitive load scores than the yellow HUD (RR: 5.37 > YR: 3.57, p < 0.001). Furthermore, for the yellow HUD, cognitive load scores were significantly higher in rainy conditions compared to foggy conditions (YR: 3.57 > YF: 2.11, p = 0.004).

3.3. Eye-Tracking Metrics

Eye-tracking data can reflect the distribution of drivers’ attention during the driving process [29]. To examine the effects of color and illumination on eye movement behavior, a two-way repeated measures ANOVA was conducted, with color and illumination as independent variables, and average fixation duration, fixation count, and average visit duration as dependent variables. The AR-HUD navigation graphics were defined as areas of interest (AOIs), and corresponding eye movement data were recorded. Prior to analysis, data normality was verified.
Figure 9 and Table 9 illustrate the eye-tracking data collected from participants under four experimental conditions. Descriptive statistics indicate that under the YF condition, all three eye-tracking metrics—average fixation duration (M = 58.77, SD = 5.95), fixation count (M = 7.04, SD = 0.77), and average visit duration (M = 1307.88, SD = 323.64)—were the lowest among all conditions, falling within the normal range. A two-way ANOVA was performed to compare differences between conditions, with the significance level set at α = 0.05, and all results were found to be statistically significant.
The results presented in Table 10 indicate significant main effects of AR-HUD navigation graphic color (F = 73.226, p < 0.001, ηp2 = 0.423) and illumination (F = 16.818, p < 0.001, ηp2 = 0.144) on average fixation duration. The results far exceeded commonly accepted thresholds for large effect sizes in human factors research. However, the interaction effect between color and illumination on average fixation duration was not statistically significant (F = 0.277, p = 0.664, ηp2 = 0.003).
The post hoc multiple comparisons for average fixation duration are presented in Table 11. The results indicate that under foggy conditions, the red yielded significantly lower scores compared to rainy conditions (RF: 569.73 < RR: 734.23, p < 0.001). Similarly, the yellow scored significantly lower under foggy conditions than under rainy conditions (YF: 457.462 < YR: 599.038, p < 0.001). Moreover, under foggy conditions, the yellow had a significantly lower score than the red (YF: 457.462 < RF: 569.731, p = 0.003). Likewise, under rainy conditions, the yellow scored significantly lower than the red (YR: 599.038 < RR: 734.231, p < 0.001). Overall, the ranking of scores follows the pattern: RR > YR > RF > YF.
As shown in Table 12, the two-way ANOVA revealed significant main effects of AR-HUD navigation graphic color (F = 403.470, p < 0.001, ηp2 = 0.801) and illumination (F = 198.904, p < 0.001, ηp2 = 0.665) on fixation count. Additionally, a significant interaction effect was observed between color and illumination (F = 7.717, p = 0.007, ηp2 = 0.072). The main effects of color and illumination far exceeded typical thresholds for large effects in human factors research, whereas the interaction effect (ηp2) was small. For those that did not meet the variance homogeneity, Welch’s variance test was used for analysis.
Further simple effects analysis results (see Table 13) indicated several significant differences. Under rainy conditions, the score for the red display was significantly higher than that under foggy conditions (RR: 13.19 > RF: 10.15, p < 0.001). For the yellow display, the score under foggy conditions was significantly lower than that under rainy conditions (YF: 7.04 < YR: 9.08, p < 0.001). Additionally, under foggy conditions, the yellow display scored significantly lower than the red display (YF: 7.04 < RF: 10.15, p < 0.001). Similarly, under rainy conditions, the score for red was significantly higher than that for yellow (RR: 13.19 > YR: 9.08, p < 0.001).
Two-way ANOVA revealed significant main effects of AR-HUD navigation graphic color (F = 73.226, p < 0.001, ηp2 = 0.423) and illumination (F = 16.818, p < 0.001, ηp2 = 0.144) on average fixation duration. The results far exceeded commonly accepted thresholds for large effect sizes in human factors research. But the interaction between color and illumination was not statistically significant (F = 0.277, p = 0.600, ηp2 = 0.003) (see Table 14).
The results of post hoc multiple comparisons showed that red graphics under foggy conditions yielded significantly lower performance scores than under rainy conditions (RF: 2128.42 < RR: 2601.19, p = 0.001). Similarly, yellow graphics were rated significantly lower in fog than in rain (YF: 1307.89 < YR: 1673.12, p = 0.013). Under foggy conditions, yellow scored significantly lower than red (YF: 1307.89 < RF: 2128.42, p < 0.001), and under rainy conditions, yellow was again rated significantly lower than red (YR: 1673.12 < RR: 2601.20, p < 0.001). Overall, the performance scores followed the descending order: RR > RF > YR > YF (see Table 15).

4. Discussion

4.1. Discussion on Subjective Measures of SA

This study found that long-wavelength colors significantly improve drivers’ situational awareness in low-illumination conditions such as rain and fog. Among the tested colors, yellow navigation graphics were more effective than red in enhancing drivers’ situational awareness while also maintaining a manageable cognitive load. In different illumination conditions during rainy and foggy weather, the cognitive load was found to be higher in foggy conditions than in rainy conditions. However, the interaction effect between color and illumination level was not significant.
The results of the SART questionnaire indicated that, compared to red graphics, yellow navigation graphics significantly enhanced drivers’ situational awareness under both rainy and foggy conditions. This finding aligns with previous research that suggests yellow, as a long-wavelength color, is more effective in low-visibility conditions [21]. The superior performance of yellow in enhancing situational awareness can be attributed to its visibility and psychological comfort, allowing drivers to process information more effectively. The significant difference in scores between yellow and red under both weather conditions emphasizes the importance of color choice in AR-HUD designs, particularly for enhancing driver safety in adverse weather conditions. No significant contrast was found regarding the illumination levels during rainy and foggy weather in the SART assessment, which may be due to the limited impact of environmental illumination changes, leading drivers to focus more on the color itself.

4.2. Discussion on DALI Scale Results

Regarding cognitive load, the results further support the effectiveness of yellow navigation graphics. Compared to red, yellow icons yielded lower scores on the DALI scale, indicating not only enhanced situational awareness but also a reduction in overall cognitive and visual workload. The lower scores in visual and temporal demands suggest that drivers can process information from yellow graphics more quickly and easily, which is crucial for maintaining safety in adverse weather conditions. The reduction in interference and stress levels associated with yellow graphics indicates that they are less distracting and more comfortable for drivers, which is vital for sustaining attention and reducing driver fatigue. In contrast, while red navigation graphics serve a warning function, their longer wavelength and lower brightness in rainy and foggy conditions may cause drivers to observe the navigation information more frequently, increasing cognitive load and visual stress. Similarly, no significant differences in illumination between rainy and foggy conditions were found in the DALI assessment.

4.3. Discussion on Eye-Tracking Metrics

Furthermore, eye-tracking data confirmed the positive impact of yellow HUD on drivers’ attention allocation. The experimental results showed that, compared to red graphics, drivers spent less time on and had shorter access times for yellow navigation graphics. This suggests that yellow graphics are more effective in conveying information, allowing drivers to allocate more time to observing the road environment. The reduced fixation duration and frequency suggest that the information presented by yellow graphics is easier to comprehend and cognitively less demanding. Eye-tracking data also highlighted the challenges posed by rainy conditions, where the dynamic nature of raindrops disrupts visual clarity more than the static blurriness caused by fog. Unlike the subjective assessments, the eye-tracking data revealed that, with unchanged navigation graphics, drivers spent more time in rainy conditions than in foggy conditions in terms of average fixation duration and average access time. However, no significant difference in the number of fixations was found between the two weather conditions. This may be due to the continuous interference of raindrops on the windshield and the uniform scattering of light in the rainy environment [37]. In contrast, while fog reduces overall visibility, its illumination conditions are relatively stable, allowing drivers to process navigation information more efficiently.
In conclusion, the results of this study indicate that in rainy and foggy low-illumination traffic conditions, yellow AR-HUD navigation designs can significantly enhance drivers’ situational awareness, effectively reduce cognitive load, and optimize attention allocation strategies. These findings are supported by both subjective evaluations and workload measurements, as well as further validated by objective eye-tracking data. The integration of these findings suggests that AR-HUD designs should prioritize color selection, not only to enhance visibility but also to consider the psychological and cognitive impact on drivers. This provides multidimensional theoretical and practical guidance for the design of vehicle navigation systems intended for extreme weather conditions. It emphasizes the need to consider factors such as user experience, cognitive load, and visual attention distribution during the design process.

5. Conclusions and Limitations

This study systematically explored the impact of different AR-HUD colors and illumination combinations on drivers’ driving performance using simulated driving experiments and eye-tracking technology. The results revealed that in rainy and foggy road scenarios, yellow-fog AR navigation designs significantly improved drivers’ performance, particularly in terms of situational awareness and driving cognitive load. Yellow enhanced situational awareness during driving in rainy and foggy weather, with yellow navigation graphics showing clear advantages over red ones. The comfort provided by yellow effectively guided drivers’ information processing, enabling them to perceive the surrounding environment more quickly and accurately and respond appropriately.
In terms of cognitive load, both red and yellow long-wavelength colors effectively guided drivers’ situational awareness in low-illumination environments. Additionally, yellow HUD displayed lower workload levels across multiple dimensions, including attention demand, visual demand, and temporal demand. In eye tracking, yellow HUD showed lower average fixation durations, fewer fixations, and shorter access times compared to red HUD, indicating that yellow has a lower cognitive load and is better for processing information under extreme weather conditions. These findings are consistent with previous studies and further confirm the potential of yellow HUD navigation graphics in improving driving safety.
Regarding illumination during rainy and foggy conditions, the use of yellow and red AR-HUD required more time during rainy weather, but this was not necessarily due to illumination issues. It may instead be related to the continuous interference of raindrops on the windshield and the uniform scattering of light in the rainy environment. While fog reduced overall visibility, its illumination conditions were relatively stable, allowing drivers to process navigation information more efficiently. Additionally, the low-illumination conditions in foggy environments made yellow navigation graphics more visually prominent, further enhancing drivers’ situational awareness. This finding suggests that AR-HUD designs should be optimized based on the illumination characteristics of different weather conditions. For instance, in rainy conditions, it is important to increase the brightness and contrast of graphics to counter the interference of raindrops on visual information. In foggy conditions, adjustments to graphic color and display position can further improve the recognizability of information.
Although this study provides valuable insights, there are several limitations that require further exploration and improvement in future research. First, while the complexity of the simulated road environment was high, it could not fully replicate all variables in real-world driving. Due to equipment limitations, this study used simulated videos as experimental materials to standardize and ensure the consistency of experimental conditions. This simulation may have had some impact on the results, as it could not completely reproduce the complexity and unpredictability of real-world driving, and participants’ situational awareness ratings might have been slightly lower than in actual driving scenarios. Second, although the sample size was sufficient for statistical analysis, it was relatively small and limited to a specific age group, which may affect the generalizability of the findings. Future research should consider larger and more diverse samples, as well as real-world test scenarios, to validate these results. Furthermore, this study focused only on two colors, and exploring a broader range of colors and their combinations could provide a deeper understanding of optimal AR-HUD design. Due to weather limitations, the study could not measure real-world rain and fog illumination levels within a short time frame. Although different types of rain and fog illumination photos were collected online, they may differ slightly from actual illumination levels.
In future work, we plan to expand the sample size and adopt an interactive driving simulation to better replicate real-world driving scenarios. Our research team is currently in the process of acquiring a Logitech steering wheel set to improve the overall quality of the experiment. To enhance the interactivity of the experimental materials, Unity will be used for scenario modeling in subsequent studies. Future research will continue to explore the application of HUD under various conditions, with efforts made to collect data from real-world environments whenever possible. Where conditions permit, glasses-type eye trackers will be employed to observe situational awareness in real-time during actual driving.
Overall, this study provides valuable insights into AR-HUD navigation design and points the way for future research. As technology continues to advance and in-vehicle interactions become more widespread, optimizing AR interfaces to ensure their application potential in more complex driving scenarios will become an important research topic. Future work should focus on developing safer, more intuitive, and adaptable AR-HUD navigation systems to enhance road safety, boost drivers’ confidence, and improve their quality of life.

Author Contributions

Conceptualization, Q.Z.; methodology, Q.Z.; software, Q.Z.; validation, Z.L.; formal analysis, Q.Z.; investigation, Q.Z. and Z.L.; resources, Q.Z.; data curation, Z.L.; writing—original draft preparation, Q.Z.; writing—review and editing, Q.Z. and Z.L.; visualization, Q.Z.; supervision, Q.Z. and Z.L.; project administration, Q.Z.; funding acquisition, Q.Z. and Z.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The datasets used during the current study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
HUDHead-Up Display
AR-HUDAugmented Reality Head-Up Display
SASituational Awareness
DALIDriving Activity Load index
SARTSituational Awareness Rating Technique
YRYellow-rain
RRRed-rain
YFYellow-fog
RFRed-fog

References

  1. Thompson, L.L. A History of HUD. Available online: https://monarchhousing.org/wp-content/uploads/2007/03/hud-history.pdf (accessed on 26 January 2025).
  2. Zhang, R.; Liu, Z.; Tan, Z.; Zhang, R.; Yu, S. Research on the Influence of Vehicle Head-Up Display Warning Design on Driver Experience with Different Driving Styles. In Lecture Notes in Computer Science; Springer Nature: Cham, Switzerland, 2023; pp. 88–100. [Google Scholar] [CrossRef]
  3. Park, H.; Kim, K.H. Efficient Information Representation Method for Driver-Centered AR-HUD System. In Design, User Experience, and Usability. User Experience in Novel Technological Environments: Second International Conference, DUXU 2013, Held as Part of HCI International 2013, Las Vegas, NV, USA, July 21–26, 2013, Proceedings, Part III 2; Springer: Berlin/Heidelberg, Germany, 2013; pp. 393–400. [Google Scholar] [CrossRef]
  4. Teng, J.; Wan, F.; Kong, Y. Exploring Enhancement of AR-HUD Visual Interaction Design Through Application of Intelligent Algorithms. Int. J. Inf. Technol. Syst. Approach 2023, 16, 1–24. [Google Scholar] [CrossRef]
  5. Wu, Z.; Zhao, L.; Liu, G.; Chai, J.; Huang, J.; Ai, X. The Effect of AR-HUD Takeover Assistance Types on Driver Situation Awareness in Highly Automated Driving: A 360-Degree Panorama Experiment. Int. J. Hum.-Comput. Interact. 2023, 40, 6492–6509. [Google Scholar] [CrossRef]
  6. Strader, S. The Death Toll from Motor Vehicle Crashes Due to Weather-Related Vision Hazards Exceeds the Number of Fatalities Caused by Well-Known Hazards Such as Tornadoes, Floods, Tropical Cyclones, and Lightning. In DRIVING BLIND Weather-Related Vision Hazards and Fatal Motor Vehicle Crashes; American Meteorological Society: Boston, MA, USA, 2015. [Google Scholar]
  7. Endsley, M.R. Measurement of Situation Awareness in Dynamic Systems. Hum. Factors J. Hum. Factors Ergon. Soc. 1995, 37, 65–84. [Google Scholar] [CrossRef]
  8. Riegner, K.L.; Steelman, K.S. Driving in the Dust: Challenges and Lessons Learned from Field Testing in Degraded Visual Environments. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2020, 64, 1916–1920. [Google Scholar] [CrossRef]
  9. Owens, D.A.; Tyrrell, R.A. Effects of Luminance, Blur, and Age on Nighttime Visual Guidance: A Test of the Selective Degradation Hypothesis. J. Exp. Psychol. Appl. 1999, 5, 115–128. [Google Scholar] [CrossRef]
  10. Röckel, C.; Hecht, H. Regular Looks out the Window Do Not Maintain Situation Awareness in Highly Automated Driving. Transp. Res. Part F-Traffic Psychol. Behav. 2023, 98, 368–381. [Google Scholar] [CrossRef]
  11. Park, S.; Xing, Y.; Akash, K.; Misu, T.; Boyle, L.N. The Impact of Environmental Complexity on Drivers’ Situation Awareness. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI ’22, Seoul, Republic of Korea, 17–20 September 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 131–138. [Google Scholar] [CrossRef]
  12. Wang, D.; Kang, D. Disaster Weather Early Warning Method and Automobile AR-HUD System. CN201910769393.5A, 20 August 2019. [Google Scholar]
  13. Deng, N.; Ye, J.; Chen, N.; Yang, X. Towards Stereoscopic On-Vehicle AR-HUD. Vis. Comput. 2021, 37, 2527–2538. [Google Scholar] [CrossRef]
  14. Deng, T.; Sun, W.; Zhang, R.; Zhang, Y. Research on Interface Design of Full Windshield Head-Up Display Based on User Experience. In Advances in Usability, User Experience and Assistive Technology: Proceedings of the AHFE 2018 International Conferences on Usability & User Experience and Human Factors and Assistive Technology, Held on July 21–25, 2018, in Loews Sapphire Falls Resort at Universal Studios, Orlando, Florida, USA; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 166–173. [Google Scholar] [CrossRef]
  15. Liu, M.; Li, W.; Zhang, B.; Hao, Q.; Guo, X.; Liu, Y. Research on the Influence of Weather Conditions on Urban Night Light Environment. Sustain. Cities Soc. 2020, 54, 101980. [Google Scholar] [CrossRef]
  16. Faranda, D.; The ClimaMeter Team. ClimateMeter: Putting Extreme Weather Phenomena in Climate Perspective. In Proceedings of the EMS Annual Meeting, Barcelona, Spain, 1–6 September 2024. [Google Scholar] [CrossRef]
  17. Sui, L.; Chen, Z.; Ni, S. Study on the Distinguish Method to Ensure Countermeasures of Traffic Hidden Trouble Points in Extreme Weather. In International Conference on Chemical, Material and Food Engineering; Atlantis Press: Amsterdam, The Netherlands, 2015; pp. 608–610. [Google Scholar] [CrossRef]
  18. Tu, H.; Li, Z.; Li, H.; Zhang, K.; Sun, L. Driving Simulator Fidelity and Emergency Driving Behavior. Transp. Res. Rec. J. Transp. Res. Board 2015, 2518, 113–121. [Google Scholar] [CrossRef]
  19. Wei, F.; Cai, Z.; Liu, P.; Guo, Y.; Li, X.; Li, Q. Exploring Driver Injury Severity in Single-Vehicle Crashes under Foggy Weather and Clear Weather. J. Adv. Transport. 2021, 2021, 9939800. [Google Scholar] [CrossRef]
  20. Morden, J.N.; Caraffini, F.; Kypraios, I.; Al-Bayatti, A.H.; Smith, R. Driving in the Rain: A Survey toward Visibility Estimation through Windshields. Int. J. Intell. Syst. 2023, 2023, 9939174. [Google Scholar] [CrossRef]
  21. Kumar, B.V. Head-Up Display (HUD). Int. J. Res. Appl. Sci. Eng. Technol. 2017, V, 1222–1225. [Google Scholar] [CrossRef]
  22. Zhang, T.; Bai, X.; Liu, J. HUD Brightness Adaptive Adjustment Method. CN112233632A, 15 January 2021. [Google Scholar]
  23. Karsilar, H.; Mathot, S.; Van Rijn, H. Attention Modulates the Effects of Stimulus Brightness and Contrast on Time Perception. PsyArXiv 2024. [Google Scholar] [CrossRef]
  24. Udayakumar, N. Visible Light Imaging. In Imaging with Electromagnetic Spectrum; Springer: Berlin/Heidelberg, Germany, 2014; pp. 67–86. [Google Scholar] [CrossRef]
  25. Becerra, R.; Azurdia-Meza, C.; Palacios Játiva, P.; Soto, I.; Sandoval, J.; Ijaz, M.; Carrera, D. A Wavelength-Dependent Visible Light Communication Channel Model for Underground Environments and Its Performance Using a Color-Shift Keying Modulation Scheme. Electronics 2023, 12, 577. [Google Scholar] [CrossRef]
  26. Long, G.M.; Garvey, P.M. The Effects of Target Wavelength on Dynamic Visual Acuity under Photopic and Scotopic Viewing. Hum. Factors J. Hum. Factors Ergon. Soc. 1988, 30, 3–13. [Google Scholar] [CrossRef]
  27. Gabbard, J.L.; Smith, M.; Merenda, C.; Burnett, G.; Large, D.R. A Perceptual Color-Matching Method for Examining Color Blending in Augmented Reality Head-Up Display Graphics. IEEE Trans. Vis. Comput. Graph. 2022, 28, 2834–2851. [Google Scholar] [CrossRef]
  28. Liu, F.; Lu, Z.; Lin, X. Vision-Based Environmental Perception for Autonomous Driving. arXiv 2022, arXiv:2212.11453. [Google Scholar] [CrossRef]
  29. Wang, J.; Yang, J.; Fu, Q.; Zhang, J.; Zhang, J. A New Dynamic Spatial Information Design Framework for AR-HUD to Evoke Drivers’ Instinctive Responses and Improve Accident Prevention. Int. J. Hum. Comput. Stud. 2024, 183, 103194. [Google Scholar] [CrossRef]
  30. Zhu, Z.; Qu, X.; Jia, G. Wavelength Intervals Selection of Illumination for Separating Objects from Backgrounds in Color Vision Applications. J. Modern Opt. 2011, 58, 777–785. [Google Scholar] [CrossRef]
  31. Li, Y.; Wang, Y.; Song, F.; Liu, Y. Assessing Gender Perception Differences in Color Combinations in Digital Visual Interfaces Using Eye Tracking—The Case of HUD. Int. J. Hum.-Comput. Interact. 2024, 40, 6591–6607. [Google Scholar] [CrossRef]
  32. Cao, Y.; Li, L.; Yuan, J.; Jeon, M. Head-up Displays Improve Drivers’ Performance and Subjective Perceptions with the In-Vehicle Gesture Interaction System. Int. J. Hum.-Comput. Interact. 2024, 1–15. [Google Scholar] [CrossRef]
  33. Chowdhury, A.; Samal, S.; Rout, T.; Maharana, M.K. Effect of Irradiance during Foggy Days and Clear Climatic Conditions in Maximum Power Point in PV Characteristics for Photovoltaic Systems. In Proceedings of the 2018 4th International Conference on Electrical Energy Systems (ICEES), Chennai, India, 7–9 February 2018; pp. 67–71. [Google Scholar] [CrossRef]
  34. Atkins, W.R.G.; Jenkins, P.G. Reduction in Light Accompanying Exceptionally Heavy Rain. Nature 1953, 172, 79. [Google Scholar] [CrossRef]
  35. Dumont, E. Extended Photometric Model of Fog Effects on Road Vision. In Proceedings of the 16th Biennial Symposium on Visibility and Simulation, Iowa City, IA, USA, 2–4 June 2002. [Google Scholar]
  36. Wang, C. Real-Time Rendering of Daylight Sky Scene for Virtual Environment; Springer: Berlin/Heidelberg, Germany, 2007; pp. 294–303. [Google Scholar] [CrossRef]
  37. Park, W.; Park, K.; Jeong, J. Verification of the Applicability of Obstacle Recognition Distance as a Measure of Effectiveness of Road Lighting on Rainy and Foggy Roads. Appl. Sci. 2024, 14, 1595. [Google Scholar] [CrossRef]
  38. CandemïR, C. A Practical Estimation of the Required Sample Size in Fmri Studies. Mugla J. Sci. Technol. 2023, 9, 56–63. [Google Scholar] [CrossRef]
  39. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Routledge: London, UK, 2013. [Google Scholar]
  40. Hou, G.; Dong, Q.; Wang, H. The Effect of Dynamic Effects and Color Transparency of AR-HUD Navigation Graphics on Driving Behavior Regarding Inattentional Blindness. Int. J. Hum.-Comput. Interact. 2024, 1–12. [Google Scholar] [CrossRef]
  41. Jacobson, B.A.; Leung, M.; Van de Ven, A.P. High Color-Saturation Lighting Devices with Enhanced Long Wavelength Illumination. U.S. Patent 9,530,944, 27 December 2016. [Google Scholar]
  42. Wu, K. A HUD-Based Platform for Environmental Sensing and Measurement: Design and Implementation. In Proceedings of the 2023 IEEE XVI International Scientific and Technical Conference Actual Problems of Electronic Instrument Engineering (APEIE), Novosibirsk, Russian, 10–12 November 2023; pp. 1010–1015. [Google Scholar] [CrossRef]
  43. Caloiero, T.; Coscarelli, R.; Ferrari, E.; Sirangelo, B. Temporal Analysis of Rainfall Categories in Southern Italy (Calabria Region). Environ. Process. 2017, 4 (Suppl. S1), 113–124. [Google Scholar] [CrossRef]
  44. Mashros, N.; Ben-Edigbe, J. Determining the Quality of Highway Service Caused by Rainfall. In Institution of Civil Engineers-Transport; Thomas Telford Ltd.: London, UK, 2014; Volume 167, pp. 334–342. [Google Scholar] [CrossRef]
  45. Li, Z.; Zhang, S.; Meng, F.; Zhang, L. Research on Image Subject Accessing Model Under Foggy Weather. In Electronics, Communications and Networks; IOS Press: Amsterdam, The Netherlands, 2024; pp. 598–607. [Google Scholar] [CrossRef]
  46. Yamada, D.; Mukada, N.; Myint, M.; Lwin, K.N.; Matsuno, T.; Minami, M. Docking Experiment in Dark Environments Using Active/Lighting Marker and HSV Correlation. In Proceedings of the 2018 OCEANS—MTS/IEEE Kobe Techno-Oceans (OTO), Kobe, Japan, 28–31 May 2018; pp. 1–9. [Google Scholar] [CrossRef]
  47. Perera, R.; Marin, R.; Ruiz, A. Static–Dynamic Multi-Scale Structural Damage Identification in a Multi-Objective Framework. J. Sound Vib. 2013, 332, 1484–1500. [Google Scholar] [CrossRef]
  48. Watanabe, H.; Yoo, H.; Tsimhoni, O.; Green, P. The Effect of Hud Warning Location on Driver Responses. In Proceedings of the Sixth World Congress on Intelligent Transport Systems, Toronto, ON, Canada, 8–12 November 1999. [Google Scholar]
  49. Menu, J.-P.R. Head-up/Head-down Transition: Measurement of Transition Times. Aviat. Space Environ. Med. 1986, 57, 218–222. [Google Scholar]
  50. Wang, B.; Ma, H.; Wang, H.; Su, X.; Wu, J. HUD Image Test Equipment. CN206132356U, 26 April 2017. [Google Scholar]
  51. Unreal Engine. 2018. Available online: https://www.unrealengine.com/en-US/what-is-unreal-engine-4 (accessed on 26 January 2025).
  52. Qiu, Y.; Lu, Y.; Wang, Y.; Yang, C. Visual Perception Challenges in Adverse Weather for Autonomous Vehicles: A Review of Rain and Fog Impacts. In Proceedings of the 2024 IEEE 7th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), Chongqing, China, 20–22 September 2024; pp. 1342–1348. [Google Scholar] [CrossRef]
  53. Kenk, M.A.; Elsaidy, M.; Hassaballah, M.; Mansour, M.B. Driving Perception in Challenging Road Scenarios: An Empirical Study. In Proceedings of the 2023 20th ACS/IEEE International Conference on Computer Systems and Applications (AICCSA), Giza, Egypt, 4–7 December 2023; pp. 1–6. [Google Scholar] [CrossRef]
  54. He, S.; Yan, X.D.; Pang, H.T.; Zhao, J. Effects of Fog Conditions on Driving Behaviors—Crash Avoidance Driving Behaviors. J. Transp. Inf. Saf. 2014, 5, 126–129. [Google Scholar]
  55. Dong, B.; Chen, A.; Zhang, Y.; Zhang, Y.; Zhang, M.; Zhang, T. The Foggy Effect of Egocentric Distance in a Nonverbal Paradigm. Sci. Rep. 2021, 11, 14398. [Google Scholar] [CrossRef]
  56. Zhou, B.; Feng, Z.; Liu, J.; Huang, Z.; Gao, Y. A Method to Enhance Drivers’ Hazard Perception at Night Based on “Knowledge-Attitude-Practice” Theory. Accid. Anal. Prev. 2024, 200, 107565. [Google Scholar] [CrossRef] [PubMed]
  57. Prasolenko, O. Impact of Road Traffic on Driver Reaction Time. Munic. Econ. Cities 2020, 6, 169–172. [Google Scholar] [CrossRef]
  58. Pauzié, A. A Method to Assess the Driver Mental Workload: The Driving Activity Load Index (DALI). IET Intel. Transp. Syst. 2008, 2, 315–322. [Google Scholar] [CrossRef]
  59. Serrell, B. Paying Attention: The Duration and Allocation of Visitors’ Time in Museum Exhibitions. Curator Mus. J. 1997, 40, 108–125. [Google Scholar] [CrossRef]
  60. Pew, R.W. The State of Situation Awareness Measurement: Heading toward the next Century. In Situational Awareness; Routledge: London, UK, 2017; pp. 459–474. [Google Scholar]
  61. Selcon, S.J.; Taylor, R.M. Evaluation of the Situational Awareness Rating Technique (SART) as a Tool for Aircrew Systems Design. In Situational Awareness in Aerospace Operations (AGARD-CP-478); NATO-AGARD: Neuilly-Sur-Seine, France, 1990. [Google Scholar]
  62. Vidulich, M.A.; Tsang, P.S. The Confluence of Situation Awareness and Mental Workload for Adaptable Human–Machine Systems. J. Cogn. Eng. Decis. Mak. 2015, 9, 95–97. [Google Scholar] [CrossRef]
Figure 1. Dynamic effects of two long-wavelength colors of AR-HUD navigation graphics.
Figure 1. Dynamic effects of two long-wavelength colors of AR-HUD navigation graphics.
Electronics 14 01745 g001
Figure 2. Foggy weather image illuminance collection.
Figure 2. Foggy weather image illuminance collection.
Electronics 14 01745 g002
Figure 3. Rainy weather image illuminance collection.
Figure 3. Rainy weather image illuminance collection.
Electronics 14 01745 g003
Figure 4. Effects of different scene illumination levels in AR-HUD navigation graphic simulated usage scenarios.
Figure 4. Effects of different scene illumination levels in AR-HUD navigation graphic simulated usage scenarios.
Electronics 14 01745 g004
Figure 5. The procedure of the experiment.
Figure 5. The procedure of the experiment.
Electronics 14 01745 g005
Figure 6. The procedure of the eye-tracking experiment.
Figure 6. The procedure of the eye-tracking experiment.
Electronics 14 01745 g006
Figure 7. Comparison of SART evaluations. *** p ≤ 0.001.
Figure 7. Comparison of SART evaluations. *** p ≤ 0.001.
Electronics 14 01745 g007
Figure 8. Comparison of DALI score of cognitive load. ns p > 0.05, *** p ≤ 0.001.
Figure 8. Comparison of DALI score of cognitive load. ns p > 0.05, *** p ≤ 0.001.
Electronics 14 01745 g008
Figure 9. Comparison of illumination effects on AR-HUD colors in average fixation duration (milliseconds), fixation count (number of times), and average visit duration (milliseconds). ns p > 0.05, * p < 0.05, ** p < 0.01, *** p ≤ 0.001.
Figure 9. Comparison of illumination effects on AR-HUD colors in average fixation duration (milliseconds), fixation count (number of times), and average visit duration (milliseconds). ns p > 0.05, * p < 0.05, ** p < 0.01, *** p ≤ 0.001.
Electronics 14 01745 g009
Table 1. Effects of different illuminance levels in AR-HUD navigation graphic usage scenarios.
Table 1. Effects of different illuminance levels in AR-HUD navigation graphic usage scenarios.
Rainy and Foggy Weather Scene Simulation
Rainy dayFoggy day
Electronics 14 01745 i001Electronics 14 01745 i002
Electronics 14 01745 i003Electronics 14 01745 i004
Table 2. Eye movement evaluation metrics under clear weather conditions.
Table 2. Eye movement evaluation metrics under clear weather conditions.
CategoryPoor
(High Cognitive Load)
GoodPoor
(Insufficient Attractiveness)
Average fixation duration>600 ms400–600 ms<400 ms
Average visit duration>2000 ms500–2000 ms<500 ms
Fixation count>7 times5–6 times1–3 times
Table 3. The impact of long-wavelength colors on drivers’ situational awareness in low-illumination rainy and foggy weather.
Table 3. The impact of long-wavelength colors on drivers’ situational awareness in low-illumination rainy and foggy weather.
The Effects of Color and Illumination on SA (Mean ± SD)
Subjective indicatorsYFYRRFRRFp
SART58.77 ± 5.9552.65 ± 6.0836.62 ± 6.1731.92 ± 5.73118.4130.000 ***
*** p ≤ 0.001.
Table 4. Two-way repeated measures ANOVA results for the effects of color and illumination on SART.
Table 4. Two-way repeated measures ANOVA results for the effects of color and illumination on SART.
Independent VariabledfDependent VariableFpηp2
Color1SART333.6800.000 ***0.769
Illumination121.1930.000 ***0.175
Color × Illumination10.3670.5460.004
R2 = 0.780, *** p ≤ 0.001.
Table 5. Post hoc multiple comparisons of SART scores under different AR-HUD display conditions.
Table 5. Post hoc multiple comparisons of SART scores under different AR-HUD display conditions.
(I) Condition(I) Condition(I) Mean(J) MeanMean Difference (I–J)p
RFRR36.6231.924.6920.006 **
RFYF36.6258.77−22.1540.000 ***
RFYR36.6252.65−16.0380.000 ***
RRYF31.9258.77−26.8460.000 ***
RRYR31.9252.65−20.7310.000 ***
YFYR58.7752.656.1150.000 ***
** p < 0.01, *** p ≤ 0.001.
Table 6. The impact of long-wavelength colors on DALI in low-illumination rainy and foggy weather.
Table 6. The impact of long-wavelength colors on DALI in low-illumination rainy and foggy weather.
The Effects of Color and Illumination on DALI (Mean ± SD)
Subjective indicatorsRRRFYRYFFp
DALE5.37 ± 0.185.25 ± 0.203.57 ± 0.232.11 ± 0.24118.4130.000 ***
*** p ≤ 0.001.
Table 7. Results of two-way repeated measures ANOVA for the effects of color and illumination on DALI.
Table 7. Results of two-way repeated measures ANOVA for the effects of color and illumination on DALI.
Independent VariabledfDependent VariableFpηp2
Color1DALI3480.9530.000 ***0.972
Illumination1352.6460.000 ***0.779
Color × Illumination1256.7690.000 ***0.720
R2 = 0.976, *** p ≤ 0.001.
Table 8. Simple effects analysis of DALE scores under different color and illumination conditions.
Table 8. Simple effects analysis of DALE scores under different color and illumination conditions.
ColorIlluminationMean DifferenceSEtpCohen’s d
RF–R−0.1150.059−1.9480.054−0.540
YF–R−1.4580.059−24.6090.000 ***−6.825
R–YF3.1420.05953.0500.000 ***14.713
R–YR1.8000.05930.3880.000 ***8.428
*** p ≤ 0.001.
Table 9. Eye-tracking data of drivers using long-wavelength colors in low-illumination conditions during rain and fog.
Table 9. Eye-tracking data of drivers using long-wavelength colors in low-illumination conditions during rain and fog.
The Effects of Color and Illumination on Eye-Tracking Data (Mean ± SD)
Subjective
indicators
RFRRYFYRFp
Average fixation
duration
569.73 ± 79.61734.23 ± 169.63457.46 ± 72.89599.04 ± 176.9918.7760.000 ***
Fixation count10.15 ± 1.1213.19 ± 0.987.04 ± 0.779.08 ± 0.74203.3640.000 ***
Average visit
duration
2128.42 ± 503.472601.19 ± 435.201307.88 ± 323.641673.12 ± 733.5130.1070.000 ***
*** p ≤ 0.001.
Table 10. Results of two-way ANOVA for the effects of color and illumination on average fixation duration.
Table 10. Results of two-way ANOVA for the effects of color and illumination on average fixation duration.
Independent VariableDependent VariableFpηp2
ColorAverage fixation
duration
73.2260.000 ***0.423
Illumination16.8180.000 ***0.144
Color × Illumination0.2770.6640.003
*** p ≤ 0.001.
Table 11. Post hoc multiple comparisons of fixation duration under different conditions.
Table 11. Post hoc multiple comparisons of fixation duration under different conditions.
(I) Condition(I) Condition(I) Mean(J) MeanMean Difference (I–J)p
RFRR569.73734.23−164.5000.000 ***
RFYF569.73457.46112.2690.003 **
RFYR569.73599.04−29.3080.432
RRYF734.23457.46276.7690.000 ***
RRYR734.23599.04135.1920.000 ***
YFYR457.46599.04−141.5770.000 ***
** p < 0.01, *** p ≤ 0.001.
Table 12. Results of two-way ANOVA for the effects of color and illumination on fixation count.
Table 12. Results of two-way ANOVA for the effects of color and illumination on fixation count.
Independent VariableDependent VariableFpηp2
ColorFixation count403.4700.000 ***0.801
Illumination198.9040.000 ***0.665
Color × Illumination7.7170.007 **0.072
** p < 0.01, *** p ≤ 0.001.
Table 13. Simple effects analysis of fixation duration under different color and illumination conditions.
Table 13. Simple effects analysis of fixation duration under different color and illumination conditions.
ColorIlluminationMean DifferenceSEtpCohen’s d
RF–R−3.0380.255−11.9370.000 ***−3.311
YF–R−2.0380.255−8.0080.000 ***−2.221
R–YF3.1150.25512.2390.000 ***3.395
R–YR4.1150.25516.1680.000 ***4.484
*** p ≤ 0.001.
Table 14. Results of two-way ANOVA for the effects of color and illumination on average visit duration.
Table 14. Results of two-way ANOVA for the effects of color and illumination on average visit duration.
Independent VariableDependent VariableFpηp2
ColorAverage visit duration73.226<0.001 ***0.423
Illumination16.818<0.001 ***0.144
Color × Illumination0.2770.6000.003
*** p ≤ 0.001.
Table 15. Post hoc multiple comparisons of visit duration under different conditions.
Table 15. Post hoc multiple comparisons of visit duration under different conditions.
(I) Condition(I) Condition(I) Mean(J) MeanMean Difference (I–J)p
RFRR2128.4232601.192−472.7690.001 ***
RFYF2128.4231307.885820.5380.000 ***
RFYR2128.4231673.115455.3080.002 **
RRYF2601.1921307.8851293.3080.000 ***
RRYR2601.1921673.115928.0770.000 ***
YFYR1307.8851673.115−365.2310.013 *
* p < 0.05, ** p < 0.01, *** p ≤ 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, Q.; Liu, Z. Reality Head-Up Display Navigation Design in Extreme Weather Conditions: Enhancing Driving Experience in Rain and Fog. Electronics 2025, 14, 1745. https://doi.org/10.3390/electronics14091745

AMA Style

Zhu Q, Liu Z. Reality Head-Up Display Navigation Design in Extreme Weather Conditions: Enhancing Driving Experience in Rain and Fog. Electronics. 2025; 14(9):1745. https://doi.org/10.3390/electronics14091745

Chicago/Turabian Style

Zhu, Qi, and Ziqi Liu. 2025. "Reality Head-Up Display Navigation Design in Extreme Weather Conditions: Enhancing Driving Experience in Rain and Fog" Electronics 14, no. 9: 1745. https://doi.org/10.3390/electronics14091745

APA Style

Zhu, Q., & Liu, Z. (2025). Reality Head-Up Display Navigation Design in Extreme Weather Conditions: Enhancing Driving Experience in Rain and Fog. Electronics, 14(9), 1745. https://doi.org/10.3390/electronics14091745

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop