1. Introduction
Head-Up Display (HUD) is a transparent display technology that projects critical information (e.g., speed, navigation) directly into the driver’s line of sight, minimizing gaze deviation to enhance situational awareness and driving safety [
1]. HUD provides essential information, such as navigation, vehicle status, and environmental data, thereby significantly enhancing situational awareness [
2]. Augmented Reality HUD (AR-HUD) integrates virtual elements (e.g., navigation arrows) with real-world road scenes via projection, enabling intuitive driver guidance [
3]. AR-HUD further assists navigation by mitigating the effects of external environmental degradation. These systems improve driver attention and facilitate the recovery of cognitive resources, ultimately enhancing driving performance through visual interaction [
4]. Research has shown that directly overlaying AR-HUD navigation and environmental information onto the driver’s line of sight can enhance both cognitive accessibility and visual clarity [
5]. However, drivers frequently encounter varying road conditions, and extreme weather (e.g., fog or rain) can cause a deterioration in visual performance. This phenomenon increases driving risks, posing threats to both drivers and pedestrians [
6].
Situational awareness (SA) is a multifaceted concept, encompassing the perception, understanding, and projection of elements within the environment [
7]. When drivers are exposed to extreme weather conditions, such as fog or rain, the degradation of the visual environment significantly threatens the operational safety and efficiency of ground vehicles [
8]. Owens and Tyrrell attributed the degradation of visual functions under varying illumination conditions to the Selective Degradation Hypothesis [
9]. Visual complexity, driving scenarios, road types, scene illumination, and informational elements all influence situational awareness and contribute to driver accidents [
10,
11]. This ability to perceive, understand, and predict relevant factors in complex or dynamic environments to support timely decision making is referred to as situational awareness. AR-HUD systems process real-time video images to identify weather conditions and provide early warnings for hazardous weather types, thereby reducing the risk of traffic accidents [
12]. These systems are capable of detecting obstacles and displaying critical driving information, achieving an approximate recognition rate of 73% under adverse weather conditions [
3]. Park proposed an AR-based HUD system specifically designed for robust obstacle recognition under adverse weather conditions. The system consists of four components: a ground obstacle detection module, an object decision module, an object recognition module, and a display module [
3]. Deng proposed an innovative AR-HUD system that provides drivers with a stereoscopic scene. The system consists of two conventional HUD displays and supports parallax through additive light field decomposition. The optical paths and illumination of the two displays are precisely calibrated for the two views to enhance situational awareness in low-illumination environments [
13]. Deng suggested that lane enhancement through AR-HUD could improve lateral control performance under foggy conditions [
14].
Although researchers have explored methods to enhance situational awareness in rainy and foggy weather conditions to reduce unpredictable hazardous events, they have overlooked the impact of AR-HUD information design on drivers under different extreme weather conditions. The effectiveness of AR-HUD interfaces varies depending on the perceived color of the graphical elements, which can be either informative or distracting to situational awareness. Thus, this study focuses on the visual design of AR-HUD navigation graphics, specifically exploring how the color of these graphics influences situational awareness in driving environments under rain and fog extreme weather conditions.
Unlike clear weather, the brightness during foggy or rainy conditions is significantly lower than that of sunny days due to the scattering and absorption of light by water droplets and fog particles in the atmosphere [
15]. Extreme weather phenomena are challenging to observe clearly due to their complex and variable nature. They are influenced by various factors, including atmospheric conditions, geographical features, and climate change, which can reduce color perception and clarity, making driving in extreme weather particularly challenging [
16]. In contrast, clear weather typically improves visibility, stabilizes road conditions, and allows for safer driving speeds, thereby reducing the likelihood of accidents [
17]. Tu’s study shows that emergency driving under reduced visibility leads to altered reactions, and the fidelity of driving simulators also affects the estimation of these behaviors [
18]. Wet road surfaces and signal control significantly affect the severity of collisions during foggy conditions but have no effect on collision severity in clear weather [
19]. Driving in heavy rain severely impacts a driver’s visibility, affecting performance under wet conditions and creating safety concerns [
20]. Many studies on HUD under extreme weather conditions have focused primarily on improving driving safety by integrating advanced warning systems, enhancing visibility, and ensuring adaptive brightness control. These systems use video processing and weather forecasting models to alert drivers to hazardous weather conditions, reducing the risk of accidents in extreme weather [
12]. Another approach focuses on enhancing the clarity of HUD images and their adaptability to environmental conditions. The use of micro-projectors and Digital Light Processing (DLP) technology enables HUD to provide high brightness and resolution, ensuring that critical information is visible even in low-visibility conditions such as night driving or storms [
12]. Kumar notes that adaptive brightness adjustment systems modify the visibility of the HUD based on external illumination conditions, ensuring safe operation during weather changes such as heavy rain or fog [
21]. Additionally, user-centered design approaches have explored how AR-HUD can enhance driver awareness by displaying key information in real time, such as road conditions, speed, and weather hazards [
22].However, while the aforementioned studies have focused on enhancing situational awareness during extreme weather to improve driving safety, they overlook the impact of navigation information graphics on driver situational awareness under extreme weather conditions.
Studies have shown that color information can shorten reaction times at lower brightness contrast, with reaction times influenced by hue and color contrast, while other studies suggest that reaction time is also related to wavelength [
23]. In imaging, the visible spectrum is used to capture images detectable by the human eye, typically ranging from 380 to 750 nm. Long-wavelength colors include red, orange, and yellow [
24]. The wavelength of light significantly impacts visual perception in the driving environment, as different wavelengths affect visibility, color recognition, and glare. Shorter wavelengths (blue light) can enhance contrast and visibility under low-illumination conditions, while longer wavelengths (red light) can reduce glare and improve comfort during night driving [
25]. Long’s study indicates that wavelength significantly affects dynamic visual acuity (DVA) in driving environments; in particular, long-wavelength light is more suitable for use under low illumination conditions [
26]. Gabbard et al. proposed a perceptual color matching method for examining color blending in AR-HUD graphics. Their findings suggest that, under clear daylight conditions, blue and green are more robust than red [
27]. Previous research has primarily focused on the effects of colors on driving behavior, attention, and subjective experience but has largely overlooked the unique advantages of wavelength colors in enhancing visual prominence under extreme weather conditions. Kumar suggested that long-wavelength colors can stand out in the soft backgrounds caused by fog to enhance the driver’s situational awareness [
21]. Shorter wavelengths (blue light) are more scattered in the atmosphere, which reduces clarity and contrast, especially in fog or rain. In contrast, longer wavelengths (red light) have better penetration under such conditions [
28]. The color in augmented reality applications or device interfaces has a tangible impact on usability [
29]. This highlights the importance of selecting appropriate spectral intervals for lighting to enhance the distinction between colored surfaces and backgrounds [
30]. This study aims to compare the impact of long-wavelength colors in AR-HUD navigation graphics on situational awareness of drivers in rain and fog conditions, exploring which colors in the long-wavelength range are more effective in helping drivers obtain navigation information efficiently while maintaining attention to the road environment, thereby reducing accidents and violations [
31].
Although HUD color plays a crucial role in driver perception, it is important to consider other factors such as environmental illumination and the complexity of the HUD. These elements can interact with color choices, affecting the overall effectiveness of the HUD [
32]. On clear days, solar irradiance reaches its maximum, providing optimal illumination conditions. In contrast, due to the presence of water droplets in the atmosphere, solar irradiance is reduced on foggy and rainy days [
33]. During heavy rain, illumination levels are significantly reduced. A study comparing illumination levels under different weather conditions shows that during particularly heavy rainfall, illumination drops significantly [
34]. Fog scatters light, causing attenuation, halos, and masking effects, which reduce visibility and illumination levels [
35]. While heavy rain and fog both have significant impacts on driving scenes, they do so through different mechanisms. The illumination characteristics of rainy and foggy days are different due to variations in scattering mechanisms, atmospheric conditions, and particle sizes [
36]. The average brightness levels during heavy rain and fog are influenced by various factors, including light scattering, color temperature, and road illumination conditions. Research indicates that the recognition distance of obstacles varies greatly in rainy and foggy conditions, with visibility differences reaching up to 72.86% under the same illumination conditions [
37]. Therefore, considering the differences between heavy rain and fog, this study conducts an experimental analysis solely from the perspective of lighting brightness. Long-wavelength colors will be incorporated into scenarios with two different illumination levels (heavy rain and fog) to measure their impact on the driver’s situational awareness, thereby assisting the driver in more effectively perceiving information from both the interface and the surrounding environment.
In summary, research on AR-HUD primarily focuses on the influence of advanced warning systems, enhanced image clarity, graphic brightness, and functional graphics (such as road conditions, speed, and weather) on driver behavior, attention, subjective acceptance, and satisfaction. However, it overlooks whether the design of AR-HUD information influences the driver’s situational awareness during navigation in extreme weather conditions with varying illumination levels. In this regard, how should the color and brightness of AR-HUD navigation graphics be designed to accommodate different illumination levels in heavy rain or fog, enabling drivers to rapidly and effectively perceive information while improving their situational awareness to ensure safe driving? Therefore, this study aims to investigate the effects of long-wavelength colors in AR-HUD navigation graphics and different average brightness levels in rainy and foggy weather conditions on drivers’ situational awareness and user experience by analyzing situational awareness, eye-tracking metrics, and subjective evaluations of the user experience. In pursuit of these objectives, this study addresses the following research questions (RQs):
RQ1: How do different colors of AR-HUD designs influence perception during driving in extreme weather conditions?
RQ2: Does the color of the HUD affect drivers’ situational awareness depending on the scene illumination?
For RQ1, we reviewed relevant colors suitable for AR-HUD and analyzed those that enhance situational awareness. The experiment was controlled using the SART questionnaire and eye-tracking to obtain both subjective and objective quantitative results. To answer RQ2, we collected 50 photos of rain and fog in various conditions from the internet and calculated their average illumination. Then, we recreated two scenes with different illumination levels using Unity (Version 2023.2.20f1, Unity Technologies, San Francisco, CA, USA). Additionally, scenarios with weak situational awareness were included in each scene, and the interaction between colors and different illumination levels was examined through the experimental setup.
5. Conclusions and Limitations
This study systematically explored the impact of different AR-HUD colors and illumination combinations on drivers’ driving performance using simulated driving experiments and eye-tracking technology. The results revealed that in rainy and foggy road scenarios, yellow-fog AR navigation designs significantly improved drivers’ performance, particularly in terms of situational awareness and driving cognitive load. Yellow enhanced situational awareness during driving in rainy and foggy weather, with yellow navigation graphics showing clear advantages over red ones. The comfort provided by yellow effectively guided drivers’ information processing, enabling them to perceive the surrounding environment more quickly and accurately and respond appropriately.
In terms of cognitive load, both red and yellow long-wavelength colors effectively guided drivers’ situational awareness in low-illumination environments. Additionally, yellow HUD displayed lower workload levels across multiple dimensions, including attention demand, visual demand, and temporal demand. In eye tracking, yellow HUD showed lower average fixation durations, fewer fixations, and shorter access times compared to red HUD, indicating that yellow has a lower cognitive load and is better for processing information under extreme weather conditions. These findings are consistent with previous studies and further confirm the potential of yellow HUD navigation graphics in improving driving safety.
Regarding illumination during rainy and foggy conditions, the use of yellow and red AR-HUD required more time during rainy weather, but this was not necessarily due to illumination issues. It may instead be related to the continuous interference of raindrops on the windshield and the uniform scattering of light in the rainy environment. While fog reduced overall visibility, its illumination conditions were relatively stable, allowing drivers to process navigation information more efficiently. Additionally, the low-illumination conditions in foggy environments made yellow navigation graphics more visually prominent, further enhancing drivers’ situational awareness. This finding suggests that AR-HUD designs should be optimized based on the illumination characteristics of different weather conditions. For instance, in rainy conditions, it is important to increase the brightness and contrast of graphics to counter the interference of raindrops on visual information. In foggy conditions, adjustments to graphic color and display position can further improve the recognizability of information.
Although this study provides valuable insights, there are several limitations that require further exploration and improvement in future research. First, while the complexity of the simulated road environment was high, it could not fully replicate all variables in real-world driving. Due to equipment limitations, this study used simulated videos as experimental materials to standardize and ensure the consistency of experimental conditions. This simulation may have had some impact on the results, as it could not completely reproduce the complexity and unpredictability of real-world driving, and participants’ situational awareness ratings might have been slightly lower than in actual driving scenarios. Second, although the sample size was sufficient for statistical analysis, it was relatively small and limited to a specific age group, which may affect the generalizability of the findings. Future research should consider larger and more diverse samples, as well as real-world test scenarios, to validate these results. Furthermore, this study focused only on two colors, and exploring a broader range of colors and their combinations could provide a deeper understanding of optimal AR-HUD design. Due to weather limitations, the study could not measure real-world rain and fog illumination levels within a short time frame. Although different types of rain and fog illumination photos were collected online, they may differ slightly from actual illumination levels.
In future work, we plan to expand the sample size and adopt an interactive driving simulation to better replicate real-world driving scenarios. Our research team is currently in the process of acquiring a Logitech steering wheel set to improve the overall quality of the experiment. To enhance the interactivity of the experimental materials, Unity will be used for scenario modeling in subsequent studies. Future research will continue to explore the application of HUD under various conditions, with efforts made to collect data from real-world environments whenever possible. Where conditions permit, glasses-type eye trackers will be employed to observe situational awareness in real-time during actual driving.
Overall, this study provides valuable insights into AR-HUD navigation design and points the way for future research. As technology continues to advance and in-vehicle interactions become more widespread, optimizing AR interfaces to ensure their application potential in more complex driving scenarios will become an important research topic. Future work should focus on developing safer, more intuitive, and adaptable AR-HUD navigation systems to enhance road safety, boost drivers’ confidence, and improve their quality of life.