Next Article in Journal
Research on Intraparticle to Interparticle Entanglement Swapping Protocols
Previous Article in Journal
Ionic Components of Particulate Matter 2.5 May Influence Daily Prevalence of Skin Symptom Exacerbations in Allergy Sufferers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AEA-RDCP: An Optimized Real-Time Algorithm for Sea Fog Intensity and Visibility Estimation

1
Department of Information and Communication Engineering, Hoseo University, 20, Hoseo-ro79beon-gil, Baebang-eup, Asan-si 31499, Republic of Korea
2
Korea Electronics Technology Institute (KETI), 25, Saenari-ro, Bundang-gu, Seongnam-si 13509, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(17), 8033; https://doi.org/10.3390/app14178033 (registering DOI)
Submission received: 14 August 2024 / Revised: 30 August 2024 / Accepted: 5 September 2024 / Published: 8 September 2024
(This article belongs to the Section Marine Science and Engineering)

Abstract

:
Sea fog reduces visibility to less than 1 km and is a major cause of maritime accidents, particularly affecting the navigation of small fishing vessels as it forms when warm, moist air moves over cold water, making it difficult to predict. Traditional visibility measurement tools are costly and limited in their real-time monitoring capabilities, which has led to the development of video-based algorithms using cameras. This study introduces the Approximating and Eliminating the Airlight–Reduced DCP (AEA-RDCP) algorithm, designed to address the issue where sunlight reflections are mistakenly recognized as fog in existing video-based sea fog intensity measurement algorithms, thereby improving performance. The dataset used in the experiment is categorized into two types: one consisting of images unaffected by sunlight and another consisting of maritime images heavily influenced by sunlight. The AEA-RDCP algorithm enhances the previously researched RDCP algorithm by effectively eliminating the influence of atmospheric light, utilizing the initial stages of the Dark Channel Prior (DCP) process to generate the Dark Channel image. While the DCP algorithm is typically used for dehazing, this study employs it only to the point of generating the Dark Channel, reducing computational complexity. The generated image is then used to estimate visibility based on a threshold for fog density estimation, maintaining accuracy while reducing computational demands, thereby allowing for the real-time monitoring of sea conditions, enhancing maritime safety, and preventing accidents.

1. Introduction

Maritime accidents can occur due to various causes, but those caused by fog are particularly frequent. Low visibility conditions, where visibility decreases to less than 1 km, are one of the main causes of maritime accidents [1,2]. Sea fog is a common type of mobile fog that occurs when warm, moist air moves over a colder surface or body of water. This phenomenon changes rapidly and occurs locally, appearing and disappearing quickly, which makes it difficult to accurately assess conditions in specific areas. These characteristics emphasize the importance of real-time monitoring to ensure maritime safety [3,4,5].
Visibility measurement equipment used to assess fog intensity is essential for ensuring maritime safety [6]. This equipment uses optical sensors that emit infrared light and measure the amount of light scattered by aerosols and the atmosphere. However, such equipment is quite expensive. Additionally, visibility data obtained at sea can contain significant errors. Consequently, visibility data are recommended to be measured by well-trained observers following the guidelines of the World Meteorological Organization (WMO) [7]. In South Korea, there are 23 observer measurement centers and 291 meteorological measurement systems in operation [8]. According to a 2018 study, comparing data collected by trained observers and visibility meters showed that visibility meters detect fog more frequently, with errors being more pronounced at sea [9]. For example, measured data sometimes show abrupt changes in fog intensity, which does not align with the typical gradual thickening or thinning of fog. Such issues can lead to conflicts of interest, such as frequent fog disrupting vessel operations and affecting the livelihoods of small fishing vessel operators [10].
Advancements in fog density estimation and real-time monitoring technologies are crucial for preventing maritime accidents and ensuring the safety of the marine environment. Recent studies have focused on using video-based algorithms to detect fog conditions instead of traditional visibility meters [11]. These video-based fog intensity measurement algorithms utilize cameras to capture marine images, enhancing accuracy by excluding sky regions that could interfere with fog detection. By applying the Dark Channel Prior (DCP) technique [12,13], these algorithms improve computational efficiency by calculating the dark channel ratio within the image, enabling the rapid real-time estimation of fog intensity. This approach helps monitor locally occurring fog conditions in real time, reduces costs, and enhances maritime navigation safety by preventing accidents.
However, estimating fog intensity using images alone presents several challenges. For example, while the human eye can distinguish between sunlight reflections on the water and actual fog on clear days, video-based algorithms may mistakenly classify bright regions caused by sunlight or atmospheric scattering as fog [14]. To address this issue, this paper proposes a new fog intensity measurement algorithm, AEA-RDCP, which removes the influence of atmospheric light (such as sunlight) to enable accurate fog detection even in images with severe atmospheric scattering. Additionally, the study employs two datasets to estimate sea fog density under various conditions: one consisting of images unaffected by atmospheric light and another heavily influenced by it. The proposed algorithm offers the same cost-effectiveness as existing video-based fog intensity measurement algorithms (RDCP) while improving accuracy and reducing the risk of accidents due to incorrect detection.
This paper is organized as follows: Section 2 reviews the relevant literature focusing on the DCP technique and introduces the existing DCP-based fog intensity measurement algorithm (RDCP). Section 3 explains the details of the AEA-RDCP algorithm. Section 4 compares the performance of AEA-RDCP and RDCP and presents the experimental evaluation results of the AEA-RDCP algorithm.

2. Literature Review

2.1. Visibility Estimation Studies

The research on fog has largely been concentrated on terrestrial environments, with considerable effort dedicated to developing fog removal techniques rather than methodologies for visibility estimation. Fog removal on land is critical across various domains such as traffic safety, architectural applications, and environmental monitoring. When fog is present, visibility can be severely limited, which poses risks in multiple scenarios including driving, construction, and environmental assessment. Addressing these issues is particularly important in fields where safety and operational efficiency are paramount. Similarly, in the maritime industry, precise real-time visibility estimation is as crucial as the fog removal techniques used on land. Limited visibility due to fog can lead to maritime accidents, highlighting the need for accurate real-time visibility information to ensure safety and efficiency in marine operations. Therefore, reliable visibility estimation technologies are essential for effectively managing these challenges [15,16,17,18,19].
In this context, Q. Zhu proposed an innovative method for estimating fog density using the Color Attenuation Prior (CAP) technique [20]. This method involves calculating the color attenuation ratio for each pixel in an RGB image to determine the fog density. Typically, objects that are farther away in a foggy image experience greater color attenuation, which allows for an estimation of fog density based on these variations. The CAP algorithm is capable of rapidly and efficiently determining fog concentration, making it useful in various fields where fast and accurate fog density estimation is required.
Additionally, F. Outay conducted research to address visibility issues caused by fog by analyzing video surveillance footage from airports [21]. This study developed a sophisticated model using deep-learning algorithms to accurately estimate visibility under foggy conditions. The model was integrated into airport video surveillance systems to provide real-time visibility monitoring, ensuring consistent performance across different weather conditions and enhancing overall airport safety and operational stability.
In the realm of fog-related research, the DCP algorithm has made significant contributions [14,15]. The DCP algorithm has been extensively used in dehazing research to restore images where visibility has been compromised due to fog. It has also been applied to visibility estimation studies. The DCP algorithm is based on the empirical observation that, in fog-free images, at least one of the color channels—Red (R), Green (G), or Blue (B)—typically exhibits very low intensity values. This principle is known as the DCP, and the channel with the lowest intensity value is referred to as the DC. The mathematical expression for this principle is as follows in Equation (1):
J d a r k ( x ) = min y Ω { x } ( min c { r , g , b } J c ( y ) ) ,
where J c represents the color channel, and Ω ( x ) denotes a local patch of n × n pixel centered at x . The pixel y is within this local patch. The DCP algorithm can be used for fog removal or intensity estimation. The main processes of the DCP algorithm include the following:
  • Generation of the DC Image: Creating the DC image using the minimum values per channel from local patches;
  • Estimation of Atmospheric Light: Calculating the atmospheric light value using the brightest pixels in the DC image;
  • Estimation of the Transmission Map: Calculating an initial transmission map, then refining it to obtain the final transmission map;
  • Calculation of the Restored Image: Using the transmission map and atmospheric light value to restore the original image.
Bae [22] proposed an enhanced method by combining the DCP with distance information from fixed objects in coastal areas, aiming to estimate comprehensive visibility over extensive regions, such as ports. This approach improves the accuracy of the DCP algorithm by leveraging the distance data of fixed objects, thereby providing a more accurate visibility estimation.
Yang [23] further improved the DCP algorithm by integrating it with the Grayscale Image Entropy (GIE) and Support Vector Machine (SVM) techniques to estimate the visibility range more precisely. This integrated approach is employed to determine the visibility in current road and traffic conditions, which helps in setting appropriate speed limits to enhance traffic safety and reduce congestion. The DCP algorithm has been rigorously validated over an extended period and has been embedded in various industrial products, such as car cameras and road CCTVs, making it a highly reliable method for estimating fog intensity. Consequently, research has been conducted using the DCP as a foundational algorithm to develop advanced video-based fog intensity measurement algorithms specifically for estimating sea fog intensity.

2.2. Image-Based Sea Fog Strength Measurement Algorithm Using DC (RDCP)

The Image-based Sea Fog Intensity Measurement Algorithm, known as RDCP, leverages the initial processes of the DCP method to extract DC values from images. This algorithm estimates the intensity of sea fog by calculating the percentage of DC values that fall below a predefined threshold, derived from empirical data. The formula for calculating the RDCP percentage is as follows in Equation (2):
D C P   p e r c e n t a g e % = n u m b e r   o f   p i x e l s   w h o s e   D C   v a l u e   i s   l e s s   t h a n   t h e   t r e s h o l d t o t a l   n u m v e r   o f   p i x e l s   i n   a n   i m a g e
According to RDCP research, it has been revealed that there is a significant difference in the distribution of pixel values between foggy and clear images. As shown in Figure 1, extracting the DC values from images taken on clear days and those with dense fog and plotting them on a graph shows a noticeable rise in pixel values at a specific threshold. This threshold is then used to measure fog intensity by calculating the proportion of pixels below this value. For this comparison, 10 images each from clear and foggy conditions were used, and, as illustrated in Figure 2, foggy images contain almost no pixels with DC values below 100, whereas clear images have relatively more pixels below this threshold. Based on this finding, a threshold of 100 (on a scale of 0 to 255) was established. However, this value is not fixed and may be adjusted depending on the local environmental conditions. A higher percentage of pixels with DC values below 100 indicates clearer weather, while a lower percentage signifies denser fog [11,24].
In the RDCP methodology, as depicted in Figure 3, the sky region is excluded from the analysis based on the horizon line. This exclusion ensures that the percentage of DC values below the threshold is calculated only from the sea region of the image. This step is crucial in order to prevent the DCP algorithm from mistakenly identifying bright objects or light sources, which are common on clear, sunny days, as fog. By focusing on the sea region, this approach reduces the influence of atmospheric light, allowing for a more accurate estimation of fog density and visibility [25].

2.3. Effects of Atmospheric Light on Visibility Estimation

Research on visibility estimation has employed various methodologies, each with distinct advantages and limitations. A recurring challenge across these studies is the accurate differentiation of effects caused by environmental factors such as sunlight, which can manifest as shadows, reflections, or highlights in images. These effects often confound the algorithms’ ability to precisely identify and quantify fog or haze [14,26,27,28].
The CAP algorithm [21], for instance, is effective for estimating fog density by analyzing the attenuation of color in RGB images. However, it struggles with distinguishing between the effects of fog and those caused by lighting variations, particularly those induced by sunlight. Sunlight can increase the overall brightness of an image, create intense highlights, and cast shadows, all of which exhibit characteristics that differ from the color attenuation due to fog. These lighting effects can lead to erroneous fog detection and miscalculations of fog density, as CAP is not designed to account for these nuances.
Deep-learning-based approaches for visibility estimation also encounter significant challenges [22]. These models are highly dependent on the quality and diversity of their training data. If the training datasets are limited to specific geographic regions or weather conditions, the models may lack the generalization capability needed to accurately perform in different environments. Furthermore, adverse weather conditions such as strong light, shadows, or rain can further degrade the accuracy of these models, as they may not be adequately represented in the training data.
Similar challenges are present in visibility estimation algorithms utilizing the DCP [13]. For instance, Bae’s study highlighted that strong sunlight could cause certain areas of the image to become overly bright, complicating the accurate calculation of the DC [22]. This excessive brightness can obscure the true nature of the fog, leading to the underestimation of its intensity. Yang’s research also pointed out that glare from sunlight could distort the image contrast, affecting the Grayscale Image Entropy (GIE) values used for visibility estimation, thereby reducing the algorithm’s accuracy [23].
The RDCP approach attempts to mitigate some of these issues by focusing on the distribution of DC values below a specific threshold and excluding the sky region from the analysis [11]. This refinement aims to reduce the influence of sunlight and other bright light sources, which can create misleading signals in the data. However, as illustrated in Figure 4, areas affected by strong sunlight or other external light sources can still introduce inaccuracies in the calculation of the DC. These light sources can elevate the intensity levels in certain areas, mimicking effects typically associated with fog or haze [29].
Despite the exclusion of the sky region, there remains a significant need to further mitigate the influence of atmospheric light to enhance the accuracy of sea fog detection. Therefore, this paper proposes additional refinements to the RDCP algorithm. Specifically, it suggests the exclusion of pixels influenced by atmospheric light from the DC calculation. This refinement aims to minimize the impact of extraneous lighting conditions, thereby improving the precision of visibility estimation and fog detection. By refining the RDCP algorithm in this manner, the study seeks to provide more reliable and accurate visibility assessments, particularly in complex maritime environments where atmospheric conditions can vary greatly.

3. Approximating and Eliminating the Airlight—Reduced DCP (AEA-RDCP)

3.1. AEA-RDCP

In this paper, we introduce a novel algorithm called Approximating and Eliminating the Airlight—Reduced Dark Channel Prior (AEA-RDCP) designed to address the issue of mistaking sunlight-reflecting sea areas for fog in image-based fog intensity measurement systems. The AEA-RDCP algorithm builds upon the initial steps of the DCP methodology and aims to provide accurate estimations of fog density and visibility. A comprehensive flowchart outlining the proposed algorithm’s processes is presented in Figure 5.
To explain the proposed method, we first apply the initial Dark Channel Prior (DCP) process to images captured with a standard camera. This process involves calculating DC values, which are essential for estimating fog density and visibility. By focusing on DC values, the algorithm reduces the complexity and computational demands. A critical step in the AEA-RDCP method is excluding the sky region from analysis. The DCP algorithm often struggles to differentiate between bright objects, like clouds or sunlight, so removing the sky region mitigates this issue and improves fog detection accuracy.
Next, when extracting the DC image from the sea area, the algorithm processes pixels influenced by atmospheric light. These pixels, which exceed a set threshold, are darkened to prevent distortion in the DC percentage calculation due to reflected sunlight. After minimizing the influence of atmospheric light, the algorithm calculates the ratio of DC values to assess fog conditions. A higher proportion of DC values below 100 indicates clear weather, while a lower proportion signals the presence of fog. This method enables real-time fog monitoring, and, by excluding atmospheric light, it improves the accuracy of visibility estimation, helping to prevent maritime accidents.

3.2. Example of Setting and Applying Ambient Light Estimation Thresholds

To accurately estimate the atmospheric light, it is essential that we adhere to the estimation process as outlined in the DCP algorithm. The DCP algorithm employs an atmospheric scattering model to determine both the atmospheric light and the transmission map, which are fundamental for restoring images compromised by fog and for the removal of haze. This model serves as a physical representation that aids in reconstructing images captured under foggy conditions. It does so by delineating the relationship between the observed pixel values in the image, the actual scene radiance (i.e., the true scene without fog), the atmospheric light, and the transmission map. The equation for the atmospheric scattering model is expressed as follows in Equation (3) [30]:
I x = J x t x + A 1 t x
  • I(x) is the observed pixel value of the image;
  • J(x) is the actual pixel value of the scene without fog;
  • t(x) is the transmission map, which is related to the depth of the scene and indicates how much the pixel has been attenuated by atmospheric particles;
  • A is the atmospheric light, representing the brightest part of the scene;
In the DCP algorithm, to estimate the atmospheric light (A), the pixels that correspond to the top 0.1% in the DC image are selected and designated as the atmospheric light (A). The rationale behind selecting the top 0.1% of pixels is based on the fact that these brightest pixels are significantly influenced by atmospheric scattering, which enables a precise estimation of the atmospheric light value [31]. This selection method is not arbitrary; it is derived from both empirical studies and experimental observations. The choice of the top 0.1% has been validated in multiple studies, including the work by C. O. Ancuti et al. [32], where they discuss effective methods for local airlight estimation in image dehazing. These methods balance accuracy and computational efficiency, ensuring that the estimation of atmospheric light is both reliable and robust. The consistent application of this criterion across various research studies further enhances the trustworthiness and effectiveness of this methodology in the field of image restoration under foggy conditions [33]. This detailed description and approach ensure that the atmospheric light is accurately determined, thus allowing for better image restoration and clarity in conditions where visibility is compromised due to fog or haze.
To estimate atmospheric light based on the well-established criterion of using the top 0.1% of pixels in the DC image, this study calculates the threshold value following a structured process, as shown in the pseudocode in Algorithm 1. The steps are as follows:
  • Dataset Selection: A dataset of 100 images with a resolution of 3840 × 2160 pixels was selected for threshold estimation, as illustrated in Figure 6. These images were taken under strong sunlight conditions and were carefully curated from the AI Hub dataset, focusing on days with abundant sunlight across various times of the day [34].
  • DC Image Creation: For each image, the minimum value among the R, G, and B channels of every pixel was chosen to generate the DC image, which captures the impact of fog.
  • Brightest Pixel Selection: The top 0.1% brightest pixels in the DC image were identified.
  • Patch Extraction: Patches of 301 × 301 pixels surrounding these brightest pixels were extracted to ensure accurate estimation by excluding pixels affected by atmospheric light.
  • Threshold Calculation: The average values of the R, G, and B channels within these patches were calculated [35]. These averages were then used as the threshold for atmospheric light estimation.
Algorithm 1. Process of obtaining the threshold value of the ambient light estimation pixel.
1. Prepare Sunlight Images: 100 images
  - Load up to 100 images from the image folder.
2. Generate Dark Channel Image:
For each image do
  - Select the minimum value among the R, G, B channels for each pixel to create the dark channel image.
  - Apply erosion operation to the dark channel image to obtain the final dark channel image.
3. Select the brightest pixel among the top 0.1% pixels in the Dark Channel image:
  - Select the top 0.1% brightest pixels in the dark channel image.
  - Find the brightest pixel among them.
4. Extract a patch around the brightest pixel and calculate the average R, G, B values in this patch:
  - Extract a patch of fixed size around the coordinates of the brightest pixel.
  - Calculate the average R, G, B values of all pixels in this patch.
5. Determine the calculated average values as the threshold for estimating atmospheric light pixels:
  - Use the calculated average R, G, B values as the atmospheric light estimate.
The choice of patch size (301 × 301 pixels) considered the overall image resolution and helped to refine the estimation process by effectively filtering out atmospheric light effects, improving accuracy [35].
Figure 7 illustrates a comparison between two methods: (a) setting the threshold value based on the average pixel values of the top 0.1%, where the pixels identified as atmospheric light are set to 0, and (b) setting the threshold value based on the average pixel values within the designated patch size around the top 0.1% of pixels, also setting the atmospheric light pixels to 0. Comparing these two methods, (a) and (b), demonstrates that using a patch size setting more effectively reduces the influence of atmospheric light, setting the pixel values to 0. This approach helps accurately identify and mitigate the effects of atmospheric scattering on image quality. The formula used for setting the pixel values to 0 when estimating atmospheric light is defined as follows in Equation (4):
p x , y = p x , y I f   m e a n p x , y < T 0 O t h e r w i s e
p x , y denotes the RGB value of the pixel at coordinate x , y in the original image, and p x , y denotes the RGB value of the pixel at coordinate x , y in the transformed image. m e a n p x , y represents the average RGB value of the pixel at x , y . T is the atmospheric light estimation threshold. In this formula, the average value of each pixel is first calculated, and, if this average value is less than the threshold, the pixel value is retained; if it is greater than or equal to the threshold, the pixel value is set to 0. This method effectively reduces the influence of atmospheric light in images of sea regions.
Additionally, Figure 8 shows an example image where the patch size is set to 301 × 301 pixels. In this figure, the x-axis and y-axis represent the dimensions of the image, and the blue-highlighted areas indicate the patch size surrounding the top 0.1% brightest pixels in the DC image. These patches, located around the top 0.1% brightest pixels, are shown to be influenced by atmospheric light. To accurately identify pixels affected by atmospheric light, the average values of the R, G, and B channels within the designated patches around the top 0.1% brightest pixels are calculated [35]. The threshold for determining atmospheric light is set at 0.95. This approach precisely identifies and excludes pixels affected by atmospheric light from further analysis, thereby improving the accuracy of fog density estimation. By setting the threshold in this manner, errors caused by reflected sunlight or other external light sources are effectively reduced, enhancing the reliability of detecting atmospheric light effects. As a result, this method provides more accurate and reliable data for subsequent processing and analysis, ensuring that the true conditions of the scene are more faithfully represented.

4. Experiments and Evaluations

4.1. Experimental Dataset

In most image-based fog research, datasets typically consist of pairs of foggy and fog-free images of the same scene, often created using artificially synthesized images. However, datasets specifically for fog density studies are relatively scarce compared to road image datasets. To address this limitation and advance fog density research in marine environments, this study introduces two types of real-time marine image datasets.
Figure 9 Dataset (a) consists of images with minimal atmospheric light interference, captured from four different ports equipped with buoys. These ports were carefully selected to reduce the impact of atmospheric light, ensuring that high-quality data could be used for analysis. In contrast, Figure 10 Dataset (b) comprises images taken from several ports significantly affected by atmospheric light, intended for comparative analysis with Figure 9 Dataset (a). All datasets were captured in South Korea. The images in Dataset (a) shown in Figure 9 were taken from the ports of Ganghwa, Pyeongtaek, Baengnyeongdo, and Jindo, marked with blue dots on the map. On the other hand, Figure 10 Dataset (b) includes images captured from Jeju Port and Mukho Port, marked with red dots on the map. The geographic locations of these ports are visually represented in Figure 11, helping to understand the distribution of the datasets.
Figure 9 Dataset (a) contains a total of 320 images, each from one of the four mentioned regions. This dataset is accompanied by a detailed fog classification table categorizing the images based on visibility conditions. Images from each location are classified into four distinct fog intensity labels: No-fog (20 images), Low-fog (20 images), Mid-fog (20 images), and Dense-fog (20 images). These classifications are systematically presented in Table 1. In this study, the RDCP fog intensity criteria were adhered to, ensuring consistency with the established standards for fog intensity measurement.
Furthermore, this approach involves removing the sky region to enhance the accuracy of fog density assessment. By applying these criteria, we aim to maintain a high level of precision in evaluating fog intensity, ensuring that our analysis results are both reliable and accurate.
Figure 9 displays images from the Baengnyeongdo region in Dataset (a), categorized by fog intensity. In Figure 9, represents No-fog, represents Low-fog, represents Mid-fog, and represents Dense-fog. Figure 10 presents examples from Dataset (b), showcasing images from clear days with abundant ambient light, as seen in Figure 10a–j.
In Figure 12a, it is evident that the day is clear with minimal influence from atmospheric light, showing almost no fog. On the other hand, in Figure 12b, you can see elements that could be mistaken for fog within the image due to the significant influence of atmospheric light. The results of this observation are represented as Dark Channel Prior (DCP) percentages in Table 2. As shown in the table, Dataset (a) has a DCP percentage of 63%, while Dataset (b) shows 40%, indicating that atmospheric light negatively impacts the camera’s performance. Therefore, we proceed with an experiment comparing the RDCP and AEA-RDCP algorithms using Datasets (a) and (b). This setup allows for a comprehensive comparison of the existing fog intensity measurement algorithm (RDCP) and the proposed algorithm under various conditions, particularly in assessing the performance of the proposed algorithm on images taken on clear days.

4.2. Dataset without the Influence of Atmospheric Light: Performance Comparison and Evaluation with RDCP

In this study, experiments were conducted using Dataset (a), which consists of images classified by fog intensity and is not influenced by atmospheric light, to compare the performance of the existing DCP-based fog intensity measurement algorithm with the proposed AEA-RDCP algorithm. The dataset used for the experiments comprises a total of 320 images, each with a resolution of 1280 × 720 pixels. Before comparing the fog detection performance of the two algorithms, we first observed how the DC percentage values change when the sky region is removed from sea images to assess the impact of external light sources.
Figure 13 is an example comparing the fog intensity levels in images from four regions (Baengnyeongdo, Ganghwa-do, Pyeongtaek Dangjin, and Jido) when the sky region is not removed (a) versus when it is removed (b). The graph specifically shows data from the Baengnyeongdo region. The x-axis represents the number of images, and the y-axis represents the DC percentage values for each image. Here, the DC percentage value denotes the ratio of DC image pixels below the threshold value.
In Figure 13, Graph (a) shows that, on clear days, the average DC percentage value is approximately 62%, which is the highest value observed. In contrast, Graph (b) shows that the average DC percentage for images on clear days is close to 100%. This result indicates that, by removing the sky region, the ambiguity in the DC values of the image is eliminated, suggesting that images with the sky region removed are less affected by external light sources. Therefore, the DC percentage values below the threshold, determined using fog intensity measurement algorithms that remove the sky region, can be more clearly distinguished according to fog intensity, and it is sufficient for estimating fog intensity in images without atmospheric light influence.
The main goal of the experiment is to assess the effectiveness of the proposed AEA-RDCP algorithm in measuring fog density in images without atmospheric light influence. To do this, the experiment compares the DC percentage values obtained from images with the sky region removed and images processed with the AEA-RDCP algorithm, evaluating the performance of both algorithms. The experiment utilized Dataset (a), collected from four different regions to minimize the influence of atmospheric light. In each trial, the DC percentage values from images with the sky region removed were compared with those from images processed by the AEA-RDCP algorithm. Pixels exceeding the threshold for atmospheric light estimation were set to zero and excluded from the DC distribution calculation, ensuring more accurate fog density measurements.
The interpretation of the results is presented through Figure 14, Figure 15, Figure 16 and Figure 17. In these figures, Graph (a) represents the results for images with the sky region removed, while Graph (b) shows the results after applying the AEA-RDCP algorithm. The comparison of the two graphs reveals that the DC percentage values align with the criteria outlined in Table 1 and are nearly identical for both algorithms. This similarity is attributed to the minimal influence of external light sources, such as sunlight reflections, in Dataset (a) from the four regions. Therefore, it is confirmed that the AEA-RDCP algorithm is just as effective as the RDCP algorithm in measuring fog density in images without atmospheric light influence.

4.3. Dataset Affected by Atmospheric Light: AEA-RDCP Performance Evaluation

This experiment’s primary objective was to assess the effectiveness of the proposed AEA-RDCP algorithm, which sets pixels influenced by atmospheric light to 0, compared to the RDCP algorithm, which removes the sky region, in accurately measuring fog density in images significantly affected by atmospheric light. The experiment was conducted using Dataset (ab), which consists of images with a resolution of 1280 × 720 pixels and is sensitive to atmospheric light interference. To evaluate the performance of each algorithm, the experiment focused on estimating the proportion of DC image pixels below a threshold value of 100.
Figure 18 presents the results for 20 images within the dataset, showing the DC percentage values for both the RDCP and AEA-RDCP algorithms. The RDCP algorithm primarily removes the sky region to mitigate atmospheric light effects, whereas the AEA-RDCP algorithm further reduces these effects by setting the estimated atmospheric light pixels to 0. This comparison highlights the strengths of each approach in managing images with significant atmospheric light interference. When examining the graphs, it is clear that the DC percentage values below the threshold are generally higher in images processed with the AEA-RDCP algorithm compared to those where only the sky region was removed. Specifically, the maximum DC percentage for the sky region removal method is approximately 90%, with a minimum of 63%, and an average of 81.6%. In contrast, the AEA-RDCP algorithm achieves a maximum DC percentage of nearly 100%, a minimum of 74%, and an average of 94%. This indicates that the AEA-RDCP algorithm improves the DC percentage value by about 12% on average compared to the RDCP algorithm.
Additionally, Figure 19 provides a detailed analysis by selecting the image with the most significant difference in DC percentage between the two methods. This figure illustrates (a) the original DC percentage, (b) the DC percentage after removing the sky region, and (c) the DC percentage after removing both the sky region and mitigating the effects of atmospheric light. This comparison visually emphasizes the impact of each processing step and shows that the AEA-RDCP algorithm offers more accurate fog density estimation in images influenced by atmospheric light.
Figure 19a illustrates how, without preprocessing, the DC percentage value significantly decreases due to the influence of external light sources, even on a clear day. However, in Figure 19b, after removing the sky region, the DC percentage value increases by more than 40%, indicating a substantial improvement in visibility. Finally, when the atmospheric light pixels are set to 0, the DC percentage approaches nearly 100%. This outcome demonstrates that the AEA-RDCP algorithm effectively mitigates the influence of external light sources, enhancing the accuracy of fog detection while reducing computational complexity. Table 3 compares the performance metrics of the RDCP and AEA-RDCP algorithms as shown in Figure 19, confirming that the AEA-RDCP algorithm achieves superior performance in both computation speed and fog detection accuracy.
While removing only the sky region in the algorithm provides more accurate and stable DC percentage values for fog detection in original images, it is inefficient for images heavily influenced by external light sources like sunlight. The experimental results applying the proposed method to datasets affected by atmospheric light demonstrate that the threshold for atmospheric light pixels effectively estimates atmospheric light. Therefore, comparing with existing methods that only remove the sky region using two real-time raw image datasets, the AEA-RDCP algorithm consistently maintains accuracy under various conditions and improves fog density estimation accuracy by minimizing the impact of sunlight.

4.4. Visibility Estimation

To validate the accuracy of the AEA-RDCP algorithm in estimating sea fog density, a set of images taken between 6:00 AM and 6:00 PM was prepared. These images, presented in Figure 20, comprise a dataset captured at 10 min intervals. The dataset illustrates the dissipation of dense fog over time, starting with heavy fog that gradually clears as time progresses. The proposed algorithm was applied to this dataset to analyze the changes in the DCP percentage over time.
Figure 21 visually represents these changes, with the x-axis indicating time and the y-axis showing DCP percentages. The analysis reveals that, as time progresses, fog density decreases and DCP percentage values increase, demonstrating that the algorithm effectively tracks fog density over time. This confirms the proposed algorithm’s high reliability in accurately estimating fog density in response to temporal changes.

5. Conclusions

Using visibility equipment to assess fog conditions at sea can lead to errors due to the lack of visual information, and there are limitations in accurately understanding conditions in the middle of the ocean. To address this issue, this study proposes a method for the real-time monitoring of fog changes at low cost by applying the proposed algorithm to cameras installed on buoys or lighthouses at sea. The proposed AEA-RDCP algorithm aims to improve the accuracy of fog detection by minimizing the influence of atmospheric light.
Existing RDCP algorithms estimate visibility by removing the sky region, but they fail to completely eliminate the effects of atmospheric light. AEA-RDCP overcomes these limitations by using the DCP algorithm to determine a threshold value for pixels influenced by atmospheric light and sets pixel values above this threshold to zero, effectively minimizing the impact of atmospheric light.
The experimental results demonstrate that the AEA-RDCP algorithm provides superior fog detection accuracy in various marine environments compared to the RDCP algorithm, which only removes the sky region. This indicates that the proposed method significantly contributes to eliminating the influence of atmospheric light. For example, in images significantly affected by atmospheric light, the AEA-RDCP algorithm showed approximately 12% higher DC percentage values than the RDCP algorithm, proving that AEA-RDCP effectively reduces the influence of atmospheric light and enhances fog detection accuracy.
In conclusion, the AEA-RDCP algorithm is an efficient method for accurately estimating fog intensity and visibility in marine environments at low cost. Traditional visibility equipment is expensive and requires high maintenance costs, whereas the camera-based approach provides reliable information from numerous marine activity points without additional costs. This is particularly beneficial for small fishing vessel operators, as it reduces economic burdens.
However, the proposed algorithm still faces challenges in dark environments, such as evening or nighttime conditions [36]. In low-light conditions, achieving sufficient illumination can be difficult, reducing the accuracy of fog detection. Therefore, future research should explore integrating additional lighting devices or thermal cameras to address these limitations. Additionally, verifying the proposed algorithm under various weather conditions and marine environments is essential in order to contribute to the development of real-time marine safety systems.

Author Contributions

Conceptualization, T.-H.I.; methodology, T.-H.I.; software, S.-H.H.; validation, S.-H.H. and T.-H.I.; formal analysis, T.-H.I.; investigation, S.-H.H.; resources, S.-H.H. and K.-W.K.; data curation, S.-H.H.; writing–original draft preparation, S.-H.H.; writing–review and editing, T.-H.I.; visualization, S.-H.H.; supervision, T.-H.I.; project administration, T.-H.I.; funding acquisition, T.-H.I. and K.-W.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was partly supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP)—Innovative Human Resource Development for Local Intellectualization program grant funded by the Korean government (MSIT) (IITP-2024-RS-2024-00436765). This research was supported by Korea Institute of Marine Science & Technology Promotion (KIMST), funded by the Ministry of Oceans and Fisheries, grant number 20210636.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Koračin, D.; Dorman, C.E.; Mejia, J.; McEvoy, D. Marine Fog: Challenges and Advancements in Observations, Modeling, and Forecasting; Springer: Berlin/Heidelberg, Germany, 2024. [Google Scholar]
  2. Jeon, H.-K.; Kim, S.; Edwin, J.; Yang, C.-S. Sea Fog Identification from GOCI Images Using CNN Transfer Learning Models. Electronics 2020, 9, 311. [Google Scholar] [CrossRef]
  3. Shao, N.; Lu, C.; Jia, X.; Wang, Y.; Li, Y.; Yin, Y.; Zhu, B.; Zhao, T.; Liu, D.; Niu, S.; et al. Radiation fog properties in two consecutive events under polluted and clean conditions in the Yangtze River Delta, China: A simulation study. Atmos. Chem. Phys. 2023, 23, 9873–9890. [Google Scholar] [CrossRef]
  4. Wang, Y.; Lu, C.; Niu, S.; Lv, J.; Jia, X.; Xu, X.; Xue, Y.; Zhu, L.; Yan, S. Diverse dispersion effects and parameterization of relative dispersion in urban fog in eastern China. J. Geophys. Res. Atmos. 2023, 128, e2022JD037514. [Google Scholar] [CrossRef]
  5. Yang, D.; Zhu, Z.; Ge, H.; Qiu, H.; Wang, H.; Xu, C. A Lightweight Neural Network for the Real-Time Dehazing of Tidal Flat UAV Images Using a Contrastive Learning Strategy. Drones 2024, 8, 314. [Google Scholar] [CrossRef]
  6. Liang, C.W.; Chang, C.C.; Liang, J.J. The impacts of air quality and secondary organic aerosols formation on traffic accidents in heavy fog–Haze weather. Heliyon 2023, 9, e14631. [Google Scholar] [CrossRef] [PubMed]
  7. World Meteorological Organization. Guide to Meteorological Instruments and Methods of Observation; World Meteorological Organization: Geneva, Switzerland, 2017; p. 1177. [Google Scholar]
  8. Korea Open MET Data Portal. Available online: https://data.kma.go.kr/climate/fog/selectFogChart.do?pgmNo=706 (accessed on 25 November 2023).
  9. Lee, H.K.; Shu, M.S. A Comparative Study on the Visibility Characteristics of Naked-Eye. Atmosphere 2018, 28, 69–83. [Google Scholar]
  10. The Korea Economic Daily: Ongjin County Council Urged the Ministry of Oceans and Fisheries to Ease the Visibility-Related Regulations. 18 October 2021. Available online: https://www.hankyung.com/society/article/202110280324Y (accessed on 25 November 2023).
  11. Hwang, S.-H.; Park, S.-K.; Park, S.-H.; Kwon, K.-W.; Im, T.-H. RDCP: A Real Time Sea Fog Intensity and Visibility Estimation Algorithm. J. Mar. Sci. Eng. 2024, 12, 53. [Google Scholar] [CrossRef]
  12. Wang, S.; Wang, S.; Jiang, Y.; Zhu, H. Discerning Reality through Haze: An Image Dehazing Network Based on Multi-Feature Fusion. Appl. Sci. 2024, 14, 3243. [Google Scholar] [CrossRef]
  13. He, K.; Sun, J.; Tang, X. Single image haze removal using dark channel prior. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 33, 2341–2353. [Google Scholar]
  14. Narasimhan, S.G.; Nayar, S.K. Contrast Restoration of Weather Degraded Images. IEEE Trans. Pattern Anal. Mach. Intell. 2003, 25, 713–724. [Google Scholar] [CrossRef]
  15. Wang, Y.; Qiu, Z.; Zhao, D.; Ali, M.A.; Hu, C.; Zhang, Y.; Liao, K. Automatic Detection of Daytime Sea Fog Based on Supervised Classification Techniques for FY-3D Satellite. Remote Sens. 2023, 15, 2283. [Google Scholar] [CrossRef]
  16. Narendra, K.; Chandrasekaran, M. Performance evaluation of various dehazing techniques for visual surveillance applications. Signal Image Video Process. 2016, 10, 267–274. [Google Scholar]
  17. Tarel, J.-P.; Hautière, N. Fast visibility restoration from a single color or gray level image. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Kyoto, Japan, 29 September–2 October 2009; pp. 2201–2208. [Google Scholar]
  18. Chiang, J.Y.; Chen, Y.-C. Underwater image enhancement by wavelength compensation and dehazing. IEEE Trans. Image Process. 2012, 21, 1755–1769. [Google Scholar] [CrossRef]
  19. Li, B.; Peng, X.; Wang, Z.; Xu, J.; Feng, D. Aod-Net: All-in-One Dehazing Network. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 4770–4778. [Google Scholar]
  20. Zhu, Q.; Mai, J.; Shao, L. A fast single image haze removal algorithm using color attenuation prior. IEEE Trans. Image Process. 2015, 24, 3522–3533. [Google Scholar]
  21. Outay, F.; Taha, B.; Chaabani, H.; Kamoun, F.; Werghi, N.; Yasar, A. Estimating ambient visibility in the presence of fog: A deep convolutional neural network approach. Pers. Ubiquitous Comput. 2021, 25, 51–62. [Google Scholar] [CrossRef]
  22. Bae, T.W.; Han, J.H.; Kim, K.J.; Kim, Y.T. Coastal Visibility Distance Estimation Using Dark Channel Prior and Distance Map Under Sea-Fog: Korean Peninsula Case. Sensors 2019, 19, 4432. [Google Scholar] [CrossRef] [PubMed]
  23. Yang, L. Comprehensive Visibility Indicator Algorithm for Adaptable Speed Limit Control in Intelligent Transportation Systems. Ph.D. Thesis, University of Guelph, Guelph, ON, Canada, 2018. [Google Scholar]
  24. Ryu, E.J.; Lee, H.C.; Cho, S.Y.; Kwon, K.W.; Im, T.H. Sea Fog Level Estimation based on Maritime Digital Image for Protection of Aids to Navigation. J. Korean Soc. Internet Inf. 2021, 22, 25–32. [Google Scholar]
  25. Jeon, H.S.; Park, S.H.; Im, T.H. Grid-Based Low Computation Image Processing Algorithm of Maritime Object Detection for Navigation Aids. Electronics 2023, 12, 2002. [Google Scholar] [CrossRef]
  26. Tan, R.T. Visibility in Bad Weather from a Single Image. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Anchorage, AK, USA, 23–28 June 2008; pp. 1–8. [Google Scholar]
  27. Meng, G.; Wang, Y.; Duan, J.; Pan, C.; Yang, X. Efficient Image Dehazing with Boundary Constraint and Contextual Regularization. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Sydney, Australia, 1–8 December 2013; pp. 617–624. [Google Scholar]
  28. Li, W.; Liu, Y.; Ou, X.; Wu, J.; Guo, L. Enhancing Image Clarity: A Non-Local Self-Similarity Prior Approach for a Robust Dehazing Algorithm. Electronics 2023, 12, 3693. [Google Scholar] [CrossRef]
  29. Liu, S.; Li, Y.; Li, H.; Wang, B.; Wu, Y.; Zhang, Z. Visual Image Dehazing Using Polarimetric Atmospheric Light Estimation. Appl. Sci. 2023, 13, 10909. [Google Scholar] [CrossRef]
  30. Fattal, R. Single image dehazing. ACM Trans. Graph. (TOG) 2008, 27, 1–9. [Google Scholar] [CrossRef]
  31. Ancuti, C.O.; Ancuti, C.; De Vleeschouwer, C. Effective Local Airlight Estimation for Image Dehazing. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; pp. 2850–2854. [Google Scholar] [CrossRef]
  32. Ancuti, C.; Ancuti, C.O.; De Vleeschouwer, C.; Bovik, A.C. Day and Night-Time Dehazing by Local Airlight Estimation. IEEE Trans. Image Process. 2020, 29, 6264–6275. [Google Scholar] [CrossRef] [PubMed]
  33. Berman, D.; Avidan, S. Non-Local Image Dehazing. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 1674–1682. [Google Scholar]
  34. AI Hub. Available online: https://aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=175 (accessed on 8 July 2023).
  35. Yang, S.; Liang, H.; Wang, Y.; Cai, H.; Chen, X. Image Inpainting Based on Multi-Patch Match with Adaptive Size. Appl. Sci. 2020, 10, 4921. [Google Scholar] [CrossRef]
  36. An, J.; Son, K.; Jung, K.; Kim, S.; Lee, Y.; Song, S.; Joo, J. Enhancement of Marine Lantern’s Visibility under High Haze Using AI Camera and Sensor-Based Control System. Micromachines 2023, 14, 342. [Google Scholar] [CrossRef] [PubMed]
Figure 1. An example of a comparative image set for calculating the DCP percentage: (a) No-fog image; and (b) Dense-fog image.
Figure 1. An example of a comparative image set for calculating the DCP percentage: (a) No-fog image; and (b) Dense-fog image.
Applsci 14 08033 g001
Figure 2. Image comparison from Figure 1: Threshold estimation graph for fog classification.
Figure 2. Image comparison from Figure 1: Threshold estimation graph for fog classification.
Applsci 14 08033 g002
Figure 3. Example of an image with the sky region removed: (a) Non-Crop Image; and (b) Crop Image.
Figure 3. Example of an image with the sky region removed: (a) Non-Crop Image; and (b) Crop Image.
Applsci 14 08033 g003
Figure 4. Example of an image with significant atmospheric light influence in the sea region.
Figure 4. Example of an image with significant atmospheric light influence in the sea region.
Applsci 14 08033 g004
Figure 5. Processing of the proposed fog intensity measurement algorithm.
Figure 5. Processing of the proposed fog intensity measurement algorithm.
Applsci 14 08033 g005
Figure 6. Dataset for obtaining the atmospheric light estimation threshold.
Figure 6. Dataset for obtaining the atmospheric light estimation threshold.
Applsci 14 08033 g006
Figure 7. Comparison image when pixel value is set to 0: (a) Example of an image with pixel values of the top 0.1% set to 0; (b) Example of an image with peripheral patches of the top 0.1% set to 0.
Figure 7. Comparison image when pixel value is set to 0: (a) Example of an image with pixel values of the top 0.1% set to 0; (b) Example of an image with peripheral patches of the top 0.1% set to 0.
Applsci 14 08033 g007
Figure 8. Example of patch size application using a blue frame.
Figure 8. Example of patch size application using a blue frame.
Applsci 14 08033 g008
Figure 9. Dataset (a): Example image sets of Baengnyeongdo classified by sea fog strength: No-fog, Low-vis, Fog, and Dense-fog.
Figure 9. Dataset (a): Example image sets of Baengnyeongdo classified by sea fog strength: No-fog, Low-vis, Fog, and Dense-fog.
Applsci 14 08033 g009
Figure 10. Dataset (b): Image set with strong influence of atmospheric light.
Figure 10. Dataset (b): Image set with strong influence of atmospheric light.
Applsci 14 08033 g010
Figure 11. Location of the image sets: Blue for Dataset (a), Red for Dataset (b).
Figure 11. Location of the image sets: Blue for Dataset (a), Red for Dataset (b).
Applsci 14 08033 g011
Figure 12. Example images of clear days from the datasets in Figure 9 and Figure 10: (a) image on a clear day without atmospheric light interference; and (b) image on a clear day with significant atmospheric light interference.
Figure 12. Example images of clear days from the datasets in Figure 9 and Figure 10: (a) image on a clear day without atmospheric light interference; and (b) image on a clear day with significant atmospheric light interference.
Applsci 14 08033 g012
Figure 13. Compare the graph when you remove the original image and the sky area: (a) Non-Crop Image; and (b) Crop Image.
Figure 13. Compare the graph when you remove the original image and the sky area: (a) Non-Crop Image; and (b) Crop Image.
Applsci 14 08033 g013
Figure 14. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Baengyeong Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Figure 14. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Baengyeong Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Applsci 14 08033 g014
Figure 15. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Pyeongtaek): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Figure 15. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Pyeongtaek): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Applsci 14 08033 g015
Figure 16. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Ji Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Figure 16. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Ji Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Applsci 14 08033 g016
Figure 17. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Ganghwa Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Figure 17. Image set without the effect of atmospheric light—graph comparison between crop image and excluded image (Ganghwa Island): (a) Crop Image; and (b) Crop Image/Excluded Airlight.
Applsci 14 08033 g017
Figure 18. Image set with the effect of atmospheric light: graph comparison between Crop Image and Crop Image excluding atmospheric light.
Figure 18. Image set with the effect of atmospheric light: graph comparison between Crop Image and Crop Image excluding atmospheric light.
Applsci 14 08033 g018
Figure 19. Comparison of graphs of images with strong effects of atmospheric light: (a) Original Image, (b) Crop Image, and (c) Excluded Airlight.
Figure 19. Comparison of graphs of images with strong effects of atmospheric light: (a) Original Image, (b) Crop Image, and (c) Excluded Airlight.
Applsci 14 08033 g019
Figure 20. An image set showing the changes in sea fog conditions over time.
Figure 20. An image set showing the changes in sea fog conditions over time.
Applsci 14 08033 g020
Figure 21. DCP percentage graph over time.
Figure 21. DCP percentage graph over time.
Applsci 14 08033 g021
Table 1. Sea fog strength classification table according to visibility.
Table 1. Sea fog strength classification table according to visibility.
Sea Fog IntensityVisibility (Meters)DC Percent (%)
No-fog1500~75~100
Low-fog1000~150050~75
Mid-fog500~100025~50
Dense-fog0~5000~25
Table 2. Comparison of DCP percentages in original images using example images.
Table 2. Comparison of DCP percentages in original images using example images.
DatasetDCP Percent (%)
Dataset (a)63%
Dataset (b)40%
Table 3. Performance comparison of RDCP and AEA-RDCP algorithms.
Table 3. Performance comparison of RDCP and AEA-RDCP algorithms.
Sea Fog IntensityRDCPAEA-RDCP
DCP percent85%100%
Processing time11 ms9 ms
Pixel count92,16092,160
StateNo-fogNo-fog
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hwang, S.-H.; Kwon, K.-W.; Im, T.-H. AEA-RDCP: An Optimized Real-Time Algorithm for Sea Fog Intensity and Visibility Estimation. Appl. Sci. 2024, 14, 8033. https://doi.org/10.3390/app14178033

AMA Style

Hwang S-H, Kwon K-W, Im T-H. AEA-RDCP: An Optimized Real-Time Algorithm for Sea Fog Intensity and Visibility Estimation. Applied Sciences. 2024; 14(17):8033. https://doi.org/10.3390/app14178033

Chicago/Turabian Style

Hwang, Shin-Hyuk, Ki-Won Kwon, and Tae-Ho Im. 2024. "AEA-RDCP: An Optimized Real-Time Algorithm for Sea Fog Intensity and Visibility Estimation" Applied Sciences 14, no. 17: 8033. https://doi.org/10.3390/app14178033

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop