Next Article in Journal
A Semiautomatic Multi-Label Color Image Segmentation Coupling Dirichlet Problem and Colour Distances
Next Article in Special Issue
Improved Coefficient Recovery and Its Application for Rewritable Data Embedding
Previous Article in Journal
A Data-Centric Augmentation Approach for Disturbed Sensor Image Segmentation
Previous Article in Special Issue
A Detection Method of Operated Fake-Images Using Robust Hashing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Three-Color Balancing for Color Constancy Correction

Department of Computer Science, Tokyo Metropolitan University, 6-6 Asahigaoka, Tokyo 191-0065, Japan
*
Author to whom correspondence should be addressed.
J. Imaging 2021, 7(10), 207; https://doi.org/10.3390/jimaging7100207
Submission received: 17 August 2021 / Revised: 22 September 2021 / Accepted: 2 October 2021 / Published: 6 October 2021
(This article belongs to the Special Issue Intelligent Media Processing)

Abstract

:
This paper presents a three-color balance adjustment for color constancy correction. White balancing is a typical adjustment for color constancy in an image, but there are still lighting effects on colors other than white. Cheng et al. proposed multi-color balancing to improve the performance of white balancing by mapping multiple target colors into corresponding ground truth colors. However, there are still three problems that have not been discussed: choosing the number of target colors, selecting target colors, and minimizing error which causes computational complexity to increase. In this paper, we first discuss the number of target colors for multi-color balancing. From our observation, when the number of target colors is greater than or equal to three, the best performance of multi-color balancing in each number of target colors is almost the same regardless of the number of target colors, and it is superior to that of white balancing. Moreover, if the number of target colors is three, multi-color balancing can be performed without any error minimization. Accordingly, we propose three-color balancing. In addition, the combination of three target colors is discussed to achieve color constancy correction. In an experiment, the proposed method not only outperforms white balancing but also has almost the same performance as Cheng’s method with 24 target colors.

1. Introduction

A change in illumination affects the pixel values of an image taken with an RGB digital camera because the values are determined by spectral information such as the spectra of illumination [1,2,3]. In the human visual system, it is well known that illumination changes (i.e., lighting effects) are reduced, and this ability keeps the entire color perception of a scene constant [4]. In contrast, since cameras do not intrinsically have this ability, white balancing is applied to images [5]. Otherwise, in the field of image segmentation or object recognition, we may suffer from color distortion caused by lighting effects [2,3,6,7,8,9,10].
Applying white balancing requires a two-step procedure: estimating a white region with remaining lighting effects (i.e., a source white point) and mapping the estimated white region into the ground truth white without lighting effects. Many studies have focused on estimating a source white point in images [11,12,13,14,15,16,17,18,19]. However, even when white regions are accurately estimated, colors other than white still include lighting effects. Therefore, various methods for reducing lighting effects on multiple colors have been investigated as in [5,20,21,22,23,24,25,26,27,28,29,30,31]. For example, von Kries’s [20] and Bradford’s [21] chromatic adaptation transforms were proposed to address this problem under the framework of white balancing.
In contrast, Cheng et al. [5] proposed a multi-color balancing for reducing lighting effects on both chromatic and achromatic colors. In Cheng’s method, multiple colors with remaining lighting effects, called “target colors” in this paper, are used for designing a matrix that maps the target colors into ground truth ones. Although Cheng’s method contributes more to improving color constancy correction than white balancing, it still has three problems: choosing the number of target colors, selecting the combination of target colors, and minimizing error which causes the computational complexity to increase.
In this paper, we propose a three-color balance adjustment. This paper has two contributions. The first one is to show that if the combination of three target colors that provides the lowest mean error under three-color balancing is chosen, the three-color balancing will have almost the same performance as Cheng’s method with 24 target colors. The second one is to show that there are multiple appropriate combinations of the three target colors, which can offer a low mean error even under various lighting conditions. From the second, some examples of the combination of three target colors are recommended, which can maintain the performance of three-color balancing under general lighting conditions. As a result, the proposed three-color balancing can more improve color constancy correction than WB. Furthermore, the proposed method gives us a new insight for selecting three target colors in multi-color balancing.
In experiments, three-color balancing is demonstrated to have almost the same performance as Cheng’s method with 24 target colors, and it outperforms conventional white balance adjustments. Additionally, the computational complexity was tested under various numbers of target colors, and the proposed method achieved faster processing than Cheng’s method with error minimization algorithms.

2. Related Work

Here, we summarize how illumination change affects colors in a digital image, and white balancing and Cheng’s multi-color balancing are summarized.

2.1. Lighting Effects on Digital Images

On the basis of the Lambertian model [1], pixel values of an image taken with an RGB digital camera are determined by using three elements: spectra of illumination E ( λ ) , spectral reflectance of objects S ( λ ) , and camera spectral sensitivity R C for color C { R , G , B } , where λ spans the visible spectrum in the range of [ 400 , 720 ] (see Figure 1) [2,3]. A pixel value P RGB = ( r P , g P , b P ) in the camera RGB color space is given by
P RGB = 400 720 E ( λ ) S ( λ ) R C ( λ ) d λ .
Equation (1) means that a change in illumination E ( λ ) affects the pixel values in an image. In the human visual system, the changes (i.e., lighting effects) are reduced, and the overall color perception is constant regardless of illumination differences, known as chromatic adaptation [4]. To mimic this human ability as a computer vision task, white balancing is typically performed as a color adjustment method.

2.2. White Balance Adjustments

By using white balancing, lighting effects on a white region in an image are accurately corrected if the white region under illumination is correctly estimated. Many studies on color constancy have focused on correctly estimating white regions, and a variety of automatic algorithms are available [11,12,13,14,15,17,18].
White balancing is performed by
P WB = M WB P XYZ .
M WB in Equation (2) is given as
M WB = M A 1 ρ D / ρ S 0 0 0 γ D / γ S 0 0 0 β D / β S M A ,
where P XYZ = ( X P , Y P , Z P ) is a pixel value of an input image I XYZ in the XYZ color space [32], and P WB = ( X WB , Y WB , Z WB ) is that of a white-balanced image I WB [33]. M A with a size of 3 × 3 is decided in accordance with an assumed chromatic adaptation transform [33]. ( ρ S , γ S , β S ) and ( ρ D , γ D , β D ) are calculated from a source white point ( X S , Y S , Z S ) in an input image and a ground truth white point ( X D , Y D , Z D ) as
ρ S γ S β S = M A X S Y S Z S and ρ D γ D β D = M A X D Y D Z D .
We call using the 3 × 3 identity matrix as M A “white balancing with XYZ scaling” in this paper. Otherwise, von Kries’s [20] and Bradford’s [21] chromatic adaptation transforms were also proposed for reducing lighting effects on all colors under the framework of white balancing. For example, under the use of von Kries’s model, M A is given as
M A = 0.4002 0.7076 0.0808 0.2263 1.1653 0.0457 0.0000 0.0000 0.9182 .

2.3. Cheng’s Multi-Color Balancing

White balancing is a method that maps a source white point in an image into a ground truth one as in Equation (3). In other words, colors other than white are not considered in this mapping, although the goal of color constancy is to remove lighting effects on all colors. To address this problem, various chromatic adaptation transforms such as those of von Kries [20], Bradford [21], CAT02 [22], and the latest CAM16 [23] have been proposed to reduce lighting effects on all colors under the framework of white balancing.
In addition, Cheng et al. proposed a method [5] for considering both achromatic and chromatic colors to further relax the limitation that white balancing has. In their method, multiple colors are used instead of white. Let T 1 = ( X T 1 , Y T 1 , Z T 1 ) , T 2 = ( X T 2 , Y T 2 , Z T 2 ) , T 3 = ( X T 3 , Y T 3 , Z T 3 ) , ⋯, T n = ( X T n , Y T n , Z T n ) be n colors with remaining lighting effects in the XYZ color space. In this paper, these colors are called “target colors.” Additionally, let G 1 = ( X G 1 , Y G 1 , Z G 1 ) , G 2 = ( X G 2 , Y G 2 , Z G 2 ) , G 3 = ( X G 3 , Y G 3 , Z G 3 ) , ⋯, G n = ( X G n , Y G n , Z G n ) be n ground truth colors corresponding to each target color. To calculate a linear transform matrix, let two matrices T and G be
T = X T 1 X T 2 X T 3 X T n Y T 1 Y T 2 Y T 3 Y T n Z T 1 Z T 2 Z T 3 Z T n and G = X G 1 X G 2 X G 3 X G n Y G 1 Y G 2 Y G 3 Y G n Z G 1 Z G 2 Z G 3 Z G n ,
respectively. In Cheng’s method, n was set to 24 (i.e., all patches in a color rendition chart) [5]. If n is greater than three, T and G will be a singular matrix, and the inverse matrix of T is not uniquely determined. Hence, the Moore–Penrose pseudoinverse [5,34] is used, and the optimal linear transform matrix M + is given as
M + = G T ( T T ) 1 .
However, as noted by Funt et al. [35], because the illumination across the target colors is generally not uniform, calculating M + only with Equation (7) is insufficient to perform calibration for such circumstances. Hence, with M + calculated in Equation (7) as the input, the method in [35] is also used to minimize the sum of error:
M = argmin M + i = 1 n cos 1 M + T i · G i M + T i G i .
As well as white balancing, a pixel value corrected by Cheng’s method P CNG is given as
P CNG = M P XYZ ,
where M is calculated as in Equations (6)–(8).
However, this method still has three problems that have not been discussed:
(i)
How the number of target colors is decided.
(ii)
How the combination of n target colors is selected.
(iii)
How the computational complexity of Cheng’s method is reduced.
Additionally, as for (iii), if the number of target colors is increased, the computational complexity of Cheng’s method will be increased due to the use of Equation (8).

3. Proposed Method

In this section, we investigate the relationship between the number of target colors and the performance of Cheng’s multi-color balancing. From the investigation, we point out that if the combination of three target colors that offers the lowest mean error in three-color balancing is chosen, the three-color balancing will have almost the same performance as Cheng’s method with 24 target colors. Accordingly, we propose a three-color balance adjustment that maps three target colors into corresponding ground truth colors without minimizing error. Additionally, the selection of three target colors is discussed, and we recommend some example combinations of three target colors, which can be used under general illumination. Finally, the procedure of the proposed method is summarized.

3.1. Number of Target Colors

In this section, we argue the relationship between the number of target colors and the performance of color constancy correction.
White balancing with XYZ scaling and Cheng’s multi-color balancing were applied to the images in Figure 2b,c, respectively, where n was set to 1 for white balancing, and it was set to 3, 4, 5, and 24 for Cheng’s method. Additionally, corresponding ground truth colors were selected from Figure 2a.
Figure 3 shows box plots of experimental results under various conditions. In each adjustment, all combinations of n target colors were chosen from 24 patches in the color rendition chart. Therefore, the combination number of n target colors was 24 C n . For example, when n = 1 , there are 24 combinations of target colors.
For every combination of target colors, the performance of each adjustment was evaluated by using the mean value of reproduction angular errors for the 24 patches. The reproduction angular error is given by
E r r = 180 π cos 1 P · Q P Q [ deg ] ,
where P is a mean-pixel value of an adjusted patch, and Q is that of the corresponding ground truth one [36]. In this experiment, P corresponds to an adjusted patch in Figure 2b,c, and Q corresponds to a patch in Figure 2a. From the figure, two properties are summarized as follows.
(i)
Cheng’s method had a lower minimum mean error than white balancing.
(ii)
When n 3 , Cheng’s method had almost the same minimum mean error as that of n = 24 .
Moreover, when n = 3 is chosen, M + is reduced to M if T and G have full rank, so Equation (8) is not required. Accordingly, in this paper, we propose selecting n = 3 .

3.2. Proposed Three-Color Balancing

Figure 4 shows an overview of the proposed three-color balancing. In a manner like white balancing, three-color balanced pixel P 3 CB is given as
P 3 CB = M 3 CB P XYZ .
Let T 1 = ( X T 1 , Y T 1 , Z T 1 ) , T 2 = ( X T 2 , Y T 2 , Z T 2 ) , and T 3 = ( X T 3 , Y T 3 , Z T 3 ) be three target colors in the XYZ color space. Note that the location of each target color is known; or target colors can be estimated by using a color estimation algorithm although color estimation algorithms do not have enough performance data yet for estimating various colors [37,38,39,40]. Additionally, let G 1 = ( X G 1 , Y G 1 , Z G 1 ) , G 2 = ( X G 2 , Y G 2 , Z G 2 ) , and G 3 = ( X G 3 , Y G 3 , Z G 3 ) be corresponding ground truth colors, respectively. Then, M 3 CB satisfies
G = M 3 CB T ,
where
T = X T 1 X T 2 X T 3 Y T 1 Y T 2 Y T 3 Z T 1 Z T 2 Z T 3 and G = X G 1 X G 2 X G 3 Y G 1 Y G 2 Y G 3 Z G 1 Z G 2 Z G 3 .
When both T and G have full rank, M 3 CB is designed by
M 3 CB = G T 1 .
By applying M 3 CB in Equation (14) to every pixel value in input image I XYZ , balanced image I 3 CB is obtained, where target colors in I XYZ are mapped into ground truth ones.
The proposed method is considered as a special case of Cheng’s multi-color balancing. If the number of target colors is three and both T and G have full rank, T 1 is uniquely determined, and M 3 CB can be designed without any error minimization algorithms.

3.3. Selection of Three Target Colors

Under one light source, the optimal combination of the three target colors has to be selected by testing all of the conceivable combinations, as discussed in Section 3.1. However, because illumination continuously changes in real situations, it is difficult to repeat performing the selection for every light source. Accordingly, in this section, appropriate combinations of three target colors, which offer a low mean reproduction error are recommended by experimentally testing all combinations of three target colors for 500 images taken under various light sources.
We used the ColorChecker dataset prepared by Hemrit et al. [41], in which pixel values of the 24 patches in a color rendition chart were recorded under 551 illumination conditions. Additionally, the ground truth colors selected in Figure 2a were used for this discussion. By using 500 images in the dataset, all combinations of three colors were tested, and the performance of the proposed method with each combination was evaluated in terms of the mean reproduction error for the 24 patches. From the experiment, a combination of three colors (index 6, 9, and 14 in Figure 5) was selected at which the proposed method had the minimum value among 500 mean E r r values.
Additionally, we used the other 51 images from the dataset to evaluate the effectiveness of the combination of three target colors. In Table 1, three-color balancing was compared with Cheng’s method and white balancing with XYZ scaling, referred to as 3CB (6, 9, 14), Cheng (1–24), and WB (XYZ), respectively. The selected combination shown in Figure 5 was used for three-color balancing, and all of the 24 target colors were chosen for Cheng’s method. For white balancing, the white patch (index 6) was selected.
From Table 1, 3CB (6, 9, 14) had almost the same performance as Cheng (1–24). In addition, 3CB (6, 9, 14) and Cheng (1–24) outperformed WB (XYZ) for almost all chromatic colors (index 7–24). In other words, WB (XYZ) did not reduce lighting effects on colors other than achromatic colors (index 1–6).
Figure 6 shows box plots of three-color balancing with four more combinations of three target colors in addition to 3CB (6, 9, 14), where 3CB (6, 9, 14), 3CB (11, 14, 24), 3CB (4, 11, 13), 3CB (6, 14, 16), and 3CB (11, 13, 24) had the first, second, third, fourth, and fifth smallest mean reproduction error, respectively.
From the figure, the five combinations of three target colors had almost the same result. This means that the effectiveness of three-color balancing can be maintained even when (6, 9, 14) is replaced with other combinations. Therefore, for example, we recommend (6, 9, 14) and (11, 14, 24) from Figure 6 as the top two combinations of target colors, which can maintain high performance under general illumination.

3.4. Procedure of Three-Color Balancing

When we assume that three colors selected from 24 colors in Figure 2a are used as ground truth colors, the procedure of the three-color balancing is given as below.
(i)
Select three ground truth colors from Figure 2a. Let them G 1 , G 2 , and G 3 , respectively.
(ii)
Prepare three objects in a camera frame, in which each object color corresponds to one of the ground truth colors. Compute three target colors T 1 , T 2 , and T 3 from the region of each object, respectively.
(iii)
Apply three-color balancing by using T 1 , T 2 , and T 3 and G 1 , G 2 , and G 3 , following Section 3.2.
Using (6, 9, 14) or (11, 14, 24) as a combination is recommended in step (i), as discussed in Section 3.3. Additionally, if the ground truth colors other than those from Figure 2a are used, the selection of target colors discussed in Section 3.3 should be carried out by using the newly determined ground truth colors.

4. Experiment

We conducted experiments to confirm the effectiveness of the proposed method.

4.1. Evaluation of Reducing Lighting Effects

In this experiment, the effectiveness of three-color balancing with the recommended combinations was demonstrated by using various colors and lighting conditions, where these experimental conditions were different from those in Section 3.3. Figure 7 shows images taken under three different light sources, where 10 more color regions were added to the color rendition chart in Figure 2, and the images included 34 color regions numbered from 1 to 34. Additionally, the lighting condition in Figure 7a is the same as that of Figure 2a, which means that the color regions numbered from 1 to 24 in Figure 7a correspond to those of Figure 2a.
Representative RGB values were computed as the mean-pixel value of a color region in Figure 7. Figure 8 and Figure 9 show images adjusted by using various methods.
The proposed method was applied to the input images, and it was compared with white balancing and Cheng’s method (Cheng (1–24)), where XYZ scaling (WB (XYZ)) and von Kries’s model (WB (von Kries)) were applied as white balance adjustments. Target colors were selected from the color rendition chart indexed from 1 to 24 in Figure 7b,c. For white balancing, the white region (index 6) was chosen. Additionally, for Cheng’s method, all of the 24 color regions were used as target colors. For the proposed method, two combinations of the three target colors determined in Section 3.3 were selected: (6, 9, 14) and (11, 14, 24), referred as to 3CB (6, 9, 14) and 3CB (11, 14, 24), respectively.
In Table 2 and Table 3, the reproduction angular errors for Equation (10) are shown to objectively compare the corrected images. From the tables, the proposed method had almost the same performance as Cheng’s method with 24 target colors in terms of the mean value and standard variation. Additionally, the proposed method and Cheng’s method outperformed conventional white balance adjustments, although the mean value of 3CB (6, 9, 14) had a higher error than that of WB (von Kries) in Table 2. Therefore, the effectiveness of the proposed method for color constancy correction was confirmed.

4.2. Evaluation of Computational Complexity

In Cheng’s method, all of the 24 patches in a color rendition chart are used as target colors. Additionally, when n 4 , the computational complexity of Cheng’s method will increase due to the use of Equation (8). In contrast, that of three-color balancing is low because Equation (8) is not required.
To evaluate the complexity, we implemented white balancing with XYZ scaling, three-color balancing, and Cheng’s method on a computer with a 3.6 GHz processor and a main memory of 16 Gbytes (see Table 4).
Note that the runtime of the equations shown in Section 2.2, Section 2.3, and Section 3.2 was only measured for white balancing, Cheng’s method, and three-color balancing, respectively. In other words, for example, image loading runtimes were excluded from the measurement. Figure 10 shows the runtime result in the case that each method was applied to the 51 test images in Section 3.3.
As shown in Figure 10, while the proposed three-color balancing and white balancing had almost the same average runtime, Cheng’s method required longer runtimes to minimize error. Moreover, in Cheng’s method, if the number of target colors increases, it will result in a long runtime.

5. Discussion on the Rank of T and G

In the proposed three-color balancing, when T and G have full rank, M 3 CB can be designed as in Equation (14). If T and G violate the rank constraints, their eigenvalues will include an almost zero value due to rank deficiency [42]. Figure 11 shows the relationship between three eigenvalues of T or G and the performance of three-color balancing under various combinations of three colors. In the figure, for visualization, the product of the three eigenvalues was calculated for the horizontal axis:
λ T = | λ T 1 | × | λ T 2 | × | λ T 3 | and λ G = | λ G 1 | × | λ G 2 | × | λ G 3 | ,
where λ T 1 ,   λ T 2 , and λ T 3 denote eigenvalues calculated from T , and λ G 1 ,   λ G 2 , and λ G 3 denote those from G . λ T was calculated by using various combinations of three target colors selected from Figure 2a, and λ G was calculated by using corresponding ground truth colors from Figure 2b.
In a manner like Figure 3, the performance of three-color balancing for every combination was evaluated in terms of the mean reproduction values for the 24 patches. From the figure, we confirmed that if one eigenvalue is nearly zero, i.e., the product of the eigenvalues is nearly zero, three-color balancing results in a high mean E r r value. This means that the rank deficiency caused by the combination of linearly dependent color vectors significantly affects the overall color correction performance. Thus, the following results were verified:
(i)
If the product of eigenvalues is small, the mean value of reproduction errors will be high.
(ii)
There are many combinations of three target colors at which three-color balancing results in a high error.
(iii)
Analysis based on eigenvalues enables us to select three target colors without a large dataset as in Section 3.3.

6. Conclusions

In this paper, we proposed a three-color balance adjustment for color constancy. While conventional white balancing cannot perfectly adjust colors other than white, multi-color balancing including three-color balancing improves color constancy correction by using multiple colors. Additionally, this paper presented the choice of the number of n target colors and the selection of three target colors for the proposed three-color balancing. When the number of target colors is over three, the best performance of color constancy correction is almost the same regardless of n. Additionally, for three-color balancing, a combination of three target colors that enables us to achieve a lower reproduction error was determined with a large dataset. Moreover, no algorithms for minimizing error are required to use the proposed method, and this contributes to reducing computational complexity. Experimental results indicated that the proposed three-color balancing did not only outperform conventional white balancing but also had almost the same performance as multi-color balancing with 24 target colors.

Author Contributions

Conceptualization, T.A., Y.K., S.S. and H.K.; methodology, T.A.; software, T.A. and Y.K.; validation, T.A., Y.K., S.S. and H.K.; formal analysis, T.A., Y.K., S.S. and H.K.; investigation, T.A.; resources, Y.K., S.S. and H.K.; data curation, T.A.; writing—original draft preparation, T.A.; writing—review and editing, H.K.; visualization, T.A. and H.K.; supervision, Y.K., S.S. and H.K.; project administration, H.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lambert, J. Photometria, Sive de Mensura et Gradibus Luminis, Colorum et Umbrae; Klett: Augsburg, Germany, 1760. [Google Scholar]
  2. Seo, K.; Kinoshita, Y.; Kiya, H. Deep Retinex Network for Estimating Illumination Colors with Self-Supervised Learning. In Proceedings of the IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech), Nara, Japan, 9–11 March 2021; pp. 1–5. [Google Scholar]
  3. Chien, C.C.; Kinoshita, Y.; Shiota, S.; Kiya, H. A retinex-based image enhancement scheme with noise aware shadow-up function. In Proceedings of the International Workshop on Advanced Image Technology (IWAIT), Singapore, 6–9 January 2019; Volume 11049, pp. 501–506. [Google Scholar]
  4. Fairchild, M.D. Color Appearance Models, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
  5. Cheng, D.; Price, B.; Cohen, S.; Brown, M.S. Beyond White: Ground Truth Colors for Color Constancy Correction. In Proceedings of the IEEE Conference on Computer Vision (ICCV), Santiago, Chile, 13–16 December 2015; pp. 298–306. [Google Scholar]
  6. Afifi, M.; Brown, M.S. What Else Can Fool Deep Learning? Addressing Color Constancy Errors on Deep Neural Network Performance. In Proceedings of the IEEE Conference on Computer Vision (ICCV), Seoul, Korea, 27 October–2 November 2019; pp. 243–252. [Google Scholar]
  7. Kinoshita, Y.; Kiya, H. Scene Segmentation-Based Luminance Adjustment for Multi-Exposure Image Fusion. IEEE Trans. Image Process. 2019, 28, 4101–4116. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Akimoto, N.; Zhu, H.; Jin, Y.; Aoki, Y. Fast Soft Color Segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 8274–8283. [Google Scholar]
  9. Kinoshita, Y.; Kiya, H. Hue-Correction Scheme Based on Constant-Hue Plane for Deep-Learning-Based Color-Image Enhancement. IEEE Access 2020, 8, 9540–9550. [Google Scholar] [CrossRef]
  10. Kinoshita, Y.; Kiya, H. Hue-correction scheme considering CIEDE2000 for color-image enhancement including deep-learning-based algorithms. APSIPA Trans. Signal Inf. Process. 2020, 9, e19. [Google Scholar] [CrossRef]
  11. Land, E.H.; McCann, J.J. Lightness and Retinex Theory. J. Opt. Soc. Am. 1971, 61, 1–11. [Google Scholar] [CrossRef] [PubMed]
  12. Buchsbaum, G. A spatial processor model for object colour perception. J. Frankl. Inst. 1980, 310, 1–26. [Google Scholar] [CrossRef]
  13. van de Weijer, J.; Gevers, T.; Gijsenij, A. Edge-Based Color Constancy. IEEE Trans. Image Process. 2007, 16, 2207–2214. [Google Scholar] [CrossRef] [Green Version]
  14. Cheng, D.; Prasad, D.K.; Brown, M.S. Illuminant estimation for color constancy: Why spatial-domain methods work and the role of the color distribution. J. Opt. Soc. Am. 2014, 31, 1049–1058. [Google Scholar] [CrossRef]
  15. Afifi, M.; Punnappurath, A.; Finlayson, G.D.; Brown, M.S. As-projective-as-possible bias correction for illumination estimation algorithms. J. Opt. Soc. Am. 2019, 36, 71–78. [Google Scholar] [CrossRef]
  16. Barron, J.T.; Tsai, Y.T. Fast Fourier Color Constancy. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 886–894. [Google Scholar]
  17. Afifi, M.; Price, B.; Cohen, S.; Brown, M.S. When Color Constancy Goes Wrong: Correcting Improperly White-Balanced Images. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019; pp. 1535–1544. [Google Scholar]
  18. Afifi, M.; Brown, M.S. Deep White-Balance Editing. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 1397–1406. [Google Scholar]
  19. Hernandez-Juarez, D.; Parisot, S.; Busam, B.; Leonardis, A.; Slabaugh, G.; McDonagh, S. A Multi-Hypothesis Approach to Color Constancy. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 2270–2280. [Google Scholar]
  20. von Kries, J. Beitrag zur physiologie der gesichtsempfindung. Arch. Anat. Physiol. 1878, 2, 503–524. [Google Scholar]
  21. Lam, K.M. Metamerism and Colour Constancy. Ph.D. Thesis, University of Bradford, Bradford, UK, 1985. [Google Scholar]
  22. Moroney, N.; Fairchild, M.D.; Hunt, R.W.; Li, C.; Luo, M.R.; Newman, T. The CIECAM02 Color Appearance Model. In Proceedings of the Color and Imaging Conference, Scottsdale, AZ, USA, 12–15 November 2002; pp. 23–27. [Google Scholar]
  23. Li, C.; Li, Z.; Wang, Z.; Xu, Y.; Luo, M.R.; Cui, G.; Melgosa, M.; Brill, M.H.; Pointer, M. Comprehensive color solutions: CAM16, CAT16, and CAM16-UCS. Color Res. Appl. 2017, 42, 703–718. [Google Scholar] [CrossRef]
  24. Finlayson, G.D.; Drew, M.S.; Funt, B.V. Spectral sharpening: Sensor transformations for improved color constancy. J. Opt. Soc. Am. 1994, 11, 1553–1563. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Finlayson, G.D.; Drew, M.S.; Funt, B.V. Color constancy: Enhancing von Kries adaption via sensor transformations. In Proceedings of the Human Vision Visual Processing, and Digital Display IV, San Jose, CA, USA, 1–4 February 1993; Volume 1913, pp. 473–484. [Google Scholar]
  26. Finlayson, G.D.; Drew, M.S. Constrained least-squares regression in color spaces. J. Electron. Imaging 1997, 6, 484–493. [Google Scholar] [CrossRef]
  27. Susstrunk, S.E.; Holm, J.M.; Finlayson, G.D. Chromatic adaptation performance of different RGB sensors. In Proceedings of the Color Imaging: Device-Independent Color, Color Hardcopy, and Graphic Arts VI, San Jose, CA, USA, 23–26 January 2001; Volume 4300, pp. 172–183. [Google Scholar]
  28. Chong, H.; Zickler, T.; Gortler, S. The von Kries Hypothesis and a Basis for Color Constancy. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Rio de Janeiro, Brazil, 14–21 October 2007; pp. 1–8. [Google Scholar]
  29. Funt, B.; Jiang, H. Nondiagonal color correction. In Proceedings of the IEEE Conference on Image Processing (ICIP), Barcelona, Spain, 14–17 September 2003; pp. 481–484. [Google Scholar]
  30. Huang, C.; Huang, D. A study of non-diagonal models for image white balance. In Proceedings of the Image Processing: Algorithms and Systems XI, Burlingame, CA, USA, 4–7 February 2013; Volume 8655, pp. 384–395. [Google Scholar]
  31. Akazawa, T.; Kinoshita, Y.; Kiya, H. Multi-Color Balance For Color Constancy. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Anchorage, AK, USA, 19–22 September 2021; pp. 1369–1373. [Google Scholar]
  32. CIE. Commission Internationale de l’Éclairage Proceedings; Cambridge University Press: Cambridge, UK, 1932. [Google Scholar]
  33. Chromatic Adaptation. Available online: http://www.brucelindbloom.com/index.html?Eqn_ChromAdapt.html (accessed on 11 December 2020).
  34. Harville, D.A. Matrix Algebra from a Statistician’s Perspective; Springer: Berlin/Heidelberg, Germany, 1997. [Google Scholar]
  35. Funt, B.; Bastani, P. Intensity Independent RGB-to-XYZ Colour Camera Calibration. In Proceedings of the International Colour Association, Taipei, Taiwan, 22–25 September 2012; pp. 128–131. [Google Scholar]
  36. Finlayson, G.D.; Zakizadeh, R. Reproduction Angular Error: An Improved Performance Metric for Illuminant Estimation. In Proceedings of the British Machine Vision Conference, Nottingham, UK, 1–5 September 2014. [Google Scholar]
  37. Finlayson, G.; Funt, B.; Barnard, K. Color constancy under varying illumination. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Cambridge, MA, USA, 20–23 June 1995; pp. 720–725. [Google Scholar]
  38. Kawakami, R.; Ikeuchi, K. Color estimation from a single surface color. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Miami, FL, USA, 22–24 June 2009; pp. 635–642. [Google Scholar]
  39. Hirose, S.; Takemura, K.; Takamatsu, J.; Suenaga, T.; Kawakami, R.; Ogasawara, T. Surface color estimation based on inter- and intra-pixel relationships in outdoor scenes. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), San Francisco, CA, USA, 13–18 June 2010; pp. 271–278. [Google Scholar]
  40. Kinoshita, Y.; Kiya, H. Separated-Spectral-Distribution Estimation Based on Bayesian Inference with Single RGB Camera. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Anchorage, AK, USA, 19–22 September 2021; pp. 1379–1383. [Google Scholar]
  41. Hemrit, G.; Finlayson, G.D.; Gijsenij, A.; Gehler, P.; Bianco, S.; Funt, B.; Drew, M.; Shi, L. Rehabilitating the ColorChecker Dataset for Illuminant Estimation. Color Imaging Conf. 2018, 2018, 350–353. [Google Scholar] [CrossRef] [Green Version]
  42. Lay, D.C. Linear Algebra and Its Applications, 4th ed.; Pearson: London, UK, 2012. [Google Scholar]
Figure 1. Imaging pipeline of RGB digital camera.
Figure 1. Imaging pipeline of RGB digital camera.
Jimaging 07 00207 g001
Figure 2. Color rendition charts under different light sources. (a) Ground truth color rendition chart, (b) color rendition chart under artificial light source, and (c) color rendition chart under daylight.
Figure 2. Color rendition charts under different light sources. (a) Ground truth color rendition chart, (b) color rendition chart under artificial light source, and (c) color rendition chart under daylight.
Jimaging 07 00207 g002
Figure 3. Box plots of mean values of reproduction errors for 24 patches. (a) Image in Figure 2b. (b) Image in Figure 2c. Boxes span from the first to third quartile, referred to as Q 1 and Q 3 . Whiskers show maximum and minimum values in range of [ Q 1 1.5 ( Q 3 Q 1 ) , Q 3 + 1.5 ( Q 3 Q 1 ) ] . The red band inside boxes indicates the median.
Figure 3. Box plots of mean values of reproduction errors for 24 patches. (a) Image in Figure 2b. (b) Image in Figure 2c. Boxes span from the first to third quartile, referred to as Q 1 and Q 3 . Whiskers show maximum and minimum values in range of [ Q 1 1.5 ( Q 3 Q 1 ) , Q 3 + 1.5 ( Q 3 Q 1 ) ] . The red band inside boxes indicates the median.
Jimaging 07 00207 g003
Figure 4. Overview of proposed method.
Figure 4. Overview of proposed method.
Jimaging 07 00207 g004
Figure 5. Example of the selected three colors, which are highlighted by yellow squares. Each patch is numbered from 1 to 24, and patches indexed by 6 (white), 9 (red), and 14 (yellow green) were selected.
Figure 5. Example of the selected three colors, which are highlighted by yellow squares. Each patch is numbered from 1 to 24, and patches indexed by 6 (white), 9 (red), and 14 (yellow green) were selected.
Jimaging 07 00207 g005
Figure 6. Box plot of three-color balancing with five combinations of target colors. Boxes span from the first to third quartile, referred to as Q 1 and Q 3 . Whiskers show maximum and minimum values in range of [ Q 1 1.5 ( Q 3 Q 1 ) , Q 3 + 1.5 ( Q 3 Q 1 ) ] . The band inside boxes indicates the median.
Figure 6. Box plot of three-color balancing with five combinations of target colors. Boxes span from the first to third quartile, referred to as Q 1 and Q 3 . Whiskers show maximum and minimum values in range of [ Q 1 1.5 ( Q 3 Q 1 ) , Q 3 + 1.5 ( Q 3 Q 1 ) ] . The band inside boxes indicates the median.
Jimaging 07 00207 g006
Figure 7. Images taken under three different light sources. (a) Ground truth image, (b) image taken under warm white light, and (c) image taken under daylight.
Figure 7. Images taken under three different light sources. (a) Ground truth image, (b) image taken under warm white light, and (c) image taken under daylight.
Jimaging 07 00207 g007
Figure 8. Images adjusted from Figure 7b. (a) WB (XYZ), (b) WB (von Kries), (c) 3CB (6, 9, 14), (d) 3CB (11, 14, 24), and (e) Cheng (1–24).
Figure 8. Images adjusted from Figure 7b. (a) WB (XYZ), (b) WB (von Kries), (c) 3CB (6, 9, 14), (d) 3CB (11, 14, 24), and (e) Cheng (1–24).
Jimaging 07 00207 g008
Figure 9. Images adjusted from Figure 7c. (a) WB (XYZ), (b) WB (von Kries), (c) 3CB (6, 9, 14), (d) 3CB (11, 14, 24), and (e) Cheng (1–24).
Figure 9. Images adjusted from Figure 7c. (a) WB (XYZ), (b) WB (von Kries), (c) 3CB (6, 9, 14), (d) 3CB (11, 14, 24), and (e) Cheng (1–24).
Jimaging 07 00207 g009
Figure 10. Three-time average runtimes for applying white balancing, three-color balancing, and Cheng’s method to 51 images. Among number of target colors, 1, 3, and over 4 indicate white balancing, three-color balancing, and Cheng’s method, respectively.
Figure 10. Three-time average runtimes for applying white balancing, three-color balancing, and Cheng’s method to 51 images. Among number of target colors, 1, 3, and over 4 indicate white balancing, three-color balancing, and Cheng’s method, respectively.
Jimaging 07 00207 g010
Figure 11. Product of three eigenvalues and mean value of reproduction error ( E r r ) for 24 patches. (a) Eigenvalues were calculated from T designed under 24 C 3 combinations of three target colors. (b) Eigenvalues were calculated from G designed under corresponding ground truth colors.
Figure 11. Product of three eigenvalues and mean value of reproduction error ( E r r ) for 24 patches. (a) Eigenvalues were calculated from T designed under 24 C 3 combinations of three target colors. (b) Eigenvalues were calculated from G designed under corresponding ground truth colors.
Jimaging 07 00207 g011
Table 1. Mean error (Mean) and standard variation (Std) of E r r values for every patch (deg). Each color index corresponds to those of Figure 5.
Table 1. Mean error (Mean) and standard variation (Std) of E r r values for every patch (deg). Each color index corresponds to those of Figure 5.
Color Index3CB (6, 9, 14)Cheng (1–24)WB (XYZ)
MeanStdMeanStdMeanStd
12.51412.48842.13652.06721.54161.4212
21.37881.08941.25620.71730.90780.6043
30.78820.64430.48050.30280.59710.4157
40.43290.29430.72050.69670.30460.1843
50.32160.21800.75070.71530.21940.1317
60.00000.00000.72200.87070.00000.0000
710.82712.57439.89062.999017.46032.4351
82.55711.48762.43141.628810.19051.8228
90.00000.00001.38551.36847.65812.1020
102.92890.92812.42851.35059.61691.3484
111.61061.51120.90281.05074.99401.2577
123.63751.63002.78551.54757.96012.1248
133.02520.90582.46370.95929.57341.2001
140.00000.00000.75880.527111.33741.5411
156.21781.43815.45111.308810.32331.5883
161.64480.88701.72270.68435.68751.2939
174.08881.36873.54781.451910.20291.4913
183.12000.98602.58651.15767.41771.0589
192.07310.66082.05680.93685.69201.2514
202.92040.85673.42270.96475.40260.7180
214.19821.60433.48161.25738.18180.9399
224.28252.02373.88261.81143.64481.1974
232.79411.27182.09370.89696.93800.8475
241.52941.11861.32610.83833.16510.4340
Total Average2.62051.08282.44521.17126.20901.1421
Table 2. Reproduction angular error ( E r r ) for every object (deg) in Figure 7b. Color indices correspond to those in Figure 7. Indices inside () indicate target colors used in each adjustment.
Table 2. Reproduction angular error ( E r r ) for every object (deg) in Figure 7b. Color indices correspond to those in Figure 7. Indices inside () indicate target colors used in each adjustment.
ColorPre-CorrectionWB (XYZ)WB (von Kries)3CB3CBCheng
Index (6)(6)(6, 9, 14)(11, 14, 24)(1–24)
121.35660.60310.51680.72351.97370.3500
221.02260.64410.56570.86691.61210.0000
321.17410.60680.52240.87091.62020.0870
421.17920.44910.39570.60521.85730.2267
521.32490.19970.15690.31292.11110.4601
621.31150.00000.00000.00002.37150.6490
718.73885.54671.43266.14663.90721.7944
810.47223.35035.15177.67318.23660.7390
910.35600.83370.69330.00001.65422.7531
105.85899.11079.48962.55613.10168.3993
1125.53642.61383.71421.91300.00001.1718
1222.31247.77406.035510.84807.31555.9673
134.96128.19118.60013.05384.06878.4872
146.46928.72579.24960.00000.00006.6441
1529.21001.57312.27924.17101.18360.6999
1615.38062.41032.31491.80671.69611.1918
1725.90124.60391.66517.36914.36802.2146
185.75464.33164.75370.52111.96545.5090
1915.10582.32291.98195.32525.20261.9077
2016.36991.57521.05583.26333.64010.8342
2125.37284.28752.59267.44203.88072.6824
2214.57410.80981.69737.75388.27492.4076
2326.62872.47791.30606.01912.73021.5043
2419.99063.91843.90943.43440.00002.1738
256.25299.06499.40822.07002.44678.0168
2611.97703.13572.97224.01412.51031.0617
2719.89887.50124.17859.54766.99604.9360
2810.94169.079010.39063.79613.05294.8839
2917.25307.26253.87328.63596.30314.4513
3025.11272.64890.95646.64884.67982.7824
316.273910.932011.46700.98470.69488.0793
3221.72342.77433.03550.10291.41131.4194
3325.93943.84172.10414.72432.19290.5683
3416.681511.83029.521214.519111.55219.7971
Mean17.30644.26563.76434.05063.37093.0838
Std7.14163.42883.43843.63132.65882.9065
Table 3. Reproduction angular error ( E r r ) for every object (deg) in Figure 7c. Color indices correspond to those in Figure 7. Indices inside () indicate target colors used in each adjustment.
Table 3. Reproduction angular error ( E r r ) for every object (deg) in Figure 7c. Color indices correspond to those in Figure 7. Indices inside () indicate target colors used in each adjustment.
ColorPre-CorrectionWB (XYZ)WB (von Kries)3CB3CBCheng
Index (6)(6)(6, 9, 14)(11, 14, 24)(1–24)
10.88146.35436.21295.53325.72485.6021
24.46830.79900.78150.70631.08520.6697
35.10010.29310.28650.27130.62990.1274
45.18780.26230.26380.25760.55570.1277
55.22140.21750.21760.21810.49270.1185
65.10130.00000.00000.00000.39590.1461
72.18573.22561.58040.29531.12220.1872
82.48921.72742.82371.26441.90371.1336
98.65576.81646.89180.00002.09090.6231
102.90231.92641.36372.11412.68621.9947
1110.16244.88195.34031.08790.00001.4796
123.29273.95803.36800.36800.40230.5244
132.92831.89331.45853.00983.91022.7644
142.26000.63360.39510.00000.00000.0139
156.08981.65952.35251.71271.95111.8289
167.69314.49244.56820.81752.13410.3307
171.72784.02902.90391.69692.21521.7242
185.81594.43394.20200.44701.68160.0214
197.31474.17854.01821.14710.34581.3105
204.88721.25181.10241.14861.77190.9867
214.88071.82821.25990.66070.82410.5465
226.30073.49313.63413.87594.02383.8070
235.89050.68570.52150.76230.66390.7583
243.17282.47082.58700.51510.00000.6320
253.01911.86421.17191.47241.93031.4009
2612.488611.032611.07004.30922.74384.9001
272.42116.12624.90353.30143.75253.3507
280.34674.60774.97932.96382.25662.9399
294.40971.91391.18502.46222.02712.4150
309.87414.29724.69852.32092.50762.1553
310.25091.62572.61890.43210.34430.5715
3212.66698.54158.81713.29762.24303.8179
331.14944.09872.54161.79882.56751.7067
343.25942.43341.73801.72251.09251.6765
Mean4.83813.17802.99581.52911.70811.5410
Std3.17392.51812.57121.38271.32811.4438
Table 4. Machine specs used for evaluating runtime.
Table 4. Machine specs used for evaluating runtime.
ProcessorIntel Core i7-7700 3.60 GHz
Memory16 GB
OSWindows 10 Home
SoftwareMATLAB R2020a
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Akazawa, T.; Kinoshita, Y.; Shiota, S.; Kiya, H. Three-Color Balancing for Color Constancy Correction. J. Imaging 2021, 7, 207. https://doi.org/10.3390/jimaging7100207

AMA Style

Akazawa T, Kinoshita Y, Shiota S, Kiya H. Three-Color Balancing for Color Constancy Correction. Journal of Imaging. 2021; 7(10):207. https://doi.org/10.3390/jimaging7100207

Chicago/Turabian Style

Akazawa, Teruaki, Yuma Kinoshita, Sayaka Shiota, and Hitoshi Kiya. 2021. "Three-Color Balancing for Color Constancy Correction" Journal of Imaging 7, no. 10: 207. https://doi.org/10.3390/jimaging7100207

APA Style

Akazawa, T., Kinoshita, Y., Shiota, S., & Kiya, H. (2021). Three-Color Balancing for Color Constancy Correction. Journal of Imaging, 7(10), 207. https://doi.org/10.3390/jimaging7100207

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop