1. Introduction
In modern digital photogrammetry, images taken with digital cameras are generally used to model objects in space. The majority of these cameras are commercially available digital cameras with a variety of qualities of both lens and digital sensor. It is also important that the object being analysed be evenly illuminated by daylight during the modelling process [
1]. At the same time, the development of photogrammetry directs its attention to new areas that require the use of artificial illumination of objects. So far, the problem of variable illumination of the scene and the photographed object has gone rather unnoticed in the literature [
2].
Experiments on modelling objects in variable lighting were performed by Stech et al. [
3], where they demonstrated that monochromatic light improves modelling effects in the context of different surfaces of modelled objects. The article did not identify the main factors for such a result. The papers in [
4,
5] also note the problem of scene lighting in the context of UAV data acquisition. The work of Roncella et al. [
6] clearly points out the need for modelling in low-light conditions. It mentions such cases as nighttime monitoring and underground surveys. The modelling of structures in mines, caves, or tunnels [
7] can be added here, where this necessarily has to be performed under artificial lighting conditions [
8,
9]. Experiments with nighttime photogrammetry are also presented in [
10], where it was shown that the light spectrum has a significant impact on the auto-calibration process performed by commercial photogrammetric software. In this work, it was proved and indicated which coefficients of intrinsic orientation elements of the camera are incorrectly calculated, which induces incorrect positions of points of the reconstructed object. These conclusions were also confirmed in [
11], which describes an experiment carried out under laboratory conditions, in which the effect of artificial lighting and its colour on the final photogrammetric model is shown. In the standard process of feature detection, the variable light spectrum of the scene is ignored since a greyscale image is used for its identification anyway. The first question can be raised here. What will be the impact of the variable light spectrum of the scene on the localisation of a feature and, consequently, on the photogrammetric process?
Chromatic aberration, one of the defects of optical systems, becomes particularly visible in the variable light spectrum of the scene and is directly dependent on the wavelength [
12]. Standard calibration models used in the photogrammetric process mainly focus on compensating for monochromatic aberrations (Seidel aberrations), particularly distortion. Models that take into account the polychromatic nature of the illumination spectrum are still in the minority [
13]. The paper in [
14] highlights the significant impact of chromatic aberration on the calibration results of a non-metric camera. A method for compensating for this problem is presented here. Interestingly, in the experiment presented in this work, the colour of the illumination was changed by using different filters on the lamp illuminating the object. In this way, the authors of the experiment directly induced errors, resulting in inaccuracies in the calibration process and only by changing the wavelength of the light illuminating the object. This experiment proved that the spectrum of electromagnetic radiation that illuminates the object affects the geometric calibration process. These conclusions were also confirmed in [
11,
15,
16], where experiments performed under a laboratory environment were described. In the work of Kaufmann et al. [
17], a method for eliminating visible chromatic aberration in images was proposed and a method was developed that eliminates the visible aberration effect for typical architectural scenes, which consequently improves the interpretive quality of the study. Separated RGB channels obtained from each image calibration pattern were used to study aberrations. In [
18], the authors presented an interesting experiment that illustrates the image quality degrading effect of chromatic aberration. They demonstrated a significant dependence on the quality of the lens and photogrammetric camera. In this work, they used an interesting solution to improve the quality, i.e., they used three different channels to locate the features separately, and then in the computational process used this information by simulating three separate cameras being in one position. This solution significantly improved the precision of the location of feature points. The authors did not use the variable light spectrum of the scene. A method of detection and elimination of chromatic aberration using Wavelet Analysis was proposed by Fryskowska et al. [
19]. As the authors showed, wavelets have the advantage of allowing analysis at multiple resolutions and can quantify the signal’s local characteristics in both the spatial and frequency domains. In this paper, the authors illuminated the calibration matrices with direct daylight from the sun and did not provide details of the demosaicing method and greyscale calculation. In the context of correcting chromatic aberration and improving the interpretive quality of real images, the method of Rudakova et al. [
20] also presents a significant achievement in this field. It is worth noting that chromatic aberration significantly changes under underwater image taking conditions, as demonstrated by the studies of Helmohlz et al. [
21,
22] and Menna et al. [
23].
In digital photogrammetry, the image is captured by a digital matrix, saved in raw data format (RAW) and later converted to standardised files (e.g., JPG, TIFF) [
24]. Typically, this process, called demosaicing, is performed automatically by the camera software. The majority of modern digital cameras use sensors based on a Bayer filter (CFA—colour filter array) [
25], through which the demosaicing operation results in a colour image consisting of three arrays representing the colours red, green, and blue (RGB). When we use CFA, we only have the actual measurement at a given point for one component, the others having to be interpolated from a measurement taken at a slightly different location, a different pixel. To understand the demosaicing process more easily, it can be described as an extension of interpolation for a greyscale image. Therefore, two related interpolation problems must be solved: reconstructing the missing half of the number of G pixels and reconstructing the missing three-quarters of the R and B pixels (
Figure 1). It is worth noting that the demosaicing problem becomes more complex for new multispectral arrays that use some type of extended Bayer filter with multiple colours, an emerging multispectral filter array (MSFA) type [
26].
In the study by Reznicek et al. [
27], the authors analysed the influence of the characterisation process on the quality of photogrammetric models. It was noted there that the demosaicing process affects the number of photogrammetric targets detected, which also strongly depends on the detector settings of such a target. Consequently, this results in different quality of photogrammetric measurements. The fact is that the process of converting RAW to JPG affects many radiometric parameters of the image (contrast, black level, white balance, etc.) so it can be concluded that the radiometric characteristics of such an image will affect the feature detectors and consequently further photogrammetric processes. This finding is supported by [
28], where the influence of different algorithms of the demosaicing process on the digital correlation of images was demonstrated. At the scale at which this research was conducted, this influence can be considered significant. In the work of Shortis et al. [
29], the authors also note the influence of the demosaicing process on the quality of the photogrammetric process and propose a new correction approach based on tunable Gaussian filters. Similar conclusions are observed in [
30], which highlights that degrading algorithms used in digital cameras, due to their limited computational resources and a certain trade-off between quality and speed of data delivery, degrade the data and thus the resulting photogrammetric models. Remarkably, this work highlights that using only the green channel and even the simplest interpolation-based and bilinear algorithm offered very accurate results. The paper analysed 14 different demosaicing algorithms. Currently, various new demosaicing algorithms based on artificial neural networks are being developed, which, as a result, can significantly improve image quality [
31,
32,
33,
34].
Since the demosaicing algorithm itself is some kind of interpolation between spectral channels, it can be deduced that the type of this algorithm itself will affect the quantity of the individual colour components, and in combination with chromatic aberration, can intensify or weaken it. The conclusion from this is that the selection of the demosaicing algorithm itself can modify the impact of the chromatic aberration described earlier in different light spectrums.
In the photogrammetric process, feature localisation and homology point detection are extremely important. Depending on the type of feature detector, it will mostly use a greyscale image to localise a feature, so there will be some data reduction here, which means a transition from an RGB image to a greyscale image, or only one spectral channel, mostly green (G), will be used to detect such a feature [
18]. This operation simplifies subsequent calculations in some way and can affect the quality of the localisation of the feature and the quality of the photogrammetric processing. Such an approach omits the influence of, for example, chromatic aberration, which becomes apparent especially in the two outer spectral channels, blue and red (B, R), and this has a direct impact on the calibration results of a non-metric camera and in the case of a scene illuminated by an incomplete light spectrum [
20]. For the detection of chromatic aberration in greyscale images, see the method developed by Fryskowska et al. [
19].
The greyscale image can be calculated using different methods [
35]. Photogrammetric software for feature localisation usually uses the green channel (R), precisely to minimise the negative effect of chromatic aberration and to improve the signal-to-noise ratio (SNR), as this channel has the most favourable ratio [
27]. In the general case, converting a colour image to a greyscale image is a complicated process, and the converted greyscale image may lose the contrasts, sharpness, shadows, and structure of the colour image [
35]. To minimise data loss, researchers have proposed various methods for calculating the new image [
36,
37,
38,
39], and some standards are already present in standardisation documents [
40,
41,
42], mainly based on weighted averages. Thus, according to recommendation 601 [
41], the greyscale can be calculated for each point in the RGB image according to the following formula:
According to recommendation 2100 [
42], the greyscale is calculated according to the following equation:
whereas, according to recommendation 709 [
40], the greyscale is calculated according to the following formula:
In addition, the greyscale can be calculated from the arithmetic mean:
or using the average of the maximum and minimum values of RGB:
As presented before, each element of the photogrammetric process, from the acquisition of the image in a specific lighting condition to its digital development into a JPG file and greyscale calculation, influences the feature location, and consequently the quality of the photogrammetric models. In the following paper, twelve different scene illumination settings, five different demosaicing methods, and five greyscale calculation methods were analysed, as well as their impact on Harris feature localisation [
43]. The following demosaicing methods were tested in the experiment: nearest neighbour interpolation (neighbour), bilinear interpolation (bilinear), smooth hue transition interpolation (smooth_hue), median-filtered bilinear interpolation (median), and gradient-based interpolation (gradient), described in detail in [
44,
45]. The methods investigated for calculating the greyscale are shown above in Formulas (1)–(5). The lighting settings included polychromatic and monochromatic light, as described in the next section.
The following article presents the contributions of the new aspects in the field of close-range digital photogrammetry:
It is presented how the light spectrum of the modelled scene influences the location of a feature on an image. The influence of both polychromatic and monochromatic illumination is demonstrated. A recommendation on the light spectrum of the modelled scene was formulated.
The impact of demosaicing algorithms on the localisation of a feature on an image of a scene illuminated in a variable spectrum was presented. The algorithm that guarantees the smallest feature deviation was identified.
The effect of greyscale calculation algorithms on the localisation of a feature in the context of the different light spectrum of the scene is described and investigated, while a recommendation and recommended method for greyscale calculation is provided.
It has been shown unquestionably that the illumination spectrum of a scene is important in digital photogrammetry, confirmed by laboratory experiments, and recommendations in the field of illumination spectrum have been formulated.
3. Results
Figure 4 presents the statistics of the position of all features in the image in relation to the light spectrum.
As the analysis reveals, the lowest median was achieved for all cases for green lighting (median = 0.40 pix), followed by the HMI case 5600 K (median = 0.44 pix) and Daylight—5000 K (median = 0.46 pix) (
Table 3). This means that for this type of lighting, the Harris feature detection was closest to the reference point. Moreover, for green lighting, there is also the lowest standard deviation and median, which indicates a high stability of feature detection in this lighting. For green lighting, the effect of lighting colour is the lowest, which also confirms the technical assumption made in [
27], where only the green channel input was used instead of converting the RGB image to greyscale.
Figure 5 shows the analysis of the location of the feature in relation to the light spectrum and the diffusion method. As can be seen, in each case, the demosaicing method resulted in a different location of the feature under examination, which confirms its influence on the Harris feature detector. Analysing the position of the points with respect to the demosaicing method only (
Table 4), the lowest median feature position was achieved for the smooth hue demosaicing method for HMI (5600 K) lighting = 0.322 pix, followed by Daylight (5000 K) = 0.363 pix and Green = 0.369 pix. The data for
Figure 5 are presented in tabular form in
Table A1.
Figure 6 presents an analysis of the feature location in relation to the light spectrum and the greyscale calculation method. As can be observed, in each case, the greyscale calculation method resulted in a different location of the feature under study, which proves its effect on the Harris feature detector. When analysing the location of the points with respect to the greyscale calculation method alone (
Table 5), the lowest median deviation of the location of the feature was achieved for the itu709 = 0.59 pix method. The data for
Figure 6 are presented in tabular form in
Table A2.
When the greyscale conversion method was analysed, on average, itu709 achieved the lowest medians (
Table 5) for all cases. However, when analysing the total data grouped into demosaicing and greyscale calculation methods, the absolute lowest median results were achieved for the case of HMI—5600 K (
Table 6) and the greyscale calculation method mean. The results in
Table 6 are sorted by median value, from lowest to highest for only 10 cases. The complete table of results is presented in
Table A3. A closer analysis reveals that the greyscale mean calculation method for full-spectrum light achieves the lowest deviation values from feature detection. Since the greyscale calculation from a single channel (green) does not contribute much and, in general, green had the lowest values for the monochromatic light type only, the statistical results presented in
Table 5 show another winner (itu709). It should clearly be noted that for polychromatic light, the lowest deviation in feature position was achieved for the greyscale mean calculation method. For monochromatic light, the greyscale mean calculation method is not significant, as the values recorded in the other channels have negligible levels and almost zero impact of both the weighted and the arithmetic mean calculations.
It is also worth noting that for polychromatic light, as shown in
Table 6, the influence of the demosaicing and greyscale algorithms is greatest and their appropriate use allows the feature position deviation to be minimised significantly. For monochromatic green light, the algorithms have no significant influence and their influence is rather random. The median values for monochrome green light are constant for all algorithms used.
4. Discussion
This research shows that the light spectrum influences the location of a feature in an image. Furthermore, it demonstrates that the positions of the feature are influenced by all intermediate algorithms used in the photogrammetric process. This effect should be considered in precision digital photogrammetry and during the development of photogrammetric algorithms and applications.
Based on the results presented, some recommendations can be derived in the context of the preferred light spectrum of the scene lighting and the proposed algorithms for demosaicing and greyscale conversion.
In the field of polychromatic light, the most appropriate spectrum would be HMI—5600 K and later Daylight 5000 K. This is also confirmed by photogrammetric practice, which recommends performing calibrations in the daylight spectrum. If artificial lighting has to be used, a spectrum of 5600 K or 5000 K that guarantees the lowest feature deviation should be used. It is worth noting that the spectral characteristics of lighting at this colour temperature do not promote any extreme value in the red region. In warmer colours, i.e., below 5000 K, the influence of the red channel and this light spectrum is clearly visible, resulting in a greater feature deviation. It is worth noting that the spectral characteristics of lighting at this temperature of colour do not promote any extreme value in the red region. In the warmer colours, i.e., below 5000 K, the influence of the red channel and this light spectrum is clearly evident, resulting in a greater deviation of the feature. The effect of the red channel’s influence in the spectrum clearly increases the significance of chromatic aberration and shifts the focal length for a given wavelength towards the red focal point. This phenomenon, in subsequent digital processing steps, interferes with the algorithms and results in feature deviation. This can be minimised in a simple way by selecting algorithms in the production cycle, as shown in the following paper.
Within the polychromatic light domain, the most appropriate choice would be to use a demosaicing method based on the smooth hue algorithm and a greyscale calculation method of the mean type for the Harris detector. It should be noted that, for polychromatic light, a significant effect of the demosaicing method and greyscale calculation was observed. Furthermore, the deviation of the characteristic strongly correlates with the CRI (colour rendering index) (
Table 1). The higher the index, the greater the deviation of the feature. For CRIs above 96, the lowest characteristic deviation was achieved in the cases studied. The use of this algorithm minimises the impact of chromatic aberration and provides the lowest feature deviation among all those studied here.
In the field of monochromatic light, very good results were achieved for green lighting. The scene was illuminated only with green LEDs with the spectrum presented earlier. Many studies and algorithms use only the green channel for greyscale calculations. The research confirms that using lighting in the green spectrum minimises the impact of chromatic aberration, which could already be deduced from earlier studies, but what is new is that such lighting minimises the impact of demosaicing algorithms and greyscale calculation algorithms. The results for this spectrum for all algorithms used were almost identical. Thus, in cases where the colour and reconstruction of the scene in RGB colours are not important, e.g., only for precise geometric reconstruction of an object, green lighting seems to be the right choice. Moreover, as is well known, the Bayer filter matrix design has the most green pixels, and they are symmetrically distributed across the matrix. Therefore, it will be and is most sensitive in this channel, and it will also minimise the impact of the demosaicing algorithm (pixel symmetry) and significantly minimise the impact of chromatic aberration for low-quality lenses. The author believes that these findings are very important for future research and accurate modelling with digital photogrammetry techniques.
The following study was limited to investigating the effect of light spectrum on the locations of Harris features. The features are a common detector used in photogrammetry, especially in the fields of calibration, so they may be a good representative group. Other features used in photogrammetry, especially for the detection of homologous points, may respond differently to the light spectrum. Testing other feature detectors is out of the scope of this study, but it is planned to be conducted in the future. This requires, naturally, a change of test fields and further experiments, but the research plan presented here appears to be suitable for this purpose.
Another limitation of the research is the demosaicing and greyscale calculation algorithms used. The author of the research selected the most popular of these, due to their simplicity often used in popular software. The point was to prove that the selection of these algorithms influences feature location and digital photogrammetry cannot be ignored. Furthermore, the author believes that in precision photogrammetry, correct selection of both the light spectrum and specific algorithms is important and should be taken into account. The recommendations for the algorithms to be used clearly present the results achieved.
The author has not modelled the phenomenon presented in this study, as it depends on a very large number of variables. The determination of chromatic aberration correction coefficients for a given lens and a specific camera model, considering the algorithms and feature detectors used and, more difficultly, the light spectrum, is beyond the range of the research. The implementation of correction factors and the development of a new method for calibrating a nonmetric camera in the context of the light spectrum is planned to be performed in future research.
In terms of a practical recommendation, it should be noted that currently produced photogrammetric software does not take into account the change of demosaicing algorithms and greyscale calculations. This is assumed a priori and has not been previously considered by the developers. Software of this type, a kind of black box, does not show in detail what algorithms they use. Obviously, in most cases, they are well known and described. Also, popular cameras do not have the possibility to choose the demosaicing algorithm, providing a ready-made JPEG file. In the case of a practical need to improve the quality of precision photogrammetry products, the author recommends using the algorithms mentioned, but this must be done before the software calculates the model. So, first, a JPEG must be created from RAW using the method specified, and then, already in the software, the greyscale must be recalculated using the method specified here as well. The software known to the author does not have such possibilities, and with this article, the manufacturers are encouraged to make these changes. In addition, in the absence of the possibility of changing the algorithms, the author recommends illuminating the scene with 5600 K or green light, which will minimise feature deviation and improve the quality of the models.
5. Conclusions
The research shows which of the popular demosaicing and greyscale calculation algorithms produce the least feature deviation, in the context of the variable light spectrum of the scene.
The lowest feature deviation in the polychromatic light domain was obtained for light with color temperatures of 5000 K and 5600 K, which is a practical recommendation for illuminating a scene with artificial light. In this domain, the lowest feature deviation was guaranteed by the demosaicing algorithms of the smooth hue type and the greyscale calculation of the mean type. From a practical point of view, the choice of lighting with a colour temperature between 5000 and 5600 K and these algorithms will guarantee the lowest feature deviation, more precise camera calibration and, consequently, a more accurate model.
In the monochromatic light domain, the case is slightly different. The lowest feature deviation was achieved for light in the green spectrum, and the demosaicing and greyscale calculation algorithms had no effect on the feature deviation. From a practical point of view, if the demosaicing and greyscale calculation algorithms cannot be adapted, the scene can be illuminated with green light, which will improve the precision of the feature location and minimise the negative effects of chromatic aberration.
The author did not model the phenomena and does not propose correction methods, as this is beyond the scope of this research, but is planning to carry this out in future research and publish the results. Due to the complex nature of the phenomena and the variety of equipment used in photogrammetry, it seems that modern artificial intelligence methods can play a large role here.