Next Article in Journal
Estimating Bermudagrass Aboveground Biomass Using Stereovision and Vegetation Coverage
Previous Article in Journal
Optimizing Slender Target Detection in Remote Sensing with Adaptive Boundary Perception
Previous Article in Special Issue
Exterior Orientation Parameter Refinement of the First Chinese Airborne Three-Line Scanner Mapping System AMS-3000
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effect of Varying the Light Spectrum of a Scene on the Localisation of Photogrammetric Features

by
Pawel Burdziakowski
Department of Geodesy, Faculty of Civil and Environmental Engineering, Gdansk University of Technology, Gabriela Narutowicza 11/12, 80-233 Gdansk, Poland
Remote Sens. 2024, 16(14), 2644; https://doi.org/10.3390/rs16142644
Submission received: 13 June 2024 / Revised: 4 July 2024 / Accepted: 14 July 2024 / Published: 19 July 2024

Abstract

:
In modern digital photogrammetry, an image is usually registered via a digital matrix with an array of colour filters. From the registration of the image until feature points are detected on the image, the image is subjected to a series of calculations, i.e., demosaicing and conversion to greyscale, among others. These algorithms respond differently to the varying light spectrum of the scene, which consequently results in the feature location changing. In this study, the effect of scene illumination on the localisation of a feature in an image is presented. The demosaicing and greyscale conversion algorithms that produce the largest and smallest deviation of the feature from the reference point were assessed. Twelve different illumination settings from polychromatic light to monochromatic light were developed and performed, and five different demosaicing algorithms and five different methods of converting a colour image to greyscale were analysed. A total of 300 different cases were examined. As the study shows, the lowest deviation in the polychromatic light domain was achieved for light with a colour temperature of 5600 K and 5000 K, while in the monochromatic light domain, it was achieved for light with a green colour. Demosaicing methods have a significant effect on the localisation of a feature, and so the smallest feature deviation was achieved for smooth hue-type demosaicing, while for greyscale conversion, it was achieved for the mean type. Demosaicing and greyscale conversion methods for monochrome light had no effect. The article discusses the problem and concludes with recommendations and suggestions in the area of illuminating the scene with artificial light and the application of the algorithms, in order to achieve the highest accuracy using photogrammetric methods.

1. Introduction

In modern digital photogrammetry, images taken with digital cameras are generally used to model objects in space. The majority of these cameras are commercially available digital cameras with a variety of qualities of both lens and digital sensor. It is also important that the object being analysed be evenly illuminated by daylight during the modelling process [1]. At the same time, the development of photogrammetry directs its attention to new areas that require the use of artificial illumination of objects. So far, the problem of variable illumination of the scene and the photographed object has gone rather unnoticed in the literature [2].
Experiments on modelling objects in variable lighting were performed by Stech et al. [3], where they demonstrated that monochromatic light improves modelling effects in the context of different surfaces of modelled objects. The article did not identify the main factors for such a result. The papers in [4,5] also note the problem of scene lighting in the context of UAV data acquisition. The work of Roncella et al. [6] clearly points out the need for modelling in low-light conditions. It mentions such cases as nighttime monitoring and underground surveys. The modelling of structures in mines, caves, or tunnels [7] can be added here, where this necessarily has to be performed under artificial lighting conditions [8,9]. Experiments with nighttime photogrammetry are also presented in [10], where it was shown that the light spectrum has a significant impact on the auto-calibration process performed by commercial photogrammetric software. In this work, it was proved and indicated which coefficients of intrinsic orientation elements of the camera are incorrectly calculated, which induces incorrect positions of points of the reconstructed object. These conclusions were also confirmed in [11], which describes an experiment carried out under laboratory conditions, in which the effect of artificial lighting and its colour on the final photogrammetric model is shown. In the standard process of feature detection, the variable light spectrum of the scene is ignored since a greyscale image is used for its identification anyway. The first question can be raised here. What will be the impact of the variable light spectrum of the scene on the localisation of a feature and, consequently, on the photogrammetric process?
Chromatic aberration, one of the defects of optical systems, becomes particularly visible in the variable light spectrum of the scene and is directly dependent on the wavelength [12]. Standard calibration models used in the photogrammetric process mainly focus on compensating for monochromatic aberrations (Seidel aberrations), particularly distortion. Models that take into account the polychromatic nature of the illumination spectrum are still in the minority [13]. The paper in [14] highlights the significant impact of chromatic aberration on the calibration results of a non-metric camera. A method for compensating for this problem is presented here. Interestingly, in the experiment presented in this work, the colour of the illumination was changed by using different filters on the lamp illuminating the object. In this way, the authors of the experiment directly induced errors, resulting in inaccuracies in the calibration process and only by changing the wavelength of the light illuminating the object. This experiment proved that the spectrum of electromagnetic radiation that illuminates the object affects the geometric calibration process. These conclusions were also confirmed in [11,15,16], where experiments performed under a laboratory environment were described. In the work of Kaufmann et al. [17], a method for eliminating visible chromatic aberration in images was proposed and a method was developed that eliminates the visible aberration effect for typical architectural scenes, which consequently improves the interpretive quality of the study. Separated RGB channels obtained from each image calibration pattern were used to study aberrations. In [18], the authors presented an interesting experiment that illustrates the image quality degrading effect of chromatic aberration. They demonstrated a significant dependence on the quality of the lens and photogrammetric camera. In this work, they used an interesting solution to improve the quality, i.e., they used three different channels to locate the features separately, and then in the computational process used this information by simulating three separate cameras being in one position. This solution significantly improved the precision of the location of feature points. The authors did not use the variable light spectrum of the scene. A method of detection and elimination of chromatic aberration using Wavelet Analysis was proposed by Fryskowska et al. [19]. As the authors showed, wavelets have the advantage of allowing analysis at multiple resolutions and can quantify the signal’s local characteristics in both the spatial and frequency domains. In this paper, the authors illuminated the calibration matrices with direct daylight from the sun and did not provide details of the demosaicing method and greyscale calculation. In the context of correcting chromatic aberration and improving the interpretive quality of real images, the method of Rudakova et al. [20] also presents a significant achievement in this field. It is worth noting that chromatic aberration significantly changes under underwater image taking conditions, as demonstrated by the studies of Helmohlz et al. [21,22] and Menna et al. [23].
In digital photogrammetry, the image is captured by a digital matrix, saved in raw data format (RAW) and later converted to standardised files (e.g., JPG, TIFF) [24]. Typically, this process, called demosaicing, is performed automatically by the camera software. The majority of modern digital cameras use sensors based on a Bayer filter (CFA—colour filter array) [25], through which the demosaicing operation results in a colour image consisting of three arrays representing the colours red, green, and blue (RGB). When we use CFA, we only have the actual measurement at a given point for one component, the others having to be interpolated from a measurement taken at a slightly different location, a different pixel. To understand the demosaicing process more easily, it can be described as an extension of interpolation for a greyscale image. Therefore, two related interpolation problems must be solved: reconstructing the missing half of the number of G pixels and reconstructing the missing three-quarters of the R and B pixels (Figure 1). It is worth noting that the demosaicing problem becomes more complex for new multispectral arrays that use some type of extended Bayer filter with multiple colours, an emerging multispectral filter array (MSFA) type [26].
In the study by Reznicek et al. [27], the authors analysed the influence of the characterisation process on the quality of photogrammetric models. It was noted there that the demosaicing process affects the number of photogrammetric targets detected, which also strongly depends on the detector settings of such a target. Consequently, this results in different quality of photogrammetric measurements. The fact is that the process of converting RAW to JPG affects many radiometric parameters of the image (contrast, black level, white balance, etc.) so it can be concluded that the radiometric characteristics of such an image will affect the feature detectors and consequently further photogrammetric processes. This finding is supported by [28], where the influence of different algorithms of the demosaicing process on the digital correlation of images was demonstrated. At the scale at which this research was conducted, this influence can be considered significant. In the work of Shortis et al. [29], the authors also note the influence of the demosaicing process on the quality of the photogrammetric process and propose a new correction approach based on tunable Gaussian filters. Similar conclusions are observed in [30], which highlights that degrading algorithms used in digital cameras, due to their limited computational resources and a certain trade-off between quality and speed of data delivery, degrade the data and thus the resulting photogrammetric models. Remarkably, this work highlights that using only the green channel and even the simplest interpolation-based and bilinear algorithm offered very accurate results. The paper analysed 14 different demosaicing algorithms. Currently, various new demosaicing algorithms based on artificial neural networks are being developed, which, as a result, can significantly improve image quality [31,32,33,34].
Since the demosaicing algorithm itself is some kind of interpolation between spectral channels, it can be deduced that the type of this algorithm itself will affect the quantity of the individual colour components, and in combination with chromatic aberration, can intensify or weaken it. The conclusion from this is that the selection of the demosaicing algorithm itself can modify the impact of the chromatic aberration described earlier in different light spectrums.
In the photogrammetric process, feature localisation and homology point detection are extremely important. Depending on the type of feature detector, it will mostly use a greyscale image to localise a feature, so there will be some data reduction here, which means a transition from an RGB image to a greyscale image, or only one spectral channel, mostly green (G), will be used to detect such a feature [18]. This operation simplifies subsequent calculations in some way and can affect the quality of the localisation of the feature and the quality of the photogrammetric processing. Such an approach omits the influence of, for example, chromatic aberration, which becomes apparent especially in the two outer spectral channels, blue and red (B, R), and this has a direct impact on the calibration results of a non-metric camera and in the case of a scene illuminated by an incomplete light spectrum [20]. For the detection of chromatic aberration in greyscale images, see the method developed by Fryskowska et al. [19].
The greyscale image can be calculated using different methods [35]. Photogrammetric software for feature localisation usually uses the green channel (R), precisely to minimise the negative effect of chromatic aberration and to improve the signal-to-noise ratio (SNR), as this channel has the most favourable ratio [27]. In the general case, converting a colour image to a greyscale image is a complicated process, and the converted greyscale image may lose the contrasts, sharpness, shadows, and structure of the colour image [35]. To minimise data loss, researchers have proposed various methods for calculating the new image [36,37,38,39], and some standards are already present in standardisation documents [40,41,42], mainly based on weighted averages. Thus, according to recommendation 601 [41], the greyscale can be calculated for each point in the RGB image according to the following formula:
I i = 0.2989   R + 0.5870   G + 0.1140   B .
According to recommendation 2100 [42], the greyscale is calculated according to the following equation:
I i = 0.2627   R + 0.6780   G + 0.0593   B ,
whereas, according to recommendation 709 [40], the greyscale is calculated according to the following formula:
I i = 0.2126   R + 0.7152   G + 0.0722   B .
In addition, the greyscale can be calculated from the arithmetic mean:
I i = R + G + B 3 ,
or using the average of the maximum and minimum values of RGB:
I i = m a x ( R , G , B ) + m i n ( R , G , B ) 2 .
As presented before, each element of the photogrammetric process, from the acquisition of the image in a specific lighting condition to its digital development into a JPG file and greyscale calculation, influences the feature location, and consequently the quality of the photogrammetric models. In the following paper, twelve different scene illumination settings, five different demosaicing methods, and five greyscale calculation methods were analysed, as well as their impact on Harris feature localisation [43]. The following demosaicing methods were tested in the experiment: nearest neighbour interpolation (neighbour), bilinear interpolation (bilinear), smooth hue transition interpolation (smooth_hue), median-filtered bilinear interpolation (median), and gradient-based interpolation (gradient), described in detail in [44,45]. The methods investigated for calculating the greyscale are shown above in Formulas (1)–(5). The lighting settings included polychromatic and monochromatic light, as described in the next section.
The following article presents the contributions of the new aspects in the field of close-range digital photogrammetry:
  • It is presented how the light spectrum of the modelled scene influences the location of a feature on an image. The influence of both polychromatic and monochromatic illumination is demonstrated. A recommendation on the light spectrum of the modelled scene was formulated.
  • The impact of demosaicing algorithms on the localisation of a feature on an image of a scene illuminated in a variable spectrum was presented. The algorithm that guarantees the smallest feature deviation was identified.
  • The effect of greyscale calculation algorithms on the localisation of a feature in the context of the different light spectrum of the scene is described and investigated, while a recommendation and recommended method for greyscale calculation is provided.
  • It has been shown unquestionably that the illumination spectrum of a scene is important in digital photogrammetry, confirmed by laboratory experiments, and recommendations in the field of illumination spectrum have been formulated.

2. Materials and Methods

In the following paper, the focus is on assessing which of the interpolation algorithms during demosaicing and greyscale conversion most affects the position of the feature on the image, through the response to the different spectrum of light illuminating the object, in other words, which part of the process influences the position of the feature the most and, conversely, which is the most resistant to changes in the spectrum and does not change the position of the feature in the image. The feature position detection in the image is used in both the calibration process and the finding of homologous points, and this has a significant impact on the overall photogrammetric process. Harris corners [43] were used as the feature detector, which is very popular, especially in the calibration process of nonmetric cameras, and is used to detect the corners of the calibration matrix.

2.1. Testing Station

Figure 2 shows a test station plan and an example image. The test station consists of a Sony ILCE-6000 camera (Sony Corporation, Tokyo, Japan) with a MeiKe MK 35 mm f/1.7 lens (HongKong Meike Digital Technology Co., Ltd., Chengdu, China), with which the images were taken, and a studio lighting lamp with adjustable light spectrum, equipped with RGBW (red, blue, green, white) LEDs. The spectrum was measured with a UPRtek MK350D spectrometer (Nanotech—Uprtek, Heerlen, The Netherlands) placed on a tripod in front of the lamp. The object photographed was a foamed polyvinyl chloride (PVC) calibration board with a printed geometric chessboard shape and a Spyder Checkr 24-colour array (Datacolor, Lawrenceville, NJ, USA). Only the geometric data from the calibration matrix were considered in this experiment. Data from the colour matrix were not used for the tests presented here. All devices, i.e., the lamp, camera, and spectrometers, were connected via USB-type cables to a PC, and the data recorded by them were stored directly in the PC’s memory. This solution allows camera and lamp settings and configuration to be controlled and spectrometer readings to be synchronised.
The chequerboard calibration matrix is one of the more popular matrices for calibrating nonmetric cameras. Popular photogrammetric software uses this type of matrix to calibrate cameras, and the methods based on this matrix are described in [46]. These algorithms use the Harris detector in most cases [14,47,48,49,50].

2.2. Lighting Settings

Twelve different lamp settings were planned, ranging from full white-light illumination from white LEDs with different colour temperatures, to a colour mix from RGB LEDs, to single R, G, and B colours. Details of the illumination are shown in Table 1. Each setting was measured with a spectrometer and the parameters recorded in the device’s memory. The values shown in Table 1 represent the measurement results from the spectrometer. The name of the setting indicates the setting suggested by the lamp manufacturer.

2.3. Experiment Workflow

Table 2 presents a summary of all the tested lighting settings, interpolation methods, greyscale calculations, and the feature detection detector used. For each lighting setting, all interpolation methods were checked and, in sequence, for each interpolation method, different greyscale calculations were verified. The presented combination of lighting and algorithms provided 300 cases, which were evaluated and subsequently assessed.
Importantly, for the stability of the data and the reproducibility of the experiment, 10 images were taken for each lighting setting. The camera was mounted on a tripod and fixed and was remotely activated to minimise changes in position; however, to ensure reliability, 10 images were taken at a time and the average position of each feature was extracted from these 10 images for a given case (Figure 3). ISO, aperture, and focal length settings remained constant for the entire experiment and were only set once at the start of data collection.
After the testing images were taken, 12 datasets with 10 images in each set were created. One set corresponded to one lighting setting. In each image, an ROI (region of interest) limited only to the calibration matrix (chequerboard) was determined and corners were detected in this region using the Harris method. The feature detection positions in the image were the input for further statistical calculations, which are presented in the next section of this article.

3. Results

Figure 4 presents the statistics of the position of all features in the image in relation to the light spectrum.
As the analysis reveals, the lowest median was achieved for all cases for green lighting (median = 0.40 pix), followed by the HMI case 5600 K (median = 0.44 pix) and Daylight—5000 K (median = 0.46 pix) (Table 3). This means that for this type of lighting, the Harris feature detection was closest to the reference point. Moreover, for green lighting, there is also the lowest standard deviation and median, which indicates a high stability of feature detection in this lighting. For green lighting, the effect of lighting colour is the lowest, which also confirms the technical assumption made in [27], where only the green channel input was used instead of converting the RGB image to greyscale.
Figure 5 shows the analysis of the location of the feature in relation to the light spectrum and the diffusion method. As can be seen, in each case, the demosaicing method resulted in a different location of the feature under examination, which confirms its influence on the Harris feature detector. Analysing the position of the points with respect to the demosaicing method only (Table 4), the lowest median feature position was achieved for the smooth hue demosaicing method for HMI (5600 K) lighting = 0.322 pix, followed by Daylight (5000 K) = 0.363 pix and Green = 0.369 pix. The data for Figure 5 are presented in tabular form in Table A1.
Figure 6 presents an analysis of the feature location in relation to the light spectrum and the greyscale calculation method. As can be observed, in each case, the greyscale calculation method resulted in a different location of the feature under study, which proves its effect on the Harris feature detector. When analysing the location of the points with respect to the greyscale calculation method alone (Table 5), the lowest median deviation of the location of the feature was achieved for the itu709 = 0.59 pix method. The data for Figure 6 are presented in tabular form in Table A2.
When the greyscale conversion method was analysed, on average, itu709 achieved the lowest medians (Table 5) for all cases. However, when analysing the total data grouped into demosaicing and greyscale calculation methods, the absolute lowest median results were achieved for the case of HMI—5600 K (Table 6) and the greyscale calculation method mean. The results in Table 6 are sorted by median value, from lowest to highest for only 10 cases. The complete table of results is presented in Table A3. A closer analysis reveals that the greyscale mean calculation method for full-spectrum light achieves the lowest deviation values from feature detection. Since the greyscale calculation from a single channel (green) does not contribute much and, in general, green had the lowest values for the monochromatic light type only, the statistical results presented in Table 5 show another winner (itu709). It should clearly be noted that for polychromatic light, the lowest deviation in feature position was achieved for the greyscale mean calculation method. For monochromatic light, the greyscale mean calculation method is not significant, as the values recorded in the other channels have negligible levels and almost zero impact of both the weighted and the arithmetic mean calculations.
It is also worth noting that for polychromatic light, as shown in Table 6, the influence of the demosaicing and greyscale algorithms is greatest and their appropriate use allows the feature position deviation to be minimised significantly. For monochromatic green light, the algorithms have no significant influence and their influence is rather random. The median values for monochrome green light are constant for all algorithms used.

4. Discussion

This research shows that the light spectrum influences the location of a feature in an image. Furthermore, it demonstrates that the positions of the feature are influenced by all intermediate algorithms used in the photogrammetric process. This effect should be considered in precision digital photogrammetry and during the development of photogrammetric algorithms and applications.
Based on the results presented, some recommendations can be derived in the context of the preferred light spectrum of the scene lighting and the proposed algorithms for demosaicing and greyscale conversion.
In the field of polychromatic light, the most appropriate spectrum would be HMI—5600 K and later Daylight 5000 K. This is also confirmed by photogrammetric practice, which recommends performing calibrations in the daylight spectrum. If artificial lighting has to be used, a spectrum of 5600 K or 5000 K that guarantees the lowest feature deviation should be used. It is worth noting that the spectral characteristics of lighting at this colour temperature do not promote any extreme value in the red region. In warmer colours, i.e., below 5000 K, the influence of the red channel and this light spectrum is clearly visible, resulting in a greater feature deviation. It is worth noting that the spectral characteristics of lighting at this temperature of colour do not promote any extreme value in the red region. In the warmer colours, i.e., below 5000 K, the influence of the red channel and this light spectrum is clearly evident, resulting in a greater deviation of the feature. The effect of the red channel’s influence in the spectrum clearly increases the significance of chromatic aberration and shifts the focal length for a given wavelength towards the red focal point. This phenomenon, in subsequent digital processing steps, interferes with the algorithms and results in feature deviation. This can be minimised in a simple way by selecting algorithms in the production cycle, as shown in the following paper.
Within the polychromatic light domain, the most appropriate choice would be to use a demosaicing method based on the smooth hue algorithm and a greyscale calculation method of the mean type for the Harris detector. It should be noted that, for polychromatic light, a significant effect of the demosaicing method and greyscale calculation was observed. Furthermore, the deviation of the characteristic strongly correlates with the CRI (colour rendering index) (Table 1). The higher the index, the greater the deviation of the feature. For CRIs above 96, the lowest characteristic deviation was achieved in the cases studied. The use of this algorithm minimises the impact of chromatic aberration and provides the lowest feature deviation among all those studied here.
In the field of monochromatic light, very good results were achieved for green lighting. The scene was illuminated only with green LEDs with the spectrum presented earlier. Many studies and algorithms use only the green channel for greyscale calculations. The research confirms that using lighting in the green spectrum minimises the impact of chromatic aberration, which could already be deduced from earlier studies, but what is new is that such lighting minimises the impact of demosaicing algorithms and greyscale calculation algorithms. The results for this spectrum for all algorithms used were almost identical. Thus, in cases where the colour and reconstruction of the scene in RGB colours are not important, e.g., only for precise geometric reconstruction of an object, green lighting seems to be the right choice. Moreover, as is well known, the Bayer filter matrix design has the most green pixels, and they are symmetrically distributed across the matrix. Therefore, it will be and is most sensitive in this channel, and it will also minimise the impact of the demosaicing algorithm (pixel symmetry) and significantly minimise the impact of chromatic aberration for low-quality lenses. The author believes that these findings are very important for future research and accurate modelling with digital photogrammetry techniques.
The following study was limited to investigating the effect of light spectrum on the locations of Harris features. The features are a common detector used in photogrammetry, especially in the fields of calibration, so they may be a good representative group. Other features used in photogrammetry, especially for the detection of homologous points, may respond differently to the light spectrum. Testing other feature detectors is out of the scope of this study, but it is planned to be conducted in the future. This requires, naturally, a change of test fields and further experiments, but the research plan presented here appears to be suitable for this purpose.
Another limitation of the research is the demosaicing and greyscale calculation algorithms used. The author of the research selected the most popular of these, due to their simplicity often used in popular software. The point was to prove that the selection of these algorithms influences feature location and digital photogrammetry cannot be ignored. Furthermore, the author believes that in precision photogrammetry, correct selection of both the light spectrum and specific algorithms is important and should be taken into account. The recommendations for the algorithms to be used clearly present the results achieved.
The author has not modelled the phenomenon presented in this study, as it depends on a very large number of variables. The determination of chromatic aberration correction coefficients for a given lens and a specific camera model, considering the algorithms and feature detectors used and, more difficultly, the light spectrum, is beyond the range of the research. The implementation of correction factors and the development of a new method for calibrating a nonmetric camera in the context of the light spectrum is planned to be performed in future research.
In terms of a practical recommendation, it should be noted that currently produced photogrammetric software does not take into account the change of demosaicing algorithms and greyscale calculations. This is assumed a priori and has not been previously considered by the developers. Software of this type, a kind of black box, does not show in detail what algorithms they use. Obviously, in most cases, they are well known and described. Also, popular cameras do not have the possibility to choose the demosaicing algorithm, providing a ready-made JPEG file. In the case of a practical need to improve the quality of precision photogrammetry products, the author recommends using the algorithms mentioned, but this must be done before the software calculates the model. So, first, a JPEG must be created from RAW using the method specified, and then, already in the software, the greyscale must be recalculated using the method specified here as well. The software known to the author does not have such possibilities, and with this article, the manufacturers are encouraged to make these changes. In addition, in the absence of the possibility of changing the algorithms, the author recommends illuminating the scene with 5600 K or green light, which will minimise feature deviation and improve the quality of the models.

5. Conclusions

The research shows which of the popular demosaicing and greyscale calculation algorithms produce the least feature deviation, in the context of the variable light spectrum of the scene.
The lowest feature deviation in the polychromatic light domain was obtained for light with color temperatures of 5000 K and 5600 K, which is a practical recommendation for illuminating a scene with artificial light. In this domain, the lowest feature deviation was guaranteed by the demosaicing algorithms of the smooth hue type and the greyscale calculation of the mean type. From a practical point of view, the choice of lighting with a colour temperature between 5000 and 5600 K and these algorithms will guarantee the lowest feature deviation, more precise camera calibration and, consequently, a more accurate model.
In the monochromatic light domain, the case is slightly different. The lowest feature deviation was achieved for light in the green spectrum, and the demosaicing and greyscale calculation algorithms had no effect on the feature deviation. From a practical point of view, if the demosaicing and greyscale calculation algorithms cannot be adapted, the scene can be illuminated with green light, which will improve the precision of the feature location and minimise the negative effects of chromatic aberration.
The author did not model the phenomena and does not propose correction methods, as this is beyond the scope of this research, but is planning to carry this out in future research and publish the results. Due to the complex nature of the phenomena and the variety of equipment used in photogrammetry, it seems that modern artificial intelligence methods can play a large role here.

Funding

This research received no external funding.

Data Availability Statement

The raw data taken during experiment are published at Burdziakowski, P. (2024). Calibration images under different lighting conditions—static for feature localisation (1–) [dataset]. Gdansk University of Technology. https://doi.org/10.34808/sb8g-tm87.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

Table A1. Feature position by light spectrum and demosaicing method. Values in pixels.
Table A1. Feature position by light spectrum and demosaicing method. Values in pixels.
Light Spectrum SettingDemosaicing MethodMeanStandard DeviationMedian
Tungusen—3200 Kneighbour1.020.580.88
Tungusen—3200 Kbilinear0.890.370.82
Tungusen—3200 Ksmooth hue0.900.530.79
Tungusen—3200 Kmedian0.890.370.82
Tungusen—3200 Kgradient0.890.430.82
Modeling—3500 Kneighbour1.020.690.91
Modeling—3500 Kbilinear0.680.380.60
Modeling—3500 Ksmooth hue0.660.490.56
Modeling—3500 Kmedian0.680.380.60
Modeling—3500 Kgradient0.650.350.58
White Halogen—4300 Kneighbour1.030.720.90
White Halogen—4300 Kbilinear0.560.320.51
White Halogen—4300 Ksmooth hue0.540.480.43
White Halogen—4300 Kmedian0.560.320.51
White Halogen—4300 Kgradient0.540.320.46
Daylight—5000 Kneighbour1.060.770.85
Daylight—5000 Kbilinear0.510.360.44
Daylight—5000 Ksmooth hue0.520.540.36
Daylight—5000 Kmedian0.510.360.44
Daylight—5000 Kgradient0.500.390.41
HMI—5600 Kneighbour1.090.770.88
HMI—5600 Kbilinear0.500.360.41
HMI—5600 Ksmooth hue0.500.560.32
HMI—5600 Kmedian0.500.360.41
HMI—5600 Kgradient0.470.400.35
Blueneighbour1.351.301.03
Bluebilinear0.810.980.49
Bluesmooth hue1.030.920.85
Bluemedian0.810.980.49
Bluegradient0.710.860.44
Greenneighbour0.650.550.50
Greenbilinear0.480.410.38
Greensmooth hue0.480.420.37
Greenmedian0.480.410.38
Greengradient0.530.440.42
Redneighbour1.651.031.52
Redbilinear1.400.961.22
Redsmooth hue1.240.891.00
Redmedian1.400.961.22
Redgradient1.461.001.21
RGBneighbour1.350.821.17
RGBbilinear0.610.390.52
RGBsmooth hue0.610.490.49
RGBmedian0.610.390.52
RGBgradient0.590.430.45
G + Bneighbour1.090.740.95
G + Bbilinear0.690.450.63
G + Bsmooth hue0.720.530.59
G + Bmedian0.690.450.63
G + Bgradient0.760.600.59
R + Bneighbour1.621.111.55
R + Bbilinear1.010.730.87
R + Bsmooth hue0.920.660.76
R + Bmedian1.010.730.87
R + Bgradient0.860.610.72
R + Gneighbour1.210.671.04
R + Gbilinear0.750.490.64
R + Gsmooth hue0.760.510.62
R + Gmedian0.750.490.64
R + Ggradient0.720.430.60
Table A2. Feature position by light spectrum and grey calculation method. Values in pixels.
Table A2. Feature position by light spectrum and grey calculation method. Values in pixels.
Light Spectrum SettingGrey MethodMeanStandard DeviationMedian
Tungusen—3200 Krec6010.870.380.79
Tungusen—3200 Kitu7090.870.400.80
Tungusen—3200 Kitu21000.890.430.81
Tungusen—3200 Kmean0.920.450.82
Tungusen—3200 Klight1.030.610.90
Modeling—3500 Krec6010.720.460.60
Modeling—3500 Kitu7090.720.440.61
Modeling—3500 Kitu21000.740.470.63
Modeling—3500 Kmean0.710.510.55
Modeling—3500 Klight0.800.580.63
White Halogen—4300 Krec6010.640.490.49
White Halogen—4300 Kitu7090.640.450.51
White Halogen—4300 Kitu21000.660.480.54
White Halogen—4300 Kmean0.610.530.44
White Halogen—4300 Klight0.690.530.55
Daylight—5000 Krec6010.610.530.44
Daylight—5000 Kitu7090.610.500.47
Daylight—5000 Kitu21000.640.520.50
Daylight—5000 Kmean0.590.590.38
Daylight—5000 Klight0.660.610.46
HMI—5600 Krec6010.590.530.42
HMI—5600 Kitu7090.600.500.46
HMI—5600 Kitu21000.620.520.48
HMI—5600 Kmean0.580.600.37
HMI—5600 Klight0.670.660.45
Bluerec6011.141.080.89
Blueitu7090.581.120.00
Blueitu21000.531.110.00
Bluemean1.200.771.05
Bluelight1.270.821.13
Greenrec6010.450.350.37
Greenitu7090.430.340.35
Greenitu21000.430.340.36
Greenmean0.820.610.66
Greenlight0.490.440.38
Redrec6011.520.861.39
Reditu7091.851.221.94
Reditu21001.871.031.85
Redmean0.960.580.86
Redlight0.940.550.88
RGBrec6010.740.570.57
RGBitu7090.730.560.60
RGBitu21000.770.580.65
RGBmean0.700.590.50
RGBlight0.820.710.58
G + Brec6010.690.510.58
G + Bitu7090.730.550.62
G + Bitu21000.740.560.61
G + Bmean0.690.510.57
G + Blight1.110.670.98
R + Brec6011.100.820.91
R + Bitu7091.160.980.91
R + Bitu21001.210.930.98
R + Bmean0.850.590.70
R + Blight1.120.740.95
R + Grec6010.760.460.65
R + Gitu7090.700.420.60
R + Gitu21000.740.450.63
R + Gmean0.950.640.80
R + Glight1.040.680.86
Table A3. Feature position by cases, spectrum, and demosaicing method and greyscale calculation method. Values in pixels.
Table A3. Feature position by cases, spectrum, and demosaicing method and greyscale calculation method. Values in pixels.
Light Spectrum SettingDemosaicing MethodGrey MethodMeanStandard DeviationMedian
Tungusen—3200 Kneighbourrec6010.970.540.84
Tungusen—3200 Kneighbouritu7090.930.540.80
Tungusen—3200 Kneighbouritu21000.950.560.82
Tungusen—3200 Kneighbourmean1.060.531.01
Tungusen—3200 Kneighbourlight1.200.661.10
Tungusen—3200 Kbilinearrec6010.860.300.79
Tungusen—3200 Kbilinearitu7090.850.270.81
Tungusen—3200 Kbilinearitu21000.870.300.81
Tungusen—3200 Kbilinearmean0.900.410.81
Tungusen—3200 Kbilinearlight0.980.500.92
Tungusen—3200 Ksmooth huerec6010.860.440.78
Tungusen—3200 Ksmooth hueitu7090.880.510.78
Tungusen—3200 Ksmooth hueitu21000.900.550.79
Tungusen—3200 Ksmooth huemean0.890.490.79
Tungusen—3200 Ksmooth huelight0.960.630.84
Tungusen—3200 Kmedianrec6010.860.300.79
Tungusen—3200 Kmedianitu7090.850.270.81
Tungusen—3200 Kmedianitu21000.870.300.81
Tungusen—3200 Kmedianmean0.900.410.81
Tungusen—3200 Kmedianlight0.980.500.92
Tungusen—3200 Kgradientrec6010.820.240.78
Tungusen—3200 Kgradientitu7090.860.310.82
Tungusen—3200 Kgradientitu21000.880.350.82
Tungusen—3200 Kgradientmean0.850.360.79
Tungusen—3200 Kgradientlight1.040.700.87
Modeling—3500 Kneighbourrec6011.000.660.88
Modeling—3500 Kneighbouritu7090.900.620.79
Modeling—3500 Kneighbouritu21000.930.630.80
Modeling—3500 Kneighbourmean1.110.741.02
Modeling—3500 Kneighbourlight1.170.761.06
Modeling—3500 Kbilinearrec6010.670.350.60
Modeling—3500 Kbilinearitu7090.670.330.61
Modeling—3500 Kbilinearitu21000.700.350.63
Modeling—3500 Kbilinearmean0.630.400.54
Modeling—3500 Kbilinearlight0.710.470.60
Modeling—3500 Ksmooth huerec6010.640.480.55
Modeling—3500 Ksmooth hueitu7090.670.510.57
Modeling—3500 Ksmooth hueitu21000.710.580.58
Modeling—3500 Ksmooth huemean0.610.430.52
Modeling—3500 Ksmooth huelight0.680.450.57
Modeling—3500 Kmedianrec6010.670.350.60
Modeling—3500 Kmedianitu7090.670.330.61
Modeling—3500 Kmedianitu21000.700.350.63
Modeling—3500 Kmedianmean0.630.400.54
Modeling—3500 Kmedianlight0.710.470.60
Modeling—3500 Kgradientrec6010.610.260.56
Modeling—3500 Kgradientitu7090.670.290.60
Modeling—3500 Kgradientitu21000.680.290.60
Modeling—3500 Kgradientmean0.570.280.53
Modeling—3500 Kgradientlight0.730.530.61
White Halogen—4300 Kneighbourrec6011.030.720.95
White Halogen—4300 Kneighbouritu7090.920.660.78
White Halogen—4300 Kneighbouritu21000.960.680.82
White Halogen—4300 Kneighbourmean1.200.761.06
White Halogen—4300 Kneighbourlight1.020.740.84
White Halogen—4300 Kbilinearrec6010.550.290.51
White Halogen—4300 Kbilinearitu7090.570.290.51
White Halogen—4300 Kbilinearitu21000.600.320.56
White Halogen—4300 Kbilinearmean0.480.300.44
White Halogen—4300 Kbilinearlight0.620.370.59
White Halogen—4300 Ksmooth huerec6010.530.520.41
White Halogen—4300 Ksmooth hueitu7090.550.480.44
White Halogen—4300 Ksmooth hueitu21000.570.520.46
White Halogen—4300 Ksmooth huemean0.450.370.37
White Halogen—4300 Ksmooth huelight0.600.500.45
White Halogen—4300 Kmedianrec6010.550.290.51
White Halogen—4300 Kmedianitu7090.570.290.51
White Halogen—4300 Kmedianitu21000.600.320.56
White Halogen—4300 Kmedianmean0.480.300.44
White Halogen—4300 Kmedianlight0.620.370.59
White Halogen—4300 Kgradientrec6010.510.270.44
White Halogen—4300 Kgradientitu7090.580.310.50
White Halogen—4300 Kgradientitu21000.580.310.49
White Halogen—4300 Kgradientmean0.420.250.37
White Halogen—4300 Kgradientlight0.610.430.50
Daylight—5000 Kneighbourrec6011.050.740.87
Daylight—5000 Kneighbouritu7090.950.720.76
Daylight—5000 Kneighbouritu21000.990.740.81
Daylight—5000 Kneighbourmean1.210.811.07
Daylight—5000 Kneighbourlight1.110.820.88
Daylight—5000 Kbilinearrec6010.510.330.43
Daylight—5000 Kbilinearitu7090.520.310.47
Daylight—5000 Kbilinearitu21000.560.350.50
Daylight—5000 Kbilinearmean0.440.380.36
Daylight—5000 Kbilinearlight0.540.410.44
Daylight—5000 Ksmooth huerec6010.490.540.34
Daylight—5000 Ksmooth hueitu7090.530.540.40
Daylight—5000 Ksmooth hueitu21000.560.550.41
Daylight—5000 Ksmooth huemean0.440.470.29
Daylight—5000 Ksmooth huelight0.570.580.38
Daylight—5000 Kmedianrec6010.510.330.43
Daylight—5000 Kmedianitu7090.520.310.47
Daylight—5000 Kmedianitu21000.560.350.50
Daylight—5000 Kmedianmean0.440.380.36
Daylight—5000 Kmedianlight0.540.410.44
Daylight—5000 Kgradientrec6010.480.340.39
Daylight—5000 Kgradientitu7090.550.380.47
Daylight—5000 Kgradientitu21000.540.370.45
Daylight—5000 Kgradientmean0.390.330.31
Daylight—5000 Kgradientlight0.550.490.43
HMI—5600 Kneighbourrec6011.040.720.79
HMI—5600 Kneighbouritu7090.960.680.82
HMI—5600 Kneighbouritu21001.000.690.85
HMI—5600 Kneighbourmean1.190.811.02
HMI—5600 Kneighbourlight1.270.861.11
HMI—5600 Kbilinearrec6010.490.320.41
HMI—5600 Kbilinearitu7090.500.300.46
HMI—5600 Kbilinearitu21000.530.330.46
HMI—5600 Kbilinearmean0.450.390.35
HMI—5600 Kbilinearlight0.530.450.42
HMI—5600 Ksmooth huerec6010.470.540.30
HMI—5600 Ksmooth hueitu7090.510.560.35
HMI—5600 Ksmooth hueitu21000.550.590.36
HMI—5600 Ksmooth huemean0.430.540.26
HMI—5600 Ksmooth huelight0.520.580.33
HMI—5600 Kmedianrec6010.490.320.41
HMI—5600 Kmedianitu7090.500.300.46
HMI—5600 Kmedianitu21000.530.330.46
HMI—5600 Kmedianmean0.450.390.35
HMI—5600 Kmedianlight0.530.450.42
HMI—5600 Kgradientrec6010.450.360.32
HMI—5600 Kgradientitu7090.500.380.40
HMI—5600 Kgradientitu21000.520.400.41
HMI—5600 Kgradientmean0.370.290.28
HMI—5600 Kgradientlight0.520.510.36
Blueneighbourrec6011.461.491.22
Blueneighbouritu7091.301.530.00
Blueneighbouritu21001.271.530.00
Blueneighbourmean1.260.811.04
Blueneighbourlight1.460.921.40
Bluebilinearrec6011.061.060.86
Bluebilinearitu7090.250.820.00
Bluebilinearitu21000.220.770.00
Bluebilinearmean1.250.791.10
Bluebilinearlight1.280.811.17
Bluesmooth huerec6011.030.680.85
Bluesmooth hueitu7091.031.120.79
Bluesmooth hueitu21000.861.190.00
Bluesmooth huemean1.080.700.93
Bluesmooth huelight1.140.760.97
Bluemedianrec6011.061.060.86
Bluemedianitu7090.250.820.00
Bluemedianitu21000.220.770.00
Bluemedianmean1.250.791.10
Bluemedianlight1.280.811.17
Bluegradientrec6011.090.910.93
Bluegradientitu7090.060.400.00
Bluegradientitu21000.060.430.00
Bluegradientmean1.170.751.03
Bluegradientlight1.180.761.05
Greenneighbourrec6010.560.410.48
Greenneighbouritu7090.510.390.40
Greenneighbouritu21000.510.380.40
Greenneighbourmean0.990.700.81
Greenneighbourlight0.670.650.46
Greenbilinearrec6010.410.290.36
Greenbilinearitu7090.390.300.33
Greenbilinearitu21000.390.290.33
Greenbilinearmean0.820.580.66
Greenbilinearlight0.400.300.34
Greensmooth huerec6010.420.340.36
Greensmooth hueitu7090.400.310.33
Greensmooth hueitu21000.390.310.33
Greensmooth huemean0.770.590.62
Greensmooth huelight0.410.330.35
Greenmedianrec6010.410.290.36
Greenmedianitu7090.390.300.33
Greenmedianitu21000.390.290.33
Greenmedianmean0.820.580.66
Greenmedianlight0.400.300.34
Greengradientrec6010.430.370.34
Greengradientitu7090.480.380.38
Greengradientitu21000.480.390.38
Greengradientmean0.680.550.56
Greengradientlight0.580.440.46
Redneighbourrec6011.890.901.88
Redneighbouritu7091.801.412.05
Redneighbouritu21002.121.102.19
Redneighbourmean1.200.581.15
Redneighbourlight1.250.531.24
Redbilinearrec6011.440.801.32
Redbilinearitu7091.931.192.08
Redbilinearitu21001.820.991.83
Redbilinearmean0.910.540.81
Redbilinearlight0.880.540.80
Redsmooth huerec6011.290.841.10
Redsmooth hueitu7091.691.081.71
Redsmooth hueitu21001.550.961.38
Redsmooth huemean0.830.590.69
Redsmooth huelight0.820.460.76
Redmedianrec6011.440.801.32
Redmedianitu7091.931.192.08
Redmedianitu21001.820.991.83
Redmedianmean0.910.540.81
Redmedianlight0.880.540.80
Redgradientrec6011.560.861.38
Redgradientitu7091.911.201.89
Redgradientitu21002.021.052.02
Redgradientmean0.950.560.87
Redgradientlight0.860.550.77
RGBneighbourrec6011.310.771.11
RGBneighbouritu7091.210.791.06
RGBneighbouritu21001.250.781.06
RGBneighbourmean1.370.771.28
RGBneighbourlight1.580.921.53
RGBbilinearrec6010.610.350.53
RGBbilinearitu7090.600.330.55
RGBbilinearitu21000.640.360.57
RGBbilinearmean0.550.370.45
RGBbilinearlight0.640.500.51
RGBsmooth huerec6010.580.460.45
RGBsmooth hueitu7090.620.510.52
RGBsmooth hueitu21000.670.550.54
RGBsmooth huemean0.530.400.39
RGBsmooth huelight0.680.500.52
RGBmedianrec6010.610.350.53
RGBmedianitu7090.600.330.55
RGBmedianitu21000.640.360.57
RGBmedianmean0.550.370.45
RGBmedianlight0.640.500.51
RGBgradientrec6010.580.410.41
RGBgradientitu7090.630.440.51
RGBgradientitu21000.640.440.52
RGBgradientmean0.490.370.36
RGBgradientlight0.590.450.47
G + Bneighbourrec6011.060.720.94
G + Bneighbouritu7091.110.800.95
G + Bneighbouritu21001.090.780.97
G + Bneighbourmean0.760.530.66
G + Bneighbourlight1.430.711.35
G + Bbilinearrec6010.570.290.56
G + Bbilinearitu7090.600.320.57
G + Bbilinearitu21000.610.330.57
G + Bbilinearmean0.610.420.55
G + Bbilinearlight1.070.600.98
G + Bsmooth huerec6010.610.480.51
G + Bsmooth hueitu7090.680.550.60
G + Bsmooth hueitu21000.720.580.58
G + Bsmooth huemean0.660.450.53
G + Bsmooth huelight0.940.530.80
G + Bmedianrec6010.570.290.56
G + Bmedianitu7090.600.320.57
G + Bmedianitu21000.610.330.57
G + Bmedianmean0.610.420.55
G + Bmedianlight1.070.600.98
G + Bgradientrec6010.630.480.51
G + Bgradientitu7090.660.430.59
G + Bgradientitu21000.680.480.57
G + Bgradientmean0.790.680.55
G + Bgradientlight1.030.780.76
R + Bneighbourrec6011.721.021.71
R + Bneighbouritu7091.751.372.00
R + Bneighbouritu21001.791.292.05
R + Bneighbourmean1.150.661.03
R + Bneighbourlight1.670.961.63
R + Bbilinearrec6011.030.700.93
R + Bbilinearitu7091.090.870.96
R + Bbilinearitu21001.150.791.05
R + Bbilinearmean0.760.520.63
R + Bbilinearlight1.050.640.92
R + Bsmooth huerec6010.900.670.71
R + Bsmooth hueitu7091.000.690.83
R + Bsmooth hueitu21001.020.740.86
R + Bsmooth huemean0.840.650.64
R + Bsmooth huelight0.850.510.77
R + Bmedianrec6011.030.700.93
R + Bmedianitu7091.090.870.96
R + Bmedianitu21001.150.791.05
R + Bmedianmean0.760.520.63
R + Bmedianlight1.050.640.92
R + Bgradientrec6010.800.590.64
R + Bgradientitu7090.870.660.71
R + Bgradientitu21000.910.680.78
R + Bgradientmean0.720.500.60
R + Bgradientlight0.980.570.89
R + Gneighbourrec6011.240.541.16
R + Gneighbouritu7091.020.480.93
R + Gneighbouritu21001.130.491.03
R + Gneighbourmean1.480.811.31
R + Gneighbourlight1.210.840.92
R + Gbilinearrec6010.630.310.60
R + Gbilinearitu7090.600.300.55
R + Gbilinearitu21000.620.340.58
R + Gbilinearmean0.890.570.76
R + Gbilinearlight1.030.660.88
R + Gsmooth huerec6010.670.390.59
R + Gsmooth hueitu7090.630.410.55
R + Gsmooth hueitu21000.660.450.55
R + Gsmooth huemean0.810.510.72
R + Gsmooth huelight1.000.660.82
R + Gmedianrec6010.630.310.60
R + Gmedianitu7090.600.300.55
R + Gmedianitu21000.620.340.58
R + Gmedianmean0.890.570.76
R + Gmedianlight1.030.660.88
R + Ggradientrec6010.650.370.53
R + Ggradientitu7090.680.390.56
R + Ggradientitu21000.680.400.57
R + Ggradientmean0.700.400.58
R + Ggradientlight0.910.500.76

References

  1. Schenk, T. Introduction to Photogrammetry. Ohio State Univ. Columb. 2005, 106, 1. [Google Scholar]
  2. Liu, Y.; Han, K.; Rasdorf, W. Assessment and Prediction of Impact of Flight Configuration Factors on UAS-Based Photogrammetric Survey Accuracy. Remote Sens. 2022, 14, 4119. [Google Scholar] [CrossRef]
  3. Stech, A.; Hudec, R.; Kamencay, P.; Polak, L.; Kufa, J. A Novel Method for 3D Photogrammetry Modeling Using Different Wavelengths. In Proceedings of the 2023 33rd International Conference Radioelektronika (RADIOELEKTRONIKA), Pardubice, Czech Republic, 19–20 April 2023; pp. 1–6. [Google Scholar]
  4. Wang, Y.; Yang, Z.; Kootstra, G.; Khan, H.A. The Impact of Variable Illumination on Vegetation Indices and Evaluation of Illumination Correction Methods on Chlorophyll Content Estimation Using UAV Imagery. Plant Methods 2023, 19, 51. [Google Scholar] [CrossRef]
  5. Sun, B.; Li, Y.; Huang, J.; Cao, Z.; Peng, X. Impacts of Variable Illumination and Image Background on Rice LAI Estimation Based on UAV RGB-Derived Color Indices. Appl. Sci. 2024, 14, 3214. [Google Scholar] [CrossRef]
  6. Roncella, R.; Bruno, N.; Diotri, F.; Thoeni, K.; Giacomini, A. Photogrammetric Digital Surface Model Reconstruction in Extreme Low-Light Environments. Remote Sens. 2021, 13, 1261. [Google Scholar] [CrossRef]
  7. Attard, L.; Debono, C.J.; Valentino, G.; Di Castro, M. Tunnel Inspection Using Photogrammetric Techniques and Image Processing: A Review. ISPRS J. Photogramm. Remote Sens. 2018, 144, 180–188. [Google Scholar] [CrossRef]
  8. Benton, D.J.; Chambers, A.J.; Raffaldi, M.J.; Finley, S.A.; Powers, M.J. Close-Range Photogrammetry in Underground Mining Ground Control. In Proceedings of the Remote Sensing System Engineering VI, San Diego, CA, USA, 28 August–1 September 2016; Ardanuy, P.E., Puschell, J.J., Eds.; SPIE: Bellingham, WA, USA, 2016; Volume 9977, p. 997707. [Google Scholar]
  9. Slaker, B.A.; Mohamed, K.M. A Practical Application of Photogrammetry to Performing Rib Characterization Measurements in an Underground Coal Mine Using a DSLR Camera. Int. J. Min. Sci. Technol. 2017, 27, 83–90. [Google Scholar] [CrossRef] [PubMed]
  10. Burdziakowski, P.; Bobkowska, K. UAV Photogrammetry under Poor Lighting Conditions—Accuracy Considerations. Sensors 2021, 21, 3531. [Google Scholar] [CrossRef] [PubMed]
  11. Bobkowska, K.; Burdziakowski, P.; Szulwic, J.; Zielinska-Dabkowska, K.M. Seven Different Lighting Conditions in Photogrammetric Studies of a 3D Urban Mock-Up. Energies 2021, 14, 8002. [Google Scholar] [CrossRef]
  12. Neale William, T.; David, H. Terpstra Toby Photogrammetric Measurement Error Associated with Lens Distortion. In Proceedings of the SAE 2011 World Congress & Exhibition, Detroit, MI, USA, 12–14 April 2011; SAE International: Warrendale, PA, USA, 2011. [Google Scholar]
  13. Luhmann, T.; Fraser, C.; Maas, H.-G. Sensor Modelling and Camera Calibration for Close-Range Photogrammetry. ISPRS J. Photogramm. Remote Sens. 2016, 115, 37–46. [Google Scholar] [CrossRef]
  14. Li, W.; Klein, J. Multichannel Camera Calibration. In Proceedings of the Proc.SPIE, Burlingame, CA, USA, 3–7 February 2013; Volume 8660, p. 866002. [Google Scholar]
  15. Matsuoka, R.; Asonuma, K.; Takahashi, G.; Danjo, T.; Hirana, K. Evaluation of correction methods of chromatic aberration in digital camera images. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, I-3, 49–55. [Google Scholar] [CrossRef]
  16. Lichti, D.D.; Jarron, D.; Shahbazi, M.; Helmholz, P.; Radovanovic, R. Investigation into the behaviour and modelling of chromatic aberrations in non-metric digital cameras. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W18, 99–106. [Google Scholar] [CrossRef]
  17. Kaufmann, V.; Ladstädter, R. Elimination of Color Fringes in Digital Photographs Caused by Lateral Chromatic Aberration. In Proceedings of the CIPA XX International Symposium, Turin, Italy, 26 September–1 October 2005; p. 25. [Google Scholar]
  18. Luhmann, T.; Hastedt, H.; Tecklenburg, W. Modelling of Chromatic Aberration for High Precision Photogrammetry. In Proceedings of the Commission V Symp. on Image Engineering and Vision Metrology, Proc. ISPRS, Dresden, Germany, 25–27 September 2006; Volume 36, pp. 173–178. [Google Scholar]
  19. Fryskowska, A.; Kedzierski, M.; Wojtkowska, M.; Grochala, A. A Novel Method of Chromatic Aberration Detection and Correction Using Wavelet Analysis. In Proceedings of the 2017 Baltic Geodetic Congress (BGC Geomatics), Gdansk, Poland, 22–25 June 2017; pp. 18–24. [Google Scholar]
  20. Rudakova Victoria and Monasse, P. Precise Correction of Lateral Chromatic Aberration in Images. In Proceedings of the Image and Video Technology; Klette, R., Rivera, M., Satoh, S., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 12–22. [Google Scholar]
  21. Helmholz, P.; Lichti, D.D. Investigation of Chromatic Aberration and Its Influence on the Processing of Underwater Imagery. Remote Sens. 2020, 12, 3002. [Google Scholar] [CrossRef]
  22. Helmholz, P.; Lichti, D.D. Assessment of chromatic aberrations for gopro 3 cameras in underwater environments. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, IV-2/W5, 575–582. [Google Scholar] [CrossRef]
  23. Menna, F.; Nocerino, E.; Remondino, F. Optical Aberrations in Underwater Photogrammetry with Flat and Hemispherical Dome Ports. In Proceedings of the Videometrics, Range Imaging, and Applications XIV, Munich, Germany, 26–27 June 2017; Remondino, F., Shortis, M.R., Eds.; SPIE: Bellingham, WA, USA, 2017; Volume 10332, p. 1033205. [Google Scholar]
  24. Cardaci, A.; Azzola, P.; Bianchessi, M.; Folli, R.; Rapelli, S. Comparative Analysis Among Photogrammetric 3D Models RAW Data vs RGB Images. In Proceedings of the Geomatics and Geospatial Technologies, Genoa, Italy, 1–2 July 2021; Enrico, B.-M., Zamperlin, P., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 271–282. [Google Scholar]
  25. Luhmann, T.; Robson, S.; Kyle, S.; Boehm, J. Close-Range Photogrammetry and 3D Imaging; Walter de Gruyter: Berlin, Germany, 2014. [Google Scholar]
  26. Bian, L.; Wang, Y.; Zhang, J. Generalized MSFA Engineering With Structural and Adaptive Nonlocal Demosaicing. IEEE Trans. Image Process. 2021, 30, 7867–7877. [Google Scholar] [CrossRef] [PubMed]
  27. Reznicek, J.; Luhmann, T.; Jepping, C. Influence of raw image preprocessing and other selected processes on accuracy of close-range photogrammetric systems according to vdi 2634. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B5, 107–113. [Google Scholar] [CrossRef]
  28. Papanikolaou, A.; Garbat, P.; Kujawinska, M. Metrological Evaluation of the Demosaicking Effect on Colour Digital Image Correlation with Application in Monitoring of Paintings. Sensors 2022, 22, 7359. [Google Scholar] [CrossRef] [PubMed]
  29. Shortis, M.R.; Seager, J.W.; Harvey, E.S.; Robson, S. Influence of Bayer Filters on the Quality of Photogrammetric Measurement. In Proceedings of the Videometrics VIII, San Jose, CA, USA, 18–20 January 2005; Beraldin, J.-A., El-Hakim, S.F., Gruen, A., Walton, J.S., Eds.; SPIE: Bellingham, WA, USA, 2005; Volume 5665, p. 56650H. [Google Scholar]
  30. Stamatopoulos, C.; Fraser, C.S.; Cronk, S. Accuracy aspects of utilizing raw imagery in photogrammetric measurement. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B5, 387–392. [Google Scholar] [CrossRef]
  31. Guo, Y.; Zhang, X.; Jin, G. Towards a Novel Generative Adversarial Network-Based Framework for Remote Sensing Image Demosaicking. Remote Sens. 2024, 16, 2283. [Google Scholar] [CrossRef]
  32. Heinze, T.; von Löwis, M.; Polze, A. Joint Multi-Frame Demosaicing and Super-Resolution with Artificial Neural Networks. In Proceedings of the 2012 19th International Conference on Systems, Signals and Image Processing (IWSSIP), Vienna, Austria, 1–13 April 2012; pp. 540–543. [Google Scholar]
  33. Park, Y.; Lee, S.; Jeong, B.; Yoon, J. Joint Demosaicing and Denoising Based on a Variational Deep Image Prior Neural Network. Sensors 2020, 20, 2970. [Google Scholar] [CrossRef]
  34. Ma, X.; Tan, M.; Zhang, S.; Liu, S.; Sun, J.; Han, Y.; Li, Q.; Yang, Y. A Snapshot Near-Infrared Hyperspectral Demosaicing Method with Convolutional Neural Networks in Low Illumination Environment. Infrared Phys. Technol. 2023, 129, 104510. [Google Scholar] [CrossRef]
  35. Saravanan, C. Color Image to Grayscale Image Conversion. In Proceedings of the 2010 Second International Conference on Computer Engineering and Applications, Bali, Indonesia, 19–21 March 2010; Volume 2, pp. 196–199. [Google Scholar]
  36. Rasche, K.; Geist, R.; Westall, J. Re-Coloring Images for Gamuts of Lower Dimension. Comput. Graph. Forum. 2005, 24, 423–432. [Google Scholar] [CrossRef]
  37. Neumann, L.; Cadik, M.; Nemcsics, A. An Efficient Perception-Based Adaptive Color to Gray Transformation. In Proceedings of the Computational Aesthetics in Graphics, Visualization, and Imaging, Alberta, Canada, 28 May–2 June 2006; Cunningham, D.W., Meyer, G., Neumann, L., Eds.; The Eurographics Association: Goslar, Germany, 2007. [Google Scholar]
  38. Bala, R.; Eschbach, R. Spatial Color-to-Grayscale Transform Preserving Chrominance Edge Information. Color Imaging Conf. 2004, 12, 82. [Google Scholar] [CrossRef]
  39. Wan, Y.; Xie, Q. A Novel Framework for Optimal RGB to Grayscale Image Conversion. In Proceedings of the 2016 8th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 27–28 August 2016; Volume 2, pp. 345–348. [Google Scholar]
  40. International Telecommunication Union. Parameter Values for the HDTV Standards for Production and International Programme Exchange. In Recommendation ITU-R BT.709-6; International Radio Consultative Committee International Telecommunication Union: Geneva, Switzerland, 2015. [Google Scholar]
  41. International Telecommunication Union. Studio Encoding Parameters of Digital Television for Standard 4:3 and Wide-Screen 16:9 Aspect Ratios. In Recommendation ITU-R BT.610-7; International Radio Consultative Committee International Telecommunication Union: Geneva, Switzerland, 2011. [Google Scholar]
  42. International Telecommunication Union. Image Parameter Values for High Dynamic Range Television for Use in Production and International Programme Exchange. In Recommendation ITU-R BT.2100-2; International Radio Consultative Committee International Telecommunication Union: Geneva, Switzerland, 2018. [Google Scholar]
  43. Harris, C.G.; Stephens, M.J. A Combined Corner and Edge Detector. In Proceedings of the Alvey Vision Conference, Manchester, UK, 31 August–2 September 1988. [Google Scholar]
  44. Popescu, A.C.; Farid, H. Exposing Digital Forgeries in Color Filter Array Interpolated Images. IEEE Trans. Signal Process. 2005, 53, 3948–3959. [Google Scholar] [CrossRef]
  45. Zhang, R.; Isola, P.; Efros, A.A.; Shechtman, E.; Wang, O. The Unreasonable Effectiveness of Deep Features as a Perceptual Metric. In Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 586–595. [Google Scholar]
  46. Bouguet, J.-Y. Camera Calibration Toolbox for Matlab (1.0). CaltechDATA 2022. [Google Scholar] [CrossRef]
  47. Zhang, Y.-J. Camera Calibration. In 3-D Computer Vision: Principles, Algorithms and Applications; Zhang, Y.-J., Ed.; Springer Nature: Singapore, 2023; pp. 37–65. ISBN 978-981-19-7580-6. [Google Scholar]
  48. Qi, W.; Li, F.; Zhenzhong, L. Review on Camera Calibration. In Proceedings of the 2010 Chinese Control and Decision Conference, Xuzhou, China, 26–28 May 2010; pp. 3354–3358. [Google Scholar]
  49. Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern. Anal Mach Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  50. Zhang, Z. Flexible Camera Calibration By Viewing a Plane From Unknown Orientations. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Volume 1, pp. 666–673. [Google Scholar]
Figure 1. CFA-type RGGB.
Figure 1. CFA-type RGGB.
Remotesensing 16 02644 g001
Figure 2. (a) Scheme of the testing station, (b) example image.
Figure 2. (a) Scheme of the testing station, (b) example image.
Remotesensing 16 02644 g002
Figure 3. (a) Position of the feature on the matrix images, (b) close-up on a single feature, red star average position calculated from 10 images, blue point position of the feature on individual images.
Figure 3. (a) Position of the feature on the matrix images, (b) close-up on a single feature, red star average position calculated from 10 images, blue point position of the feature on individual images.
Remotesensing 16 02644 g003
Figure 4. Feature position by light spectrum (The central mark of the on each box indicates the median, and the bottom and top edges of the box indicate the 25th and 75th percentiles, respectively. The whiskers extend to the most extreme data points not considered outliers, and the outliers are plotted individually using the ’dot’ marker symbol).
Figure 4. Feature position by light spectrum (The central mark of the on each box indicates the median, and the bottom and top edges of the box indicate the 25th and 75th percentiles, respectively. The whiskers extend to the most extreme data points not considered outliers, and the outliers are plotted individually using the ’dot’ marker symbol).
Remotesensing 16 02644 g004
Figure 5. Feature position by light spectrum and demosaicing method.
Figure 5. Feature position by light spectrum and demosaicing method.
Remotesensing 16 02644 g005
Figure 6. Position of the feature by the light spectrum and grey calculation method.
Figure 6. Position of the feature by the light spectrum and grey calculation method.
Remotesensing 16 02644 g006
Table 1. Details of lighting settings.
Table 1. Details of lighting settings.
Setting NameSpectrumCIE 1913CCTLUXCRI
Tungsten (3200 K)Remotesensing 16 02644 i001Remotesensing 16 02644 i0023135326.695.6
Modelling Lamp (3500 K)Remotesensing 16 02644 i003Remotesensing 16 02644 i0043370378.495.1
White Halogen (4300 K)Remotesensing 16 02644 i005Remotesensing 16 02644 i0063985501.995.0
Horizon daylight (5000 K)Remotesensing 16 02644 i007Remotesensing 16 02644 i0084616388.796.0
HMI5600 (5600 K)Remotesensing 16 02644 i009Remotesensing 16 02644 i0105180368.796.6
BlueRemotesensing 16 02644 i011Remotesensing 16 02644 i0120173.70.0
GreenRemotesensing 16 02644 i013Remotesensing 16 02644 i0147845327.40.0
RedRemotesensing 16 02644 i015Remotesensing 16 02644 i016088.00.0
Blue Green RedRemotesensing 16 02644 i017Remotesensing 16 02644 i0188459286.572.7
Blue GreenRemotesensing 16 02644 i019Remotesensing 16 02644 i02016,642293.029.2
Blue RedRemotesensing 16 02644 i021Remotesensing 16 02644 i022377379.90.0
Green RedRemotesensing 16 02644 i023Remotesensing 16 02644 i0245016319.750.8
Table 2. Overview of the algorithm parameters used.
Table 2. Overview of the algorithm parameters used.
Light Spectrum SettingInterpolation Method [44,45]RGB to GreyscaleDetector
Tungsten (3200 K)neighbourrec601 (Equation (1))Harris [43]
Modelling Lamp (3500 K)bilinearitu709 (Equation (2))
White Halogen (4300 K)smooth_hueitu2100 (Equation (3))
Horizon daylight (5000 K)medianmean (Equation (4))
HMI5600 (5600 K)gradientlight (Equation (5))
Blue
Green
Red
Red + Green + Blue
Green + Blue
Red + Blue
Red + Green
Table 3. Position of features by light spectrum. Values in pixels.
Table 3. Position of features by light spectrum. Values in pixels.
Light Spectrum SettingMeanStandard DeviationMedian
Tungsten (3200 K)0.920.460.82
Modelling Lamp (3500 K)0.740.500.60
White Halogen (4300 K)0.650.500.51
Horizon daylight (5000 K)0.620.550.46
HMI5600 (5600 K)0.610.570.44
Blue0.941.040.67
Green0.520.450.40
Red1.430.981.23
Red + Green + Blue0.750.600.58
Green + Blue0.790.580.65
Red + Blue1.080.830.88
Red + Green0.840.560.68
Table 4. Position of the features by the demosaicing method. Values in pixels.
Table 4. Position of the features by the demosaicing method. Values in pixels.
Demosaicing MethodMeanStandard DeviationMedian
neighbour1.180.880.97
bilinear0.740.620.60
smooth hue0.740.650.56
median0.740.620.60
gradient0.720.620.57
Table 5. Feature position by grey calculation method. Values in pixels.
Table 5. Feature position by grey calculation method. Values in pixels.
Grey MethodMeanStandard DeviationMedian
rec6010.820.690.62
itu7090.800.780.59
itu21000.820.760.61
mean0.800.610.63
light0.890.680.71
Table 6. Summary of the 10 lowest median results for cases grouped by spectrum, demosaicing method, and greyscale calculation method (see Appendix A for complete table). Values in pixels.
Table 6. Summary of the 10 lowest median results for cases grouped by spectrum, demosaicing method, and greyscale calculation method (see Appendix A for complete table). Values in pixels.
Light Spectrum SettingDemosaicing MethodGrey MethodMeanStandard DeviationMedian
HMI—5600 Ksmooth huemean0.430.540.26
HMI—5600 Kgradientmean0.370.290.28
Daylight—5000 Ksmooth huemean0.440.470.29
HMI—5600 Ksmooth huerec6010.470.540.30
Daylight—5000 Kgradientmean0.390.330.31
HMI—5600 Kgradientrec6010.450.360.32
HMI—5600 Ksmooth huelight0.520.580.33
Greenbilinearitu21000.390.290.33
Greenmedianitu21000.390.290.33
Greensmooth hueitu21000.390.310.33
Greenbilinearitu7090.390.300.33
Greenmedianitu7090.390.300.33
Greensmooth hueitu7090.400.310.33
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Burdziakowski, P. The Effect of Varying the Light Spectrum of a Scene on the Localisation of Photogrammetric Features. Remote Sens. 2024, 16, 2644. https://doi.org/10.3390/rs16142644

AMA Style

Burdziakowski P. The Effect of Varying the Light Spectrum of a Scene on the Localisation of Photogrammetric Features. Remote Sensing. 2024; 16(14):2644. https://doi.org/10.3390/rs16142644

Chicago/Turabian Style

Burdziakowski, Pawel. 2024. "The Effect of Varying the Light Spectrum of a Scene on the Localisation of Photogrammetric Features" Remote Sensing 16, no. 14: 2644. https://doi.org/10.3390/rs16142644

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop