1. Introduction
Under natural conditions, the ground illuminance scattered by sunlight during the day is approximately 10,000 lux. At night, the illuminance on the ground under a full moon at the zenith is around 0.1 lux, while on a clear night with only faint starlight, the illuminance is approximately 0.01 lux [
1,
2]. Environments with ground illuminance below 0.1 lux are generally referred to as low-light environments. Information acquisition in low-light environments is of significant importance in both military and civilian applications [
3]. The experience of several contemporary regional wars bears out that large-scale attacks or critical battles often occur at night. Obtaining comprehensive battlefield intelligence throughout the day is a key factor for success in modern warfare. To adapt to the unpredictability of war and increase the temporal and spatial coverage of operations, night vision technology has become essential. Currently, aerial reconnaissance night vision technology primarily falls into two categories: low-light imaging technology and infrared thermal imaging technology. Both have the advantages of high reconnaissance concealment and adaptability for imaging throughout the day. Compared to infrared thermal imaging, low-light imaging technology can achieve imaging results close to daylight in nighttime environments and offers the advantage of high resolution. As a result, it has received increasing attention in recent years [
4].
Traditional low-light night vision systems generally provide monochromatic grayscale images (each pixel on the image has only two possible values or grayscale levels). Monochromatic images are less sensitive to human eyes, and targets that require color differentiation can be easily overlooked. This directly affects the detection and identification of targets in military reconnaissance. Color information can effectively improve image contrast because human eyes have high sensitivity, target recognition rates, and reaction speeds for color images. Color information in low-light environments can be divided into pseudo-color and true color [
5,
6]. Pseudo-color information mainly utilizes image fusion from different wavelength bands, such as a fusion between low-light images and infrared images or of low-light images with near-infrared spectral images. However, such fusion images cannot accurately reflect the true color information of objects and may affect target interpretation. Low-light true-color imaging technology uses filtering techniques to spectrally divide visible light at night, and then utilizes high-sensitivity detectors to capture images in different wavelength bands. These images are subsequently fused to obtain low-light night vision images with true colors that are similar or close to those observed under daylight conditions. True-color low-light images can effectively enhance the richness and depth of information acquisition at night, increasing the reliability of target interpretation [
7,
8].
In order to obtain large-swath-width, high-resolution true-color images in low-light environments, this study focuses on a low-light true-color system based on an RGB filter wheel. The basic approach involves using a telephoto lens with a large aperture, performing multi-line sequential exposure using an RGB filter wheel to separate the reflected light from the target into three spectral bands, and then synthesizing real-time images to obtain true-color images. In addition, to achieve a large coverage area in a single image, the optical stitching method of multiple detector optical fields is investigated. The performance of the developed low-light true-color imaging system is validated, revealing that all system indicators meet the requirements. The low-light imaging effect is excellent, with vibrant colors and high fidelity, indicating that the system can meet the demands of airborne low-light true-color imaging.
3. RGB Filter Wheel Design
The reflected natural light from the target is filtered through the RGB color filters to obtain RGB single-channel images, which can be further fused to approximate a natural-looking color image. The R, G, and B color filters are installed on the filter wheel and rotated sequentially around its axis to switch between filters, thereby capturing the target’s RGB color images [
17,
18]. The selected wavelength bands and transmittance of each color filter are as follows:
R band: 620–680 nm, transmittance 96%, other wavelengths blocked.
G band: 500–540 nm, transmittance 96%, other wavelengths blocked.
B band: 440–480 nm, transmittance 96%, other wavelengths blocked.
The selection of these RGB filter wavelengths not only considers the requirements for low-light true-color imaging but also extends the application of low-light night vision in the field of low-light remote sensing for object discrimination [
18].
The RGB filter wheel mainly consists of a support bracket, three-color filters, bearings, drive motors, encoders, and Hall effect switches. Its three-dimensional model is shown in
Figure 6. According to the requirements of the imaging system’s spectral channels, the filter wheel contains three filters and is located between the optical objective group and the imaging components. The bearings are preloaded and the shaft holes are matched during assembly to restrict radial movement of the filter wheel, ensuring that the rotational tilt angle of the filter wheel meets the requirements.
Each of the three filters on the filter wheel’s starting positions is equipped with a Hall effect switch. When a filter reaches the aperture position, it triggers the exposure and provides synchronous exposure signals to the image sensor. The operational mode is illustrated in
Figure 7.
For every rotation of the filter wheel, the detector undergoes three exposures, obtaining RGB color images that are merged into a single frame. The maximum frame rate of the detector is 25 frames per second, and the output frame rate of the color image is 8 frames per second. Therefore, the rotation speed (
n) of the filter wheel is calculated as:
During the rotation of the filter wheel, to ensure that the exposure time within each individual color filter segment meets the requirements, the time when the aperture is located within a specific color filter should be greater than the exposure time, and the maximum diameter of the filter should cover the aperture. The calculation model is shown in
Figure 8.
Based on the previous analysis, with a maximum exposure time of 22.5 ms for the detector, the angle (
θ) through which the filter wheel rotates during the exposure time is:
The maximum radius
OB of the filter wheel is given by:
where
θ1 is the distance between the optical axis and the rotational center of the filter wheel and
d is the maximum radius of the aperture, which is limited by the exit pupil diameter of the optical system.
According to geometric relationships, the arc length swept by the aperture within the exposure time must be smaller than the arc length of a 120° arc with a radius of
OO1, i.e.:
where
θ1 is the angular aperture of the aperture with respect to the rotational center and
d is the half-width of the filter wheel bracket.
θ1 can be determined from the following equation:
By solving Equations (5)–(7) simultaneously, the maximum size of the filter can be obtained.
4. Optical Stitching of the Reflective Prism
A large format is crucial for the low-light imaging system to ensure a wide swath width and high resolution. Four reflective prisms are arranged in a configuration similar to four small squares, forming a larger square. This arrangement divides the imaging beam into four parts, which are then captured by four COMS detectors. The four detectors are positioned around the reflective prisms, as shown in the schematic diagram of the equivalent image plane in
Figure 9.
The direct stitching of the detector chips is challenging, and it is almost impossible to achieve seamless integration. Therefore, the four detectors need to be dispersed in three-dimensional space to effectively utilize the image plane. A four-panel reflective prism is employed to implement the strategy of dividing the light paths into four channels. The reflective surfaces of the prisms are inclined at 45° to the optical axis. The four light paths pass through the reflective prism assembly and project onto four groups of high-sensitivity detector components in four directions in three-dimensional space [
19,
20,
21]. The working principle of the reflective prism assembly is illustrated in
Figure 10.
The size and height of the reflective prism vary with its position in the optical path. The relationship between the prism size and the optical path is depicted in
Figure 11. The length of the 45° reflective prism is greater near the exit pupil position than the farthest position, forming a trapezoidal effective reflective surface [
22,
23].
A calculation model is established to determine the relationship between the edge rays of the field of view and the size of the reflective prism, as shown in
Figure 12.
Here,
R is the optical exit pupil radius,
D is the size of a single detector,
L is the back-working distance of the optics, and
L1 is the distance between the placement of the reflective prism and the detector. Based on geometric relationships, the length of the reflective prism in the
Y direction is given by:
The
Y dimension of the reflective prism determines the relative positions of the two edges
F1F4 and
F2F3, and their different positions will result in different widths in the
X direction. According to geometric relationships, the lengths of the reflective prism in the
X direction are:
For ease of engineering implementation, the projected surfaces of the reflective prism along the optical axis direction are designed as rectangles, with dimensions that cover the theoretically calculated maximum size. Moreover, the design of the reflective prism incorporates a 3% overlap between the stitched fields of view.
6. Experimental Verification and Results
To verify the effectiveness of the RGB filter wheel in fusing low-light images into true-color images, experiments were conducted using a large-aperture short-focus lens matched with a single high-sensitivity detector. The filter wheel and the experimental prototype are shown in
Figure 14.
The experiments were conducted in a darkroom with adjustable environmental illumination ranging from 0.01 to 10,000 lux, and the targets captured were in color. The acquired monochrome images and the fused true-color images are shown in
Table 4. It can be observed that even in low-light conditions of 0.01 lux, the RGB channel images obtained using the filter wheel still maintain high resolution and contrast. In higher illuminations, the images are clear with sharp details. After image fusion, the color information of the targets can be restored, resulting in clear, high-resolution, and richly saturated true-color images. This demonstrates that the low-light imaging system based on the RGB filter wheel has excellent low-light imaging capabilities and a high dynamic range.
Based on the research and design results mentioned above, the entire system was manufactured and assembled. The field stitching was adjusted using parallel light pipes and gradient rate targets. The adjustment was carried out at the site shown in
Figure 15. The overlap ratio was controlled within 3% for each stitched field. Image stitching software was used to ensure a stitching accuracy of within three pixels between adjacent fields.
Furthermore, field experiments were conducted to evaluate the imaging capabilities of the low-light imaging system. The experiments were conducted in overcast and moonless conditions in a coastal beach to avoid artificial light interference from urban areas. The ground illuminance was measured to be 0.01 lux, using a high-precision illuminometer. A comparison was made between images acquired by a conventional camera and the low-light imaging system, as shown in
Figure 16.
To further validate the performance of the integrated large-format low-light imaging system, aerial dynamic tests were conducted in December 2021 at 5:30 a.m. at Neifu Airport in Shaanxi Province, China, with ground illuminance ranging from 1 to 10 lux.
Figure 17 shows a single frame image captured during the aerial tests. The results demonstrate that the low-light imaging system can produce high-contrast and high-definition images in low-light airborne environments, while providing RGB true-color image output. The fields in the aerial images are stitched with high accuracy. Each single image has a large coverage area and clear visibility of buildings, farmland, and trees, resulting in sharp edges. The calculated ground resolution reached 0.2 m per pixel at an altitude of 2 km, indicating the system’s excellent long-range and large-format imaging capabilities. The flight tests also verified the stability and reliability of the airborne imaging capabilities of the system.
7. Conclusions
This study designed, developed, and tested a large-format low-light true-color imaging system for airborne platforms. To capture color information under extremely low-light conditions, a technique utilizing an RGB filter wheel, high-sensitivity COMS detectors, and multi-line sequential exposure was proposed to achieve the fusion of true-color low-light images. An SNR calculation model for the low-light imaging system was established, and its low-light imaging capabilities were thoroughly evaluated. The relationship between the filter wheel’s rotation speed, sector size, and exposure strategy was determined, and the filter wheel parameters were calculated. The calculation method for the structural parameters of the four-panel reflective prism was studied, and mathematical expressions for the geometric parameters of the prism were provided. Based on these research results, the design of the large-format low-light imaging system based on the RGB filter wheel was realized. Laboratory and field experiments confirmed that the system can produce high-contrast true-color images in low-light conditions of 0.01 lux, and all performance indicators of the system meet the design requirements. The research findings and design methodology presented in this study have reference value for the design of airborne low-light imaging equipment.