Next Article in Journal
Space Object and Background Polarization Models and Detectability Analysis
Previous Article in Journal
A Federated Transfer Learning Framework Based on Heterogeneous Domain Adaptation for Students’ Grades Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Experiment of High-Resolution Multispectral Polarization Imaging System

1
National and Local Joint Engineering Research Center of Space Optoelectronics Technology, Changchun University of Science and Technology, Changchun 130022, China
2
College of Opto-Electronic Engineering, Changchun University of Science and Technology, Changchun 130022, China
3
College of Computer Science and Technology, Changchun University of Science and Technology, Changchun 130022, China
4
College of Electronic Information Engineering, Changchun University of Science and Technology, Changchun 130022, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(21), 10712; https://doi.org/10.3390/app122110712
Submission received: 27 September 2022 / Revised: 13 October 2022 / Accepted: 20 October 2022 / Published: 22 October 2022

Abstract

:
This paper addresses the objectives in a complex context: the polarization mathematical and physical model of the basic components of the target/background is constructed by simulation modeling. The image fusion experiment is carried out based on a two-dimensional wavelet transformation. A high-resolution polarization imaging instrument was developed, and static experiments were carried out with polarization, infrared, and visible cameras. The result shows that using polarization imaging detection technology to detect physical evidence targets in complex backgrounds can effectively improve target contrast. It can enrich the target information, improve image quality, and improve target detection accuracy.

1. Introduction

As the demand for target acquisition continues to increase, higher requirements are placed on the capability and accuracy of target identification and information acquisition. Compared with traditional optical detection methods, multidimensional composite detection technology not only improves the ability to simultaneously acquire information such as target intensity distribution and physical and chemical properties but also greatly increases the amount of information obtained compared to traditional optical detection technology. At the same time, the accuracy of target detection and identification has been improved, especially in complex environments where it has a significant impact on target identification, and thus has been widely used in various fields.
To better identify targets in complex background environments, we have developed a high-resolution multispectral polarimetric imaging system, which can achieve accurate target identification by distinguishing the polarization contrast information between targets and backgrounds; this can have an important role in public security surveillance, target identification, and early warning and detection of military targets.

2. Study the Characteristics of Target Polarization and the Difference between Target Polarization and Background

At present, the Torrance-Sparrow (TS) model based on BRDF is mainly used to model the polarization characteristics of rough metal and nonmetal surfaces of typical material evidence targets. It is assumed that the basic material of the target is composed of a large number of mirror elements, and the reflection process of light incident on the material is the superposition of all elements’ mirror reflection components dL r , s and diffuse reflection components dL r , d :
dL r = dL r , s + dL r , d
The specular reflection of a single element is calculated by the Fresnel formula. The direction’s micro-bins are randomly distributed, and the included angle distribution between its normal direction and the macroscopic normal distribution of the surface is Gaussian distribution:   dL r , s .
P ( θ ) = exp ( tan 2 ( θ ) 2 σ 2 ) 2 π σ 2 cos 3 ( θ )
where θ is the angle between the macro surface normal and the original micro surface normal, and σ is the roughness.
The diffuse reflection component is given by:
dL r , d = aL i cos θ i
where a is the diffuse reflection coefficient, which is obtained by experimental measurement [1,2,3]. θ i is the amplitude brightness of the incident light and the angle between the incident light and the macro normal.
Different scholars have made targeted revisions to the TS model, making its adaptability better [4,5]. In pBRDF, a polarization correction model based on the TS model, diffuse reflection is regarded as complete Lambert scattering, which is unpolarized, while specular reflection will produce a polarization effect. If shadow and shading factors are considered at the same time, the reflection attenuation caused by the geometric relationship of adjacent micro-bins is defined according to the bidirectional reflection function BRDF:
f = dL r dE i
where Lr is the brightness of the outgoing light and Ei is the incident light intensity. Therefore, we can get the polarization BRDF—pBRDF that the microelements are equi-sawtooth, the normal direction of the bin is Gaussian, and the shielding factor g is as follows:
f j , k ( θ i , θ r , ϕ r , ϕ i ) = G · P ( θ ) 4 cos ( θ ) cos ( θ r ) cos ( θ i ) M j , k ( θ i , θ r , ϕ r , ϕ i ) + a d ω i
where θ i and θ r are the angles between the incident light and the outgoing light and the macro normal, respectively, P ( θ ) is the Gaussian distribution function, M j , k ( θ i , θ r , ϕ r , ϕ i ) are the 16 Mueller matrix elements characterizing the polarization characteristics of the surface reflection process, and ϕ r and ϕ i   are the azimuth angles of the incident and outgoing light, respectively.

3. Research on Polarization Detection Imaging Based on Micro-Polarizer

Polarized light can be divided into linearly polarized light, circularly polarized light, and elliptically polarized light according to the shape of the end trajectory of the light wave electric vector. There are two quantitative representations: Jones vector representation and Stokes vector representation [6,7,8,9]. Jones vector representation is mainly used to represent completely polarized light, while Stokes vector representation can represent completely polarized light, partially polarized light, and unpolarized light. In the detection of man-made targets and metal targets, the full polarization characteristics of reflected light from targets contain very important characteristic information.
The Stokes vector method uses a matrix to describe the polarization state of a beam of light. The specific form is:
s = [ I Q U V ] = [ I 0 ° + I 90 ° I 0 ° I 90 ° I 45 ° I 135 ° I Right I Left ]
where I is the total light intensity of the incident light; Q is the light intensity difference in the directions of 0° and 90°; U is the light intensity difference between 45° and 135° directions; V is the light intensity difference between right-handed circularly polarized light and left-handed circularly polarized light.
The traditional polarization measurement technique requires the acquisition of four images in different directions and the processing of the images, so it is difficult to achieve real-time measurement. As shown in Figure 1,The polarization direction of the micro-polarizer array is 0°, 45°, 90°, and 135° for each adjacent 2 × 2 unit, so that four images with different directions of polarization detection can be obtained by interpolation averaging in a single frame, thus realizing real-time polarization measurement.

4. Research on Multi-Dimensional Information Fusion and Enhancement Processing Algorithm

4.1. Principle of Two-Dimensional Discrete Wavelet Transform

Figure 2 shows the filter bank representation of the two-dimensional Mallat fast algorithm. As can be seen from Figure 2a, the specific process of two-dimensional wavelet decomposition is as follows [10,11,12,13]: firstly, one-dimensional wavelet decomposition is performed on the one-dimensional data composed of each line in the image, and then one-dimensional wavelet decomposition is performed on the one-dimensional data of each column of low and high-frequency information formed by decomposition, and finally, four sub-band images are obtained: c j 1 , d j 1 1 , d j 1 2 and d j 1 3 . c j 1 is obtained by row low-pass and column low-pass, so it contains the low-frequency approximate information of the image; it is often referred to as d j 1 LH (Here L stands for low-pass filtering and H stands for high-pass filtering), corresponding to the high-frequency components in the vertical direction, that is, the horizontal edge detail information. d j 1 2 is obtained by row high pass and column low pass, denoted as d j 1 HL and corresponds to high-frequency components in the horizontal direction, that is, vertical edge detail information. d j 1 3 is obtained by row high pass and column high pass, denoted as d j 1 HH ; it contains high-frequency components in the focus direction, that is, diagonal edge detail information.

4.2. Fusion Rule Based on the Maximum Absolute Value of the Coefficient

After wavelet transformation, the low-frequency subband coefficients still reflect the average energy of the image, while the high-frequency subband coefficients reflect the details such as edges and textures of the image in different directions [14,15,16,17]. The weighted average of low-frequency coefficients and the maximum absolute value of high-frequency coefficients are used for fusion, and the high-frequency part containing significant details of the image is fused with emphasis, which helps to improve the visual effect of the fused image and keep the significant details in the source image. Let the source images A and B be decomposed by J-layer wavelet to obtain coefficients { cA , dA j ϵ } { cB , dB j ϵ } , and the coefficient corresponding to the fused image F is { cF , dF j ϵ } . Thus, the low-frequency scale coefficient of the image X in the J-th layer is represented, and the high-frequency wavelet coefficient in the layer direction of the image X is represented.
The weighted average fusion criterion of low-frequency subbands is:
aF ( m , n ) = 1 2 [ cA ( m , n ) + cB ( m , n ) ]
where (m, n) denotes the position of the low-frequency subband coefficients.

4.3. Experimental Simulation

The simulation is aimed at a green grass field and green-coated board with low contrast. They are put together to distinguish and identify the green grass field and green-coated board, and the polarization fusion method is used to improve the contrast of the green-coated board. The image fusion process based on 2D wavelet transformation is shown in Figure 3. The simulation results are shown in Figure 4, where the S0 image is the image at the total light intensity of the target, and the DOP image is the polarization degree image.

5. Design of High-Resolution Multispectral Polarization Imaging Scheme

5.1. Composition and Function

The system composition is shown in Figure 5. The system consists of four parts: power subsystem, electronics subsystem, optical detection subsystem and servo subsystem. The power supply subsystem is used to power and drive the device; the electronics subsystem is used for video image processing and storage; the optical detection subsystem is used to obtain multi-dimensional information about the target; and the servo subsystem is used for image stabilization and cross-roll pitch control.
The optical detection subsystem includes a visible polarization imaging module, an infrared imaging module, and a laser active illumination module. The visible polarization module is used to obtain the polarization information of the visible band of the target during the day; the infrared imaging module is used to obtain the long-wave infrared radiation information of the target at night; the laser active illumination module is used to assist the illumination in the low-light environment to enhance the imaging effect.

5.2. Workflow

The specific workflow of the system is as follows:
(1)
The subsystem supplies power to the image and lighting modules in the electronics and optical detection subsystem;
(2)
The optical detection subsystem shoots that target to obtain the visible light polarization and infrared image of the target;
(3)
The servo subsystem provides image stabilization, pitch, and rotation for the optical detection subsystem to realize scanning imaging;
(4)
The electronics subsystem fuses and stores the acquired images.

6. Development and Experiment of a High-Resolution Polarization Imaging Instrument

6.1. System Development

The visible light micro-polarization detector and infrared micro-polarization detector were purchased, and the pod mounted on the UAV platform was designed and processed. The whole system was installed and adjusted and electronically controlled, and the pitch rotation function was initially realized. The physical diagram of the system is shown in Figure 6.

6.2. Static Experiment

The procedure was as follows: we fixed the polarization camera, infrared camera, and visible camera on the ceiling, and adjusted the aperture and focal length to make the image clear; fixed sand, sand mixture, and soil on a wooden board, divided it into nine areas and made sure that these nine areas were all within the field of view of three cameras; the nine areas were photographed without objects, and then the polarization angle (AOP), linear polarization degree (DOLP), and other images of the background were obtained.
We used knives, cloth, saws, mineral water bottles, hammers, foam boards, scissors, bricks, and wooden boards, changed the order once for each group, and shot three images of the nine items on three different backgrounds.
We replaced the target items and used nine items such as meat, bones, saws, stones, slippers, cloth, steel pipes, knives, and hammers, changed the order once for each group, and shot three images of the nine items on three different backgrounds. Bullets, knives, and remote controllers were placed in three target areas, respectively, and images of tiny objects under three cameras were taken. Finally, we added a lot of grass to the background shaded, and obtained the image of the target object with shade. It was divided into two situations: natural illumination during the day, and LED lamp illumination at night.

Target Polarization Image Analysis

Natural light exposure, contrast map of physical evidence target, as shown in Figure 7:
When illuminated by LED light at night, the contrast map of the physical evidence target is shown in Figure 8:

6.3. Analysis of Verification Results

Under the condition of natural light exposure during the day, the infrared image and the intensity image are not much different. After the polarization image and the intensity image are fused, the contrast is greatly improved; The image contrast is improved again by fusing the polarization image, infrared image and intensity image.
When illuminated by LED light at night, the imaging effect of the intensity image is very poor, the infrared image is much clearer, and the polarization image is better than the intensity image. After the polarization image and the intensity image are fused, the image contrast is significantly improved.

7. Conclusions

According to the calculated contrast, the target detection recognition and processing under complex conditions. For conventional intensity imaging, polarized images have obvious advantages, and texture details cannot be seen in intensity imaging, but they can be seen well in polarized images. Therefore, when using polarization imaging detection technology to detect physical evidence targets in complex backgrounds, it can effectively improve target contrast. It can enrich the target information, improve the image quality, improve the target detection accuracy, and search for the physical evidence on the scene of the incident. Goals provide effective means. Due to the long wavelength of infrared rays, infrared cameras can observe the target normally during the day and at night. Therefore, under low illumination conditions, when the visible light camera cannot work normally, the infrared camera can be used to image the observation target.

Author Contributions

Data curation, Q.F. and Y.Z.; formal analysis, H.S., C.W. and J.D.; investigation, J.Z. and H.S.; methodology, Q.F.; project administration, J.D. and H.J.; resources, Y.L.; software, Z.L.; supervision, Y.L. and H.J.; validation, J.L.; visualization, S.Z. and W.Y.; writing—original draft, Q.F.; writing—review and editing, W.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Changchun University of Science and Technology, China. Huilin Jiang, National Natural Science Foundation of China: 61890963; National Natural Science Foundation of China: 61890960; National Natural Science Foundation of China: 62127813.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The study did not report any data.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, M.; Xu, W.; Sun, Z.; Jia, A.; Xiu, P.; Chen, W.; Li, L.; Zheng, C.; Li, J. Degree of polarization modeling based on modified microfacet pBRDF model for material surface. Opt. Commun. 2019, 453, 124390. [Google Scholar] [CrossRef]
  2. Zhang, Y.; Zhang, Y.; Zhao, H.; Wang, Z. Improved atmospheric effects elimination method for pBRDF models of painted surfaces. Opt. Express 2017, 25, 16458–16475. [Google Scholar] [CrossRef] [PubMed]
  3. Shell, J.R., II. Polarimetric Remote Sensing in the Visible to Near Infrared; Rochester Institute of Technology: Rochester, NY, USA, 2005. [Google Scholar]
  4. Song, D.W.; Han, T.; Shang, S.; Li, D.; Sun, W.; Fan, X. Study on microwave scattering of rough sea surface based on the corrected TS model. In Proceedings of the IET International Radar Conference 2013, Xi’an, China, 14–16 April 2013. [Google Scholar]
  5. Wang, S.; Zhang, X.; Wang, Z.; Lv, D. On-line fuzzy identification of thermal systems based on an improved TS model. In Proceedings of the 2010 International Conference on Modelling, Identification and Control, IEEE, Innsbruck, Austria, 15–17 February 2010; pp. 47–52. [Google Scholar]
  6. Liao, Y.B. Polarized Optics; Science Press: Beijing, China, 2003; pp. 45–63. (In Chinese) [Google Scholar]
  7. Bai, S.K.; Duan, J.; LU, Y. Experimental study on polarization imaging characteristics of various materials. J. Appl. Opt. 2016, 34, 510–516. (In Chinese) [Google Scholar]
  8. Duan, J.; Fu, Q.; Mo, C.; Zhu, Y.; Liu, D. Review of polarization imaging for international military application. In Proceedings of the ISPDI 2013-Fifth International Symposium on Photoelectronic Detection and Imaging. International Society for Optics and Photonics, Beijing, China, 25 June 2013; Volume 8908, p. 13. [Google Scholar]
  9. Gruev, V.; Ortu, A.; Lazarus, N.; Spiegel, J.V.D. Fabrication of a dual-tier thin film micropolarizationarray. Opt. Express 2007, 15, 4994–5007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Zhang, X.; Zhang, R. The technology research in decomposition and reconstruction of image based on two-dimensional wavelet transform. In Proceedings of the 2012 9th International Conference on Fuzzy Systems and Knowledge Discovery, Chongqing, China, 29–31 May 2012; pp. 1998–2000. [Google Scholar]
  11. Othman, G.; Zeebaree, D.Q. The applications of discrete wavelet transform in image processing: A review. J. Soft Comput. Data Min. 2020, 1, 31–43. [Google Scholar]
  12. Fu, M.; Liu, H.; Yu, Y.; Chen, J.; Wang, K. DW-GAN: A discrete wavelet transform GAN for nonhomogeneous dehazing. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 203–212. [Google Scholar]
  13. Hill, P.R.; Canagarajah, C.N.; Bull, D.R. Image Fusion Using Complex Wavelets. In Proceedings of the British Machine Vision Conference BMVC 2002, Cardiff, UK, 2–5 September 2002; pp. 1–10. [Google Scholar]
  14. Liu, Y.; Liu, S.; Wang, Z. A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fusion 2015, 24, 147–164. [Google Scholar] [CrossRef]
  15. Zhao, Y.; Zhang, L.; Zhang, D.; Pan, Q. Object separation by polarimetric and spectral imagery fusion. Comput. Vis. Image Underst. 2009, 113, 855–866. [Google Scholar] [CrossRef]
  16. Lavigne, D.A.; Breton, M. A new fusion algorithm for shadow penetration using visible and midwave infrared polarimetric images. In Proceedings of the 13th International Conference on Information Fusion, Edinburgh, UK, 26–29 July 2010. [Google Scholar]
  17. Zhang, Q.; Gao, J.; Fan, Z.G.; Wang, Z.W.; Yan, Y. A foggy image reconstruction method using target and atmospheric polarization information. Opt. Precis. Eng. 2019, 40, 9. [Google Scholar]
Figure 1. Schematic diagram of the micro-polarizer array system.
Figure 1. Schematic diagram of the micro-polarizer array system.
Applsci 12 10712 g001
Figure 2. Filter bank representation of two-dimensional Mallat fast algorithm. (a) Wavelet decomposition. (b) Wavelet reconstruction.
Figure 2. Filter bank representation of two-dimensional Mallat fast algorithm. (a) Wavelet decomposition. (b) Wavelet reconstruction.
Applsci 12 10712 g002
Figure 3. Image fusion process based on two-dimensional wavelet transform.
Figure 3. Image fusion process based on two-dimensional wavelet transform.
Applsci 12 10712 g003
Figure 4. Simulation results. (a) S0 image; (b) DOP image; (c) fused image.
Figure 4. Simulation results. (a) S0 image; (b) DOP image; (c) fused image.
Applsci 12 10712 g004
Figure 5. Schematic diagram of system composition.
Figure 5. Schematic diagram of system composition.
Applsci 12 10712 g005
Figure 6. The physical diagram of the system.
Figure 6. The physical diagram of the system.
Applsci 12 10712 g006
Figure 7. Intensity, polarization, infrared, and fusion images under natural light. (a) intensity image; (b) DoLP image; (c) Infrared image; (d) Intensity Polarization Fusion Image; (e) Intensity, polarization, and infrared fusion images.
Figure 7. Intensity, polarization, infrared, and fusion images under natural light. (a) intensity image; (b) DoLP image; (c) Infrared image; (d) Intensity Polarization Fusion Image; (e) Intensity, polarization, and infrared fusion images.
Applsci 12 10712 g007
Figure 8. Intensity, polarization, infrared, and fused images under LED light at night. (a) intensity image; (b) DoLP image; (c) Infrared image; (d) Intensity Polarization Fusion Image; (e) Intensity, polarization, and infrared fusion images.
Figure 8. Intensity, polarization, infrared, and fused images under LED light at night. (a) intensity image; (b) DoLP image; (c) Infrared image; (d) Intensity Polarization Fusion Image; (e) Intensity, polarization, and infrared fusion images.
Applsci 12 10712 g008
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fu, Q.; Yang, W.; Shi, H.; Li, Y.; Zhang, S.; Zhan, J.; Liu, J.; Wang, C.; Liu, Z.; Zhu, Y.; et al. Design and Experiment of High-Resolution Multispectral Polarization Imaging System. Appl. Sci. 2022, 12, 10712. https://doi.org/10.3390/app122110712

AMA Style

Fu Q, Yang W, Shi H, Li Y, Zhang S, Zhan J, Liu J, Wang C, Liu Z, Zhu Y, et al. Design and Experiment of High-Resolution Multispectral Polarization Imaging System. Applied Sciences. 2022; 12(21):10712. https://doi.org/10.3390/app122110712

Chicago/Turabian Style

Fu, Qiang, Wei Yang, Haodong Shi, Yingchao Li, Su Zhang, Juntong Zhan, Jianan Liu, Chao Wang, Zhuang Liu, Yong Zhu, and et al. 2022. "Design and Experiment of High-Resolution Multispectral Polarization Imaging System" Applied Sciences 12, no. 21: 10712. https://doi.org/10.3390/app122110712

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop