Next Article in Journal
Factory Simulation of Optimization Techniques Based on Deep Reinforcement Learning for Storage Devices
Previous Article in Journal
Convolutional-Neural-Network-Based Hexagonal Quantum Error Correction Decoder
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research into Image Point Positioning Compensation of a High-Orbit Thermal Infrared Staring Camera

1
School of Environmental Science, Nanjing Xiaozhuang University, Nanjing 211171, China
2
College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China
3
Department of Geographical Sciences, University of Maryland, College Park, MD 20742, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2023, 13(17), 9688; https://doi.org/10.3390/app13179688
Submission received: 4 July 2023 / Revised: 19 August 2023 / Accepted: 25 August 2023 / Published: 27 August 2023
(This article belongs to the Section Aerospace Science and Engineering)

Abstract

:
The sight of the high-orbit thermal infrared staring camera is concentrated and sensitive to temperature changes, and therefore its image point positioning is key to ensuring the geometric quality of the thermal infrared staring image and eliminating the errors of the imaging system in orbit. According to the geometric imaging characteristics of the high-orbit thermal infrared staring camera, internal and external positioning compensation models are proposed in this paper based on the two-dimensional pointing angle of temperature change, and four kinds of experimental schemes of image point positioning compensation are designed. The experimental results show that the method proposed in this paper has a good compensation effect on the image point positioning error.

1. Introduction

Infrared thermal imaging plays an increasingly important role in military and civilian fields because of its good concealment, strong anti-interference, strong target recognition ability and all-weather work. An infrared thermal imager uses photoelectric technology to detect the infrared radiation emitted by the object in a passive way [1], calculates the temperature of each point on the surface of the object, displays different temperatures in different colors, and converts them into images and graphics that can be distinguished by human vision. An infrared thermal imager can break through human visual barriers and detect objects in a completely dark environment. It can detect objects even in the presence of smoke and dust, and does not require light source illumination, so it can be used 24 h per day. Compared with low-orbit satellites, high-orbit satellites have a larger observation range and incomparable temporal resolution advantages due to their high orbital altitude, and can complete more complex observation tasks, such as mobile target search and the tracking and periodic monitoring of targets. As an important link in the space-based thermal infrared observation system, high-orbit thermal infrared detectors are used for tactical intelligence collection, field situational awareness tasks, forest fire positioning and so on, so their imaging quality is closely related to the observation quality. In space-based early warning, the geosynchronous orbit satellite is mainly used for the detection and monitoring of the active section of ballistic missiles. It is equipped with an infrared scanning detector and a high-resolution infrared staring detector. The working mechanism is that the infrared scanning detector detects the tail flame of the ballistic missile. After discovering the ballistic missile target, it is transferred to the infrared staring detector to stare and track the target, determine the active section trajectory of the target, and achieving the accurate detection of the target. Therefore, it is necessary to study the imaging of the geosynchronous orbit early warning satellite. The on-orbit imaging process of the high-orbit camera, similar to other types of optical cameras, is actually a process in which multiple on-board loads such as attitude measurement, orbit measurement, and time synchronization measurement work together. In this process, there are various external measurement results errors. In addition, due to the complex operating environment of the geosynchronous satellite, the camera itself also has a certain degree of internal distortion. Therefore, the geometric processing of the high-orbit image is very important.
In recent years, high-precision geometry processing technologies for middle and low orbit satellites and aerial images have become more developed [2], including geometric imaging models, in-orbit geometric calibration models and sensor rectifying models [3,4,5]. Xiu, J. [6] developed the line-of-sight (LOS) algorithm and control flow and image motion compensation based on the imaging optical path. Kornus [7] and others conducted an in-depth exploration on the sensor laboratory calibration and the in-orbit rectifying of the MOMS-2P satellite, and for each CCD array they built a corresponding in-orbit geometric rectifying model, mainly correcting the internal geometric distortion of the camera, including the deviation of the image principal point, the deviation of the focal length, and the bending and rotating of the CCD, and finally a uniform solution was obtained through the method of bundle adjustment. Gachet [8] and others carried out the in-orbit geometric calibration of a SPOT5 HRG camera and HRS camera, respectively, through high-precision 50 cm high-resolution reference data provided by the calibration site on the ground. They, respectively, used the calibration model of quintic polynomial fitting to compensate for the errors in the directions alongside the orbit and perpendicular to the orbit, and this could ultimately reduce the distortion error inside the camera to less than 0.1 pixel. Cai, Y. [9] established different positioning models and analyzed the calculation principle of the aerial remote sensing image positioning algorithm. Jacobsen [10] and others set 15 additional parameters according to the characteristics of India’s IRS-1C satellite camera and verified the impact of each additional parameter on the positioning accuracy by designing experiments with different combinations of additional parameters. Japan’s ALOS [11] satellite used the self-rectifying regional network adjustment with additional parameters to carry out the overall calibration. Su Wenbo [12] studied the in-orbit geometric calibration of the space linear array CCD sensor, established the detection model of the external orientation elements with the help of SPOT5 HRS image auxiliary data, and calibrated the space linear array CCD sensor using the self-rectifying bundle adjustment to achieve good calibration accuracy. Wang Chongyang [13] proposed a simplified line-of-sight vector correction model in the form of a non-pointing angle for in-orbit geometric calibration and carried out relevant experiments with the help of simulation data, proving that the model did not need to solve the satellite earth distance, and with a simplified calculation process, it had stronger applicability.
Most of the above research work is aimed at the in-orbit geometric calibration of the low-orbit remote sensing linear array satellites to the ground. At present, there has been little research into the error compensation of high-orbit infrared cameras. For the geosynchronous orbit satellite, the solar illumination is uneven, and the satellite temperature changes dramatically [14], with the variation in laws of its temperature of annual long periodicity and short diurnal periodicity [15]. The infrared camera collects the temperature information of sensitive targets in the geosynchronous orbit, and due to its high-orbit height, the concentrated imaging line of sight, and severe temperature changes, the geometric imaging model needs to be studied [16]. The geosynchronous orbit satellite is affected by the drastic change in temperature, which will cause the relative installation relationship between the star sensor and the early warning camera to change. The change in the installation angle will be directly reflected in the error of attitude measurement. Therefore, we have carried out a detailed study on the influence of temperature on the camera distortion error.
In addition, most of the objects observed by this type of satellite show point features, and there is almost no ground control point information in the process of real-time geometric processing, which brings great difficulties regarding the accuracy of the motion parameter estimation of high-value targets [17]. Therefore, according to the geometric imaging characteristics and the error sources of the high-orbit thermal infrared staring camera, this paper proposes a geometric imaging model based on the two-dimensional pointing angle of temperature change. It can provide effective help to realize the two-dimensional trajectory reconstruction and provide more accurate observation trajectories for target parameter estimation.

2. Virtual Control Points Calibration Scheme Based on Image Space Uniformity

2.1. Compensation Scheme

In this paper, we construct an error compensation model for the image point positioning of early warning area array cameras according to the two-dimensional pointing angle model, including an external parameter error compensation model and an internal parameter error compensation model.
The external parameter compensation model is as follows:
[ tan ( ψ x ( x , y ) ) tan ( ψ y ( x , y ) ) 1 ] = λ R I N S C A M ( α , β , θ ) ( t ) [ Q X Q Y Q Z ]
in which [ Q X Q Y Q Z ] = R J 2000 I N S R W G S 84 J 2000 [ X X S Y Y S Z Z S ] , R I N S C A M ( α , β , θ ) ( t ) = [ a 1 b 1 c 1 a 2 b 2 c 2 a 3 b 3 c 3 ] , wherein ( x , y ) is the probe coordinate corresponding to the image point on the image, ψ x ( x , y ) , ψ y ( x , y ) are the sight directions of the x direction and the y direction, respectively, in the camera coordinate system, ( X S , Y S , Z S ) is the position coordinate of the satellite in the earth fixed coordinate system, ( X , Y , Z ) is the position coordinate of the object point corresponding to the image point in the earth fixed coordinate system, R W G S 84 J 2000 is the conversion matrix from the earth fixed coordinate system to the J2000 coordinate system, R I N S C A M ( α , β , θ ) ( t ) refers to the time-varying conversion matrix between the star sensor and the camera under different temperature conditions, and ( α , β , θ ) refers to the relative installation angle between the star sensor and the camera, which is also the compensation object, and λ is the scale coefficient.
The calibration results of the installation angle at different temperatures are calculated using the simulation data samples. We establish the variation curve of the installation angle and time, and then apply it to the compensation of the external parameters. Finally, the image point positioning error before and after compensation is directly calculated to verify the accuracy of the external parameter error compensation.
The internal parameters compensation model is as follows:
( x f , y f , 1 ) T = ( tan ( ψ x ( x , y ) ) , tan ( ψ x ( x , y ) ) , 1 ) T
{ tan ( ψ x ( x , y ) ) = x f = x a 0 + x a 1 l + x a 2 s + x a 3 l s + x a 4 l 2 + x a 5 s 2 + x a 6 l 2 s + x a 7 l s 2 + x a 8 l 3 + x a 9 s 3 tan ( ψ y ( x , y ) ) = y f = y a 0 + y a 1 l + y a 2 s + y a 3 l s + y a 4 l 2 + y a 5 s 2 + y a 6 l 2 s + y a 7 l s 2 + y a 8 l 3 + y a 9 s 3
in which x a 0 ~ x a 9 , y a 0 ~ y a 9 are internal compensation parameters.
For the internal parameter compensation, the two-dimensional pointing angle corresponding to the probe on the area array camera can be calculated by using the calculated internal calibration parameters, and the deviation between the two-dimensional pointing angle of the sample probe and the two-dimensional pointing angle after adding the distortion parameters can be counted, and the pointing angle after calibration can be displayed intuitively. Or the two-dimensional pointing angle of the probe element can be calculated according to the calculated internal calibration parameters, and compared with the image point coordinates calculated according to the original internal parameters to compare the image point positioning errors before and after calibration.

2.2. Calibration Principle

To solve the problem of the motion parameter estimation error of high-value point targets caused by the lack of ground control points in the process of real-time geometric processing [18], this paper proposes a calibration method of virtual control points based on image space uniformity, with the image points of the target projected onto the Earth’s surface through the photography center to obtain the corresponding projection point sequence of the Earth’s surface, so as to obtain the virtual control points, as shown in Figure 1.
At the observation time, the image points of the virtual control point sequence are completely coincident with the image points of the target point at that time, so a series of virtual control points are generated by using the intersection points obtained. Given the coordinates ( x , y ) of the image plane of the target and the internal and external orientation elements at the observation time, the spatial coordinates ( X , Y , Z ) of the corresponding virtual control points in the earth fixed coordinate system can be obtained from the geometric imaging model:
x f = a 1 X + b 1 Y + c 1 Z a 3 X + b 3 Y + c 3 Z y f = a 2 X + b 2 Y + c 2 Z a 3 X + b 3 Y + c 3 Z
in which { a 1 = cos φ cos κ sin φ sin ω sin κ a 2 = cos φ sin κ sin φ sin ω cos κ a 3 = sin φ cos ω b 1 = cos ω sin κ b 2 = cos ω cos κ b 3 = sin ω c 1 = sin φ cos κ + cos φ sin ω sin κ c 2 = sin φ sin κ + cos φ sin ω cos κ c 3 = cos φ cos ω , φ is pitch angle, ω is rolling angle, κ is yaw angle.
For the generated virtual control image, N generated control points are selected as the calibration points, with the object space coordinates corresponding to the calibration points set as ( X i , Y i , Z i ) and the corresponding image point coordinates set as ( x i , y i ) , i = 1 , 2 , 3 , , N , and then the calibration model is decomposed into the pointing angle error along the direction of x and the pointing angle error along the direction of y according to the two-dimensional pointing angle model:
{ J ( M W , W N ) = a 1 Q X + b 1 Q Y + c 1 Q Z a 3 Q X + b 3 Q Y + c 3 Q Z tan ( ψ x ( x , y ) ) K ( M W , W N ) = a 2 Q X + b 2 Q Y + c 2 Q Z a 3 Q X + b 3 Q Y + c 3 Q Z tan ( ψ y ( x , y ) )
The initial values of the extrinsic calibration parameters are determined according to the initial installation matrix of the camera, and the initial values of the intrinsic calibration parameters are determined according to the parameters of the theoretical two-dimensional pointing angle model when the camera has no internal distortion.
The current value of the intrinsic calibration parameter W N is regarded as the true value, and the extrinsic calibration parameter M W is regarded as the unknown parameter to be determined. For each virtual calibration control point, the above equation is linearized to establish the corresponding error equation:
V i = A i X L i
A i = [ J i M W K i M W ] = [ J i α J i β J i θ K i α K i β K i θ ] , X = d X = [ d α d β d θ ] , L i = [ J ( M W i , W N i ) K ( M W i , W N i ) ] ;
When solving the intrinsic calibration parameters, the extrinsic calibration parameter M W obtained above is regarded as the true value, and the intrinsic calibration parameter W N as the unknown parameter to be determined. For each virtual calibration control point, the above equation is linearized to establish the corresponding error equation:
V i = B i Y L i
in which B i = [ J i W N K i W N ] = [ B 1 i , B 2 i ] , Y = [ Y 1 , Y 2 ] = d Y ;
B 1 i = [ J i x a 0 J i x a 1 J i x a 2 J i x a 3 J i x a 4 J i x a 5 J i x a 6 J i x a 7 J i x a 8 J i x a 9 K i x a 0 K i x a 1 K i x a 2 K i x a 3 K i x a 4 K i x a 5 K i x a 6 K i x a 7 K i x a 8 K i x a 9 ]
B 2 i = [ J i y a 0 J i y a 1 J i y a 2 J i y a 3 J i y a 4 J i y a 5 J i y a 6 J i y a 7 J i y a 8 J i y a 9 K i y a 0 K i y a 1 K i y a 2 K i y a 3 K i y a 4 K i y a 5 K i y a 6 K i y a 7 K i y a 8 K i y a 9 ]
L i = [ J ( M W i , W N i ) K ( M W i , W N i ) ]

2.3. Step-by-Step Compensation Scheme

Due to the strong correlation between the internal and external parameters, if the internal and external parameters are compensated at the same time, an ill-conditioned equation will be generated. Accordingly, this paper adopts the strategy of step-by-step error compensation, alternately solving the internal and external parameter values of the error equation, so as to carry out more accurate error compensation for image point positioning. According to the different order of solving parameters, the compensation schemes can be divided into the following two kinds: internal compensation before external compensation, and external compensation before internal compensation. The scheme of “the external first and then the internal” compensates the external parameters of the camera, updates the internal parameter compensation model in real time using the external parameters after compensation, and then compensates the internal parameters using the updated internal parameter compensation model. On the other hand, the scheme of “the internal first and then the external” compensates the camera’s internal parameters first, updates the external parameter compensation model in real time using the compensated internal parameters, and then compensates the external parameters using the updated external parameter compensation model.
In order to better analyze the accuracy of parameter compensation, in the above-mentioned external–internal compensation scheme, after the first step of external parameter compensation, the statistical accuracy is only the accuracy of the external parameter compensation. Similarly, in the internal and external compensation scheme, the accuracy calculation is carried out after the internal parameter compensation, which is the accuracy of the internal parameter compensation. At this time, there are four kinds of compensation schemes (as shown in Figure 2) actually established: first outside and then inside, first inside and then outside, only inside compensation, and only outside compensation.
The specific compensation process is shown in the following steps:
  • The spatial motion model of the target and the strict imaging geometric model of the area array camera are established. The three-dimensional motion trajectory of the generated target space is observed and the imaging simulation results of the motion trajectory on the two-dimensional image are obtained.
  • In the two-dimensional imaging process, three-dimensional errors are added to the attitude parameters and orbital position parameters, and distortion errors are added to the camera parameters. The calculated position is used as the positioning result before error compensation.
  • The grid points with uniform and regular distribution are selected on the camera array. The pixel coordinates (0, 0) in the upper left corner are taken as the starting grid points, and the pixel coordinates (10,000, 10,000) in the lower right corner are taken as the ending grid points. The interval between the grid points in the X direction and the Y direction is 100 pixels.
  • According to the pixel size design parameters, the pixel coordinates of these grid points are transformed into the coordinates in the camera coordinate system, and the two-dimensional pointing angle in the camera coordinate system is calculated as the initial coordinate value.
  • For the image point positioning error caused by the internal orientation elements, the actual two-dimensional pointing angle coordinates of each grid point in the camera coordinate system are calculated, and the difference between the actual two-dimensional pointing angle and the initial pointing angle is used to reflect the distortion inside the camera, and the internal parameter compensation model is constructed to calculate the internal calibration parameters.
  • For the image point positioning error caused by the external orientation elements, the relative installation angle change caused by the temperature change is used to compensate the external parameters, and the external parameter compensation model is constructed to calculate the external calibration parameters.
  • After calculating the internal and external calibration parameters, the error compensation is carried out according to the two schemes proposed in this paper. The positioning results before and after error compensation are compared with the theoretical values to obtain the optimal compensation scheme.
For external parameter compensation, the image point positioning error before and after compensation is directly calculated. For internal parameter compensation, the two-dimensional pointing angle corresponding to the probe on the area array camera is calculated with the solved intrinsic calibration parameters, and the deviation between the two-dimensional pointing angle of the probe and the two-dimensional pointing angle after adding distortion parameters is counted. For the temperature error, the calibration results of the installation angle at different temperatures are calculated, the change curve of the installation angle and time is established, polynomial fitting is performed, and then it is applied to the compensation of external parameters.

3. Experimental Results and Analysis

3.1. Two-Dimensional Pointing Angle Experiment

The orbital height of the geosynchronous orbit satellite studied in this paper is 36,000 km. Although the three-axis stability for the research object of this paper is the geosynchronous orbit satellite, the ambient temperature changes drastically [14]. The drastic temperature difference will cause the installation bracket of the camera to deform violently, and the installation angle of the camera will also change accordingly, resulting in the same effect as the attitude error. For the image point positioning error, we collected the observation data of different seasons and different times of the day in a year and established the error empirical model of the camera installation angle related to temperature and time (as shown in Figure 3).
According to the camera’s internal distortion parameters (as shown in Table 1), assuming that there is no external parameter error, the actual two-dimensional pointing angle coordinates of each grid point in the camera coordinate system are calculated, and the difference between the two-dimensional pointing angle and the initial coordinate value is calculated, as shown in Figure 4. It can be seen that after the distortion parameter is added, the difference of the two-dimensional probe pointing angle of the camera can represent the geometric distortion of the camera.
According to the internal distortion parameters of the designed camera, it is assumed that there is no external parameter error; that is, there is only internal geometric distortion error. The actual two-dimensional pointing angle coordinates of each grid point in the camera coordinate system are calculated, and the difference between the actual two-dimensional pointing angle and the initial pointing angle is used to reflect the distortion inside the camera. According to the internal parameter solution method introduced in Section 2.1, the internal error compensation is performed on the distorted two-dimensional pointing angle, and the difference between the compensated two-dimensional pointing angle and the initial pointing angle is obtained (as shown in Figure 5). It can be clearly seen that the interpolation of the two-dimensional pointing angle after compensation is reduced, and the fitting accuracy can reach the millimeter level, which verifies the effectiveness of the internal error compensation scheme proposed in this paper.
Figure 4 and Figure 5 show the effectiveness of the two-dimensional pointing angle model proposed in this paper in solving the internal compensation parameters.

3.2. Internal and External Parameter Error Compensation Experiment

The relative installation angle error (Table 2) is increased.
The probe pointing angle of the virtual grid point in the camera coordinate system is calculated and then converted to the body coordinate system through the setting of relative installation angle to obtain the pointing angle deviation in the body coordinate system, as shown in Figure 6.
The internal and external parameters, respectively, calculated by the four schemes, are used to calculate the pointing angle of the compensated grid point in the body coordinates, and then the difference is found with the pointing angle calculated by the real internal and external parameters, as shown in Figure 7, Figure 8, Figure 9 and Figure 10. The internal and external compensation parameters calculated from the above four schemes are used for the compensation calculation of the image point positioning of the target track. The positioning error after compensation is shown in Table 3.
In the process of two-dimensional imaging, three-direction errors are added to the attitude parameters and orbital position parameters, and distortion errors are added to the camera parameters as the positioning results before error compensation. The positioning results before and after error compensation are compared with the theoretical values to obtain the optimal compensation scheme (as shown in Table 3).
It can be seen from the experimental results that the error compensation accuracy of the four error compensation schemes, namely, the internal first and then the external, the external first and then the internal, the external only, and the internal only, decreases sequentially. The difference between the two error compensation schemes, namely, the internal first and then the external, and the external first and then the internal, is not significant. This shows that after multiple iterations, the internal and external parameters tend to be stable, but the scheme in which the internal compensation is carried out first, followed by the external, is still relatively better than the one in which the external takes place before the internal. This is due to the different methods of solving the compensation error, and it can be concluded that it is better to finally conduct the external parameter compensation scheme.
In addition, the error compensation effect of the external scheme alone is much better than that of the internal scheme alone, and this indicates that there is still a large external parameter error after the internal parameter compensation, and the positioning error does not decrease but increases. This is because when the internal and external errors exist, there is a correlation, which will offset some of the errors. After the internal parameter compensation, most of the positioning errors caused by distortion are eliminated. On the other hand, the error caused by the external parameters is more obvious, so the external parameters need to be compensated. However, after the external parameter compensation, it can be seen that the external parameter compensation will absorb some of the internal parameter errors; that is, there will be partial substitution between the internal and external parameters.

3.3. Temperature Installation Angle Experiment

According to the image point positioning error compensation process designed above, two schemes are used to compensate the image point positioning under different temperature conditions; that is, only external parameter compensation and internal and external compensation are performed. At different times of the day, under the temperature condition, the same target trajectory is staring imaging, and the camera distortion is the same each time. The specific distortion parameters are shown in Table 1. The external relative installation angle parameters of each experiment are calculated, and a difference is made with the original relative installation angle to obtain the corresponding curve of the installation angle error related with time and compare it with the original added relative installation angle error. The compensation results of the two schemes under different temperature conditions are compared with the installation angle of the original simulation, and the results are shown in Figure 11.
Although temperature has a great effect on the relative installation angle, the change rule of the relative installation angle and time can be recovered after compensation, and the compensation scheme of “the internal first and then the external” is better than that of the external only. It can be found that even when the relative installation angle error is large, the two schemes can better calculate the true relative installation angle, and this can also verify the correctness and effectiveness of the two compensation process schemes designed above.

4. Conclusions

According to the periodicity of the temperature of the high-orbit thermal infrared staring camera, a two-dimensional internal and external positioning compensation model based on temperature change is proposed in this paper. In this paper, four kinds of experimental schemes of image point positioning compensation were designed: only internal parameter error compensation, only external parameter error compensation, first inside and outside, and first outside and inside. The internal and external parameters calculated by the four schemes were used to calculate the pointing angle of the compensated grid points in the body coordinate, and the pointing angle calculated by the real internal and external parameters was different (Figure 7, Figure 8, Figure 9 and Figure 10), and the internal and external compensation parameters were calculated, respectively. The compensation calculation of the target trajectory image point positioning was carried out. The positioning error after compensation was obtained, and it was compared with the positioning result before error correction and the ideal value (as shown in Table 3). The experiment proves that the parameter error compensation scheme is the optimal solution. The experimental results show that the compensation accuracy was the highest when internal compensation was performed first, followed by external compensation. The phase angle of the external parameter compensation under different temperature conditions was consistent with the added installation error. In this paper, two schemes were used to compensate the image point positioning under different temperature conditions, namely, only external parameter compensation and internal and external compensation (as shown in Figure 11). The compensation results were compared with the installation angle of the original simulation. The two schemes could better recover the change rule of the relative installation angle and time, and the compensation scheme of internal and external compensation was better than that of external parameter compensation only. Even when the relative installation angle error was large, the two schemes could better calculate the real relative installation angle.
Despite the success of our approach, we acknowledge that there are still some limitations and challenges that need to be addressed in future research. For instance, in the process of solving the internal and external compensation parameters, due to the redundant observation virtual control points, the least squares evaluation method was adopted to solve the compensation parameters, but the method needs to give the initial value of the estimated parameters in the process, and in the process of establishing the error equation, it needs to be Taylor-expanded and linearized, and there is a certain error in the theory. Therefore, in follow-up research work, it is necessary to establish a compensation parameter solution method that does not require a linearization process in order to improve the accuracy and practicability of parameter compensation. In the experimental process of internal and external calibration compensation, although this paper proposes a two-dimensional pointing angle model to replace the internal distortion model of the early warning area array camera, it fails to fully solve the correlation between internal and external parameters, resulting in unsatisfactory solution results. Therefore, in the process of on-orbit application, a more preferred parameter compensation model than the two-dimensional pointing angle model should be found to minimize the influence of strong correlation between internal and external parameters.
In general, the internal and external compensation model proposed in this paper had a good compensation effect on the target image point positioning error, which provides a certain reference for its on-orbit application. External parameter compensation under different temperature conditions was carried out, and the installation angle and time were established. The change model was consistent with the phase angle of the added installation error, which could also prove the validity and correctness of the internal and external parameter compensation model proposed in this paper. The compensation of the target image point positioning error caused by the existence of various errors in the imaging process was successfully carried out. The results of this paper can provide accurate trajectory reconstruction for the subsequent motion parameter estimation of high-value point targets. This paper focuses on the error compensation method of internal and external parameters of an area array camera, and the main error components were internal geometric distortion and installation angle error. Therefore, the error compensation method of internal and external parameters in the imaging model proposed in this paper can be used for all orbit altitudes and types of satellites.

Author Contributions

Conceptualization, H.X., J.L. and J.M.; Methodology, B.W., F.W. and W.Z.; Writing—original draft, C.L.; Writing—review & editing, Q.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Natural Science Foundation of China, 42271448.

Data Availability Statement

The data that support the findings of this study are available upon request from the corresponding author, Qinghong Sheng. The data are not publicly available owing to their sensitive information, which may compromise the privacy of the research participants.

Acknowledgments

The authors would like to thank the editors and anonymous reviewers for their valuable comments, which helped improve this paper. This work was substantially supported by the National Natural Science Foundation of China (No. 42271448). This support was valuable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhou, B.; Zhang, X.; Li, H. Study on Air Bubble Defect Evolution in Wind Turbine Blade by Infrared Imaging with Rheological Theory. Appl. Sci. 2019, 9, 4742. [Google Scholar] [CrossRef]
  2. Dong, Y.; Fan, D.; Ma, Q.; Ji, S.; Ouyang, H. Automatic on-orbit geometric calibration framework for geostationary optical satellite imagery using open access data. Int. J. Remote Sens. 2019, 40, 6154–6184. [Google Scholar] [CrossRef]
  3. Wang, D. Single Star Early Warning Algorithm and Error Analysis Based on Prior Information Fusion. Master’s Thesis, National University of Defence Technology, Changsha, China, 2015. [Google Scholar]
  4. Pi, Y.; Xie, B.; Yang, B.; Zhang, Y. In-orbit geometric calibration method of linear array push-sweep optical satellite with sparse control points. J. Surv. Mapp. 2019, 48, 216–225. [Google Scholar]
  5. Cao, H.; Tao, P.; Li, H.; Shi, J. Bundle adjustment of satellite images based on an equivalent geometric sensor model with digital elevation model. ISPRS J. Photogramm. Remote Sens. 2019, 156, 169–183. [Google Scholar] [CrossRef]
  6. Xiu, J.; Huang, P.; Li, J.; Zhang, H.; Li, Y. Line of Sight and Image Motion Compensation for Step and Stare Imaging System. Appl. Sci. 2020, 10, 7119. [Google Scholar] [CrossRef]
  7. Kornus, W.; Lehner, M.; Schroeder, M. Geometric in-flight calibration of the stereoscopic line-CCD scanner MOMS-2P. ISPRS J. Photogramm. Remote Sens. 2000, 55, 59–71. [Google Scholar] [CrossRef]
  8. Gachet, R. SPOT5 In-flight Commission: Inner Orientation of HRG and HRS Instruments. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 35, 535–539. [Google Scholar]
  9. Cai, Y.; Zhou, Y.; Zhang, H.; Xia, Y.; Qiao, P.; Zhao, J. Review of Target Geo-Location Algorithms for Aerial Remote Sensing Cameras without Control Points. Appl. Sci. 2022, 12, 12689. [Google Scholar] [CrossRef]
  10. Srivastava, P.K.; Alurkar, M.S. Inflight calibration of IRS-1C imaging geometry for data products. ISPRS J. Photogramm. Remote Sens. 1997, 52, 215–221. [Google Scholar] [CrossRef]
  11. Tadono, T.; Shimada, M.; Watanabe, M.; Hashimoto, T.; Iwata, T. Calibration and Validation of PRISM Onboard ALOS. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 35, 13–18. [Google Scholar]
  12. Su, W. Research on Techniques for On-Orbit Geometric Calibration of Space Linear CCD Sensor. Ph.D. Thesis, PLA Information Engineering University, Zhengzhou, China, 2010. [Google Scholar]
  13. Wang, C. Research on High Precision Ground Positioning Algorithm of Space Linear Array Satellite Images. Master’s Thesis, PLA Information Engineering University, Zhengzhou, China, 2016. [Google Scholar]
  14. Nishio, Y.; Tohyama, F.; Onishi, N. The sensor temperature characteristics of a fluxgate magnetometer by a wide-range temperature test for a Mercury exploration satellite. Meas. Sci. Technol. 2007, 18, 2721. [Google Scholar] [CrossRef]
  15. Guo, Y. Study on the variation of in-orbit temperature parameters of stationary orbit satellite. Eng. Spacecr. 2011, 20, 76–81. [Google Scholar]
  16. Tian, Y.; Sun, A.; Luo, N.; Gao, Y. Aerial image mosaicking based on the 6-DoF imaging model. Int. J. Remote Sens. 2020, 41, 74–89. [Google Scholar] [CrossRef]
  17. Chen, J.; Zhang, B.; Tang, X.; Li, G.; Zhou, X.; Hu, L.; Dou, X. On-Orbit Geometric Calibration and Accuracy Validation for Laser Footprint Cameras of GF-7 Satellite. Remote Sens. 2022, 14, 1408. [Google Scholar] [CrossRef]
  18. Liu, Y.; Mo, F.; Zhang, Y.; Xie, J.; Li, Q.; Hu, F.; Liu, C. Validation of preliminary geometric positioning accuracy for China’s civil high-resolution surveying and mapping satellites: Ziyuan-3 01 and Gaofen-7 panchromatic imagery. Remote Sens. Lett. 2021, 12, 521–530. [Google Scholar] [CrossRef]
Figure 1. Schematic diagram of ground virtual control points.
Figure 1. Schematic diagram of ground virtual control points.
Applsci 13 09688 g001
Figure 2. Design of internal and external parameter compensation process. (a) Internal compensation before external compensation; (b) external compensation before internal compensation.
Figure 2. Design of internal and external parameter compensation process. (a) Internal compensation before external compensation; (b) external compensation before internal compensation.
Applsci 13 09688 g002
Figure 3. Curve of relative installation angle and time change.
Figure 3. Curve of relative installation angle and time change.
Applsci 13 09688 g003
Figure 4. Pointing angle difference diagram after camera distortion. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 4. Pointing angle difference diagram after camera distortion. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g004
Figure 5. Pointing angle difference diagram after internal parameter compensation. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 5. Pointing angle difference diagram after internal parameter compensation. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g005
Figure 6. Parameter changes in four compensation schemes. In the figure, ‘ICO’ refers to internal compensation only, ‘ECO’ refers to external compensation only, ‘ICBEC’ refers to internal compensation before external compensation, and ‘ECBIC’ refers to external compensation before internal compensation. (a) xa0–xa4 parameter change; (b) xa5–xa9 parameter change; (c) ya0–ya4 parameter change; and (d) ya5–ya9 parameter change.
Figure 6. Parameter changes in four compensation schemes. In the figure, ‘ICO’ refers to internal compensation only, ‘ECO’ refers to external compensation only, ‘ICBEC’ refers to internal compensation before external compensation, and ‘ECBIC’ refers to external compensation before internal compensation. (a) xa0–xa4 parameter change; (b) xa5–xa9 parameter change; (c) ya0–ya4 parameter change; and (d) ya5–ya9 parameter change.
Applsci 13 09688 g006aApplsci 13 09688 g006b
Figure 7. Pointing angle difference diagram only after internal compensation scheme. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 7. Pointing angle difference diagram only after internal compensation scheme. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g007
Figure 8. Pointing angle difference diagram only after external compensation scheme. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 8. Pointing angle difference diagram only after external compensation scheme. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g008
Figure 9. Pointing angle difference diagram of the compensation scheme of the internal first and then the external. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 9. Pointing angle difference diagram of the compensation scheme of the internal first and then the external. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g009
Figure 10. Pointing angle difference diagram of the compensation scheme of the external first and then the internal. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Figure 10. Pointing angle difference diagram of the compensation scheme of the external first and then the internal. (a) Pointing angle difference in X direction; (b) pointing angle difference in Y direction.
Applsci 13 09688 g010
Figure 11. The relationship between the relative installation angle and time change of the two schemes. (a) Compensation result of relative installation angle α ; (b) compensation result of relative installation angle β ; (c) compensation result of relative installation angle θ .
Figure 11. The relationship between the relative installation angle and time change of the two schemes. (a) Compensation result of relative installation angle α ; (b) compensation result of relative installation angle β ; (c) compensation result of relative installation angle θ .
Applsci 13 09688 g011
Table 1. Target image simulation parameters.
Table 1. Target image simulation parameters.
Image principal point Δ x / p i x e l
−0.0015
Image principal point Δ y / p i x e l
−0.0015
Focal length Δ f / m m
1
Radial distortion k 1 / p i x e l 2
5 × 10−11
Eccentric distortion p 1 / p i x e l 1
6 × 10−8
Eccentric distortion p 2 / p i x e l 1
−3 × 10−8
Area array tilt φ x / a r c sec
−8
Area array tilt φ y / a r c sec
−8
Area array rotation β / a r c sec
−8
Table 2. Relative installation angle error parameters.
Table 2. Relative installation angle error parameters.
Installation Angle ErrorValue
α / a r c sec 72
β / a r c sec −72
θ / a r c sec 72
Table 3. Positioning error after four compensation schemes.
Table 3. Positioning error after four compensation schemes.
Internal OnlyExternal OnlyExternal before InternalInternal before ExternalBefore Error Compensation
X direction/pixel−27.459−2.268−1.859−0.608−28.414
Y direction/pixel−38.781−3.895−2.313−0.875−32.701
Total error/pixel47.5184.5072. 9671.06543.321
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xiao, H.; Li, C.; Sheng, Q.; Wang, B.; Li, J.; Ma, J.; Wu, F.; Zhou, W. Research into Image Point Positioning Compensation of a High-Orbit Thermal Infrared Staring Camera. Appl. Sci. 2023, 13, 9688. https://doi.org/10.3390/app13179688

AMA Style

Xiao H, Li C, Sheng Q, Wang B, Li J, Ma J, Wu F, Zhou W. Research into Image Point Positioning Compensation of a High-Orbit Thermal Infrared Staring Camera. Applied Sciences. 2023; 13(17):9688. https://doi.org/10.3390/app13179688

Chicago/Turabian Style

Xiao, Hui, Chenying Li, Qinghong Sheng, Bo Wang, Jun Li, Jianguo Ma, Fan Wu, and Wei Zhou. 2023. "Research into Image Point Positioning Compensation of a High-Orbit Thermal Infrared Staring Camera" Applied Sciences 13, no. 17: 9688. https://doi.org/10.3390/app13179688

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop