Next Article in Journal
Exploring the Limits of Large Language Models’ Ability to Distinguish Between Objects
Previous Article in Journal
Ozone Nanobubble-Assisted Pretreatment of Lignocellulose: Enhancing Wood Liquefaction and Bio-Polyurea Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Non-Coaxial Image Motion Compensation Method for Airborne Infrared Area-Array Whisk-Broom Camera Under Backward Squint Conditions

1
Key Laboratory of Space Active Opto-Electronics Technology, Shanghai Institute of Technical Physics, Chinese Academy of Sciences, Shanghai 200083, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(9), 4619; https://doi.org/10.3390/app15094619
Submission received: 15 February 2025 / Revised: 11 April 2025 / Accepted: 21 April 2025 / Published: 22 April 2025
(This article belongs to the Section Earth Sciences)

Abstract

:
Three-dimensional (3D) imaging technology enables simultaneously capturing two-dimensional surface features, depth information, and the spatial structure of the target area, offering broad applications in airborne imaging. Airborne area-array whisk-broom cameras are widely used at low-to-medium altitudes, providing high-speed height ratio airborne imaging due to their ability to achieve wide-field imaging through scanning. Currently, most airborne area-array whisk-broom imaging systems employ a vertical downward view, which limits their ability to fully capture the 3D characteristics of the target area. To overcome this limitation, this study proposes a backward-squint area-array wide-field whisk-broom imaging scheme. However, under such whisk-broom scanning conditions, a misalignment exists between the equivalent rotation axis of the image motion compensation mirror after optical path deflection and the roll scanning axis of the camera. To resolve this problem, we propose an accurate calculation model for non-coaxial image motion compensation. We conducted theoretical analysis and simulation experiments to validate the proposed method, achieving a stabilization accuracy better than 0.65 μrad per compensation cycle during a 45° backward squint and a 90° scanning width. Our research advances airborne area-array whisk-broom imaging technology by proposing a novel backward-squint imaging scheme and an innovative non-coaxial image motion compensation model, which significantly enhance in wide-field squint imaging and 3D modeling.

1. Introduction

In recent years, airborne remote sensing imaging technology has advanced rapidly [1] and has been widely applied in various fields, including geology [2,3], oceanography [4], agriculture and forestry [5,6], and urban studies [7,8]. As imaging tasks become increasingly complex, the use of three-dimensional (3D) imaging technology for remote sensing has become an important research focus [9,10]. 3D imaging technology captures both the two-dimensional surface features and the depth and spatial structure of a target region, providing richer and more detailed environmental data [11]. This enables precise terrain analysis, target recognition, and tracking, driving an increasing demand for such technology. Currently, in the field of 3D remote sensing imaging, the predominant approach is to stitch images acquired from multiple area-array cameras [12]. This method achieves 3D imaging by integrating multiple camera fields of view and processing the data using 3D modeling software [13]. While the method is primarily used for visible-light imaging with a wide field of view, it requires an additional stabilization platform, which increases both system size and complexity.
In infrared imaging, the single-frame imaging field of view is limited by the constrained size of area-array detectors. Area-array cameras can achieve wide-swath imaging through whisk-broom scanning [14], making it an effective and lightweight approach to expanding the imaging field of view. Iyengar et al. [15] and Lareau et al. [16] used a step-and-stare imaging method, where a rotating mechanism captures images at multiple positions perpendicular to the flight path, enabling wide-field imaging. Wang et al. [17] further increased the swath width and improved the scanning efficiency by implementing whisk-broom scanning. However, most area-array cameras use downward-looking rotational scanning, primarily capturing downward-looking images; hence, they are unable to acquire multi-angle 3D information of the target.
During airborne imaging, scanning motion, flight motion, platform vibrations, and airflow disturbances cause relative motion between the camera and the imaged targets. This relative motion induces image motion, resulting in blurring during exposure. Thus, LOS stabilization [18] and image motion compensation [19,20,21] are essential, as their effectiveness directly impacts imaging quality. Held et al. [22] developed a rotation model based on coordinate transformation to convert from the geodetic coordinate system to the aircraft coordinate system. By repeatedly applying rotation matrices, they determined the actual LOS direction and computed image motion. Yang et al. [23] further refined the image motion solution at the focal plane; however, its computational complexity made it unsuitable for high-dynamic response compensation in airborne platforms. Additionally, Wang et al. [24] applied an orthogonal decomposition compensation method, utilizing double-angle relationships and optical reversibility. This method used two separate reflective mirrors for LOS stabilization and image motion compensation. A 45° pitch mirror compensated for forward image motion and pitch variations, while a compensation mirror corrected scanning image motion and roll variations, ensuring a lightweight, compact system design. Although computationally simple, this method was limited by coupling constraints. At large pitch angles, the equivalent compensation axis became non-coaxial with the aircraft roll axis, rendering it unsuitable for backward-squint whisk-broom imaging. Therefore, in backward-squint whisk-broom imaging, the coordinate transformation method is limited by computational complexity, while the orthogonal decomposition method is inapplicable.
To address the limitation of vertical downward whisk-broom cameras in acquiring complete 3D information of target scenes, this paper proposes a backward-squint wide-swath whisk-broom imaging scheme. In addition, we propose a LOS path planning method and a novel non-coaxial image motion compensation method based on the aircraft coordinate system to overcome the limitations of traditional image motion compensation methods under the backward-squint scanning conditions. First, we plan the LOS path in the navigation coordinate system to compensate for forward image motion. Next, considering attitude disturbances, we determine the roll angle of the two-axis camera and the pitch mirror angle in the aircraft coordinate system. Finally, we establish a non-coaxial image motion compensation model under backward-squint conditions and propose a high-precision method for correcting non-coaxial image motion. We conducted theoretical analysis and real-time simulations of the LOS path and ground projection trajectory to validate the accuracy of the compensation method under backward-squint conditions. The findings of this paper provide significant support for advancing airborne area-array whisk-broom imaging technology in wide-swath backward-squint imaging and 3D modeling applications.

2. Coordinate System

2.1. Ground (G) and Navigation (N) Coordinate System

The navigation coordinate system is defined as a right-handed inertial frame, with its origin N located at the center of mass of the aircraft platform. The ground coordinate frame G is defined as a local-level Cartesian frame tangent to the Earth’s surface at the imaging region, as illustrated in Figure 1. The ground coordinate system shares the same axis orientations as the navigation coordinate system, where the x-axis points toward true north, the y-axis points toward true east, and the z-axis aligns with the local vertical direction. In this system, G represents a reference point located on the geodetic plane.

2.2. Aircraft Body (B) and Camera (C) Coordinate System

The aircraft body coordinate system is defined as follows: the Bx-axis points in the direction of the aircraft’s forward flight, the By-axis points to the right wing of the aircraft, and the Bz-axis points directly downward. The navigation coordinate system, the aircraft body coordinate system, and the camera coordinate system share the same origin, located at the aircraft’s center of mass.
The aircraft’s attitude is provided by a position and orientation system (POS), which integrates an inertial measurement unit (IMU). Given an initial yaw angle ψ0 of the aircraft, the navigation coordinate system undergoes a sequential clockwise rotation around the z-, y-, and x-axes by ψposψ0 (yaw angle), θpos (pitch angle), and ϕpos (roll angle) to form the aircraft body coordinate system. Since the imaging system employs a two-axis gimbal structure, where the roll axis is the outer frame and the pitch axis is the inner frame, the aircraft body coordinate system is further transformed into the camera coordinate system. This transformation involves first rotating clockwise around the y-axis by the camera pitch angle θ, followed by a clockwise rotation around the x-axis by the camera roll angle ϕ. The real-time line-of-sight (LOS) direction of the imaging system aligns with the Cz-axis, as illustrated in Figure 2.
According to the definition of rotational relationships, the rotation matrix R N B from the navigation coordinate system to the aircraft body coordinate system is as follows [20]:
R N B = R N x ( ϕ p o s ) R N y ( θ p o s ) R N z ( ψ p o s ψ 0 ) = 1 0 0 0 cos ϕ p o s sin ϕ p o s 0 sin ϕ p o s cos ϕ p o s   cos θ p o s 0 sin θ p o s 0 1 0 sin θ p o s 0 cos θ p o s cos ( ψ p o s ψ 0 ) sin ( ψ p o s ψ 0 ) 0 sin ( ψ p o s ψ 0 ) cos ( ψ p o s ψ 0 ) 0 0 0 1
Similarly, the rotation matrix R B C from the aircraft body coordinate system to the camera coordinate system is as follows:
R B C = R C x ( ϕ ) R C y ( θ ) = 1 0 0 0 cos ϕ sin ϕ 0 sin ϕ cos ϕ cos θ 0 sin θ 0 1 0 sin θ 0 cos θ = cos θ 0 sin θ sin θ sin ϕ cos ϕ cos θ sin ϕ sin θ cos ϕ sin ϕ cos θ cos ϕ

3. LOS Path Planning

3.1. LOS Correction Model

Most existing area-array whisk-broom imaging systems use downward-looking cameras, with the camera pitch angle close to 0°. During whisk-broom imaging, the LOS ground trajectory remains approximately along the same horizontal line, as shown by the shallow blue curve in Figure 3a. Under the proposed backward-squint imaging conditions, the camera pitch angle increases significantly to 45°. Consequently, during whisk-broom imaging, the LOS deviates substantially from the vertical flight path, as illustrated by the shallow blue curve in Figure 3b.
During aircraft flight, image motion is induced by both forward motion and scanning deviations along the wingspan direction. Considering these factors, a LOS path planning method for vertical flight path correction under backward-squint conditions is established, as shown in Figure 4.
Considering a single whisk-broom scanning cycle, the camera performs uniform angular motion around the roll axis. Based on geometric relationships, the scanning angular velocity ω is determined by the pitch tilt angle θ, total scanning field of view ϕTFOV, field overlap ratio ρ, instantaneous field of view β, detector’s vertical pixel count n, and scanning efficiency η, and can be expressed as the following equation:
ω = ϕ TFOV η ( 1 ρ ) tan θ + 1 2 n β tan θ 1 2 n β v H
When the aircraft’s fuselage is aligned with the nadir point, the camera is backward squinted at 45° toward the ground origin, defining this moment as t = 0. The horizontal distance between the aircraft and the ground origin is L = H. Based on the geometric relationships illustrated in Figure 4, the roll planning angle ϕN, the effective imaging height imaging height H′, the horizontal distance L between the aircraft and the ground point, and the pitch planning angle θN are expressed by the following equations:
ϕ N ( t ) = ω t L ( t ) = H tan θ + v t H ( t ) = H cos ϕ ( t ) θ N ( t ) = arctan ( L H )
Substituting ϕN and θN into Equation (2) yields the normalized LOS direction vector v L O S N .
v L O S N = sin θ N cos θ N sin ϕ N cos θ N cos ϕ N

3.2. Camera Planning Angles Calculation

The vector v L O S N represents the LOS direction in the navigation coordinate system without attitude disturbances. However, during aircraft flight, external disturbances cause variations in the attitude angles, necessitating the transformation of this LOS vector into the aircraft coordinate system for direction control. The LOS direction vector v L O S N in the navigation coordinate system is transformed to the aircraft coordinate system using the rotation matrix R N B as follows:
v L O S B = R N B v L O S N = R N x ( ϕ p o s ) R N y ( θ p o s ) R N z ( ψ p o s ) v L O S N
Since v L O S N is a unit vector and the rotation matrix R N B is an orthogonal matrix, the transformed vector v L O S B remains a unit vector. According to Equation (2), v L O S B can be expressed in terms of the camera roll angle ϕc and the camera pitch angle θc:
v L O S B = sin θ C cos θ C sin ϕ C cos θ C cos ϕ C
Substituting Equations (5) and (7) into Equation (6) and solving the matrix equation, we obtain the following results:
θ C = arcsin R N B ( 1 , 1 ) sin θ N R N B ( 1 , 2 ) cos θ N sin ϕ N + R N B ( 1 , 3 ) cos θ N cos ϕ N ϕ C = arctan R N B ( 2 , 1 ) sin θ N R N B ( 2 , 2 ) cos θ N sin ϕ N + R N B ( 2 , 3 ) cos θ N cos ϕ N R N B ( 3 , 1 ) sin θ N R N B ( 3 , 2 ) cos θ N sin ϕ N + R N B ( 3 , 3 ) cos θ N cos ϕ N
Since ϕN and θN vary over time, and the aircraft’s three-axis attitude angles are continuously updated, real-time calculations can determine the target roll and pitch angles required for the camera system. These target values are then applied to the roll scanning motor and pitch mirror to achieve stable LOS pointing.

4. Non-Coaxial Image Motion Compensation Method

Most dual-axis area-array cameras operate under 90° downward-looking or wingspan-sweep side-viewing conditions. The fast compensation mirror, used for image motion compensation, is fixed inside the aircraft’s roll gimbal and moves with the roll gimbal during whisk-broom scanning. After the compensation mirror rotates, the optical path is reflected by the pitch mirror to compensate for roll-induced image motion. According to optical relationships, the equivalent compensation axis of the compensation mirror is perpendicular to the LOS. Under 90° downward-looking scanning conditions, this equivalent compensation axis coincides with the aircraft’s roll axis, allowing direct one-half compensation for roll deviations.
However, under the backward-squint whisk-broom scanning conditions discussed in this paper, the pitch mirror has a significant tilt angle and undergoes rotational changes with roll movements. Consequently, the equivalent compensation axis also changes due to the interactions between the pitch mirror’s motion and the roll gimbal’s rotation. As a result, the equivalent rotation axis of the compensation mirror no longer coincides with the aircraft’s roll axis, as illustrated in Figure 5. Therefore, traditional composite-axis image stabilization methods are not suitable for backward-squint imaging. Thus, a new non-coaxial image motion compensation method is required to address this challenge.

4.1. Image Motion Compensation Vector Model

The vector relationship for image motion compensation is illustrated in Figure 6. The axes Bx, By, and Bz correspond to the three axes of the aircraft coordinate system. In the camera coordinate system, the LOS vector v L O S c and the equivalent compensation rotation axis vector v C o m p c correspond to the Cz and Cx directions. Thus, v L O S c = ( 0,0 , 1 ) T , v C o m p c = ( 1,0 , 0 ) T .
The LOS vector and the equivalent compensation rotation axis vector are transformed into the aircraft coordinate system using the rotation matrix R B C as follows:
v L O S b = R B C v L O S c = sin θ cos θ sin ϕ cos θ cos ϕ v C o m p b = R B C v C o m p c = cos θ sin θ sin ϕ sin θ cos ϕ
When exposure starts, the initial roll and pitch angles of the camera system are ϕstart and θstart. During the exposure, the roll angle rotates by Δϕ, changing to ϕ + Δϕ, while the pitch mirror simultaneously adjusts to the pitch angle θ. The expressions for the initial LOS vector v L O S s t a r t , the real-time LOS vector v L O S , and the compensation rotation axis vector v C o m p are given as follows:
v L O S s t a r t = sin θ s t a r t cos θ s t a r t sin ϕ s t a r t cos θ s t a r t cos ϕ s t a r t v L O S = sin θ cos θ sin ( ϕ s t a r t + Δ ϕ ) cos θ cos ( ϕ s t a r t + Δ ϕ ) v C o m p = cos θ sin θ sin ( ϕ s t a r t + Δ ϕ ) sin θ cos ( ϕ s t a r t + Δ ϕ )
The compensation encoding angle κ is fed back by its corresponding encoder. According to the double-angle relationship, the actual compensation angle is 2κ. The corrected LOS direction after a 2κ rotation by the compensation mirror can be calculated using Rodrigues’ rotation formula:
v r o t = cos ( 2 κ ) v L O S + 1 cos ( 2 κ ) ( v C o m p v L O S ) v C o m p + sin ( 2 κ ) v C o m p × v L O S
Substituting Equation (10) into Equation (11) yields:
v r o t = sin θ cos ( 2 κ ) sin ( ϕ s t a r t + Δ ϕ ) cos θ cos ( 2 κ ) cos ( ϕ s t a r t + Δ ϕ ) sin ( 2 κ ) c o s ( ϕ s t a r t + Δ ϕ ) cos θ cos ( 2 κ ) sin ( ϕ s t a r t + Δ ϕ ) sin ( 2 κ )
After the combined action of the compensation mirror rotation and the pitch mirror adjustment, the LOS should return to its initial position, which gives v r o t = v L O S s t a r t . This constraint can be expressed by the following equations:
sin θ cos ( 2 κ ) = sin θ s t a r t
sin ( ϕ s t a r t + Δ ϕ ) cos θ cos ( 2 κ ) + cos ( ϕ s t a r t + Δ ϕ ) sin ( 2 κ ) = cos θ s t a r t sin ϕ s t a r t
c o s ( ϕ s t a r t + Δ ϕ ) cos θ cos ( 2 κ ) sin ( ϕ s t a r t + Δ ϕ ) sin ( 2 κ ) = cos θ s t a r t cos ϕ s t a r t
At this point, the compensation problem is transformed into an angle-solving problem, where θstart, ϕstart, and Δϕ are known parameters in the constraint equations. The pitch angle θ and the compensation encoding angle κ can be obtained through algebraic convergence iteration. However, this method requires significant computational resources, making it unsuitable for real-time onboard processing.

4.2. Precise Image Motion Compensation Method

The constraint Equations (13)–(15) form a nonlinear system; however, since there are only two variables, κ and θ, the system can be simplified into a single-variable equation by successively eliminating variables. From Equation (13), the following relation can be obtained:
cos ( 2 κ ) = sin θ s t a r t sin θ
Substituting this into Equations (14) and (15) yields the following relationship:
sin ( 2 κ ) = cos θ s t a r t sin ϕ s t a r t sin ( ϕ s t a r t + Δ ϕ ) cos θ sin θ s t a r t sin θ cos ( ϕ s t a r t + Δ ϕ ) sin ( 2 κ ) = cos θ s t a r t cos ϕ s t a r t + c o s ( ϕ s t a r t + Δ ϕ ) cos θ sin θ s t a r t sin θ sin ( ϕ s t a r t + Δ ϕ )
Combining the two equations of Equation (17) and eliminating sin2κ, an equation containing only the variable θ can be obtained. The final solution is as follows:
θ = arctan tan θ s t a r t cos ( Δ ϕ ) 2 κ = arccos sin θ s t a r t sin arctan tan θ s t a r t cos ( Δ ϕ )
As seen from Equation (18), the exact analytical solutions for θ and κ depend only on θstart and Δϕ, and are independent of the initial roll angle ϕstart. Although this result provides an exact solution, the compensation angle κ involves nested trigonometric functions, making the computation complex. Therefore, further simplification of this method is required.

4.3. Simplified Image Motion Compensation Method

Considering that the roll rotation angle Δϕ during camera exposure is small, the corresponding compensation angle 2κ is also small. Thus, a small-angle approximation can be applied for simplification:
sin ( Δ ϕ ) Δ ϕ , cos ( Δ ϕ ) 1 sin ( 2 κ ) 2 κ , cos ( 2 κ ) 1 s i n ( ϕ s t a r t + Δ ϕ ) sin ϕ s t a r t + Δ ϕ cos ϕ s t a r t c o s ( ϕ s t a r t + Δ ϕ ) cos ϕ s t a r t Δ ϕ sin ϕ s t a r t
Substituting the approximate results of Equation (19) into Equations (13) and (14) yields the following expressions:
sin θ sin θ s t a r t ( sin ϕ s t a r t + Δ ϕ cos ϕ s t a r t ) cos θ cos ( 2 κ ) + ( cos ϕ s t a r t Δ ϕ sin ϕ s t a r t ) s i n ( 2 κ ) cos θ s t a r t sin ϕ s t a r t
Neglecting the second-order term Δ ϕ sin ϕ s t a r t sin ( 2 κ ) and solving, the approximate solution is as follows:
θ = θ s t a r t 2 κ = Δ ϕ cos θ s t a r t
At this point, the simplified approximate solution has been obtained. Compared to Equation (18), it offers significant computational advantages. However, due to the approximation-induced errors, further simulation analysis is required to verify the compensation accuracy of the simplified approach.

5. Experiments and Discussion

To verify the LOS path planning and image motion compensation method, a series of simulation experiments were conducted. The applicability conditions of the compensation method were determined, and its limitations were analyzed.

5.1. Deviation Analysis of Simplified Method

Equations (18) and (21) provide the exact solution and the small-angle approximated solution for non-coaxial image motion compensation, respectively. Simulation calculations are required to compare and verify the compensation accuracy and deviation of the simplified approximation method.
As the roll axis of the platform rotates from −45° to +45°, the camera pitch planning angle θN ranges from 32° to 45°, as determined by the LOS path planning model in Equation (4). Considering attitude disturbances and scanning speed, the roll variation Δϕ during the exposure period remains below 1°. When the aircraft is aligned with the nadir point, the initial camera pitch angle θstart is 45°. As the roll variation angle Δϕ increases from 0° to 1.5°, planning curves of the compensation angle 2κ and pitch angle θ for the simplified approximation algorithm and the exact compensation algorithm are shown in Figure 7a,b, while the compensation deviation are shown in Figure 7c,d.
As shown in Figure 7, after applying the simplified approximation algorithm, the maximum compensation deviation of the compensation mirror during the roll motion is 6 × 10−5°, corresponding to 1 μrad. With an instantaneous field of view of 250 μrad, the calculated compensation deviation is better than 0.004 pixels, meeting the compensation requirements. Since the simplified compensation algorithm assumes that the camera pitch angle remains equal to its initial value, there exists a certain approximation deviation in the pitch angle during the compensation period. When the deviation reaches 0.5 pixels, equivalent to 0.07°, the roll motion angle is 1.28°. Therefore, as long as the roll motion remains below 1.28° during exposure, the simplified algorithm can achieve effective image motion compensation.
Figure 7 shows that when the initial pitch angle is fixed, the approximation deviation of the compensation mirror and pitch mirror is positively correlated with the roll variation angle. Therefore, we further analyze the effect of the initial camera pitch angle θstart on the simplified compensation algorithm when the roll variation angle Δϕ is fixed. When Δϕ = 1.28° and θstart varies from 30° to 45°, planning curves of the compensation angle 2κ and pitch angle θ for the simplified approximation algorithm and the exact compensation algorithm are shown in Figure 8a,b, while the compensation deviation curves are shown in Figure 8c,d.
As shown in Figure 8, when the roll variation angle is fixed, the approximation deviation of the compensation mirror and pitch mirror is positively correlated with the absolute value of the initial pitch angle, although the variation is small. The compensation deviation of the compensation mirror remains around 3 × 10−5°, corresponding to an accuracy better than 0.002 pixels, while the compensation deviation of the pitch mirror is approximately 7 × 10−3°, corresponding to an accuracy better than 0.5 pixels. In summary, during camera scanning imaging, the maximum compensation deviation occurs when imaging at a 45° squint view of the ground point. However, as long as the roll variation angle remains below 1.28° during image motion compensation, the simplified approximation compensation method can be applied to achieve backward-squint image motion compensation.
Figure 9 presents a 3D visualization of the computation deviation for the backward-squint image motion compensation approximation algorithm and the exact algorithm as functions of two variables. The color gradient from yellow to blue indicates an increasing compensation deviation. The 3D visualization clearly shows that the primary influencing factor of compensation deviation is the roll variation angle Δϕ, while the initial pitch angle has a relatively smaller effect, which is consistent with the single-variable simulation validation discussed above.
In practical airborne operations, considering both computational efficiency and compensation accuracy, a combination of the two formulas can be used:
θ = arctan tan θ s t a r t cos ( Δ ϕ ) 2 κ = Δ ϕ cos θ s t a r t

5.2. Image Motion Compensation Simulation

Using a two-axis frame camera as the experimental subject, the aircraft operates at an altitude of 3000 m with a speed-to-height ratio of 0.04 and a roll scanning rate of 40°/s. The actual attitude variation curve of a single flight strip during aircraft operation, shown in Figure 10, is used as input to perform LOS stabilization simulation for the proposed image motion compensation algorithm. The exposure start point is set at a 45° backward-squint ground point, where the roll planning angle is 0° and the pitch planning angle is 45°, with an exposure stabilization time of 30 ms for each exposure.
Equation (8) enables the calculation of the camera planning angles under attitude disturbances, providing both the real-time and planned values of the camera roll angle ϕC and pitch angle θC, as shown in Figure 11. It can be observed that due to attitude disturbances, the pitch angle and roll angle of the camera must be adjusted accordingly. The next step is to compute the compensation angles using the image motion compensation algorithm to stabilize the LOS.
After applying the backward-squint image motion compensation algorithm in Equation (22), the comparison between the compensated LOS vector and the target LOS normalized vector is shown in Figure 12. The two vectors exhibit a high degree of overlap, which is consistent with the previous computational analysis.
Figure 13 shows the trajectory of the central LOS ground projection point during whisk-broom imaging exposure. The image indicates that the displacement in the x-direction during exposure is 1.37 × 10−4 m, while the displacement in the y-direction is 2.78 × 10−3 m. Considering the 45° backward-squint imaging condition, the actual imaging altitude is given by 3000/cos 45° = 4242 m. Consequently, the central LOS stabilization deviation of the proposed image motion compensation algorithm is calculated as 0.032 μrad in the x-direction and 0.65 μrad in the y-direction, which is better than 0.01 pixels. This demonstrates the accuracy of the image motion compensation algorithm in stabilizing the central LOS.
Figure 14 illustrates the updates of the aircraft position and the real-time ground projection point over six successive steps within a single compensation cycle during backward-squint imaging when the aircraft roll angle is 0°. Due to the forward motion of the aircraft, the camera’s imaging position shifts incrementally in the negative x-direction, as shown in the magnified schematic on the left. The magnified schematic on the right presents the trajectory variations of the ground projection boundary points.
To analyze the influence experienced by each ground projection boundary point during the scan compensation process, Figure 15a–d presents the ground projection trajectories of the four boundary points in the figure. These correspond to the upper, right, left, and lower boundary points, respectively. During the 30 ms compensation cycle, the displacement along the x and y directions ranges from approximately 5 to 7 m. The motion trajectories of the boundary points indicate that the displacement is primarily driven by rotational scanning. This is due to the fact that the scanning speed at this time is 40°/s (0.7 rad/s), while the speed-to-height ratio is 0.04 rad/s. Under such conditions, rotational motion is identified as the primary cause of boundary point displacement. Even with compensation, a certain degree of image rotation still occurs. This issue is inherent in two-axis cameras during the scanning process and is discussed in detail in Section 5.5.
The time-varying ground projection coordinates of each boundary point during the compensation period were recorded at 5 ms intervals, as listed in Table 1.
Based on these coordinates, the deviations at each time step were calculated, and the root mean square (RMS) deviations were summarized in Table 2. The maximum RMS deviation within the 5 ms intervals was 1.4445 m, and the minimum was 1.0950 m. The average deviation per millisecond was 1.26 m/ms.
The ground displacement deviations of each boundary point were converted into angular deviations and then projected onto the image plane based on the focal length. The residual image motion deviations after compensation are summarized in Table 3. At a scanning speed of 40°/s and a speed-to-height ratio of 0.04 s−1, during backward-squint imaging at a 45° viewing angle, the maximum image motion deviation speed on the sensor plane was 3.06 μm/ms. If the image motion permitted was one-half the size of one pixel, the maximum allowable integration time was 1.96 ms.
Considering a total scanning field of view of 90°, the scanning interval was set from −40° to 40°. Figure 16 shows the ground-projected FOV corresponding to scanning angles from 0° to 40°. It can be observed that the central viewing line remains aligned along the same horizontal direction, demonstrating stable pointing performance.
To further analyze the compensation performance at different scanning angles under the given pitch tilt condition, the same method was applied to evaluate the compensation effect at each angle. The image-plane motion deviation speeds of the four ground boundary points were plotted, as shown in Figure 17a. It can be observed that the image motion deviations of the upper boundary point a and the lower boundary point d increased with the scanning angle, while those of the left point c and right point b decreased. The residual image motion deviation after compensation ranged from 1.9 μm/ms to 3.1 μm/ms. Figure 17b illustrates the relationship between the maximum allowable exposure time and the scanning angle. Influenced by the residual image motion of boundary points a and d, the maximum exposure time at a scanning angle of 45°, with a scanning speed of 40°/s and a speed-to-height ratio of 0.04, was determined to be 1.7 ms. Overall, the variation in scanning angle introduced a residual deviation fluctuation of approximately 0.6 μm/ms.
In the previous analysis, the rotational component of the scanning motion was identified as the primary contributor to the compensation deviation. Figure 18 further illustrates the impact of different scanning speeds on the performance of the proposed compensation method. As shown in the figure, the residual image motion on the focal plane gradually decreases with reduced scanning speed, allowing for longer exposure durations. Figure 18e shows that when the scanning speed is 20°/s, the maximum residual image motion on the focal plane is 1.9 μm/ms, allowing for a maximum exposure time of 3.2 ms. At a reduced scanning speed of 10°/s, as illustrated in Figure 18g, the residual further decreases to about 1 μm/ms, extending the maximum exposure time to approximately 6 ms. These results demonstrate that lower scanning speeds significantly improve the compensation effect by reducing residual image motion and extending allowable exposure time.
Through the above analysis, it can be concluded that the aircraft platform velocity also has a certain impact on the proposed compensation method. Under fixed tilt angle and scanning speed conditions, simulations were conducted across a typical range of speed-to-height ratios (0–0.1) commonly used in airborne imaging. The variations in residual image motion compensation deviations on the focal plane with respect to the speed-to-height ratio are shown in Figure 19. As the ratio increases, the residual deviations of the right boundary point b and the left boundary point c gradually increase, while those of the upper point a and the lower point d gradually decrease. The overall variation remains within 0.3 μm/ms, indicating that the influence of platform speed on compensation performance is relatively small when the scanning speed is high. These results demonstrate the robustness of the proposed method in compensating for forward motion-induced image shifts.
Further investigation into the impact of the pitch tilt angle on the effectiveness of image motion compensation, simulations were conducted within the typical range of 20° to 45°, under identical parameter conditions and a fixed scanning speed of 40°/s. Figure 20 illustrates the variation of focal plane residual deviation and maximum exposure time with respect to the pitch tilt angle under different scanning angles.
As shown in the figure, the residual deviation on the focal plane exhibits noticeable changes with varying pitch tilt angles. When the pitch tilt angle is 20°, the residual deviation across different scanning angles is approximately 1.2 μm/ms, allowing for a maximum exposure time of about 4.5 ms. As the pitch tilt angle increases to 45°, the residual deviation rises to around 3 μm/ms, and the corresponding maximum exposure time is reduced to approximately 2 ms. Through comparative analysis, it is evident that the variation in scanning angle has a significantly smaller impact on compensation performance than the variation in pitch tilt angle.
In conclusion, these simulation results validate the effectiveness of the proposed compensation algorithm and comprehensively analyze the effects of scanning angle, scanning speed, speed-to-height ratio, and pitch tilt angle on compensation performance. Among these factors, the pitch tilt angle and scanning speed are identified as the key limiting factors, as they have the most significant impact on residual deviation and allowable exposure time. By selecting an appropriate pitch tilt angle and scanning speed, the compensation performance can be notably improved. The algorithm enables image motion compensation under wide-field backward-squint imaging conditions with a flight altitude of 3000 m, a roll scanning rate of 40°/s, and a scanning field of view of 90°.

5.3. Comparison of Compensation Methods

A set of typical imaging parameters was used to assess the performance of the proposed backward-squint image motion compensation method, following the configuration used in a previous study [23]: focal length f = 100 mm, flight altitude H = 2000 m, platform velocity v = 150 m/s, and a field of view of 33.4° × 33.4°. In that study, the roll scanning angle was 30° and the pitch obliquity angle was 20°, and image motion compensation was implemented using a three-axis stabilized platform. The reported maximum residual image-plane velocity after compensation was 2.6 mm/s.
Under the same conditions, the aircraft’s position and its ground projection are illustrated in Figure 21.
Figure 22 shows the position variation of the four boundary points of the ground projection over time after compensation using the proposed method.
The corresponding coordinates of the four ground boundary projection points over time in Figure 22 are listed in Table 4.
The root mean square (RMS) deviation of the four boundary points in Table 4 is calculated at 5 ms intervals, as shown in Table 5. The maximum RMS deviation among the four ground projection boundary points is 0.6047 m, and the minimum is 0.0923 m. By projecting the displacement at each time step onto the line-of-sight vector, the image motion angle deviation corresponding to every 5 ms ranges from 0.0371 mrad to 0.1305 mrad. After conversion to the imaging plane, the image motion on the focal plane every 5 ms is between 3.4 μm and 12.0 μm, with a maximum image motion compensation deviation velocity of 2.4 mm/s.
As summarized in Table 6, the proposed image motion compensation method achieves a compensation accuracy of 2.4 mm/s, which represents an improvement of approximately 7.7% compared to conventional methods under the same conditions. In addition, it employs a two-axis frame with internal mirror-based compensation, making the design structurally more lightweight and offering better portability.

5.4. Computational Performance Analysis

To evaluate the computational performance of the proposed method, all numerical operations were implemented and tested on a DSP using single-precision floating-point arithmetic. The main computation for determining the LOS pointing angles consists of scanning angle calculation, attitude transformation, and final angle solving camera angles ϕc and θc. Based on the average of multiple test runs, the total execution time per update cycle is approximately 271 microseconds, consisting of 38.4 μs, 102.0 μs, and 130.0 μs for each step, respectively.
In addition, the image motion compensation is implemented through a position-based model that calculates angular displacement as a function of roll angle variation. The computation time for this step is approximately 66.8 microseconds per cycle, as measured on the same DSP platform. The execution time distribution for each computational step is summarized in Figure 23. This lightweight numerical structure supports real-time execution alongside other control tasks.
Considering a typical control frequency of 1 kHz, which corresponds to a 1 ms processing window, the total computational load of the proposed method occupies less than 30% of the available cycle time. This indicates that the method is computationally efficient and suitable for real-time embedded deployment in airborne imaging systems.
The numerical precision of the computed camera angles ϕc and θc was also examined through a simulation test with a scanning angular velocity of 0.3 rad/s over a duration of 0–2 s. As shown in Figure 24a, the computed values (red discrete points) match the high-precision reference curve (blue line) at every 0.2 s interval. Figure 24b presents the relative error between each computed value and the reference. The root-mean-square error is 4.25 × 10−8 rad, equivalent to 0.04 µrad, which satisfies the accuracy requirements for most airborne stabilization and imaging tasks.
In this test, all variables and operations were implemented using float-precision arithmetic. Given that the computational process involves trigonometric evaluations, further improvements in execution speed may be achieved in practical engineering applications through fixed-point conversion or lookup-table optimization techniques. Based on typical optimization strategies in embedded implementations, the execution time can potentially be reduced by a factor of two to three, depending on the use of fixed-point arithmetic or lookup-table techniques. These optimization techniques are expected to further improve the real-time capability and computational efficiency of the proposed method.

5.5. Limitations

For a two-axis camera, unlike the horizontal displacement of the imaging field of view during downward-viewing scanning, the imaging field of view under backward-squint conditions also rotates around the LOS due to camera roll motion, as shown in Figure 25. Since the proposed image motion compensation method is based on LOS stabilization, it cannot compensate for image rotation. Therefore, the maximum applicable rotational scanning speed is determined through calculation.
Considering two consecutive image frames with optical axes OC and OD, the camera pitch angle is θ and the roll variation angle is Δϕ. The corresponding image rotation angle α, represented by ∠AFB in the figure, is associated with an imaging altitude of H. Based on geometric relationships, it can be derived as follows:
AF = H tan θ + H cot θ = 2 H sin 2 θ AS = 1 2 AB = OA sin 1 2 Δ ϕ = H cos θ sin 1 2 Δ ϕ sin ( 1 2 α ) = AS AF = sin 1 2 Δ ϕ sin θ
Under the small-angle approximation, the image rotation angle α is given by Equation (24), which is the product of the roll variation angle and the sine of the camera pitch angle.
α = Δ ϕ sin θ
For an area-array detector with m × n pixels, given a camera whisk-broom scanning angular velocity of ω and an exposure time of t, the image motion during exposure should be within 0.5 pixels to ensure image stability. This constraint can be expressed as follows:
1 2 α m 2 + n 2 0.5 Δ t 1 ω sin θ m 2 + n 2
According to Equation (25), both the scanning speed and the pitch tilt angle jointly determine the allowable exposure time. When the scanning speed is fixed, a smaller pitch tilt angle results in a longer maximum exposure time; conversely, at a fixed pitch tilt angle, reducing the scanning speed extends the allowable exposure time. For an infrared camera equipped with a 640 × 512 area-array detector, under a scanning speed of 40°/s (i.e., 0.7 rad/s) and a pitch tilt angle of 45°, the maximum exposure time constrained by image rotation is calculated to be 2.47 ms. When the tilt angle is reduced to 20°, the maximum allowable exposure time increases to 5.11 ms. These values closely match the simulated results presented in Section 5.2 regarding the influence of pitch tilt angle variation. Due to forward flight effects and attitude disturbances, the actual compensated exposure time is slightly reduced, although it still approaches the theoretical limit, which further validates the accuracy of the proposed compensation method.

6. Conclusions

This study proposes a novel backward-squint wide-field whisk-broom imaging scheme to enable an area-array camera to acquire multi-angle 3D information of the target scene. By conducting reciprocal flights along adjacent flight strips, the aircraft captures forward-looking, downward-looking, and backward-looking images, providing comprehensive 3D information. To overcome the challenges of traditional image motion compensation methods under backward-squint whisk-broom conditions, a novel LOS stabilization and non-coaxial image motion compensation method is introduced. An exact compensation algorithm and a simplified approximate compensation algorithm are derived, with an analysis of the compensation deviation in the approximate method. By comparing computational efficiency and compensation accuracy, an engineering-oriented compensation strategy integrating both algorithms is proposed. Simulation experiments incorporating attitude disturbances and realistic aircraft motion confirm that the proposed method achieves a line-of-sight (LOS) stabilization accuracy better than 0.65 μrad per compensation cycle. This result demonstrates its effectiveness in addressing image motion compensation challenges under squint scanning conditions across various two-axis frame imaging systems. The method supports stable compensation for a wide range of pitch tilt angles up to 45° and is applicable to different scanning speeds. For instance, when applied to a 640 × 512 infrared detector, it enables stable exposures of approximately 1.8 ms at 40°/s, and 3.3 ms at 20°/s, under a 45° backward-squint configuration.
Despite these advancements, the proposed method has certain limitations. The image rotation issue in backward-squint whisk-broom imaging remains a challenge, as the current compensation method primarily focuses on LOS stabilization and non-coaxial image motion compensation. Future work will focus on developing effective image rotation compensation strategies and achieving real-time implementation on airborne platforms. The proposed novel whisk-broom imaging and compensation method contributes significantly to advancing airborne area-array whisk-broom imaging technology, particularly in wide-field squint imaging and 3D modeling applications.

Author Contributions

Conceptualization, J.J. and G.H.; methodology, J.J.; software, J.J.; validation, J.J.; writing—original draft preparation, J.J.; writing—review and editing, J.J. and Y.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National key research and development Program, grant number [2023YFC3107602].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author. In accordance with the laws and regulations of the People’s Republic of China governing scientific research, data sharing is subject to approval. Interested parties may contact the corresponding author for access, and data will be provided upon approval.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Pepe, M.; Fregonese, L.; Scaioni, M. Planning Airborne Photogrammetry and Remote-Sensing Missions with Modern Platforms and Sensors. Eur. J. Remote Sens. 2018, 51, 412–436. [Google Scholar] [CrossRef]
  2. Casagli, N.; Intrieri, E.; Tofani, V.; Gigli, G.; Raspini, F. Landslide Detection, Monitoring and Prediction with Remote-Sensing Techniques. Nat. Rev. Earth Environ. 2023, 4, 51–64. [Google Scholar] [CrossRef]
  3. Han, W.; Zhang, X.; Wang, Y.; Wang, L.; Huang, X.; Li, J.; Wang, S.; Chen, W.; Li, X.; Feng, R.; et al. A Survey of Machine Learning and Deep Learning in Remote Sensing of Geological Environment: Challenges, Advances, and Opportunities. ISPRS J. Photogramm. Remote Sens. 2023, 202, 87–113. [Google Scholar] [CrossRef]
  4. Amani, M.; Moghimi, A.; Mirmazloumi, S.M.; Ranjgar, B.; Ghorbanian, A.; Ojaghi, S.; Ebrahimy, H.; Naboureh, A.; Nazari, M.E.; Mahdavi, S.; et al. Ocean Remote Sensing Techniques and Applications: A Review (Part I). Water 2022, 14, 3400. [Google Scholar] [CrossRef]
  5. Terentev, A.; Dolzhenko, V.; Fedotov, A.; Eremenko, D. Current State of Hyperspectral Remote Sensing for Early Plant Disease Detection: A Review. Sensors 2022, 22, 757. [Google Scholar] [CrossRef] [PubMed]
  6. Arab, S.T.; Islam, M.; Shamsuzzoha; Alam, K.F.; Muhsin, N.; Noguchi, R.; Ahamed, T. A Review of Remote Sensing Applications in Agriculture and Forestry to Establish Big Data Analytics. In Remote Sensing Application: Regional Perspectives in Agriculture and Forestry; Ahamed, T., Ed.; Springer Nature: Singapore, 2022; pp. 1–24. ISBN 978-981-19-0213-0. [Google Scholar]
  7. Shahtahmassebi, A.R.; Li, C.; Fan, Y.; Wu, Y.; Lin, Y.; Gan, M.; Wang, K.; Malik, A.; Blackburn, G.A. Remote Sensing of Urban Green Spaces: A Review. Urban For. Urban Green. 2021, 57, 126946. [Google Scholar] [CrossRef]
  8. Ma, X.; Man, Q.; Yang, X.; Dong, P.; Yang, Z.; Wu, J.; Liu, C. Urban Feature Extraction within a Complex Urban Area with an Improved 3D-CNN Using Airborne Hyperspectral Data. Remote Sens. 2023, 15, 992. [Google Scholar] [CrossRef]
  9. Gao, L.; Shi, W.; Zhu, J.; Shao, P.; Sun, S.; Li, Y.; Wang, F.; Gao, F. Novel Framework for 3D Road Extraction Based on Airborne LiDAR and High-Resolution Remote Sensing Imagery. Remote Sens. 2021, 13, 4766. [Google Scholar] [CrossRef]
  10. Wen, D.; Huang, X.; Bovolo, F.; Li, J.; Ke, X.; Zhang, A.; Benediktsson, J.A. Change Detection From Very-High-Spatial-Resolution Optical Remote Sensing Images: Methods, Applications, and Future Directions. IEEE Geosci. Remote Sens. Mag. 2021, 9, 68–101. [Google Scholar] [CrossRef]
  11. Jurado, J.M.; López, A.; Pádua, L.; Sousa, J.J. Remote Sensing Image Fusion on 3D Scenarios: A Review of Applications for Agriculture and Forestry. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102856. [Google Scholar] [CrossRef]
  12. Zhou, Q.; Duan, Y.; Liu, X.; Zhao, X.; Dong, J.; Zhang, H. AFC-900 Large-Format Aerial Frame Camera: Design Principles and Photogrammetric Processing. In Proceedings of the Twelfth International Conference on Information Optics and Photonics, Xi’an, China, 23–26 July 2021; Volume 12057, pp. 208–215. [Google Scholar]
  13. Gruber, M.; Schachinger, B.; Mostafa, M. The next generation vexcel imaging 3d city modelling using directly georeferenced data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2022, 43, 309–316. [Google Scholar] [CrossRef]
  14. Dong, L.; Liu, F.; Han, M.; You, H. Mosaicing Technology for Airborne Wide Field-of-View Infrared Image. Appl. Sci. 2023, 13, 8977. [Google Scholar] [CrossRef]
  15. Iyengar, M.; Lange, D. The Goodrich 3rd Generation DB-110 System: Operational on Tactical and Unmanned Aircraft. In Proceedings of the Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications III, Kissimmee, FL, USA, 19–20 April 2006; Volume 6209, pp. 75–88. [Google Scholar]
  16. Lareau, A.G. Flight Demonstration of the CA-261 Step Frame Camera. In Proceedings of the Airborne Reconnaissance XXI, San Diego, CA, USA, 29–30 July 1997; Volume 3128, pp. 17–28. [Google Scholar]
  17. Wang, C.; Hang, G.; Jin, X.; Liu, J.; Zhou, C.; Wang, Y. A New Method for LOS Path Planning and Overlap Rate Setting of Airborne Area-Array Whisk-Broom Imaging. J. Infrared Millim. Waves 2023, 42, 383–390. [Google Scholar]
  18. Hilkert, J. A comparison of inertial line-of-sight stabilization techniques using mirrors. In Proceedings of the SPIE Defense and Security Symposium 2004, Orlando, FL, USA, 12–16 April 2004; Volume 5430, pp. 13–22. [Google Scholar]
  19. Hayakawa, T.; Watanabe, T.; Ishikawa, M. Real-time high-speed motion blur compensation system based on back-and-forth motion control of galvanometer mirror. Opt. Express 2015, 23, 31648. [Google Scholar] [CrossRef] [PubMed]
  20. Xiu, J.; Huang, P.; Li, J.; Zhang, H.; Li, Y. Line of Sight and Image Motion Compensation for Step and Stare Imaging System. Appl. Sci. 2020, 10, 7119. [Google Scholar] [CrossRef]
  21. Hurak, Z.; Rezac, M. Image-Based Pointing and Tracking for Inertially Stabilized Airborne Camera Platform. IEEE Trans. Control Syst. Technol. 2012, 20, 1146–1159. [Google Scholar] [CrossRef]
  22. Held, K.J.; Robinson, B.H. TIER II Plus Airborne EO Sensor LOS Control and Image Geolocation. In Proceedings of the 1997 IEEE Aerospace Conference, Snowmass, CO, USA, 13 February 1997; Volume 2, pp. 377–405. [Google Scholar]
  23. Yang, Y.; Yu, C.; Wang, Y.; Hua, N.; Kuang, H. Imaging Attitude Control and Image Motion Compensation Residual Analysis Based on a Three-Axis Inertially Stabilized Platform. Appl. Sci. 2021, 11, 5856. [Google Scholar] [CrossRef]
  24. Wang, Y.; Han, G.; Qi, H.; Ma, Y.; Jiang, B.; Liu, M.; Yao, B.; Shu, R. Investigation of Image Motion Compensation Technique Based on Real-Time LOS Tracking. J. Infrared Millim. Waves 2015, 34, 757–762. [Google Scholar]
Figure 1. Ground coordinate system and navigation coordinate system.
Figure 1. Ground coordinate system and navigation coordinate system.
Applsci 15 04619 g001
Figure 2. Aircraft body coordinate system and camera coordinate system.
Figure 2. Aircraft body coordinate system and camera coordinate system.
Applsci 15 04619 g002
Figure 3. Scanning trajectory deviation in the wingspan direction: (a) vertical downward view whisk-broom; (b) backward squint whisk-broom.
Figure 3. Scanning trajectory deviation in the wingspan direction: (a) vertical downward view whisk-broom; (b) backward squint whisk-broom.
Applsci 15 04619 g003
Figure 4. Schematic diagram of vertical flight path LOS correction.
Figure 4. Schematic diagram of vertical flight path LOS correction.
Applsci 15 04619 g004
Figure 5. Imaging system under backward-squint conditions.
Figure 5. Imaging system under backward-squint conditions.
Applsci 15 04619 g005
Figure 6. LOS vector relationship for image motion compensation.
Figure 6. LOS vector relationship for image motion compensation.
Applsci 15 04619 g006
Figure 7. Comparison of 45° camera pitch angle: (a) compensation angle; (b) pitch angle; (c) compensation angle deviation; (d) pitch angle deviation.
Figure 7. Comparison of 45° camera pitch angle: (a) compensation angle; (b) pitch angle; (c) compensation angle deviation; (d) pitch angle deviation.
Applsci 15 04619 g007
Figure 8. Comparison of 1.28° camera roll angle: (a) compensation angle; (b) pitch angle; (c) compensation angle deviation; (d) pitch angle deviation.
Figure 8. Comparison of 1.28° camera roll angle: (a) compensation angle; (b) pitch angle; (c) compensation angle deviation; (d) pitch angle deviation.
Applsci 15 04619 g008
Figure 9. 3D variation of deviation for approximate and exact algorithms with roll angle and initial pitch angle: (a) compensation angle deviation; (b) pitch angle deviation.
Figure 9. 3D variation of deviation for approximate and exact algorithms with roll angle and initial pitch angle: (a) compensation angle deviation; (b) pitch angle deviation.
Applsci 15 04619 g009
Figure 10. Aircraft attitude angle: (a) yaw; (b) pitch; (c) roll.
Figure 10. Aircraft attitude angle: (a) yaw; (b) pitch; (c) roll.
Applsci 15 04619 g010
Figure 11. Real-time and planned angle values after attitude disturbance: (a) pitch; (b) roll.
Figure 11. Real-time and planned angle values after attitude disturbance: (a) pitch; (b) roll.
Applsci 15 04619 g011
Figure 12. Comparison of the compensated and target LOS vector.
Figure 12. Comparison of the compensated and target LOS vector.
Applsci 15 04619 g012
Figure 13. Trajectory curve of the ground projection point of the central LOS.
Figure 13. Trajectory curve of the ground projection point of the central LOS.
Applsci 15 04619 g013
Figure 14. Aircraft position and ground projection point trajectory for backward-squint imaging with a roll angle of 0°.
Figure 14. Aircraft position and ground projection point trajectory for backward-squint imaging with a roll angle of 0°.
Applsci 15 04619 g014
Figure 15. Ground projection positions of boundary points: (a) upper; (b) right; (c) left; (d) lower.
Figure 15. Ground projection positions of boundary points: (a) upper; (b) right; (c) left; (d) lower.
Applsci 15 04619 g015
Figure 16. Ground-projected field of view at roll angles from 0° to 40°.
Figure 16. Ground-projected field of view at roll angles from 0° to 40°.
Applsci 15 04619 g016
Figure 17. Focal plane compensation residuals and maximum exposure time under varying scanning angles: (a) residual image motion velocity; (b) maximum exposure duration.
Figure 17. Focal plane compensation residuals and maximum exposure time under varying scanning angles: (a) residual image motion velocity; (b) maximum exposure duration.
Applsci 15 04619 g017
Figure 18. Relationship between focal plane residual deviation and maximum exposure time under different scanning speeds: (a) ω = 40°/s; (b) ω = 35°/s; (c) ω = 30°/s; (d) ω = 25°/s; (e) ω = 20°/s; (f) ω = 15°/s; (g) ω = 10°/s; (h) ω = 5°/s.
Figure 18. Relationship between focal plane residual deviation and maximum exposure time under different scanning speeds: (a) ω = 40°/s; (b) ω = 35°/s; (c) ω = 30°/s; (d) ω = 25°/s; (e) ω = 20°/s; (f) ω = 15°/s; (g) ω = 10°/s; (h) ω = 5°/s.
Applsci 15 04619 g018
Figure 19. Variations in focal plane residual compensation deviation under different scanning speeds and speed-to-height ratios: (a) ω = 40°/s; (b) ω = 30°/s; (c) ω = 20°/s; (d) ω = 10°/s.
Figure 19. Variations in focal plane residual compensation deviation under different scanning speeds and speed-to-height ratios: (a) ω = 40°/s; (b) ω = 30°/s; (c) ω = 20°/s; (d) ω = 10°/s.
Applsci 15 04619 g019
Figure 20. Focal plane residual deviation and maximum exposure time versus pitch tilt angle under different scanning angles: (a) ϕ = 0°; (b) ϕ = 10°; (c) ϕ = 20°; (d) ϕ = 30°; (e) ϕ = 40°.
Figure 20. Focal plane residual deviation and maximum exposure time versus pitch tilt angle under different scanning angles: (a) ϕ = 0°; (b) ϕ = 10°; (c) ϕ = 20°; (d) ϕ = 30°; (e) ϕ = 40°.
Applsci 15 04619 g020
Figure 21. Aircraft position and ground projection for backward-squint imaging with a roll angle of 30° and a pitch angle of 20°.
Figure 21. Aircraft position and ground projection for backward-squint imaging with a roll angle of 30° and a pitch angle of 20°.
Applsci 15 04619 g021
Figure 22. Position variation of ground boundary projection points over time: (a) upper; (b) right; (c) left; (d) lower.
Figure 22. Position variation of ground boundary projection points over time: (a) upper; (b) right; (c) left; (d) lower.
Applsci 15 04619 g022
Figure 23. Execution time distribution.
Figure 23. Execution time distribution.
Applsci 15 04619 g023
Figure 24. Computation accuracy verification: (a) computed results vs. reference values.; (b) computed errors.
Figure 24. Computation accuracy verification: (a) computed results vs. reference values.; (b) computed errors.
Applsci 15 04619 g024
Figure 25. Schematic diagram of image rotation in backward-squint imaging.
Figure 25. Schematic diagram of image rotation in backward-squint imaging.
Applsci 15 04619 g025
Table 1. Ground projection trajectories of boundary points over the compensation period.
Table 1. Ground projection trajectories of boundary points over the compensation period.
Time (s)Point (a)Point (b)Point (c)Point (d)
0(2571.311, 240.951)(3500.160, 281.122)(2571.311, −240.951)(3500.160, −281.122)
0.005(2572.030, 241.769)(3501.352, 280.310)(2570.436, −240.173)(3499.182, −281.998)
0.010(2572.752, 242.586)(3502.542, 279.495)(2569.562, −239.393)(3498.199, −282.872)
0.015(2573.481, 243.406)(3503.734, 278.673)(2568.688, −238.609)(3497.209, −283.748)
0.020(2574.222, 244.234)(3504.936, 277.839)(2567.807, −237.814)(3496.202, −284.632)
0.025(2574.955, 245.049)(3506.120, 277.015)(2566.940, −237.029)(3495.207, −285.502)
0.030(2575.683, 245.855)(3507.291, 276.196)(2566.083, −236.252)(3494.219, −286.362)
Table 2. RMS deviations of ground projection boundary points.
Table 2. RMS deviations of ground projection boundary points.
Time (s)Point (a)Point (b)Point (c)Point (d)
0.0051.08911.44261.17121.3131
0.0101.09061.44221.17051.3148
0.0151.09701.44781.17471.3225
0.0201.11121.46301.18661.3399
0.0251.09581.44251.16921.3214
0.0301.08591.42881.15731.3097
RMS (m)1.09501.44451.17161.3203
Table 3. Compensation deviations.
Table 3. Compensation deviations.
RMSPoint (a)Point (b)Point (c)Point (d)
Distance deviation (m/5 ms)1.09501.44451.17161.3203
Angular deviation (mrad/5 ms)0.27640.31250.29600.2859
Focal plane deviation (μm/ms)2.70623.05942.89802.7991
Table 4. Time-varying coordinates of ground projection boundary points during compensation.
Table 4. Time-varying coordinates of ground projection boundary points during compensation.
Time (s)Point (a)Point (b)Point (c)Point (d)
0(304.304, 1950.614)(1721.270, 2544.267)(66.847, 450.027)(1347.972, 542.396)
0.005(304.614, 1950.379)(1721.722, 2544.668)(66.684, 450.012)(1347.956, 542.487)
0.010(304.924, 1950.144)(1722.174, 2545.068)(66.521, 449.997)(1347.940, 542.578)
0.015(305.234, 1949.909)(1722.627, 2545.469)(66.357, 449.983)(1347.924, 542.669)
0.020(305.544, 1949.674)(1723.080, 2545.869)(66.194, 449.968)(1347.908, 542.760)
0.025(305.854, 1949.439)(1723.534, 2546.270)(66.031, 449.953)(1347.892, 542.851)
0.030(306.163, 1949.204)(1723.988, 2546.671)(65.867, 449.939)(1347.875, 542.942)
Table 5. RMS deviations of ground projection boundary point displacements every 5 ms.
Table 5. RMS deviations of ground projection boundary point displacements every 5 ms.
Point (a)Point (b)Point (c)Point (d)
RMS distance deviation (m)0.38990.60470.16390.0923
RMS angular deviation (mrad)0.13050.09630.07980.0371
Image motion on focal plane (μm)12.08.97.43.4
Table 6. Comparison of image motion compensation methods.
Table 6. Comparison of image motion compensation methods.
MethodProposed MethodTraditional Method
System structureTwo-axis frameThree-axis frame
Compensation typeMirror-based compensationThree-axis frame compensation
Compensation accuracy2.4 mm/s2.6 mm/s
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jin, J.; Han, G.; Wang, Y. A Novel Non-Coaxial Image Motion Compensation Method for Airborne Infrared Area-Array Whisk-Broom Camera Under Backward Squint Conditions. Appl. Sci. 2025, 15, 4619. https://doi.org/10.3390/app15094619

AMA Style

Jin J, Han G, Wang Y. A Novel Non-Coaxial Image Motion Compensation Method for Airborne Infrared Area-Array Whisk-Broom Camera Under Backward Squint Conditions. Applied Sciences. 2025; 15(9):4619. https://doi.org/10.3390/app15094619

Chicago/Turabian Style

Jin, Jiarong, Guicheng Han, and Yueming Wang. 2025. "A Novel Non-Coaxial Image Motion Compensation Method for Airborne Infrared Area-Array Whisk-Broom Camera Under Backward Squint Conditions" Applied Sciences 15, no. 9: 4619. https://doi.org/10.3390/app15094619

APA Style

Jin, J., Han, G., & Wang, Y. (2025). A Novel Non-Coaxial Image Motion Compensation Method for Airborne Infrared Area-Array Whisk-Broom Camera Under Backward Squint Conditions. Applied Sciences, 15(9), 4619. https://doi.org/10.3390/app15094619

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop