Next Article in Journal
Tensor-Based Few-Shot Learning for Cross-Domain Hyperspectral Image Classification
Previous Article in Journal
Deterministic Sea Wave Reconstruction and Prediction Based on Coherent S-Band Radar Using Condition Number Regularized Least Squares
Previous Article in Special Issue
Disentangled Representation Learning for Robust Radar Inter-Pulse Modulation Feature Extraction and Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

Axis Estimation of Spaceborne Targets via Inverse Synthetic Aperture Radar Image Sequence Based on Regression Network

College of Electronic Science and Technology, National University of Defense Technology, Changsha 410000, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(22), 4148; https://doi.org/10.3390/rs16224148
Submission received: 26 September 2024 / Revised: 3 November 2024 / Accepted: 4 November 2024 / Published: 7 November 2024
(This article belongs to the Special Issue Recent Advances in Nonlinear Processing Technique for Radar Sensing)

Abstract

:
Axial estimation is an important task for detecting non-cooperative space targets in orbit, with inverse synthetic aperture radar (ISAR) imaging serving as a fundamental approach to facilitate this process. However, most of the existing axial estimation methods usually rely on manually extracting and matching features of key corner points or linear structures in the images, which may result in a degradation in estimation accuracy. To address these issues, this paper proposes an axial estimation method for spaceborne targets via ISAR image sequences based on a regression network. Firstly, taking the ALOS satellite as an example, its Computer-Aided Design (CAD) model is constructed through a prior analysis of its structural features. Subsequently, target echoes are generated using electromagnetic simulation software, followed by imaging processing, analysis of imaging characteristics, and the determination of axial labels. Finally, in contrast to traditional classification approaches, this study introduces a straightforward yet effective regression network specifically designed for ISAR image sequences. This network transforms the classification loss into a loss function constrained by the minimum mean square error, which can be utilized to adaptively perform the feature extraction and estimation of axial parameters. The effectiveness of the proposed method is validated through both electromagnetic simulations and experimental data.

1. Introduction

For most rigid spacecrafts, their attitudes in orbit can be expressed by axial attitude, such as that of the main axis, solar panel axis, etc., which are important features for describing the motion of space targets in orbit. Axial estimation has a wide range of applications, including fault rescue, intent inversion, spacecraft docking, and re-entry prediction [1]. Specifically, the attitude of a space target can be characterized by yaw, pitch and roll angles in an orbital coordinate system [2]. Currently, adaptive optics telescopes and inverse synthetic aperture radars (ISARs) are the primary surveillance technologies employed for axial estimation. However, in practice, the quality of images obtained from adaptive optics can be compromised by various factors, including sunlight, the Earth’s shadow, atmospheric turbulence, and adverse weather conditions. In contrast, ISARs possess all-weather imaging capability and are not influenced by lighting conditions, making them an effective way to estimate the axial direction of spacecrafts. Nevertheless, unlike optical imaging, ISAR images lack texture information and suffer from angular scintillation [3]. Accordingly, estimating the axial direction of spacecrafts based on a sequence of ISAR images remains a challenging task due to various environmental factors and technological limitations.
Most existing methods for axial estimation based on ISAR image sequences rely on manually extracting and matching features of key corner points or linear structures in the images. Ferrara et al. [4,5,6] estimate the axial direction of the target by reconstructing the 3D structure from a sequence of multi-view ISAR images. This method necessitates projection matrices corresponding to scattering points with various attitudes. However, this method requires the acquisition of parallax angles through long observation intervals, which complicates practical application. In addition, this method struggles to reconstruct three-dimensional structures when the signal-to-noise ratio (SNR) is low, thus requiring a high SNR to achieve reliable results. Morita et al. [7,8] employ the factorization method for axial estimation, constructing a trajectory matrix by extracting dominant scatterers from the ISAR image sequence. Then, the axial parameters of the space target are obtained by decomposing the trajectory matrix. The disadvantage of this method is the reliance on the strong scattering center, which refers to a point or region on the spacecraft that reflects a significant amount of radar energy, making a dominant feature in the ISAR image. Depending on such a feature makes this method vulnerable to errors if the strong scattering center is not clearly visible or is affected by noise. Furthermore, this reliance leads to insufficient algorithmic robustness, as the accuracy of the estimation heavily depends on the presence and clarity of the strong scattering center. Rongzhen et al. [9] proposes a joint estimation approach for ISAR and optical cameras that requires accurate feature extraction. However, some features are not easily observed due to self-occlusion, and the accuracy of feature extraction is also affected by the quality of ISAR data. Therefore, how to rapidly and accurately estimate the axial orientation of a space target is particularly significant.
Compared with traditional feature extraction methods, both target recognition and detection based on neural networks have shown advantages in terms of precision and efficiency [10]. However, there are few axial estimation methods based on deep learning. To address these issues, this paper proposes a satellite axial estimation method based on image sequence and regression network, transforming traditional manual feature extraction into neural network-based regression optimization. The proposed method can be directly utilized for axial regression using ISAR image sequences, without reconstructing a 3D model. This end-to-end estimation approach can effectively utilize the feature extraction capability of neural networks to improve the accuracy of traditional methods based on manual feature extraction [11]. Furthermore, the estimation speed achieves real-time performance, which effectively extends the existing axial estimation methods.
Firstly, taking the ALOS satellite as an example, echoes from space targets at varying attitudes are obtained by varying the pitch and yaw angles. The axial orientation corresponding to each ISAR image is determined based on the imaging coordinate system. Then, the network structure for axial estimation applicable to ISAR image sequences is constructed, and the corresponding hyperparameters are set for network training. After completing training, the network can adaptively estimate the axial direction of spacecrafts. Finally, experimental results validate the feasibility of achieving axial estimation through neural networks, providing valuable technical support for refined situational awareness of motion targets [12].

2. An Axial Estimation Method Based on Neural Networks

The specific process can be divided into three main parts: the first part is the definition of the spacecraft attitude in orbit, and the second part involves the construction of a characteristic dataset and the establishment of axial labels. The third part focuses on the construction, training, and evaluation of regression networks. The overall framework of axial estimation based on regression network is shown in Figure 1.

2.1. Definition of the Spacecraft Attitude in Orbit

In this part, the orbit coordinate system and the body coordinate system are introduced, and then the in-orbit spacecraft attitude is defined as pitch and yaw angles. As shown in Figure 2, the two coordinate systems are defined as follows:
(1)
Orbit coordinate system O i X p Y p Z p
Figure 2 shows that the coordinate origin is O i , with the direction of the spacecraft toward the center of the Earth defining the Y p axis. The X p axis points in the negative normal direction of the orbit plane. Using the right-hand rule, the direction of the Z p axis is determined, which also corresponds to the motion direction of the spacecraft.
Figure 2. Definition of the orbit and body coordinate system.
Figure 2. Definition of the orbit and body coordinate system.
Remotesensing 16 04148 g002
(2)
Body coordinate system O t X s Y s Z s
As shown in Figure 2, the coordinate origin is O t . The Z s axis points along the main axis of the spacecraft, the Y s axis points along the solar panel, and the X s axis is determined using the right-hand rule.
The attitude of the spacecraft in orbit can be represented by the body coordinate system and the orbit coordinate system, specifically through the pitch and yaw angles, as shown in Figure 3. O t Z s is the direction of the main axis of the space object, and O t Y s is the direction of the solar panels. Z s O t Z and X p O t Z indicate the pitch and yaw angle, respectively.

2.2. Imaging Characteristics and Dataset Construction

Taking the ALOS satellite as an example, its in-orbit schematic is illustrated in Figure 4a. The CAD model of the satellite is established in Figure 4b, with the fundamental structure including the main body, the surveying instrument, the solar panels, the optical lenses, and the phased-array antenna. After obtaining the CAD model, system parameters for the radar echo must be determined through simulation and testing. Subsequently, for the ISAR imaging of space targets, the range resolution is determined by the bandwidth of the radar, as follows:
ρ r = c 2 B
where c is the speed of light. Meanwhile, the azimuth resolution is determined by the relative rotation of the radar with respect to the target:
ρ a = λ 2 θ
where λ is the wavelength and θ is the relative angle of rotation. In order not to produce deformation in the imaging result, the range resolution is usually made equal to the azimuth resolution. Therefore, the required synthesized aperture should satisfy
θ = B f c
A representative imaging result is given in Figure 4c, with specific parameters described in Section 3.1. It can be observed that the imaging result is close to the real structure of the target, which validates the effectiveness.
To intuitively analyze the imaging characteristics of the ALOS satellite, Figure 5 illustrates the imaging results obtained at various attitude angles: yaw angles ranging from 15° to 75° in 15° intervals, and pitch angles ranging from 30° to 150° in 30° intervals. Taking a pitch angle of 45° as an example, the shape of solar panels of the ALOS satellite is approximately rectangular, although some image distortion can be observed as the yaw angle changes. This reflects the projection relationship between the target and the radar line of sight at this attitude. Additionally, it is evident that the ALOS satellite is rotating around the X-axis of the target coordinate system during this time. When the yaw angle is fixed at 120°, varying the pitch angle indicates that the ALOS satellite rotates around the Y-axis of the target coordinate system during this period.
For a given 3D target, one imaging result can serve as a reference while another imaging angle is provided, allowing the human eye to directly infer the 3D axial direction. Especially when sequential imaging results are available, the target attitude can be further determined by comparing differences between these images. Given the complexity and precision required for estimating target axial direction, using advanced algorithms is essential. Based on this finding, this paper proposes the development of a regression network for the automatic estimation of target axial direction.

2.3. Regression Network Construction and Training

Based on the dataset constructed in the first step, the dataset for the regression network can be constructed. Inspired by references [13,14], a regression network is constructed, which includes the ISAR image sequences X 1 , X 2 , X 3 , the center yaw angle Y c , and the center pitch angle P c . For the regression labeling of this dataset, the axial direction of the center ISAR image is taken as the true value. The designed network structure is shown in Figure 6. For each perspective, the backbone network consists of four modules, each comprising convolution, batch normalization, activation, and max pooling. For these four convolutional layers, the sizes of convolutional kernel are 5, 3, 3, and 3, and the number of channels is 8, 16, 32, and 64.
Since the results of neighboring images in the ISAR image sequence are similar, as shown in Figure 7, the backbone networks for the remaining two viewpoints share weights. This step is utilized to minimize the number of parameters in the network. Then, we assume the dimension of the input data is H × W × C , where H , W , and C represent the height, width, and number of channels of the image, respectively. Then, the output obtained from each viewpoint image through the backbone network is H 16 × W 16 × 64 . Following this, these three outputs are combined along the channel dimension to obtain the backbone features of the ISAR image sequence, resulting in a size of H 16 × W 16 × 192 . The feature is then processed through a 1 × 1 convolutional layer, resulting in an output feature of size H 16 × W 16 × 64 . The 1 × 1 convolutional kernel serves to reduce computational complexity while reducing feature redundancy. Eventually, the yaw and pitch angles are separately predicted by outputting through fully connected layers with a dimension of 1.
For this regression network, the loss function consists of two components and can be expressed as follows:
L = L p + λ L y
where λ is the balance coefficient of the loss function, which is set to 0.5 according to the training loss variations. L y and L p represent the losses for yaw and pitch angles, respectively, which can be defined based on the mean squared error as
L y = 1 N n ( Y ^ c Y c ) 2
L p = 1 N n ( P ^ c P c ) 2
where Y ^ c and P ^ c represent the estimated yaw and pitch angles, respectively, which are output by the regression network after processing the ISAR image sequence. N denotes the total number of training samples.

3. Results and Performance Analysis

To validate the effectiveness of the proposed method, relevant experiments have been designed and analyzed, which are mainly divided into three parts. The first part focuses on establishing a CAD model of the ALOS satellite. Afterwards, the target echo is obtained using electromagnetic calculation software, followed by imaging processing, analysis of imaging features, and the determination of axial labels. Finally, feature extraction is performed, and axial parameters are estimated using network adaptation. The second part employs the Grad-CAM technique to visualize the process of network prediction through heat maps, providing interpretability regarding the regression network. The third part introduces measured data for validation to further enhance the comprehensiveness of the experiments and the applicability of the method. Testing on real measured data is performed to verify the effectiveness and robustness of the proposed method in practical environments.

3.1. Axial Estimation Utilizing Regression Networks

The ALOS satellite is used as the target for this study, with its typical structure and imaging results illustrated in Figure 1 and Figure 4. Thereafter, the radar simulation parameters are described, with a carrier frequency in the Ka band (35 GHz). The bandwidth is approximately 3 GHz, which achieves a range resolution of 5 cm. Simultaneously, the azimuth resolution is set to 5 cm, resulting in a synthetic aperture length of 2.5°. The instantaneous attitude of the satellite target is defined by Euler angles, specifically the yaw and pitch angles. It is determined by the angle between the body coordinate system of the target and its orbit coordinate system. Subsequently, considering the symmetry of the ALOS satellite, the yaw angle is set to range from 0° to 180° and the pitch angle is set to range from 0 to 90°. One ISAR image is obtained for each five-degree interval between yaw and pitch, resulting in a total of 703 ISAR images.
The height and width of the images for each of the three views are 512. It is noted that the images have already been resized compared to the original size after imaging, which contributes to the reduction of network parameters. The dimensions of the output data are 2 × 1, representing yaw and pitch angles, respectively. The dataset is divided into training, validation, and test sets in a ratio of 7:1:2. To improve model robustness, data augmentation techniques such as translation, contrast enhancement, and noise addition are applied. Specifically, translation enhancement involves the horizontal and vertical image shift, with the pixel number randomly generated from −15 to 15. Contrast enhancement is the image application with gamma transform, with the gamma number randomized from 0.5 to 1.5. Noise addition is the pixel-by-pixel Gaussian noise addition with a probability from 0.3 to 0.7. It should be emphasized that rotation and flipping are not adopted here, which can allow us to avoid the ambiguity of axis estimation. The initial learning rate is set to 0.004, with a reduction factor of 0.9 applied every 30 training epochs. The total number of training epochs is set to 200. The gradient decay parameter and the second-moment decay parameter are set to 0.9 and 0.999, using the Adaptive Moment Estimation (ADAM) optimizer.
Firstly, three test samples of representative ISAR images are taken as examples. The intermediate ISAR imaging results for these samples are shown in Figure 8a. The true values for these samples are listed in the first column of Table 1, and corresponding attitudes are visualized using CAD models in Figure 8b. The predicted yaw and pitch, obtained by training the model with ISAR sequences of three images at 5° intervals, are also listed in the second column of Table 1. It is evident that the results are highly consistent with the true values, showing a minimal overall mean error. This encompasses both the axial direction of the true values and the predicted values.
Based on the initial findings, further experiments were conducted to investigate the effects of significant angular variations between sequential ISAR images. Imaging results with pitch and yaw angles separated by 10 degrees were used for the training and validation sets, while the test set consisted of intermediate-value angles located in the intervals of 0 to 10 degrees, 10 to 20 degrees, etc. The experimental results are shown in Table 2, where the mean errors of the estimated pitch and yaw angles are within acceptable limits. Despite the increase in angular discontinuities, the network demonstrates strong robustness, indicating its ability to effectively adapt to the substantial angular variations encountered during the training phase.
In addition, experiments were conducted under varying signal-to-noise ratio (SNR) conditions, specifically at levels of 0, 5, and 10, as shown in Table 3. The imaging results corresponding to the same attitude across different SNR levels are presented in Figure 9. The findings indicate that at an SNR of 0, while the prediction error exhibits a slight increase, the overall decline in the estimation accuracy of the yaw angle is minimal. This demonstrates that the method retains its robustness even in noisy environments. When the SNR is increased to 5 and 10, the discrepancies between the predicted results and the true values further decrease, with average errors of 0.24° and 0.63°, respectively. These results confirm the effectiveness of the method and highlight the subtle impact of SNR on the accuracy of axial estimation.
To quantitively assess the prediction accuracy of yaw and pitch angles, yaw error, pitch error and mean estimation error are selected as quantitative metrics. In addition, the accuracy of instantaneous axial estimation method based on a single ISAR image is compared. The results are shown in Table 4. Compared to instantaneous axial estimation based on a single ISAR image, the average estimation error based on a sequence of ISAR images can be decreased by 0.6°, which effectively improves the accuracy of the axial estimation method. Furthermore, the prediction time for each image sequence is recorded at 0.06 s once the network has been trained.
To further validate the performance of the proposed method under different loss functions, mean-absolute-error (MAE) and Huber loss were introduced in addition to the original mean-squared-error (MSE) loss. This allowed for a comparison of their accuracy and robustness in estimating yaw and pitch angles. The experimental results, as shown in Figure 10, indicate that while the three loss functions exhibit similar overall trends, MSE loss achieves optimal accuracy. Specifically, MSE loss attained the lowest average error in both yaw and pitch estimations, demonstrating high stability and applicability across various environments and angular variations.
Furthermore, we considered the impact of pitch angle variation on recognition performance in domain adaptation. The ability of the regression network to estimate yaw and pitch angles is evaluated by selecting pitch angles in the range of 15° to 75° with 15° intervals while varying the yaw angle at each pitch angle. The experimental results are presented in Table 5, indicating that the average estimation error is at its maximum at a pitch angle of 75°, whereas it is minimized at 60°. To facilitate further our analysis of estimation performance across different yaw angles, the yaw angle range is segmented into 15° intervals, resulting in a total of 12 intervals, The estimation error results for each interval are shown in Figure 11. The maximum estimation error reaches 7.5° when the pitch angle is 75° and the yaw angle ranges from 165° to 180°, significantly increasing the average estimation error. This is attributed to the fact that at this pitch angle, critical components of the imaging results are obscured due to the extreme attitude, making it a challenging sample for estimation.

3.2. Visualization of the Network Prediction Process

Based on the Grad-CAM technique, the predicted values of yaw and pitch angles are inversely activated. A heat map of the extracted regional features for each convolutional layer is generated to provide a partial interpretability of the proposed network. The individual convolutional layers of the proposed regression network are sequentially arranged from top to bottom in Figure 12. The following conclusions can be obtained from the observations:
(1) The first four convolutional layers function as the backbone layers for primary feature extraction. Due to the shared weights, overall feature extraction is concentrated at the edges of three images., which results in some feature responses corresponding to empty sections in certain ISAR images. Moreover, the shallow features in the backbone feature extraction network are still extracting some local features, such as component corner points, component line edges, and so on.
(2) The feature response of the convolutional layer in terms of yaw and pitch is holistic, particularly focusing on the directional features at the beginning and end of prominent components. This significantly impacts the accuracy of subsequent axial estimation.
Figure 12. Feature visualization with different convolutional layers. (a) First convolutional layer; (b) Second convolutional layer; (c) Third convolutional layer; (d) Fourth convolutional layer; (e) Convolutional layers for yaw estimation; (f) Convolutional layers for pitch estimation.
Figure 12. Feature visualization with different convolutional layers. (a) First convolutional layer; (b) Second convolutional layer; (c) Third convolutional layer; (d) Fourth convolutional layer; (e) Convolutional layers for yaw estimation; (f) Convolutional layers for pitch estimation.
Remotesensing 16 04148 g012aRemotesensing 16 04148 g012b

3.3. Real-Data Experiments

Here, we provide the experimental setup for our recent satellite imaging experiment, where the radar and satellite were positioned with a separation distance of 780 m. Figure 13 presents optical images of the satellite at different attitudes. The approximate dimensions of the target satellite are 6 m in length, 1.4 m in width, and 6 m in height. The main experimental parameters are listed in Table 6, where “Frequency step length” represents the step length of the carrier frequency between the adjacent subpulses in a burst, i.e., f . Although the experimental frequency band differs from the simulation frequency band, the core idea of the processing algorithm is consistent with the proposed approach.
Due to limitations in the experimental setup, the pitch angle of the satellite is restricted to 0 degrees. The yaw angle ranges from 0 to 360 degrees at intervals of 1.5 degrees to form the dataset. Figure 14 shows typical imaging results at selected yaw angles. Owing to the high resolution of the terahertz band, the structural features and key components of the satellite target are clearly presented. The experiment utilized the same 7:1:2 ratio as used in the simulated data for training, validation, and testing the network. In the initial experiment, images captured at 1.5-degree intervals were used to validate the network, with results displayed in the first row of Table 7. Following this, an additional experiment was conducted using images at three-degree intervals for training and validation, while images at 1.5-degree intervals served as the test set to assess the network’s accuracy for angles not encountered during training. The results, presented in the second row of Table 7, indicate a noticeable decrease in accuracy compared to the first experiment. This reduction highlights the limitation of the network generalization when estimating target orientations not covered in the training set. The decrease in performance under real-world measurement conditions further emphasizes the significant challenge in achieving high accuracy with measured data compared to simulated data. Attaining higher accuracy with measured data, especially for unseen angles, remains a considerable challenge, and we are committed to pursuing further improvements.

4. Conclusions

An axial estimation framework based on a regression network with ISAR image sequences is proposed in this paper, and it is capable of the real-time estimation of the space target axial direction. Firstly, taking the ALOS satellite as an example, its echo data for arbitrary attitude are obtained using a CAD model and electromagnetic simulation software. ISAR image data and label libraries are established according to the system resolution requirements. Subsequently, the axial estimation network tailored for ISAR image sequences is constructed. In contrast to traditional methods based on classification and detection, yaw and pitch angles are directly output by this network based on the minimum mean squared error. Finally, the interpretability of the regression network is evaluated through feature visualization, which contributes to the inverse design of an improved framework for axial estimation networks. The attitude estimation errors in yaw and pitch directions were 2.10° and 1.05°, respectively, with an average attitude estimation error of 1.58°. After completing the training, the network takes approximately 0.06 s to predict a single ISAR image sequence. In addition to simulated data, testing with real measured data further validated the effectiveness of the proposed framework in practical environments, demonstrating its robustness in estimating target orientation. Despite challenges posed by real-world conditions, the framework achieved reliable results, reinforcing its potential for application in operational settings.

Author Contributions

Conceptualization, W.G. and Q.Y.; methodology, W.G.; software, W.G.; validation, W.G., Q.Y. and C.L.; formal analysis, W.G.; investigation, H.W.; resources, H.W.; data curation, W.G; writing—original draft preparation, W.G.; writing—review and editing, W.G; visualization, C.L.; supervision, H.W.; project administration, H.W.; funding acquisition, Q.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Science and Technology Innovation Program of Hunan Province, and National Natural Science Foundation of China under Grant 62201591 and 62035014.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Koshkin, N.; Korobeynikova, E.; Shakun, L.; Strakhova, S.; Tang, Z.H. Remote sensing of the EnviSat and Cbers-2B satellites rotation around the centre of mass by photometry. Adv. Space Res. 2016, 58, 358–371. [Google Scholar] [CrossRef]
  2. Kou, P.; Liu, Y.; Zhong, W.; Tian, B.; Wu, W.; Zhang, C. Axial Attitude Estimation of Spacecraft in Orbit Based on ISAR Image Sequence. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 7246–7258. [Google Scholar] [CrossRef]
  3. Xu, Z.; Ai, X.; Zhao, F.; Xiao, S. Attitude Estimation for Linear-Type Targets Based on Bistatic Full-Polarization Information. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4023005. [Google Scholar] [CrossRef]
  4. Mcfadden, F.E. Three-dimensional reconstruction from ISAR sequences. In Proceedings of the AeroSense 2002, Orlando, FL, USA, 1–5 April 2002; pp. 58–67. [Google Scholar]
  5. Ferrara, M.; Arnold, G.; Stuff, M. Shape and motion reconstruction from 3D-to-1D orthographically projected data via object-image relations. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31, 1906–1912. [Google Scholar] [CrossRef] [PubMed]
  6. Ferrara, M.; Arnold, G.; Parker, J.T.; Stuff, M. Robust estimation of shape invariants. In Proceedings of the IEEE Radar Conference 2012, Atlanta, GA, USA, 7–11 May 2012; pp. 0167–0172. [Google Scholar]
  7. Morita, T.; Kanade, T. A sequential factorization method for recovering shape and motion from image streams. IEEE Trans. Pattern Anal. Mach. Intell. 1997, 19, 858–867. [Google Scholar] [CrossRef]
  8. Zhou, J.; Shi, Z.; Fu, Q. Three-dimensional scattering center extraction based on wide aperture data at a single elevation. IEEE Trans. Geosci. Remote Sens. 2015, 53, 1638–1655. [Google Scholar] [CrossRef]
  9. Du, R.; Liu, L.; Bai, X. Instantaneous attitude estimation of spacecraft utilizing joint optical-and-ISAR observation. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5112114. [Google Scholar] [CrossRef]
  10. Huo, K.; Kou, P.; Liu, Y. Attitude estimation method of spacecraft in orbit with complex structure using sequential ISAR images. Syst. Eng. Electron. 2023, 45, 2438–2445. [Google Scholar]
  11. Wang, C.; Jiang, L.; Ren, X.; Zhong, W.; Wang, Z. Automatic Instantaneous Attitude Estimation Framework for Spacecraft Based on Colocated Optical/ISAR Observation. IEEE Geosci. Remote Sens. Lett. 2024, 21, 3502005. [Google Scholar] [CrossRef]
  12. Wang, C.; Jiang, L.; Li, M.; Ren, X.; Wang, Z. Slow-Spinning Spacecraft Cross-Range Scaling and Attitude Estimation Based on Sequential ISAR Images. IEEE Trans. Aerosp. Electron. Syst. 2023, 59, 7469–7485. [Google Scholar] [CrossRef]
  13. Fan, L.; Wang, H.; Yang, Q.; Deng, B. THz-ViSAR-Oriented fast indication and imaging of rotating targets based on nonparametric method. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5217515. [Google Scholar] [CrossRef]
  14. Fan, L.; Wang, H.; Yang, Q.; Deng, B. High-Quality airborne terahertz video SAR imaging based on Echo-Driven robust motion compensation. IEEE Trans. Geosci. Remote Sens. 2024, 62, 2001817. [Google Scholar] [CrossRef]
Figure 1. The overall framework of axial estimation.
Figure 1. The overall framework of axial estimation.
Remotesensing 16 04148 g001
Figure 3. Definition of the yaw and pitch angle.
Figure 3. Definition of the yaw and pitch angle.
Remotesensing 16 04148 g003
Figure 4. ALOS satellite modeling and typical electromagnetic simulation imaging. (a) In-orbit schematic of the ALOS satellite; (b) CAD model; (c) Schematic of imaging result.
Figure 4. ALOS satellite modeling and typical electromagnetic simulation imaging. (a) In-orbit schematic of the ALOS satellite; (b) CAD model; (c) Schematic of imaging result.
Remotesensing 16 04148 g004
Figure 5. Imaging results of the ALOS satellite at typical attitude angles.
Figure 5. Imaging results of the ALOS satellite at typical attitude angles.
Remotesensing 16 04148 g005
Figure 6. Architecture of the regression network.
Figure 6. Architecture of the regression network.
Remotesensing 16 04148 g006
Figure 7. The sequence of ISAR images whose pitch angles are all 70° and yaw angles are 155°, 160°, and 165° from left to right.
Figure 7. The sequence of ISAR images whose pitch angles are all 70° and yaw angles are 155°, 160°, and 165° from left to right.
Remotesensing 16 04148 g007
Figure 8. Typical ISAR imaging results with corresponding CAD modules. (a) Typical ISAR imaging results; (b) CAD modules.
Figure 8. Typical ISAR imaging results with corresponding CAD modules. (a) Typical ISAR imaging results; (b) CAD modules.
Remotesensing 16 04148 g008aRemotesensing 16 04148 g008b
Figure 9. The imaging results under varying SNR levels. (a) SNR: 0; (b) SNR: 5; (c) SNR: 10.
Figure 9. The imaging results under varying SNR levels. (a) SNR: 0; (b) SNR: 5; (c) SNR: 10.
Remotesensing 16 04148 g009
Figure 10. Yaw and pitch angle estimation errors for different loss functions.
Figure 10. Yaw and pitch angle estimation errors for different loss functions.
Remotesensing 16 04148 g010
Figure 11. Mean estimation errors for various yaw angle intervals in different pitch angle. (a) 15°; (b) 30°; (c) 45°; (d) 60°; (e) 75°.
Figure 11. Mean estimation errors for various yaw angle intervals in different pitch angle. (a) 15°; (b) 30°; (c) 45°; (d) 60°; (e) 75°.
Remotesensing 16 04148 g011
Figure 13. Real images of the satellite.
Figure 13. Real images of the satellite.
Remotesensing 16 04148 g013
Figure 14. The imaging results at three different azimuth angles. (a) yaw: 0°; (b) yaw: 45°; (c) yaw: 75°; (d) 90°; (e) yaw: 100°; (f) yaw: 115°.
Figure 14. The imaging results at three different azimuth angles. (a) yaw: 0°; (b) yaw: 45°; (c) yaw: 75°; (d) 90°; (e) yaw: 100°; (f) yaw: 115°.
Remotesensing 16 04148 g014
Table 1. Analysis of axial estimates for three samples.
Table 1. Analysis of axial estimates for three samples.
SampleTrue Value (°)
(Yaw, Pitch)
Prediction (°)
(Yaw, Pitch)
True Value of Axis DirectionPrediction of Axis Direction
1(95, 10)(92.94, 11.64)(−0.09, 0.98, 0.17)(−0.05, 0.98, 0.20)
2(55, 20)(56.12, 20.20)(0.54, 0.77, 0.34)(0.52, 0.78, 0.35)
3(160, 70)(158.85, 70.13)(−0.32, 0.12, 0.94)(−0.32, 0.12, 0.94)
Table 2. Average errors of pitch and yaw angles under different data set intervals.
Table 2. Average errors of pitch and yaw angles under different data set intervals.
Data Set Intervals (°)Average Pitch Angle Error (°)Average Yaw Angle Error (°)
51.052.10
103.235.61
Table 3. Average errors of pitch and yaw angles at different SNR levels.
Table 3. Average errors of pitch and yaw angles at different SNR levels.
SNRAverage Pitch Angle Error (°)Average Yaw Angle Error (°)
01.53083.3392
51.45342.9668
101.29802.6946
Table 4. The axial estimation error based on ISAR image sequence and single ISAR image.
Table 4. The axial estimation error based on ISAR image sequence and single ISAR image.
MethodYaw (°)Pitch (°)Average Estimation Error (°)
Estimation error of a single ISAR image2.531.842.19
Estimation error of the ISAR sequence2.101.051.58
Table 5. The axial estimation error at different pitch angles.
Table 5. The axial estimation error at different pitch angles.
Pitch Angle (°)Yaw (°)Pitch (°)Average Estimation Error (°)
151.991.171.58
302.130.981.56
451.680.941.31
601.370.921.14
752.481.351.91
Table 6. Main experimental parameters.
Table 6. Main experimental parameters.
ParameterValue
Carrier frequency0.22 THz
PRF10,000
Bandwidth20 GHz
Frequency step length8.3 MHz
Pulse width 5   μ s
Table 7. Comparison of average yaw angle error in different experiments.
Table 7. Comparison of average yaw angle error in different experiments.
ExperimentAverage Yaw Angle Error (°)
Experiment 1:1.5° interval5.5963
Experiment 2:3° interval6.9973
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Guo, W.; Yang, Q.; Wang, H.; Luo, C. Axis Estimation of Spaceborne Targets via Inverse Synthetic Aperture Radar Image Sequence Based on Regression Network. Remote Sens. 2024, 16, 4148. https://doi.org/10.3390/rs16224148

AMA Style

Guo W, Yang Q, Wang H, Luo C. Axis Estimation of Spaceborne Targets via Inverse Synthetic Aperture Radar Image Sequence Based on Regression Network. Remote Sensing. 2024; 16(22):4148. https://doi.org/10.3390/rs16224148

Chicago/Turabian Style

Guo, Wenjing, Qi Yang, Hongqiang Wang, and Chenggao Luo. 2024. "Axis Estimation of Spaceborne Targets via Inverse Synthetic Aperture Radar Image Sequence Based on Regression Network" Remote Sensing 16, no. 22: 4148. https://doi.org/10.3390/rs16224148

APA Style

Guo, W., Yang, Q., Wang, H., & Luo, C. (2024). Axis Estimation of Spaceborne Targets via Inverse Synthetic Aperture Radar Image Sequence Based on Regression Network. Remote Sensing, 16(22), 4148. https://doi.org/10.3390/rs16224148

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop