Next Article in Journal
Modeling and Correction Methods for Positioning Errors in Loran System at Sea
Previous Article in Journal
Importance of Spectral Information, Seasonality, and Topography on Land Cover Classification of Tropical Land Cover Mapping
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Full-Sky Star Identification Based on Spatial Projection and Reconfigurable Navigation Catalog

by
Siyao Wu
1,2,
Ting Sun
1,2,*,
Fei Xing
3,
Haonan Liu
1,2,
Jiahui Song
1,2 and
Shijie Yu
1,2
1
School of Instrument Science and Opto-Electronics Engineering, Beijing Information Science and Technology University, Beijing 100192, China
2
Laboratory of Intelligent Microsystems, Beijing Information Science and Technology University, Beijing 100192, China
3
Department of Precision Instrument, Tsinghua University, Beijing 100084, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(9), 1553; https://doi.org/10.3390/rs17091553
Submission received: 13 March 2025 / Revised: 24 April 2025 / Accepted: 25 April 2025 / Published: 27 April 2025

Abstract

:
A star tracker is widely used as a high-precision attitude measurement device for spacecraft. It calculates attitude by extracting the magnitude and the position of presumed detected stars by a CCD/CMOS sensor and matching them with stars in the star catalog. The traditional star identification methods typically require the selection of specific anchor stars, which may cause insufficient identification accuracy as the number of stars used in the rough search is limited. In this paper, we propose a star identification method based on spatial projection, which starts with preprocessing. Then, a method for online expansion and reconstruction of the star catalog is proposed, which provides more stored star data. After the rough recognition and coordinate system transformation, the final identification is realized in the polar coordinate system. All the star points in the star image are identified, and the attitude information is obtained at the same time. The performance of the identification method is verified by real night sky experiments. Stray light experiments are also carried out to prove good noise immunity capabilities. Compared with the traditional subgraph isomorphism method, the proposed method makes it easier to adjust the number of recognizable stars in the field of view and better recognition of specific areas. The method is of great significance for future tasks such as attitude measurement, celestial navigation, remote sensing measurement, and space target observation and tracking.

Graphical Abstract

1. Introduction

Star identification is an important process for attitude determination systems in astronomical navigation [1], especially in remote sensing measurements, lunar exploration engineering, asteroid exploration, space science observation, rendezvous docking, satellite networking, non-radio navigation and other processes, which need an absolute orientation result. Taking a high precision star tracker as an example, after star spot extraction and high precision centering [2,3], star identification is needed. The star coordinates in the celestial coordinate system are matched with the coordinates of the imaging star spots in the star tracker coordinate system to obtain the transformation matrix (attitude information) from the two coordinate systems.
The evaluation of the star identification algorithm usually includes identification accuracy, recall rate, calculation speed, storage size, and so on. At present, the main existing star image identification algorithms are divided into two major types: subgraph isomorphism and pattern recognition [4].
The triangle algorithm [5,6,7] is a classical algorithm for subgraph isomorphism. The triangle algorithm mainly uses the constant angular distances formed by at least three stars between the obtained star spots and the star information stored in the catalog. The triangle algorithm produces relatively accurate results, but after the stars for matching are recognized, secondary recognition is required to match other stars. In addition, the triangle algorithm depends on the search table of angular distance. With the increase in the number of stars, the search table becomes extremely large, which brings difficulties to the search process. The classical triangle algorithm (Liebe, 1992) [8] uses a single CCD camera and a star catalog to identify triangular features. It supports initial coarse attitude determination without prior knowledge. However, it relies on fixed quantization of features and does not validate the effectiveness of the recognition method in real scenarios.
On the basis of the above methods, new recognition algorithms are constantly proposed. The polygon angular match algorithm (Kosik, 1991) [9] is a high-efficiency vector search method and verifies its superiority by statistical analysis. However, the algorithm has high complexity and limited adaptability to star field density variations. The van Bezooijen algorithm (DeAntonio et al., 1993) [10] achieves star recognition by brightness sorting and connection matching, with experimental validation. However, the algorithm suffers from a direction matching error and false radiation problem, which affects the recognition accuracy. Improvements in the subgraph isomorphism algorithm also include the pyramid algorithm (Mortari, 2004) [11] and the simplest general subgraph (Liu, 2021) [12]. The purpose is to improve the matching process of angular distance information. Sun et al. (2024) introduced an iterative voting mechanism to enhance cyclic feature matching [13], but the computational complexity increased quadratically with the star count. He et al. (2025) developed a partitioned catalog voting method that reduces database storage by 96% through celestial sphere zoning [14], but its uniform partitioning strategy limits its adaptability to sparse star fields.
In addition, researchers pursue faster search speeds with faster database search techniques, like the k-vector technique (Spratling and Mortari, 2011) [15], hashing techniques (Gerhard, 2016) [16], and the K-L transformation algorithm (Zhao, 2016) [17]. The purpose is to improve search and storage costs, reduce identification time, and improve processing speeds.
The grid method [18] belongs to a pattern recognition method, which divides the star spots by coordinates into grids and compares them with stored star information. The grid method can identify all stars in the FOV at one time. But due to the limitation of the grid division, the recognition accuracy is not as good as that of the angular distance method, which may lead to the problem of false recognition. Especially when the imaging environment is complex, with an interfering background, the applicability of the current recognition method needs to be improved.
The research has also been conducted on the patten recognition algorithm, like a modified grid algorithm (Lee, 2007; Na, 2009; Wang, 2024) [19,20,21], mixed identification strategy (Silani, 2006) [22], image-based identification algorithm (Yoon, 2011) [23], the hidden Markov model-based algorithm (Sun, 2016) [24], and the singular value method (SVM) algorithm (Wei, 2019) [25]. Recent innovations in pattern representation show promising directions: Kim (2024) combined singular values with Gaussian mixture models (GMM) to generate pseudo-planar features [26], demonstrating 98.2% robustness against five false stars, but the method required intensive SVD computations for multiple star subsets. The researchers aim to explore new patterns to achieve accurate star matching and fast search strategies, but with high complexity.
The polar coordinate-based methods also belong to the pattern recognition algorithm, which has shown potential in addressing rotation invariance challenges. Zhang et al. (2008) proposed a radial and cyclic feature extraction method by partitioning star patterns into angular sectors and radial rings [27]. While this approach achieved 97.57% identification accuracy, its performance degraded significantly under high positional noise due to fixed quantization scales. Wei et al. (2009) introduced a log-polar transform algorithm to map star coordinates into rotation-invariant polar domains [28]. However, their method required extensive preprocessing for pattern vector alignment and exhibited computational bottlenecks when handling dense star fields. Fang et al. (2020) developed a geometric verification-based method combining angular distances and spatial projections [29]. While achieving 98.4% accuracy with missing stars, their approach required redundant database storage for multi-pattern validation. Dai et al. (2025) proposed a dynamic distance ratio matching (DDRM) algorithm, which adjusts radial quantization scales based on local star density [30]. Although DDRM improved noise tolerance, it still relied on predefined magnitude thresholds, limiting flexibility in sparse star fields. These limitations highlight the need for adaptive spatial mapping techniques under variable noise conditions while minimizing computational overhead.
With the development of artificial intelligence, a series of new algorithms have appeared, like hierarchical mixture-of-experts models (Crain et al., 2000) [31], adaptive ant colony algorithm (Quan, 2010) [32], neural network-based algorithm (Ma, 2019) [33]. They provide a new approach for star image recognition but have not been widely used at present. Deep learning approaches, such as BSC-Net (Li et al., 2022) and MBS-Net (Liu et al., 2023), introduced convolutional networks for background suppression in complex star images [34,35]. While these methods achieved superior signal-to-noise ratios, their reliance on extensive real-world training data and high computational loads limits real-time applicability.
Traditional methods primarily use image coordinate systems to describe the geometric distribution of surrounding stars. This approach has inherent limitations because it relies on unreliable information to determine the orientation of Cartesian coordinate systems. In contrast, polar coordinate systems inherently separate rotation-invariant radial features from cyclic features during rotation, enabling radial features to serve as robust descriptors independent of other unreliable parameters. This star pattern definition demonstrates superior accuracy and computational efficiency.
Even among algorithms based on polar coordinates, most adopt fixed patterns formed by a predetermined number of stars for optimal matching. Such implementations remain vulnerable to significant position errors during initial identification, especially under stray light or sensor noise interference, due to the limited number of stars. To address this problem, we propose a method based on spatial reprojection. The primary star is identified first, followed by a planar projection that converts three-dimensional attitude parameters into two-dimensional vectors. Subsequent star matching uses the attitude matrix for iterative refinement, taking full advantage of information from all detected stars. This integration of full pattern recognition with real-time attitude estimation achieves simultaneous star identification and attitude determination.
Conventional fixed navigation catalogs employ predetermined magnitude thresholds. This configuration becomes problematic in celestial regions with sparse star fields, where the limited number of detectable stars compromises the accuracy of attitude determination. To solve this problem, we propose a reconfigurable navigation catalog strategy. The method first identifies the current celestial region through initially detected stars, then selectively appends region-specific stars based on required magnitude thresholds and density criteria during catalog reconstruction, eliminating the need for full catalog regeneration. The framework enhances computational efficiency and catalog flexibility, accommodating diverse application requirements.
Aiming at the above problems, this paper proposes a full-sky star identification method based on spatial projection and a reconfigurable navigation catalog. Full-sky identification refers to a method capable of robustly identifying stars across arbitrary celestial regions by dynamically reconfiguring the navigation catalog. Our method is adapted to the complex imaging environment with high recognition accuracy. It is also possible to arbitrarily change the navigation star density according to the sky area to avoid very few extracted stars in the field of view, and realize more precise recognition and improve the attitude accuracy. Therefore, improved anti-interference performance under stray light, star catalog reconstruction for specific areas, and synchronous identification and attitude calculation are three main advantages.

2. Principles and Methods

The necessary information for general star identification includes: positions of captured star spots on the image detector, navigation star catalog, and parameters of the system. Based on the above information, the proposed identification algorithm is divided into five main parts, as shown in Figure 1. Step 1 is preprocessing; Step 2 is rough recognition; Step 3 is coordinate transformation; Step 4 is final recognition; Step 5 is synchronous identification and attitude calculation.
The proposed method integrates spatial projection with catalog reconfiguration. Spatial projection implements rotation-invariant feature extraction in polar coordinates, reducing the problems of rotational misalignment and sensitivity to positional noise in the Cartesian coordinate system. Catalog reconstruction dynamically adjusts star density in sparse regions, ensuring sufficient stars for identification without overloading storage. This synergy allows for simultaneous high accuracy and adaptability, which is unique compared to fixed-catalog methods.
The algorithm has two branches: the star image processing branch and the star catalog processing branch. The star image processing branch starts with the star spot positions. The star catalog processing branch starts with the navigation star catalog. They both go through five steps. In comparison, the star catalog processing branch requires more computing resources. We will introduce each branch at each step.
Step 1: Preprocessing. For the star image processing branch, star extraction results require reordering (comprehensive sorting according to brightness and distance information), and then, the main star is selected. For the star catalog processing branch, the search table T1 is constructed based on the original star navigation catalog. Each star in the star catalog is taken as the main star, and the stars within the field of view around the main star are selected and recorded. The star serial numbers and the distance d si from the main star after projection are calculated to form the search table which needs to be stored. Each row in T1 represents the star serial numbers around the main star, and the serial numbers are the row numbers of the selected neighboring stars in the original navigation catalog. In addition, the preprocessing for the star catalog is omitted if the search table T1 is stored for the same detection condition.
We also proposed a method for online expansion and reconstruction of the star catalog in Step 1, which provides more stored star data.
Step 2: Rough recognition. For the star image processing branch, the main star P is taken as the center, and the distance values d pi from each star spot on the image plane to the center are calculated, which are quantified by a certain scale to form a digital string BP. For the star catalog processing branch, the distance values d si in the above step are quantified with the same scale to form the digital string BS. The similarity between BP and different BS formed by the main stars is calculated and sorted. The sorted top stars are taken as the candidate main stars to complete the initial recognition. The result of the initial recognition will greatly reduce the computational cost in the subsequent processing.
Step 3: Coordinate transformation. For the star image processing branch, the two-dimensional coordinates of star spots are converted to polar coordinates and sorted by polar angle. For the star catalog processing, the coordinates in the celestial coordinate system of stars in each row of the search table T1 are used to obtain the fitted plane based on the focal length, pixel size, etc. Then, the main star and another star in the FOV are used to establish a framework to realize the transformation from the projection plane to the celestial coordinate system. The three-dimensional coordinates of star spots can be converted into two-dimensional coordinates on the projection plane and, then, additionally converted to polar coordinates and reordered.
Step 4: Final identification. The similar radius values R in the above step are extracted from the polar coordinates. By subtraction the angle of the corresponding item, we can quickly identify whether it is a true star and determine its number. So far, all stars in the field of view have been identified.
Step 5: Synchronous identification and attitude calculation. The attitude information is obtained by multiplying the above matrices.
The calculation steps and details of the algorithm are shown in Figure 2.

3. Detailed Program and Experimental Analysis

We use real night sky experiments to capture star images under various working conditions. The parameters of the star tracker are shown in Table 1.

3.1. Star Identification Algorithm

In the star image processing, the process is from Step 1 to Step 5, as shown in Figure 2. The processing results of each step are given in this section to clearly show the algorithm principles and details. The star sensor is fixed on the spacecraft. The simulation was performed using MatlabR2023b with the CPU at 2.5 GHz.
Step 1: Preprocessing.
Since the celestial coordinates of each star remain fixed on the celestial sphere, their positions are uniquely determined by the parameters of right ascension and declination. The right ascension and declination of a star are expressed as ( α i , δ i ) . According to the relationship between spherical and Cartesian coordinates, the direction vector v i of the navigation star in inertial space is given by Equation (1).
v i = cos α i cos δ i sin α i cos δ i sin δ i
The position information of extracted star points in the star image needs to be updated in real time: star spot positions ( x i , y i ) .
At the same time, the star tracker needs a reference original star catalog for star image recognition. The star catalog information (input) is shown in Table 2.
Table 3 is formed according to the star catalog of Table 2 for the purpose of searching for star clusters in the FOV. It will remain unchanged unless the navigation star catalog and the field of view change. Therefore, it is only necessary to calculate and store them in advance, and it is not necessary to recalculate each time. The selection condition for the number of stars around the main star is the angular distance from the main star within θ F O V , which is the field of view. The “…” in the Tables of the paper represents other numbers or descriptions of the same type which are not listed in the tables.
Step 2: Rough recognition.
Figure 3 shows the BP calculation of the star image processing branch. For the star image processing branch, the main star P is taken as the center, and the distance values d pi from each star spot on the image plane to the center are calculated, which are quantified by a certain scale to form a digital string BP. This must be updated for each frame.
Table 4 shows the number of stars within a radius range, which is separated according to the method shown in Figure 3.
Table 5 shows the BS calculation of the star catalog processing branch. For the star catalog processing branch, the distance values d si in the above step are quantified with the same scale to form the digital string BS. This can be stored in advance, as long as the star catalog and the field of view of the star tracker remain unchanged.
The above BP and BS are compared. ‘And’ operation is performed on corresponding quantized radii, then addition is performed. A larger value of the addition result represents a higher similarity between digital strings BP and BS. So far, the rough identification is completed, and the initial identification result is obtained. We now have the candidate serial number of the main star (there may be 1–20 candidate stars). Note that this step requires only radius information, not specific position information. So, the processing will not be too computationally intensive.
Step 3: Coordinate transformation.
For the star image processing branch, the coordinate transformation is needed from the Cartesian coordinate system (StarPos) to the polar coordinate system (ComPos), as shown in Figure 4.
The directional vector w i of the navigation star in the sensor coordinate system can be calculated by Equation (2).
w i = 1 ( x i x 0 ) 2 + ( y i y 0 ) 2 + f 2 ( x i x 0 ) ( y i y 0 ) f
where ( x 0 , y 0 ) is the intersection of the center of the main axis of the star sensitizer and the photodetector, ( x i , y i ) are the coordinates of the star being projected on the photodetector, and f is the focal length of the star sensitizer.
For the star catalog processing branch, some more procedures are needed. The stars are projected on the image plane (assuming that the main star is imaged at the image plane center), as shown in Figure 5. Then, the coordinates of imaged stars can be calculated by Equation (3).
o c w i = o w i o o c
Therefore, the three-dimensional coordinates (point cluster) of the imaged stars relative to the main point in the celestial coordinate system are obtained. But this cannot be easily transformed to the polar coordinate system. Therefore, the following processes are conducted.
For each candidate star, a processing cycle is required as follows.
(a): The three-dimensional coordinates of the point cluster are used to fit the plane and determine the plane equation, as shown in Figure 6. Plane fitting uses singular value decomposition (SVD). The centroid of the star cluster is subtracted, and SVD is applied to the centered data. The normal vector of the plane is the third column, and RM aligns the celestial frame with the projection plane.
(b): Taking the connecting line between another star and the main star as the X-axis, the Y-axis is in the image plane, and the Z-axis forms a right-hand spiral rule with X and Y. Then, the vectors of the three axes are obtained, and this coordinate system is named as the first-projection coordinate system X V , Y V , Z V .
(c): Considering the concept of attitude matrix, the transformation RM from the celestial coordinate system to the defined coordinate system is obtained, as described in Equation (4):
R M = X V e x Y V e x Z V e x X V e y Y V e y Z V e y X V e z Y V e z Z V e z T
where e x , e y , e z is the base vector of the original coordinate system.
(d): By the transformation matrix, the three-dimensional coordinates of the point cluster are transformed into two-dimensional coordinates on the plane, as described in Equation (5):
u = f v x v z v = f v y v z
where v x , v y , v z = R M v i . These coordinates will be further converted to polar coordinates, as described in Equation (6):
R = u 2 + v 2 θ = arctan ( v u )
where R represents the distance between the star point and the center of the image (projection origin), and θ represents the angle of the star point relative to the center of the image. The distribution of star points in the polar coordinate system (denoted as ComSelect) is shown in Figure 7.
(e): The distance value of R from the imaged star to the main star is compared in ComPos and ComSelect; R is recorded within a certain range and saved as R 1 and R 2 (corresponding row alignment).
Step 4: Final identification.
The corresponding angles of MarkPos and MarkSelect within similar R are subtracted, and the subtraction results θ M a r k are calculated. If θ M a r k are very close (less than 0.3° deviation), these are considered to be real stars. Then the corresponding serial star number is obtained. Table 6 shows the comparison results between polar coordinates.
As shown in Figure 8, if the spots in MarkPos and MarkSelect with Flag 1 are plotted, the figure is obtained.
However, it is not intuitive to observe the relationship of the above spots because of the angle difference θ M a r k . In this regard, we compensate for the angle θ M e a n A T , which means rotating the first-projection coordinate system around the Z-axis.
The angles θ M a r k in Table 6 are rounded, and the frequency of occurrence of each angle is counted. The angle with the highest frequency of occurrence is designated as θ r o t a t e . If θ M a r k are less than 0.3° deviation with θ r o t a t e , these angles are marked as θ k , and their mean values are calculated, which can be expressed as Equation (7).
θ M e a n A T = 1 K k = 1 K Δ θ k
Then, we obtain the second-projection coordinate system, and we can see in Figure 9 that the extracted spot positions coincide with the result calculated by the star catalog.
Of course, not all extracted points can be successfully identified as star points after angle compensation. As shown in Figure 9, there are still some cases where the positions marked with blue asterisks do not coincide with the positions marked with red circles. This discrepancy arises primarily due to the limited number of navigation stars stored in the catalog and the possible presence of noise in the set of extracted points. Furthermore, not all the positions of the stars calculated by the catalog processing branch can be extracted because the energy of some of the navigational stars in the catalog does not meet the detection threshold.
The identified star serial number is marked in the corresponding positions, and the result is shown in Figure 10. The positions marked with blue asterisks represent the polar coordinate position obtained by the star image processing branch, the positions marked with red circles represent the polar coordinate position obtained by the star catalog processing branch, and the green numbers represent the serial numbers of stars in the navigation star catalog. The blue asterisks represent the star image processing, and the red circles represent the star catalog processing. The green numerals are the serial numbers in the star catalog, which show the identified star serial numbers marked as Flag 1 in Table 6.
Only if the identified star serial number is >2, the identification is really successful, and this process is completed. Figure 11 shows the successful star identification results in the star image. The numbers on the star image represent the star serial number in the star catalog.
Step 5: Synchronous identification and attitude calculation.
RM is obtained in Step 3, as described by Equation (4). RZ is a rotation matrix around the Z-axis determined by θ M a r k , which can be calculated by Equation (8).
R Z = cos ( θ M e a n A T ) sin ( θ M e a n A T ) 0 sin ( θ M e a n A T ) cos ( θ M e a n A T ) 0 0 0 1
RS is a transformation matrix from the second-projection coordinate system to the star tracker coordinate system, which can be calculated by Equation (9).
R S = 0 1 0 1 0 0 0 0 1
After considering the transformation matrix RM, RZ, and RS, the attitude matrix gRT from the celestial coordinate system to the star tracker coordinate system can be obtained, as shown in Equation (10).
g R T = R S R Z R M
Then, the quaternion qRT can be calculated, as shown in Equation (11).
q R T ( 1 , 4 ) = 0.5 ( s q r t ( g R T ( 1 , 1 ) + g R T ( 2 , 2 ) + g R T ( 3 , 3 ) + 1 ) ) ; q R T ( 1 , 1 ) = ( g R T ( 2 , 3 ) g R T ( 3 , 2 ) ) / 4 / q R T ( 1 , 4 ) ; q R T ( 1 , 2 ) = ( g R T ( 3 , 1 ) g R T ( 1 , 3 ) ) / 4 / q R T ( 1 , 4 ) ; q R T ( 1 , 3 ) = ( g R T ( 1 , 2 ) g R T ( 2 , 1 ) ) / 4 / q R T ( 1 , 4 ) .

3.2. Regression Verification

We conduct inverse verification by gRT, star catalog, and the parameters to calculate the theoretical positions. Then, we compare them with the identified results above. It can be seen from Figure 12 and Figure 13 that the identified stars are all true stars and the calculated gRT is a reliable attitude result. The red circles and cyan numbers in Figure 12 represent the corresponding inverse verification results.
Green asterisks in Figure 13 represent the successfully extracted stars, and green numbers represent the star sequence numbers of the extracted stars computed by catalog inversion. The red circles represent the positions of the corresponding inverse verification stars. The cyan numbers represent other inversion results that have not been extracted.

3.3. Experiment and Analysis with Interference

Figure 14 shows the identified results of the proposed algorithm under the influence of different types of stray light. Some of the extracted results are not identified (such as in (c)), possibly because the extracted points themselves are noise, or there is no information stored in the star catalog. Some stars in the catalog are not extracted (such as in (a), (c), and (e)), which is also a normal phenomenon, possibly due to the SNR. However, in general, the recognition results are satisfactory in the presence of sunlight, cloud cover, and moonlight interference, which indicates that the proposed algorithm has sufficient anti-interference capability.
We capture star images using another star tracker at different exposure times. The parameters of the star tracker are shown in Table 7.
The recognition results are shown in Figure 15. It can be seen that the proposed recognition algorithm is well adapted under different focal lengths and exposure times.

3.4. Reconstruction of the Navigation Catalog

For the region of the sky with few stars, the required magnitude of the navigation stars is modified. Then, the star density is increased, and the search table is conveniently modified to identify more star spot extraction results.
If the magnitude of the filtered navigation stars is too small when constructing a navigation star catalog, some of the extracted star points will not be recognized because they are not stored in the navigation star catalog. Therefore, we propose a reconstruction method of the dynamic star catalog to densify the star points in a certain region, so that the extracted star points are recognized by the star chart recognition algorithm.
In the same field of view, the extent of right ascension within the field of view gradually increases as the visual axis moves toward the north and south poles. For a star in a certain field of view, its declination and elevation need to satisfy certain conditions: Assuming that the coordinates of the star in the celestial coordinate system are ( α i , δ i ) , the direction of the visual axis of the star sensitizer is ( α , δ ) , and the magnitude of the field of view angle is ( F O V x , F O V y ) , so the range of values for α i is
α F O V x 2 cos δ α i α + F O V x 2 cos δ
The value range of δ i is
δ F O V y 2 δ i δ + F O V y 2
After determining the direction of the visual axis of the star sensitizer, the coordinate range of the star under the celestial coordinate system is determined by Equations (12) and (13). The range of the filtered area is determined by the above process. Then, an adaptive adjustment of the magnitude values of the filtered star is realized by setting the number of stars to be filtered.
The preprocessed catalog is used as the original navigation catalog, and the magnitude-filtered thresholds of the stars in a particular region are adaptively adjusted to reconfigure the original navigation catalog. The specific procedure is shown in Figure 16.
After increasing the magnitude threshold in specific regions, newly added stars in the star catalog are assigned new identifiers and processed. Each of these added stars is designated as a primary star, and its adjacent stars are identified and recorded in an auxiliary catalog table. Then, the information on each primary star is added to the entries of its adjacent stars in the auxiliary table, while keeping the original identifiers in the auxiliary table unchanged.
The above process realizes the update of the search table T1 in Table 3. Table 8 shows the schematic process of star catalog reconstruction, and Figure 17 shows the distribution of the star point positions after updating the star catalog.
We conducted the experiment of star catalog reconstruction. For regions that need more navigation stars, we only need to add the corresponding navigation star information.
The comparison of identification results before and after star catalog reconstruction is shown in Figure 18.

4. Discussion

In a real night sky observation experiment, as shown in Figure 19, the star tracker is fixed on the turntable and is pointing to the zenith.
About 5000 star images are captured with the real night sky experiment. We sample every 10 images and conduct star extraction and identification with the proposed method. The identification accuracy, recall rate, and processing time are compared with the commonly used method of Xing’s [36] based on navigation star domain and k-vector, which has good accuracy and calculation speed and has been applied to more than 200 on-orbit star trackers.
Three indexes are counted in the test using the real data, which are the definition of accuracy, the definition of recall rate, and the evaluation of time [37].
Identification accuracy refers to the proportion of stars that are truly stars in the celestial coordinate system out of all the stars recognized as stars by the algorithm, which can be expressed as
A c c u r a c y = T P T P + F P
where TP (True Positive) refers to the number of correctly identified stars, and FP (False Positive) refers to the number of incorrectly identified stars.
In the definition of identification accuracy, the calculation numerator is the intersection of the identified stars and the regression calculation results. The calculation denominator is the number of identified stars obtained by the algorithm. This indicates the proportion of the real stars in the identified stars. The higher the accuracy is, the more accurate the identification is, and the noise is not mistaken for a star in the identification result. Figure 20 shows the comparison of identification accuracy results between the two methods.
The recall rate refers to the proportion of all real stars in the field of view that are correctly recognized by the algorithm, which can be expressed as
R e c a l l = T P T P + F N
where FN (False Negative) refers to the number of unidentified stars.
In the definition of the recall rate, the calculation numerator is the intersection of the extraction results and the regression results. The calculation denominator is the number of stars calculated from the star catalog. The recall rate indicates how many stars are not identified by the method. The higher the recall rate, the more extracted results are identified. The comparison of recall rate results between the two methods is shown in Figure 21.
In the evaluation of time, because the program is mainly used for the verification of principle and not for the optimization of engineering, we have a rough estimate of the operation time and obtain the relative relationship between the two methods, as shown in Figure 22.
From the above results, we can see that the proposed method in this paper has great advantages in terms of identification accuracy, recall rate, and calculation time, especially in identification accuracy. The identification accuracy of the proposed method in this paper has a 99.6% probability of reaching 1 (compared with 59.2% of the previous method), and the recall rate has an almost 100% probability of reaching 1 (compared with 66.2% of the previous method). The k-vector method may cause identification errors mainly because the number of stars used in the 4-star search is limited, and the setting of the threshold makes it possible to identify the results as other false stars. Usually, secondary identification and verification are used to avoid this problem. The proposed method in this paper uses all the information of stars at one time in the identification process, so the identification accuracy is high and the identification results are reliable.
It can be seen from the above figure that some star points are extracted in the star image extraction process, but they are not identified in the star identification process. There may be two reasons for this, as mentioned in the above discussions. On the one hand, it may be that the stars themselves are not real stars, but noise. The second reason is that there are no corresponding stars in the star catalog, so they cannot be identified. In the second case, if we really want to identify these previously unidentifiable star points, we will inevitably need to expand our star catalog. In this process, the advantages of our algorithm are shown. When expanding the star catalog, the contents that need to be modified are limited to the information of several stars in the related FOV in T1, while other parts remain unchanged. This has been explained in Table 3. After the expansion of the star catalog, the originally unidentifiable star points are successfully identified, which is important for the final accuracy of the attitude determination.

5. Conclusions

This research presents a star identification method based on spatial projection and a reconfigurable navigation catalog. Compared with a traditional method based on initial identification and re-identification using angular distance, this method identifies all the stars in the FOV at one time. Compared with the traditional pattern recognition method, the accurate spatial relationship between stars is also considered in the recognition.
The advantages of this paper have been verified in experiments and discussions. (1) Improved anti-interference performance under stray light. A series of star images with serious stray light are studied by experiment, and it is found that they are effectively identified, indicating the good anti-noise ability of the algorithm. (2) Star catalog reconstruction for specific areas is easy to implement. The method discusses the star catalog expansion and reconstruction method. In previous methods, if a star is added, the angular distance between this star and all other stars needs to be recalculated and reordered to generate a completely new searching table. This step is complex. The proposed method only needs to modify the search table and the rows of several adjacent stars. It is very convenient to modify the density of stars in a certain area. This will have practical significance in future applications. (3) Synchronous identification and attitude calculation. The method completes the star image recognition and the attitude calculation at the same time, which is different from the conventional methods.
In conclusion, the method suits missions requiring high-precision attitude determination in variable star fields. Future work will explore more efficient methods of noise suppression to minimize false extraction of star points. The development of an on-board real-time processing version is expected as well.

Author Contributions

Conceptualization, S.W. and T.S.; methodology, S.W. and T.S.; software, S.W., T.S. and H.L.; validation, F.X.; investigation, S.W., T.S. and J.S.; resources, T.S., F.X. and S.Y.; data curation, S.W. and T.S.; writing—original draft preparation, S.W., T.S. and F.X.; writing—review and editing, H.L., J.S. and S.Y.; visualization, S.W.; supervision, F.X.; project administration, S.Y.; funding acquisition, T.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (Grant No. 62375022).

Data Availability Statement

The synthetic data supporting the results of this study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors thank Beijing Information Science and Technology University and Tsinghua University for providing real data to fully validate the performance of the algorithm, and we thank TY-Space Technology (Beijing) Ltd. for providing NST4S-A2 star tracker for our experiments. We are also very grateful to all the editors and reviewers for their hard work on this paper.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Spratling, B.B., IV; Mortari, D. A survey on star identification algorithms. Algorithms 2009, 2, 93–107. [Google Scholar] [CrossRef]
  2. Sun, T.; Xing, F.; Bao, J.; Zhan, H.; Han, Y.; Wang, G.; Fu, S. Centroid determination based on energy flow information for moving dim point targets. Acta Astronaut. 2022, 192, 424–433. [Google Scholar] [CrossRef]
  3. Zhan, H.; Xing, F.; Bao, J.; Sun, T.; Chen, Z.; You, Z.; Yuan, L. Analyzing the effect of the intra-pixel position of small PSFs for optimizing the PL of optical subpixel localization. Engineering 2023, 27, 140–149. [Google Scholar] [CrossRef]
  4. Padgett, C.; Kreutz-Delgado, K.; Udomkesmalee, S. Evaluation of star identification techniques. J. Guid. Control Dyn. 1997, 20, 259–267. [Google Scholar] [CrossRef]
  5. Scholl, M. Star field identification algorithm. Opt. Lett. 1993, 18, 399–401. [Google Scholar] [CrossRef]
  6. Scholl, M. Experimental demonstration of a star field identification algorithm. Opt. Lett. 1993, 18, 402–404. [Google Scholar] [CrossRef]
  7. Scholl, M. Star field identification for autonomous attitude determination. J. Guid. Control Dyn. 1995, 18, 61–65. [Google Scholar] [CrossRef]
  8. Liebe, C.C. Pattern recognition of star constellations for spacecraft applications. IEEE Aerosp. Electron. Syst. Mag. 1992, 7, 34–41. [Google Scholar] [CrossRef]
  9. Kosik, J.C. Star pattern identification aboard an inertially stabilized aircraft. J. Guid. Control Dyn. 1991, 14, 230–235. [Google Scholar] [CrossRef]
  10. DeAntonio, L.; Udomkesmalee, S.; Alexander, J.; Blue, R.; Dennison, E.; Sevaston, G.; Scholl, M. Star-tracker based, all-sky, autonomous attitude determination. SPIE Proc. Space Guid. Control Track. 1993, 1949, 204–215. [Google Scholar]
  11. Mortari, D.; Samaan, M.A.; Bruccoleri, C.; Junkins, J.L. The pyramid star identification technique. Navigation 2004, 51, 171–183. [Google Scholar] [CrossRef]
  12. Liu, H.; Wei, X.; Li, J.; Wang, G. A star identification algorithm based on simplest general subgraph. Acta Astronaut. 2021, 183, 11–22. [Google Scholar] [CrossRef]
  13. Sun, Q.; Niu, Z.; Li, Y.; Wang, Z. A Robust High-Accuracy Star Map Matching Algorithm for Dense Star Scenes. Remote Sens. 2024, 16, 2035. [Google Scholar] [CrossRef]
  14. He, X.; Zhang, L.; He, J.; Mu, Z.; Lv, Z.; Wang, J. A Voting-Based Star Identification Algorithm Using a Partitioned Star Catalog. Appl. Sci. 2025, 15, 397. [Google Scholar] [CrossRef]
  15. Spratling, B.B.; Mortari, D. The K-Vector ND and its Application to Building a Non-Dimensional Star Identification Catalog. J. Astronaut. Sci. 2011, 58, 261–274. [Google Scholar] [CrossRef]
  16. Gerhard, J. A Geometric Hashing Technique for Star Pattern Recognition. Master’s Thesis, West Virginia University, Morgantown, WV, USA, 2016. [Google Scholar]
  17. Zhao, Y.; Wei, X.; Li, J.; Wang, G. Star identification algorithm based on K–L transformation and star walk formation. IEEE Sens. J. 2016, 16, 5202–5210. [Google Scholar] [CrossRef]
  18. Padgett, C.; Kreutz-Delgado, K. A grid algorithm for autonomous star identification. IEEE Trans. Aerosp. Electron. Syst. 1997, 33, 202–213. [Google Scholar] [CrossRef]
  19. Lee, H.; Bang, H. Star Pattern Identification Technique by Modified Grid Algorithm. IEEE Trans. Aerosp. Electron. Syst. 2007, 43, 1112–1116. [Google Scholar]
  20. Na, M.; Zheng, D.; Jia, P. Modified grid algorithm for noisy all-sky autonomous star identification. IEEE Trans. Aerosp. Electron. Syst. 2009, 45, 516–522. [Google Scholar] [CrossRef]
  21. Wang, T.; Wang, G.; Wei, X.; Li, Y. A star identification algorithm for rolling shutter exposure based on Hough transform. Chin. J. Aeronaut. 2024, 37, 319–330. [Google Scholar] [CrossRef]
  22. Silani, E.; Lovera, M. Star identification algorithms: Novel approach & comparison study. IEEE Trans. Aerosp. Electron. Syst. 2006, 42, 1275–1288. [Google Scholar]
  23. Yoon, H.; Lim, Y.; Bang, H. New star-pattern identification using a correlation approach for spacecraft attitude determination. J. Spacecraft Satell. 2011, 48, 182–186. [Google Scholar] [CrossRef]
  24. Sun, L.; Jiang, J.; Zhang, G.; Wei, X. A discrete HMM-based feature sequence model approach for star identification. IEEE Sens. J. 2016, 16, 931–940. [Google Scholar] [CrossRef]
  25. Wei, X.; Wen, D.; Song, Z.; Xi, J. Star identification algorithm based on oriented singular value feature and reliability evaluation method. Int. J. Aeronaut. Space Sci. 2019, 62, 265–274. [Google Scholar] [CrossRef]
  26. Kim, K. A New Star Identification Using Patterns in the Form of Gaussian Mixture Models. Adv. Space Res. 2024, 74, 319–331. [Google Scholar] [CrossRef]
  27. Zhang, G.; Wei, X.; Jiang, J. Full-sky autonomous star identification based on radial and cyclic features of star pattern. Image Vision Comput. 2008, 26, 891–897. [Google Scholar] [CrossRef]
  28. Wei, X.; Zhang, G.; Jiang, J. Star identification algorithm based on log-polar transform. J. Aerosp. Comput. Inf. Commun. 2009, 6, 483–490. [Google Scholar] [CrossRef]
  29. Fang, Y.; Jiang, J.; Zou, Y. A star identification algorithm based on geometric verification. In Proceedings of the 2020 2nd International Conference on Robotics, Intelligent Control and Artificial Intelligence, Shanghai, China, 17–19 October 2020. [Google Scholar] [CrossRef]
  30. Dai, Y.; Shi, C.; Ben, L.; Zhu, H.; Zhang, R.; Wu, S.; Shan, S.; Xu, Y.; Zhou, W. Star Identification Algorithm Based on Dynamic Distance Ratio Matching. Remote Sens. 2025, 17, 62. [Google Scholar] [CrossRef]
  31. Crain, T.; Bishop, R.; Ely, T. Event detection and identification during autonomous interplanetary navigation. AIAA J. 2000, 38, 62–69. [Google Scholar]
  32. Quan, W.; Fang, J. A star recognition method based on the adaptive ant colony algorithm for star sensors. Sensors 2010, 10, 1955–1966. [Google Scholar] [CrossRef]
  33. Ma, W.; Zhang, J.; Wu, Y.; Jiao, L.; Zhu, H.; Zhao, W. A Novel Two-Step Registration Method for Remote Sensing Images Based on Deep and Local Features. Remote Sens. 2019, 57, 4834–4843. [Google Scholar] [CrossRef]
  34. Li, Y.; Niu, Z.; Sun, Q.; Xiao, H.; Li, H. BSC-Net: Background Suppression Algorithm for Stray Lights in Star Images. Remote Sens. 2022, 14, 4852. [Google Scholar] [CrossRef]
  35. Liu, L.; Niu, Z.; Li, Y.; Sun, Q. Multi-Level Convolutional Network for Ground-Based Star Image Enhancement. Remote Sens. 2023, 15, 3292. [Google Scholar] [CrossRef]
  36. Xing, F.; You, Z.; Dong, Y. A rapid star identification algorithm based-on navigation star domain and K vector. J. Astronaut. 2010, 31, 2302–2308. [Google Scholar]
  37. Sun, Q.; Liu, L.; Niu, Z.; Li, Y.; Zhang, J.; Wang, Z. A Practical Star Image Registration Algorithm Using Radial Module and Rotation Angle Features. Remote Sens. 2023, 15, 5146. [Google Scholar] [CrossRef]
Figure 1. A flowchart of the method.
Figure 1. A flowchart of the method.
Remotesensing 17 01553 g001
Figure 2. Calculation steps and details of the algorithm.
Figure 2. Calculation steps and details of the algorithm.
Remotesensing 17 01553 g002
Figure 3. Digital string BP calculation of the star image processing branch. The numbers represent the distance from the division radius range to the center.
Figure 3. Digital string BP calculation of the star image processing branch. The numbers represent the distance from the division radius range to the center.
Remotesensing 17 01553 g003
Figure 4. Coordinate transformation from the Cartesian coordinate system to the polar coordinate system for star spots on the image plane.
Figure 4. Coordinate transformation from the Cartesian coordinate system to the polar coordinate system for star spots on the image plane.
Remotesensing 17 01553 g004
Figure 5. The calculation of the coordinates of imaging star spots in the celestial coordinate system.
Figure 5. The calculation of the coordinates of imaging star spots in the celestial coordinate system.
Remotesensing 17 01553 g005
Figure 6. Plane fitting result based on the three-dimensional coordinates of the point cluster.
Figure 6. Plane fitting result based on the three-dimensional coordinates of the point cluster.
Remotesensing 17 01553 g006
Figure 7. Position coordinates are replaced by polar coordinates upon applying the transformation matrix.
Figure 7. Position coordinates are replaced by polar coordinates upon applying the transformation matrix.
Remotesensing 17 01553 g007
Figure 8. Polar coordinate results of star imaging processing and star catalog processing. The blue asterisks represent star imaging processing, and the red circles represent star catalog processing.
Figure 8. Polar coordinate results of star imaging processing and star catalog processing. The blue asterisks represent star imaging processing, and the red circles represent star catalog processing.
Remotesensing 17 01553 g008
Figure 9. Polar coordinate results of star imaging processing and star catalog processing after considering the angular rotation. The blue asterisks represent star imaging processing, and the red circles represent star catalog processing.
Figure 9. Polar coordinate results of star imaging processing and star catalog processing after considering the angular rotation. The blue asterisks represent star imaging processing, and the red circles represent star catalog processing.
Remotesensing 17 01553 g009
Figure 10. The identification result diagram with star serial numbers. The blue asterisks represent star imaging processing, the red circles represent star catalog processing, and the green numbers are the serial numbers in the star catalog.
Figure 10. The identification result diagram with star serial numbers. The blue asterisks represent star imaging processing, the red circles represent star catalog processing, and the green numbers are the serial numbers in the star catalog.
Remotesensing 17 01553 g010
Figure 11. The star image with successful star identification results.
Figure 11. The star image with successful star identification results.
Remotesensing 17 01553 g011
Figure 12. Theoretical positions by inverse calculation of gRT, star catalog, and the parameters.
Figure 12. Theoretical positions by inverse calculation of gRT, star catalog, and the parameters.
Remotesensing 17 01553 g012
Figure 13. A comparison between the identified star spots and the inverse calculated star spots.
Figure 13. A comparison between the identified star spots and the inverse calculated star spots.
Remotesensing 17 01553 g013
Figure 14. The identification results of the proposed algorithm under the influence of different types of stray light. (a,c,e) are the results under the polar coordinate system under the interference of sunlight, cloud cover and moonlight, where blue asterisks are extracted as star point positions, red circles are the result of the star catalog processing branch, and green numbers represent the identified results with Flag 1. (b,d,f) are the identification results in the two-dimensional star image, and green asterisks and numbers represent the positions and identified star serial number.
Figure 14. The identification results of the proposed algorithm under the influence of different types of stray light. (a,c,e) are the results under the polar coordinate system under the interference of sunlight, cloud cover and moonlight, where blue asterisks are extracted as star point positions, red circles are the result of the star catalog processing branch, and green numbers represent the identified results with Flag 1. (b,d,f) are the identification results in the two-dimensional star image, and green asterisks and numbers represent the positions and identified star serial number.
Remotesensing 17 01553 g014aRemotesensing 17 01553 g014b
Figure 15. The identification results of the proposed algorithm under the influence of different exposure times. (a,c) are the results in the polar coordinate system under the exposure times of 20 ms and 40 ms, where blue asterisks are extracted as star point positions, red circles are the result of the star catalog processing branch, and green numbers represent the identified results. (b,d) are the identification results in the two-dimensional star image, and green asterisks and numbers represent the positions and the identified star serial number.
Figure 15. The identification results of the proposed algorithm under the influence of different exposure times. (a,c) are the results in the polar coordinate system under the exposure times of 20 ms and 40 ms, where blue asterisks are extracted as star point positions, red circles are the result of the star catalog processing branch, and green numbers represent the identified results. (b,d) are the identification results in the two-dimensional star image, and green asterisks and numbers represent the positions and the identified star serial number.
Remotesensing 17 01553 g015
Figure 16. A flowchart of the star magnitude adaptive adjustment process.
Figure 16. A flowchart of the star magnitude adaptive adjustment process.
Remotesensing 17 01553 g016
Figure 17. The star image processing branch with added stars. The red asterisks represent the number added stars in the star catalog.
Figure 17. The star image processing branch with added stars. The red asterisks represent the number added stars in the star catalog.
Remotesensing 17 01553 g017
Figure 18. A comparison of identification results before and after the star catalog reconstruction. The green asterisks represent the positions of extracted star points; the red circles represent the positions and star serial numbers of the corresponding inverse verification results; red circles and numbers are the star point information that is not identified in (a) but is identified in (b) after the star catalog reconstruction.
Figure 18. A comparison of identification results before and after the star catalog reconstruction. The green asterisks represent the positions of extracted star points; the red circles represent the positions and star serial numbers of the corresponding inverse verification results; red circles and numbers are the star point information that is not identified in (a) but is identified in (b) after the star catalog reconstruction.
Remotesensing 17 01553 g018
Figure 19. (a) is the appearance of the star tracker; (b) is the real night sky experiment set.
Figure 19. (a) is the appearance of the star tracker; (b) is the real night sky experiment set.
Remotesensing 17 01553 g019
Figure 20. Identification accuracy result comparison.
Figure 20. Identification accuracy result comparison.
Remotesensing 17 01553 g020
Figure 21. Recall rate result comparison.
Figure 21. Recall rate result comparison.
Remotesensing 17 01553 g021
Figure 22. A relative relationship of the calculation time between two methods (the ratio of Xing’s method to the proposed method).
Figure 22. A relative relationship of the calculation time between two methods (the ratio of Xing’s method to the proposed method).
Remotesensing 17 01553 g022
Table 1. Parameters of the star tracker in experiments.
Table 1. Parameters of the star tracker in experiments.
ParametersValue
Focal length (mm)49.74
Position of principal point (mm)(7.583, 7.584)
Image plane size (pixels × pixels)1024 × 1024
FOV (deg)17 × 17
Table 2. Original navigation star catalog.
Table 2. Original navigation star catalog.
Serial Number of the Main Star in the Star CatalogX in the Celestial
Coordinate System
Y in the Celestial
Coordinate System
Z in the Celestial
Coordinate System
Magnitude
10.013361−0.012375−0.9998345.32
20.009891−0.040586−0.9991274.73
40000.0101270.0101270.0101271.73
Table 3. Built and stored search table T1 of the star catalog.
Table 3. Built and stored search table T1 of the star catalog.
Serial Number of the Main Star in the Star CatalogNumber of Stars Around the Main StarSerial Number of Selected Neighboring Star 1 in the CatalogSerial Number of Selected Neighboring Star 2 in the CatalogSerial Number of Selected Neighboring Star 3 in the Catalog
143234
256134
400039396139623963
Table 4. The number of stars within a radius range.
Table 4. The number of stars within a radius range.
Quantization Separated by RadiusNumber of Stars Within the Radius Range
10
20
31
42
N2
Table 5. A comparison example between BP and BS.
Table 5. A comparison example between BP and BS.
Serial Number of the Main Star in the Star CatalogBPBS
Number of Stars Around the Main StarSerial Number of Selected Neighboring Star 1 in the CatalogSerial Number of Selected Neighboring Star 2 in the CatalogSerial Number of Selected Neighboring Star 3 in the Catalog
10011
20100
31100
42210
N2301
Similarity highlowlow
Table 6. A comparison between polar coordinates.
Table 6. A comparison between polar coordinates.
MarkPosMarkSelectResult of Subtraction
θ M a r k (°)
FlagIdentified Star Serial Number
R 1 (pixel) θ 1 (°) R 2 (pixel) θ 2 (°)
478.66954.852480.78228.28026.5712643
478.66954.852473.53929.83525.020/
478.66954.852479.88143.26411.590/
191.94155.585191.82729.08526.5012811
371.228117.583372.75590.96826.6112784
374.100120.584372.75590.96829.620/
374.100120.584377.643215.005−94.420/
442.825131.990444.105105.36126.6312817
Table 7. Parameters of the star tracker in experiments.
Table 7. Parameters of the star tracker in experiments.
ParametersValue
Focal length (mm)18.50
Position of principal point (mm)(3.101, 2.496)
Image plane size (pixels × pixels)1040 × 1292
FOV (deg)15 × 19
Table 8. A schematic process of star catalog reconstruction.
Table 8. A schematic process of star catalog reconstruction.
Serial Number of the Main Star in the Star CatalogNumber of Stars Around the Main StarSerial Number of Selected Neighboring Stars in the Catalog
143 (44)2344001
256 (58)13440014002
357 (59)12640014002
40003939613962396339983999
40014712340024003
40026323640014003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, S.; Sun, T.; Xing, F.; Liu, H.; Song, J.; Yu, S. Research on Full-Sky Star Identification Based on Spatial Projection and Reconfigurable Navigation Catalog. Remote Sens. 2025, 17, 1553. https://doi.org/10.3390/rs17091553

AMA Style

Wu S, Sun T, Xing F, Liu H, Song J, Yu S. Research on Full-Sky Star Identification Based on Spatial Projection and Reconfigurable Navigation Catalog. Remote Sensing. 2025; 17(9):1553. https://doi.org/10.3390/rs17091553

Chicago/Turabian Style

Wu, Siyao, Ting Sun, Fei Xing, Haonan Liu, Jiahui Song, and Shijie Yu. 2025. "Research on Full-Sky Star Identification Based on Spatial Projection and Reconfigurable Navigation Catalog" Remote Sensing 17, no. 9: 1553. https://doi.org/10.3390/rs17091553

APA Style

Wu, S., Sun, T., Xing, F., Liu, H., Song, J., & Yu, S. (2025). Research on Full-Sky Star Identification Based on Spatial Projection and Reconfigurable Navigation Catalog. Remote Sensing, 17(9), 1553. https://doi.org/10.3390/rs17091553

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop