Next Article in Journal
Towards More Accurate Determination of the Thermoelectric Properties of Bi2Se3 Epifilms by Suspension via Nanomachining Techniques
Previous Article in Journal
A Novel 0.1 mm 3D Laser Imaging Technology for Pavement Safety Measurement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Active Decision Support System for Observation Scheduling Based on Image Analysis at the BOROWIEC SLR Station

by
Tomasz Suchodolski
Centrum Badań Kosmicznych Polskiej Akademii Nauk (CBK PAN), Bartycka 18A, 00-716 Warszawa, Poland
Sensors 2022, 22(20), 8040; https://doi.org/10.3390/s22208040
Submission received: 13 September 2022 / Revised: 16 October 2022 / Accepted: 18 October 2022 / Published: 21 October 2022
(This article belongs to the Section Intelligent Sensors)

Abstract

:
The dynamic exploration of the orbits from the LEO-to-GEO region, for the needs of telecommunication services, science, industry and defense, forces monitoring of the trajectory of such orbital objects for the safety of spacecraft traffic and, in the case of deorbitation, for the safety of ground infrastructure. First off all, the need for trajectory monitoring in order to avoid collisions can be distinguished, as well as the need to calibrate the satellite on-board devices. This is mainly carried out by radar measurements, by passive optical acquisition and active laser measurements. The number of orbital objects increases rapidly, and the number of tracking stations for the second is relatively small. This leads to a situation in which each tracking station must select which of the objects will be subject to the measurement task. In the case of the Satellite Laser Ranging (SLR) or passive optical set-up, the weather conditions are an important factor enabling the measurement of the orbital object trajectory. This paper presents an innovative observation scheduling support system based on the analysis of the images obtained from the Allsky camera. The information of the degree of cloud cover, the position of the Sun/Moon in connection with the graphical projections of the ephemeris trajectory of the orbital objects allows increasing the measurement efficiency. The presented solution is part of a larger number of improvements carried out by the author, which lead to the upgrade of SLR stations in terms of new technologies and safety of use.

1. Introduction

The issue of laser ranging to artificial satellites dates back to 1964, when the first measurement of the distance between the SLR station and the Beacon-B Explorer satellite was made [1]. Historically, laser stations were used to measure the trajectories of geodynamic satellites [2,3], scientific platforms and robotic rovers located on the surface of the Moon [4,5], as well as to determine the position of the first navigation constellations [6,7]. As technology advances, laser measurements are used to calibrate altimetric satellites, the correct operation of which depends on accurate information about their location [8,9]. SLR measurements make a contribution to research on the implementation and definition of the International Terrestrial Reference Frame (ITRF), pole movement, the coordinates of the geocenter of the Earth’s gravitational field, the coordinate and velocity changes in the position of the SLR stations and determining the position of objects orbiting Earth with centimeter accuracy [10,11]. The greatest advantage of the laser technique is the direct and absolute measurement of the range between the SLR observatory and the given orbital object (satellite, rocket bodies and space debris).
Over the years, more and more satellites being launched into orbit have led to the situation that, in space, there is more space debris than active operating satellites. According to the European Space Agency Space Debris Office (ESA SDO) report [12], there is a sharp increase in the number of orbital objects, resulting in an avalanche of debris objects accompanying the main space mission. Among them, we can distinguish rocket bodies, payload fragmentation debris and rocket fragmentation debris (Table 1). Although these are irrelevant elements from the point of view of the orbital mission, they pose a high risk of possible collisions. An important factor is that such elements are uncontrollable, because they do not have mechanisms that would allow for changing the orbit trajectory. Therefore, in the event of a collision probability, it is not possible to correct their orbit. Moreover, not all operational satellites have the possibility to correct their trajectory. It should be mentioned that small or inactive satellites do not have such maneuvering mechanisms. This makes it necessary to also track such objects to prevent possible collisions. In connection with the above, there is also a need to determine the trajectory for objects such as space debris or defunct satellites.
Unfortunately, accurate information about the trajectory of an orbital object is not always sufficient. In the case of rocket bodies, information about the orientation of such elements is also important. It may turn out that knowing the exact trajectory of the rocket segment is insufficient, since the uncertainty of its orientation may lead to a collision. The difference between the diameter of a rocket segment (e.g., 3 m) and its length (e.g., 8 m) and its specific orientation in space (along or perpendicular) may result in erroneous assumptions for the collision probability calculations. Another important effect is the possibility of object rotation. The rotation effect can lead to the fragmentation of the orbital object and the creation of another pool of debris objects. All these uncertainties can be reduced by laser ranging to an orbital object. Short laser pulses (ps to ns) enable accurate measurement of the object’s position, and with in-depth data analysis they also provide information about the object’s orientation [13] or its rotation (spin) [14]. Therefore, the use of laser technology in recent years has changed the pool of objects that need to be tracked.
Most of the existing SLR stations support only scientific missions and navigation constellations. Of the 40 operational stations, only a small number are able to successfully track objects such as space debris. Due to the increasing number of scientific missions, the list of objects for laser tracking exceeds 100 objects per station. In the case of low Earth orbit (LEO) objects, multiple passes of a satellite over a station often happens, so it increases the observation load of the given station. Therefore, the method of the observations planning process is important in order to maximize the operational capacity of the SLR station. As long as it is possible to perform a priori ephemeris analysis for passes over the given SLR station, the weather conditions for the next few hours are not known at this stage. It often happens that partial cloud cover makes it impossible to observe a given object. It follows that it is necessary to introduce such a functionality of the planning system so that it takes into account the current weather conditions. This paper presents an innovative decision support module in the SLR observation planning system based on the postprocessing and analysis of the images taken from a fish-eye camera and available ephemeris of orbital objects. This solution is a fragment of a series of improvements made by the author described partially in the work in [15].

2. Satellite Laser Ranging Process

Laser ranging to an orbital object is carried out by using a laser source with short pulse width (in units of ps or of ns level) and specified repetition rate (in units of Hz or of kHz level). The laser beam is transmitted towards the theoretical orbital position of the object ( κ ) in order to obtain the signal echoes ( E x ( t ) ). The time interval Δ t p between the moments of time of sending the laser pulse ( t S T A R T ) and recording the photon event ( t S T O P ) scattered from the object in the telescope’s photon detector determines the double distance to the object in the time domain (Figure 1).
Laser measurements are made for satellites, rocket bodies and space debris from the regions from LEO (low Earth orbit) to GEO (geostationary orbit), so the distance between the ground station and the orbital object ranges mainly from about 350 km to 42,000 km. The tracked objects differ in terms of their purpose, so they also differ in the shape and materials from which they are made. This determines the method of their observation and the selection of laser sources to obtain effective results.

2.1. Types and Characteristics of Tracked Orbital Objects

The orbital objects tracked by SLR stations can be broken down into two main categories: cooperative and uncooperative targets. Cooperative targets, mainly active satellites, are equipped with systems of cubic prisms (retroreflectors) mounted on the outer casing, whose task is the low-loss reflection of the laser beam emitted from the SLR station. Uncooperative targets are therefore all other objects that are not equipped with cubic prism (satellites, rocket bodies and space debris) or decommissioned satellites equipped with retroreflectors with lost ability to maintain spatial orientation (wrong orientation makes it impossible to reflect the beam towards the station). While in the case of ranging to the cooperative targets in connection to mounted retroreflectors there is no need to generate highly energetic laser beams, in the case of uncooperative objects, there is a need to use high-energy laser sources to obtain scattered echoes from the orbital object [16,17,18]. This means that the observation of such different objects requires the use of two different laser sources. It is also dictated by the safety of instruments located on scientific satellites (e.g., Earth observation). In a simple approximation, high-energy lasers emit radiation to inactive targets. The use of two different laser sources for the observation of two different types of objects must therefore be taken into account in the observation planning process.
Another key element in the observation planning process is the difference in the reflection characteristics of cooperative and uncooperative targets with an increase in the distance depending on the type of orbit. The Borowiec SLR station (LASBOR) tracks cooperative and uncooperative targets from the LEO regime and cooperative targets up to the MEO region (medium Earth orbit). For example, Figure 2 shows the pass over LASBOR station of a typical cooperative object (LARETS geodetic satellite) from the LEO region. As can be seen, the range varies from 710 km to 2130 km depending on the angle of pass elevation.
In the case of objects equipped with retroreflectors, they do not cause perturbations in the ranging process to the orbital object in terms of laser pulse energy level. However, in the case of uncooperative objects as rocket bodies, such a change in distance may make it impossible to carry out the measurement process due to the lack of signal echoes. For the LASBOR station, the range limit to uncooperative targets results is 1000 km for R C S = 10 (Radar Cross-Section). Effects related to light refraction are another issue, but they are not the subject of this paper. When analyzing the above limits, it can be concluded that it is necessary to take into account these limitations in the process of observation scheduling (Figure 3).
On the basis of subsequent measurements, the limits of the elevation angles were established. For the LEO region, for cooperative objects, it is 20 degrees above the horizon, for uncooperative targets, 30 degrees, respectively.
Laser ranging to MEO targets has a different specificity. The pass of the orbital object over the station takes a relatively long time (tens of minutes to hours). Therefore, unlike ranging to LEO objects, whose passes last for a few minutes, observations of MEO objects can be split into several slots overnight. Another aspect is that for the MEO region, ranging to cooperative targets is only possible. Figure 4 shows the change in the distance and elevation angle of the COSMOS 2475 tracking process. Due to the high orbital altitude, the distance to the object changes less as a percentage than it does for LEO objects (19,276 km–23,553 km).
An important aspect is that the low elevation observation angle of the orbital object implies a decrease in measurement efficiency due to the location (nadir) of the retreflectors’ panel and their acceptance cone. Taking into account the above facts, the process of scheduling the observation of the MEO objects can be limited to passes starting from 50 degrees above the horizon. The advantage of such a solution is the release of observation slots for relatively short observations of objects from the LEO region, as shown in the Figure 5.
Determining the boundary conditions during the observation planning process increases the number of observation slots, taking into account the orbital region (LEO–MEO) and the type of the tracked object (cooperative or uncooperative). Such a priori scheduling enables an orderly process of observation of orbital objects. The only unknown variable is the weather conditions that will occur when the observations are made. The full cloud cover or rainfall make the measuring process completely impossible, while in the case of partial cloud cover, it is possible to select objects such that it is possible to perform object tracking.

2.2. Weather Conditions and Operational Time

In the present configuration, the SLR LASBOR tracks orbital objects within the nautical night. This implies a variability in the number of daily observing hours over the year. Weather conditions, which strongly depend on the season in Europe, are another aspect. Unfortunately, during the longest nautical nights rainfall and full cloud cover make it difficult to track objects effectively. Figure 6 shows the length of the nautical night over the year and the percentage of weather conditions suitable for tracking operation.
As shown, the winter season is characterized by a high degree of cloud cover, which implies a relatively low number of observation hours at night time (e.g., in January 4,13 h of cloudless of the statistical night constitutes 31% of observation rate). The months from March to September are characterized by the best weather conditions and observation results despite the shorter nights. In connection with the above, there is a need to optimize the planning process in the remaining months. It often happens that only part of the sky is covered with the clouds. Additionally, this fact should be resolved when real-time observations are being made.

2.3. Observation Load

Currently, the list of objects tracked by SLR LASBOR station has over 200 orbital objects. It can be assumed that half of the objects are cooperative targets (LEO–MEO); the rest are rocket bodies from the LEO region. This replicates the generation of extensive lists of scheduled measurements. For example, for the night of 2021/12/15, the scheduled observation list included 249 passes. As shown in Table 2, the scheduled number of 200 tracked objects from different orbital regions causes overlaps. The situation takes place despite the introduction of boundary conditions for the observation of a given type of object and given orbital altitude, as described in the Section 2.1.
Therefore, selecting a specific object for the tracking process is difficult issue [19]. Initially, the selection may be based on the time of the pass over the station and knowledge of the object’s reflection characteristics, as well as the time elapsed since the last measurement. At this stage, the operator of the SLR system has no knowledge of the weather conditions described in the Section 2.2.

3. Active Decision Support System for Observation Scheduling and Observation Safety

Appropriate planning of the observations (Section 2.3), taking into account the specificity of orbital objects (Section 2.1), enables the operator to select objects and time slots that generate the highest measurement efficiency in terms of the time length of the pass over the SLR station (Table 2). The only unknown for the system operator is the current weather conditions (Section 2.2). For night time operations, the use of a simple Allsky camera in connection with a low light pollution index at the station location does not bring any effect (e.g., illumination of cloud fractions). Short exposure times do not show fractions of clouds or stars with low brightness. Too long exposure times introduce noise and highlight bright stars despite light cloud cover. Regardless of the above, the operator’s analysis of such an image in an illuminated room is a demanding task that requires attention. This condition forced the development of such an image postprocessing system that the operator could clearly and easily assess the weather conditions above the station. Additionally, the existing ephemeris of the objects can be projected onto the image acquired from the Allsky camera in the form of objects trajectories.

3.1. Data Structures

Due to the large number of tracked objects, the catalog of the objects ( s a t c a t ), ephemeris information ( s a t e p h e ), the acquired measurement data ( s a t m e a s ) and other derived data ( s a t x ) necessary to carry out the process are stored in the SQL database environment, the processing of which is thus ordered and is carried out in accordance with the following scheme:
{ ( s a t c a t , s a t e p h e , s a t p l a n , s a t m e a s , , s a t n ) | q 1 , , q k } ,
where: s a t * are attribute relationships, and q 1 , , q k are a relational algebra of the laser management system. This method of data organization, a fragment of which is shown in the Figure 7, also allows for the subsequent analysis of the collected information, which can be used, for example, to generate parameters of new orbits or to characterize given objects.
The process of tracking an orbital object κ s a t c a t in the time domain (t) requires the conversion of ephemeris information s a t e p h e ( κ ) about the theoretical spatial position X κ , Y κ , Z κ of the orbital object into the values of azimuth θ κ ( t ) and elevation ϕ κ ( t ) angles that enable the follow-up movement of the telescope mount behind this object (Figure 8).
s a t e p h e ( κ ) X κ ( t ) , Y κ ( t ) , Z κ ( t ) θ κ ( t ) , ϕ κ ( t ) .
The role of the management software is, inter alia, processing information about the theoretical position of the orbital object κ s a t c a t in the time domain (t) into control data vectors V A z / E l = [ v , a , θ , ϕ , ] of drives of the Az/El orthogonal axis of the telescope mount and controlling this process. By obtaining information about the azimuth and elevation angles in the time domain, it is possible to visualize the object’s trajectory by comparing these data with weather data.

3.2. Image Processing

The assumption was to use publicly available software for the analysis and processing of optical data obtained from the Allsky camera. While the entire station’s management software was created in C # in conjunction with MySQL, the external support software uses different environments. For data processing in the form of images obtained from the Allsky camera, it was decided to use OpenCV using Python scripts. The OpenCV library provides extensive functions that enable image processing that meet the needs of this solution. The proposed implementation does not require real-time processing, but the available functions shortened the time of creating the active decision support system.
The aim of the task is to determine the degree of cloud cover during the day and night time. Therefore, a different course of action has to be implemented during the day time with the sunlight and a different one during the night time, which is strictly dependent on the light pollution. Simple operations were chosen to construct the image processing algorithm, which in changing weather conditions simplifies the calibration and modification of the actions taken. For operations carried out during the day, the algorithm was implemented according to Figure 9. Due to the fact that most of the available cameras have detectors with a different number of pixels in the X : Y axes (e.g., 1944 px:2592 px), the first operation is to trim these axes to equal values ( X ¯ : Y ¯ { X ¯ Y ¯ } ). In order to remove high-frequency content in the form of noise or edges, Gaussian blurring G ( x ) was used. The purpose of all day time operation is not to detect shapes but only changes in background brightness. The image prepared in this way can be binarized by specifying the activation threshold F h ( x ) . Therefore, the image was divided into areas with two degrees of brightness, understood as overcast or lack thereof. Finally, as the upper graphical layer, ephemeris data are added in the form of the satellite pass trajectory built on azimuth θ κ ( t ) and elevation ϕ κ ( t ) angles. By comparing the binarized image with the pass trajectory, it is possible to estimate the percentage of satellite visibility in order to perform a laser measurement. Taking into account all satellites passes at the given moment of time (e.g., Table 2), it is possible to choose the one pass with the longest visibility.
The night time processing mode (Figure 10) differs in the way the data are handled. Due to the possible different cloud bases (mainly dependent on the season), and therefore a change in the background constant brightness (light pollution), the night time image analysis is best performed by star detection. The second approach is the differential analysis of consecutive images (Figure 11). Even in the case of full cloud cover, it is possible for the system operator to observe the background changes and estimate the overcast level depending on the wind (cumulative differential image changes).

4. Experiment

4.1. Test Platform

The assumption of the project was to use cheap and generally available elements in the form of a camera and the Raspberry Pi microcontroller. Initially, a camera dedicated to Raspberry Pi with the resolution of 1944 px:2592 px was used. Unfortunately, the small pixel size of the camera prevents correct acquisition during night time by applying long exposures that resulted in noise. Therefore, the astronomical ASI120 monochrome camera [20] (960 px:1280 px) with the pixel size of 3.75 μm was selected. The result of the postprocessing in the form of images was made available on the Apache website to be accessible from any SLR station computer.

4.2. Active Decision Support System Operational Test

Functional tests were carried out in day and night mode. In accordance with the scheme of operation shown in Figure 9, successive transformations of the image were performed in the day time (Figure 12, Figure 13, Figure 14 and Figure 15). It should be noted that the calibration frames known from astronomy in the form of flat, bias or dark frames were not used.
For that moment of time, there were 3 objects from the tracking list (Etalon-1, ERS-1, ADEOS-1) within the measurement range of the SLR station. As shown in Figure 15, the percentage value of visibility for each object in relation to the current cloud cover was estimated (where green markings mean favorable conditions). These values were, respectively: Etalon-1 = 0%, ERS-1 = 100% and ADEOS-1 = 55%. The validity of the estimate depends on the predicted duration of the satellite’s pass in relation to the speed of cloud movement and is only illustrative for the given moment of time.
Additionally, as a supplementary task, it was decided to perform tests of the cumulative differential image algorithm (Figure 11), the result of which is presented in the confrontation of the original image in Figure 16 and Figure 17. This method of processing revealed cloud fractions seemingly invisible to the human eye.
The estimation of the degree of cloud cover in the night mode differs significantly. As described in Section 3.2, the best unequivocal method is star detection. In order to minimize image noise, which can lead to misidentification of the stars, relatively short exposure times were used. This method of acquisition makes it difficult to perform analysis by a human but is not a problem for processing algorithms. Only two brighter stars can be seen on the unprocessed and cropped image shown in the Figure 18, and it would require an operator’s effort to analyze such dark graphic. In accordance with the scheme of operation shown in Figure 10, after the blur operation, the image was brightened (Figure 19), and the stars were highlighted using the dilate process (Figure 20). At this point, the number of visible stars rises to 16 and they are easily seen. The last operation on the image marks the cloudless areas in the form of circles and makes the graphic projection of the trajectories of the object’s passes over the SLR station.
For that moment of time, there were 5 objects from the tracking list (rocked body SL3–13121, satellittes: GLONASS 105, GLONASS 132, Cryosat–2 and HY–2C). As shown in Figure 21, the percentage value of visibility for each object in relation to the current cloud cover was estimated (where green markings means favorable conditions). These values were, respectively: SL3-13121 = 30%, GLONASS 105 = 100%, GLONASS 132 = 19%, Cryosat-2 = 47% and HY-2C = 30%. The validity of the estimate depends on the predicted duration of the satellite’s pass in relation to the circles around stars with arbitrarily selected diameters and is only illustrative for the given moment of time. Field flattening (circles) was also omitted due to the lack of a target lens (ultimately 180 field of view).
Additionally, a test of the cumulative differential image algorithm was performed for the night mode, the steps of which are shown in the Figure 22. Figure 23 and Figure 24 show two night images with the cloud fraction taken 6 min apart. As can be seen, cloud fractions are subtle. From the point of view of the system operator, it is not possible to state changes in cloud cover. At the moment, the station operates only in night mode, so it is impossible to analyze such images in a lit room.
The basis of the algorithm (Figure 11) is image acquisition in every minute and cumulative differential comparison every n minutes (in tests n : = 6 ). Figure 22a–f shows the results of such differential comparisons. The self-differential comparison (a) suppresses the details, which confirms the correct operation of the algorithm. With each successive image (b–f) compared with the base image (1st one), the differences are emphasized. After n : = 6 images is executed, the new base image gets into the FIFO buffer as a reference one. If each captured image was analyzed separately (Figure 23 and Figure 24), it would be difficult or even impossible to determine the cloud cover level in a lit operator’s room. The algorithm shown in Figure 11 highlights any background changes that occur over time (Figure 22b–f), making these changes as white fractions. This way of presenting data does not require increased attention from the station personnel, who can focus on other tasks.

5. Conclusions

The process of laser ranging with the continuous growth of orbital objects and a limited number of the SLR stations is a complex issue. After decades of operational support for scientific missions, there is an additional need to monitor space debris objects. Despite the advanced observation planning algorithms, the weather factor is the main element decision uncertainty, which implies measurement efficiency. This paper proposed solutions to improve the effectiveness of measuring the distance to the orbital objects, taking into account weather conditions at the Borowiec SLR station. The presented algorithms for the active decision support system for observation scheduling led to an increase in the effectiveness of measuring positions of orbital objects. SLR Borowiec, so far, did not have the Allsky system, so the high effectiveness of observation was obtained only during cloudless nights. In other cases, the station operator had to manually check the condition of the cloud cover, or in the case of partial cloud cover, some measurements were ineffective. The application of the presented solution simultaneously provides information on weather conditions and enables the selection of such passes that generate maximum measurement efficiency. An additional advantage of this solution is the ability to support decisions in 24/7 operations, although the station currently only works in night mode. A separate issue is the parameterization of algorithms, which depends on the specific setup. Initially, the tests were carried out on a camera dedicated to Raspberry Pi, which enabled the analysis of ∼176 of the sky area. Then, the camera was changed to an ASI120, with the lens enabling ∼120 field detection. Therefore, the field flatness parameterization was abandoned partly until the target lens is used. Due to the changing weather conditions, depending on the season, the complete system parameterization will take many months. Variable background brightness dependent on light pollution will force the use of additional algorithms for changing the camera’s sensitivity. Additional improvements to the algorithms are possible, which may take into account air traffic using the ADS-B signal.

Author Contributions

Conceptualization, T.S.; methodology, T.S.; software, T.S.; validation, T.S.; formal analysis, T.S.; investigation, T.S.; resources, T.S.; data curation, T.S.; writing—original draft preparation, T.S.; writing—review and editing, T.S.; visualization, T.S.; supervision, T.S.; project administration, T.S.; funding acquisition, T.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Plotkin, H. Reflection of ruby laser radiation from Explorer XXII. Proc. IEEE 1965, 53, 301–302. [Google Scholar] [CrossRef]
  2. Kucharski, D.; Kirchner, G.; Koidl, F.; Cristea, E. 10 Years of LAGEOS-1 and 15 years of LAGEOS-2 spin period determination from SLR data. Adv. Space Res. 2009, 43, 1926–1930. [Google Scholar] [CrossRef]
  3. Kucharski, D.; Kirchner, G.; Lim, H.; Koidl, F. Spin parameters of High Earth Orbiting satellites ETALON-1 and ETALON-2 determined from kHz Satellite Laser Ranging data. Adv. Space Res. 2014, 54, 2309–2317. [Google Scholar] [CrossRef]
  4. Alley, C.O.; Bowman, S.R.; Rayner, J.D.; Wang, B.C.; Yang, F.M. First Lunar Ranging Results from The University of Maryland Research Station at the 1.2M Telescope of the GSFC. In Proceedings of the 6th International Workshop on Laser Ranging Instrumentation, Antibes, France, 22–26 September 1986; p. 625. [Google Scholar]
  5. Chapront, J.; Chapront-Touze, M.; Francou, G. A new determination of lunar orbital parameters, precession constant and tidal acceleration from LLR measurements. Astron. Astrophys. 2002, 387, 700. [Google Scholar] [CrossRef] [Green Version]
  6. Zhu, S.; Reigber, C.; Kang, Z. Apropos laser tracking to GPS satellites. J. Geod. 1997, 71, 423–431. [Google Scholar] [CrossRef]
  7. Hujsak, R.S.; Gilbreath, G.C.; Truong, S. GPS Sequential Orbit Determination Using Spare SLR Data. In Proceedings of the 11th International Workshop on Laser Ranging, Deggendorf, Germany, 21–25 September 1998; p. 56. [Google Scholar]
  8. Arnold, D.; Montenbruck, O.; Hackel, S.; Sosnica, K. Satellite laser ranging to low Earth orbiters: Orbit and network validation. J. Geod. 2019, 93, 2315–2334. [Google Scholar] [CrossRef] [Green Version]
  9. Fernández, J.; Fernández, C.; Féménias, P.; Peter, H. The Copernicus Sentinel-3 Mission. In Proceedings of the 20th International Workshop on Laser Ranging, Potsdam, Germany, 9–14 October 2016. [Google Scholar]
  10. Pearlman, M.R.; Noll, C.E.; Pavlis, E.C.; Lemoine, F.G.; Combrink, L.; Degnan, J.J.; Kirchner, G.; Schreiber, U. The ILRS: Approaching 20 years and planning for the future. J. Geod. 2019, 93, 2161–2180. [Google Scholar] [CrossRef]
  11. Wilkinson, M.; Schreiber, U.; Procházka, I.; Moore, C.; Degnan, J.; Kirchner, G.; Zhongping, Z.; Dunn, P.; Shargorodskiy, V.; Sadovnikov, M.; et al. The next generation of satellite laser ranging systems. J. Geod. 2019, 93, 2227–2247. [Google Scholar] [CrossRef]
  12. ESA Space Debris Office. Esa’S Annual Space Environment Report. LOG 2022. Available online: https://www.sdo.esoc.esa.int (accessed on 20 June 2022).
  13. Zhao, S.; Steindorfer, M.; Kirchner, G.; Zheng, Y.; Koidl, F.; Wang, P.; Shang, W.; Zhang, J.; Li, T. Attitude analysis of space debris using SLR and light curve data measured with single-photon detector. Adv. Space Res. 2020, 65, 1518–1527. [Google Scholar] [CrossRef]
  14. Kucharski, D.; Kirchner, G.; Koidl, F. Spin parameters of nanosatellite BLITS determined from Graz 2 kHz SLR data. Adv. Space Res. 2011, 48, 343–348. [Google Scholar] [CrossRef]
  15. Suchodolski, T. Active Control Loop of the BOROWIEC SLR Space Debris Tracking System. Sensors 2022, 22, 2231. [Google Scholar] [CrossRef]
  16. Kirchner, G.; Koidl, F.; Friederich, F.; Buske, I.; Völker, U.; Riede, W. Laser measurements to space debris from Graz SLR station. Adv. Space Res. 2013, 51, 21–24. [Google Scholar] [CrossRef]
  17. Lejba, P.; Suchodolski, T.; Michalek, P.; Bartoszak, J.; Schillak, S.; Zapaśnik, S. First laser measurements to space debris in Poland. Adv. Space Res. 2018, 61, 2609–2616. [Google Scholar] [CrossRef]
  18. Cordelli, E.; Di Mira, A.; Heese, C.; Flohrer, T.; Kloth, A.; Steinborn, J. Demonstration of space debris observation capabilities of the ESA IZN-1 robotic optical ground station. In Proceedings of the 73rd International Astronautical Congress (IAC), Paris, France, 18–22 September 2022. [Google Scholar]
  19. Sybilska, A.; Słonina, M.; Leba, P.; Lech, G.; Karder, J.; Suchodolski, T.; Wagner, S.; Sybilski, P.; Mancas, A. WebPlan–web-based sensor scheduling tool. In Proceedings of the 8th European Conference on Space Debris, Darmstadt, Germany, 20–23 March 2021; ESA Space Debris Office: Darmstadt, Germany, 2021. [Google Scholar]
  20. ZWO ASI Astronomy Cameras. Available online: https://astronomy-imaging-camera.com (accessed on 20 June 2022).
Figure 1. Block diagram of the SLR tracking system.
Figure 1. Block diagram of the SLR tracking system.
Sensors 22 08040 g001
Figure 2. Elevation angle and distance to the LARETS satellite in relation to the SLR station.
Figure 2. Elevation angle and distance to the LARETS satellite in relation to the SLR station.
Sensors 22 08040 g002
Figure 3. Elevation and azimuth angles to the LARETS satellite in relation to the SLR station.
Figure 3. Elevation and azimuth angles to the LARETS satellite in relation to the SLR station.
Sensors 22 08040 g003
Figure 4. Elevation angle and distance to the COSMOS 2475 satellite in relation to the SLR station.
Figure 4. Elevation angle and distance to the COSMOS 2475 satellite in relation to the SLR station.
Sensors 22 08040 g004
Figure 5. Elevation and azimuth angles to the COSMOS 2475 satellite in relation to the SLR station.
Figure 5. Elevation and azimuth angles to the COSMOS 2475 satellite in relation to the SLR station.
Sensors 22 08040 g005
Figure 6. Effectiveness of successful observations depending on the season.
Figure 6. Effectiveness of successful observations depending on the season.
Sensors 22 08040 g006
Figure 7. Block diagram of the database structure—scheduling chain.
Figure 7. Block diagram of the database structure—scheduling chain.
Sensors 22 08040 g007
Figure 8. Block diagram of the ephemeris processing data flow.
Figure 8. Block diagram of the ephemeris processing data flow.
Sensors 22 08040 g008
Figure 9. Block diagram of the day time image processing.
Figure 9. Block diagram of the day time image processing.
Sensors 22 08040 g009
Figure 10. Block diagram of the night time image processing.
Figure 10. Block diagram of the night time image processing.
Sensors 22 08040 g010
Figure 11. Block diagram of the cumulative differential image processing.
Figure 11. Block diagram of the cumulative differential image processing.
Sensors 22 08040 g011
Figure 12. Cropped unprocessed image.
Figure 12. Cropped unprocessed image.
Sensors 22 08040 g012
Figure 13. Blurred image.
Figure 13. Blurred image.
Sensors 22 08040 g013
Figure 14. Binary transformation.
Figure 14. Binary transformation.
Sensors 22 08040 g014
Figure 15. Ephemeris graphical projection (green line—cloudless, red line—overcast).
Figure 15. Ephemeris graphical projection (green line—cloudless, red line—overcast).
Sensors 22 08040 g015
Figure 16. Cropped unprocessed image.
Figure 16. Cropped unprocessed image.
Sensors 22 08040 g016
Figure 17. Cumulative differential image.
Figure 17. Cumulative differential image.
Sensors 22 08040 g017
Figure 18. Cropped unprocessed night image.
Figure 18. Cropped unprocessed night image.
Sensors 22 08040 g018
Figure 19. Blurred and brightened image.
Figure 19. Blurred and brightened image.
Sensors 22 08040 g019
Figure 20. Dilation transformation.
Figure 20. Dilation transformation.
Sensors 22 08040 g020
Figure 21. Ephemeris graphical projection (green line—cloudless, red line—overcast).
Figure 21. Ephemeris graphical projection (green line—cloudless, red line—overcast).
Sensors 22 08040 g021
Figure 22. Cumulative differential night images ( n : = 6 ).
Figure 22. Cumulative differential night images ( n : = 6 ).
Sensors 22 08040 g022
Figure 23. 1st night frame (starting minute).
Figure 23. 1st night frame (starting minute).
Sensors 22 08040 g023
Figure 24. 6th night frame (+6 min).
Figure 24. 6th night frame (+6 min).
Sensors 22 08040 g024
Table 1. Evolution of numbers of orbital objects [12].
Table 1. Evolution of numbers of orbital objects [12].
YearPayloadRocket BodyPayload Fragments DebrisOther Types 1
202178491990762912,557
202062631957668612,419
20153864196560355789
20103268162161914164
20052709139110683595
2000249313519453438
19901708102910302493
19809385845472018
19704022091961170
1 Others including: payload debris, rocket fragmentation debris, rocket mission related object and unidentified.
Table 2. SLR Borowiec observation list: 8 of 249 rows (2021/12/15—night tracking).
Table 2. SLR Borowiec observation list: 8 of 249 rows (2021/12/15—night tracking).
ObjectStart [UTC]Stop [UTC]Pass [m:s]Elevation [°]Range [km]
GLN-13416:07:5016:37:5130:0150–7519,328–19,518
Stella16:07:5016:13:2505:3520–65881–1787
SL16-2062516:10:3816:14:1903:4130–381272–1473
SL8-1132716:16:2116:17:2101:0030–301685–1698
SL14-2278416:18:5216:23:2904:3730–421310–161
SL16-2222016:19:2516:24:5205:2730–58987–1486
Galileo-21116:27:3716:57:3830:0150–6023,921–24,433
CZ2C-2848016:31:2216:33:2702:0530–331174–1256
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Suchodolski, T. Active Decision Support System for Observation Scheduling Based on Image Analysis at the BOROWIEC SLR Station. Sensors 2022, 22, 8040. https://doi.org/10.3390/s22208040

AMA Style

Suchodolski T. Active Decision Support System for Observation Scheduling Based on Image Analysis at the BOROWIEC SLR Station. Sensors. 2022; 22(20):8040. https://doi.org/10.3390/s22208040

Chicago/Turabian Style

Suchodolski, Tomasz. 2022. "Active Decision Support System for Observation Scheduling Based on Image Analysis at the BOROWIEC SLR Station" Sensors 22, no. 20: 8040. https://doi.org/10.3390/s22208040

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop