Next Article in Journal
Prediction of Myoelectric Biomarkers in Post-Stroke Gait
Previous Article in Journal
The Algorithm of Determining an Anti-Collision Manoeuvre Trajectory Based on the Interpolation of Ship’s State Vector
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

A Flexible Baseline Measuring System Based on Optics for Airborne DPOS

1
Research Institute for Frontier Science, Beihang University, Beijing 100191, China
2
Division of Mechanics and Acoustic Metrology, National Institute of Metrology, Beijing 100029, China
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(16), 5333; https://doi.org/10.3390/s21165333
Submission received: 30 June 2021 / Revised: 30 July 2021 / Accepted: 4 August 2021 / Published: 7 August 2021

Abstract

:
Three-dimensional imaging for multi-node interferometric synthetic aperture radar (InSAR) or multi-task imaging sensors has become the prevailing trend in the field of aerial remote sensing, which requires multi-node motion information to carry out the motion compensation. A distributed position and orientation system (DPOS) can provide multi-node motion information for InSAR by transfer alignment technology. However, due to wing deformation, the relative spatial relationship between the nodes will change, which will lead to lower accuracy of the transfer alignment. As a result, the flexible baseline between the nodes affects the interferometric phase error compensation and further deteriorates the imaging quality. This paper proposes a flexible baseline measuring system based on optics, which achieves non-connect measurement and overcomes the problem that it is difficult to build an accurate wing deformation model. An accuracy test was conducted in the laboratory, and results showed that the measurement accuracy of the baseline under static and dynamic conditions was less than 0.3 mm and 0.67 mm, respectively.

1. Introduction

Airborne Synthetic Aperture Radar (SAR) requires a plane to move in a straight line at a constant speed, which is difficult to attain because of external interference such as gust, turbulence, and engine vibration. The position and orientation system (POS) can provide high-precision motion information for SAR to compensate for its motion error and then realize two-dimensional imaging with high resolution [1]. With the development of an airborne earth observation system, three-dimensional imaging for multi-node interferometric synthetic aperture radar (InSAR) or multi-task imaging sensors has become the prevailing trend [2,3,4], which requires multi-node motion information to carry out the motion compensation. A single POS cannot achieve the measurement of multi-node motion information. Therefore, a distributed position and orientation system (DPOS) needs to be developed. DPOS mainly includes a main POS, a few sub-IMUs, and a distributed POS computer system (DPCS). The main POS integrates a high-precision inertial measurement unit (IMU) and global navigation satellite system (GNSS). The sub-IMU only consists of a low-precision IMU [1].
Analyzed from the three-dimensional imaging principle of interferometric SAR (InSAR) or array SAR, the longer the baselines between multiple nodes are the higher the three-dimensional imaging accuracy is. In general, SAR antennas are installed in the pod below the belly, as shown in Figure 1. In order to increase baseline length, the aircraft is refitted by attaching a steel plate and SAR antennas are located inside the steel plate, as shown in Figure 2. In order to increase the baseline length further and pursue higher imaging accuracy, installing several radar pods on the wings is under consideration.
In general, the main POS is installed in the belly, and a few sub-IMUs are installed on the wing near the phase center of the SAR. Figure 3 shows the installation layout of DPOS. The high-precision motion information of the main POS is provided for each sub-IMU as reference and high-accuracy motion information of each sub-IMU is realized by transfer alignment technology. Due to the wing deformation, the flexible baseline between the main POS and the sub-IMU will seriously degrade the performance of transfer alignment.
Airborne InSAR uses the geometric relationship between radar wavelength, interferometric phase, the aircraft’s height, baseline length, and beam direction to measure the three-dimensional position information of the target on the ground. The radar wavelength and interferometric phase depend on InSAR technology. The aircraft’s height and beam direction are measured in real time by DPOS. Hypothesizing that each SAR is configured with a high-precision POS, the baseline can be calculated directly. However, the measurement accuracy of the baseline cannot meet the requirement of imaging because the accuracy of the single POS is at the centimeter level and measuring the accuracy of the baseline is certainly at the centimeter level. Given the above analysis, it can be seen that the imaging accuracy of InSAR depends mainly on InSAR technology, DPOS, and the measurement accuracy of the baseline. At the same time, the baseline measurement determines the performance of DPOS. Therefore, baseline measurement becomes the core issue for InSAR.
Accurate baseline measurement is mainly determined by an accurate model of wing deformation. Until now, some studies have idealized the wing deformation as a Markov process [5,6,7] and some parameters in the Markov model have been confirmed by experience. Liu et al. use elastic mechanics to simulate the process of wing deformation [8]. However, the wing deformation model established by this method varies for different aircraft material, which lacks practicality.
Due to the advantages of non-contact, fast speed, and high precision, optical measurement has been widely applied in many fields [9,10,11,12]. For example, among almost all the wind tunnel tests, wing deformation is measured with an optical camera [13]. In this paper, a flexible baseline measuring system for airborne DPOS is proposed. The relative position and orientation between the main POS and the sub-IMU are measured by two cameras, and, further, the flexible baseline measurement can be realized. Taking the distance between the main POS and sub-IMUs into account, there are non-overlapping fields of view between cameras. In this paper, the hand–eye calibration method [14,15,16] is used to solve the external parameters between cameras with non-overlapping fields of view.
The biggest advantage of a flexible baseline measuring system is that it can directly measure wing deformation information and further achieve baseline measurement. In addition, the measurement accuracy of the baseline can be gradually enhanced by improving the vision algorithm. Higher measurement accuracy of the baseline will directly improve the imaging performance of the InSAR. In addition, it can also improve the performance of DPOS by transfer alignment technology, which indirectly improves the imaging performance of the InSAR.

2. System Overview and External Parameter Calibration Method for Two Cameras with Non-Overlapping Fields of View

2.1. System Overview

Figure 4 shows the schematic diagram of a flexible baseline measuring system. Two sub-IMUs are installed on the wing, and two targets are attached to the corresponding sub-IMUs’ surfaces. Two cameras installed on a tripod are rigidly linked. For practical applications of airborne DPOS, the distance between the two sub-IMUs is long and there are non-overlapping fields of view between cameras; as a result, camera C 1 can only “see” the target S 1 , and the camera C 2 can only “see” the target S 2 .
Variables T 1 and T 2 are homogeneous transformation matrices of the target S 1 relative to the camera C 1 and the target S 2 relative to the camera C 2 , both of which can be calculated by the Perspective-n-Point (PnP) method [17,18]. Variable T 3 is the homogeneous transformation matrix between the two cameras; its calculation process is presented in detail in Section 2.2.
The homogeneous transformation matrix T i ( i = 1 , 2 , 3 , 4 ) is represented with a rotation matrix R i and a translation vector t i as follows:
T i = ( R i t i 0 1 × 3 1 )
where R i is a 3 × 3 rotation matrix, which is represented with three Euler angles around the x-axis, y-axis, and z-axis, respectively. t i is a 3 × 1 translation vector.
Analyzed from the flexible baseline measuring system, once T 1 ,     T 2 , and T 3 are calculated, the homogeneous transformation matrix T 4 between the two targets can be calculated easily. Then, the flexible baseline can be recovered between the two targets, which is the final variable to be solved.

2.2. External Parameter Calibration Method for the Two Cameras with Non-Overlapping Fields of View

The classical stereo calibration algorithm [19] is not suitable to calibrate external parameters between the two cameras with non-overlapping fields of view. In this paper, the hand–eye calibration method, which originated in robotics, is used to cope with this problem. First, the principle of the hand–eye calibration method in robotics is presented and second, this method is extended to solve the external parameters between cameras with non-overlapping fields of view.
A schematic diagram of the hand–eye calibration method in robotics is shown in Figure 5. The camera and robot gripper are rigidly connected and the goal is to determine the relative position and orientation between the camera and the robot gripper. For convenience, some coordinate systems used in this paper are defined as follows [20]:
G : the gripper coordinate system, which is fixed on the robot gripper and moves along with the gripper.
C : the camera coordinate system, whose origin is at the camera lens.
B : the target coordinate system, which is fixed on the target.
W : the robot world coordinate system, which is fastened to the robot work station. As the robot arm moves, the encoder output can communicate the position relationship between the gripper and robot work station.
As the gripper moves, the camera remains focused on the target; then the position relationship between camera and gripper can be solved.
Assuming the gripper is replaced with a camera, hand–eye calibration can be used to solve the relative position relationship between two cameras with non-overlapping fields of view.
As shown in Figure 6, the two targets P 1 and P 2 are rigidly linked, as are the two cameras C 1 and C 2 . Let the two cameras perform motions K ( K 20 ) times. Each camera pose is expressed relative to its first pose ( 0 th pose). T 1 k denotes the homogeneous transformation of the camera C 1 from 0 th pose to k th pose, for k = 1 , 2 , K . Similarly, T 2 k indicates the homogeneous transformation of the camera C 2 from 0 th pose to k th pose. T 3 is the unknown homogeneous transformation between the two cameras.
Based on the above analysis, the calibration process of external parameters for two cameras with non-overlapping fields of view can be summarized in detail as follows:
Step 1: Solve the external parameters of the camera relative to the corresponding target ( A 0 , A k , B 0 , B k ).
As shown in Figure 4, let A k represent the external parameters of camera C 1 relative to target P 1 at k th pose. A k consists of the rotation matrix R k and translation vector t k , and its expression is shown as Equation (2). A 0 represents the external parameters of camera C 1 relative to target P 1 at 0 th pose. All of these can be calculated by using the MATLAB calibration toolbox developed by Zhang’s calibration method [21,22].
A k = ( R k t k 0 1 × 3 1 )
In the same way, let B k represent external parameters of camera C 2 relative to target P 2 at k th pose and B 0 represent external parameters of camera C 2 relative to target P 2 at 0 th pose. Referring to the calculation process of A k mentioned above, B 0 and B k can be calculated by the same method.
Step 2: Solve the camera pose relative to its first pose ( 0 th pose).
According to rigid body rotation theory, the camera pose relative to its first pose ( 0 th pose) can be obtained by the homogeneous transformation between camera and target before and after the camera motion. The expression of T 1 k and T 2 k can be written as
{ T 1 k = A 0 A k 1 T 2 k = B 0 B k 1
Step 3: Solve the homogeneous transformation ( T 3 ) between the two cameras.
According to the system overview introduced in Section 2.1, the equation for T 3 can be derived by
T 2 k T 3 = T 3 T 1 k
Equation (4) is the hand–eye calibration model with the form A X = X B , where X is the unknown matrix to be determined. Further, Equation (4) can be broken down into
{ R 2 k R 3 = R 3 R 1 k R 2 k t 3 + t 2 k = R 3 t 1 k + t 3
Equation (5) can easily be solved by linear algebra, but it is a linear homogeneous system that theoretically has an infinite solution. In order to obtain a unique solution, the Lie group and Lie algebra theory [23] are used.
Rigid-body motions can be expressed by a Euclidean group, which consists of a matrix described by the following form [24]:
[ R T 0 1 ]
where R S O ( 3 ) , T R 3 . Here, S O ( 3 ) represents a group of rotation matrices. The transformation from Lie algebra to Lie group satisfies the exponential mapping relationship. If [ ω ] S O ( 3 ) , exp [ ω ] S O ( 3 ) , then its exponential mapping satisfies the following formula:
{ exp [ ω ] = I + sin ω ω [ ω ] + 1 cos ω ω 2 [ ω ] 2 [ 0 ω 3 ω 2 ω 3 0 ω 1 ω 2 ω 1 0 ] = Δ [ ω ] ω 2 = ω 1 2 + ω 2 2 + ω 3 2
The transformation from Lie group to Lie algebra satisfies the logarithmic mapping relationship. If θ S O ( 3 ) , then its logarithmic mapping can be expressed as follows:
log θ = ϕ 2 sin ϕ ( θ θ T )
where ϕ satisfies 1 + 2 cos ϕ = t r ( θ ) and log θ 2 = ϕ 2 . According to Equation (5), R 2 k can be expressed as
R 2 k = R 3 R 1 k R 3 T
Let log   R 2 k = [ α k ] and log   R 1 k = [ β k ] , then R 2 k = R 3 R 1 k R 3 T can be rewritten as:
[ α k ] = log ( R 3 R 1 k R 3 T ) = R 3 [ β k ] R 3 T = [ R 3 β k ]
Then, it yields
α k = R 3 β k
Now the optimal value of R 3 can be found by minimizing the following cost function
k = 1 K R 3 β k α k 2
that
R 3 = ( M T M ) 1 2 M T
where M = k = 1 K β k ( α k ) T .
Then, by combining Equation (5) with (13), t 3 can be calculated by
t 3 = ( C T C ) 1 C T d
where
C = [ R 2 1 I R 2 2 I R 2 K I ] , d = [ R 3 t 1 1 t 2 1 R 3 t 1 2 t 2 2 R 3 t 1 K t 2 K ]
So far, R 3 and t 3 have been calculated. Then T 3 can be solved with
T 3 = ( R 3 t 3 0 1 × 3 1 )

3. Flexible Baseline Measurement

The flexible baseline measuring system proposed in this paper is used to measure flexible baseline between multiple nodes, which are mainly divided into three major steps.
Step 1: Solve the homogeneous transformation ( T 1 and T 2 ) between camera and target.
Before solving T 1 and T 2 , the camera intrinsic parameters must be known, which can be calculated by Equation (2). If the coordinates of each feature point in the world coordinate system (target coordinate system) and the image coordinates of the corresponding feature points are known, the homogeneous transformation ( T 1 and T 2 ) between camera c and target can be calculated by the PnP algorithm.
Step 2: Solve the homogeneous transformation ( T 4 ) between the two targets.
So far, T 1 , T 2 , and T 3 have been calculated. The homogeneous transformation T 4 between the two targets can be obtained by Equation (17).
T 4 = T 2 T 3 1 T 1 1
Step 3: Solve the flexible baseline.
As shown in Figure 7, point A represents sub-IMU2 and point B represents sub-IMU1. L denotes the baseline length between point A ( x A , y A , z A ) and point B ( x B , y B , z B ) under the initial condition of the wing without deformation, which is shown with a red line in Figure 6. L denotes the baseline length between point A ( x A , y A , z A ) and point B ( x B , y B , z B ) in the case that the wing is subjected to external force and deformed, which can be calculated by the following formula:
L = ( x A x B ) 2 + ( y A y B ) 2 + ( z A z B ) 2

4. Laboratory Tests for Flexible Baseline Measurement

4.1. External Parameter Calibration Method for the Two Cameras ( T 3 )

4.1.1. DPOS Demonstration Platform

As shown in Figure 8, a DPOS demonstration platform is designed according to the shape and characteristics of the real wing, which is made of aluminum alloy 7075. One side length of the wing is 3 m.

4.1.2. Cameras

The cameras used in the experiment are AVT gt2450, as shown in Figure 9. The camera parameters are shown in Table 1.

4.2. Flexible Baseline Measurement

4.2.1. Static Test

The test based on the demonstration platform was carried out. The two targets were placed on the wing where the two adjacent sub-IMUs were mounted, as shown in Figure 10. In this test, the two cameras were placed in front of the two targets with a distance of 1 m. Loads of 1 kg, 2 kg, 3 kg, 4 kg, 5 kg, 6 kg, 7 kg, and 8 kg were added to the wing sequentially. The three-dimensional coordinates measuring system with bino-theodolites (TCMSBT) were taken as the benchmark of flexible baseline measurement, which consists of a theodolite TM6100A and a total station TS09, and its measurement accuracy was up to 0.05 mm, as shown in Figure 11.
The relative deformation and baseline between the two targets and the baseline error were calculated, and the results are shown in Table 2, from which it can be concluded that the baseline measurement accuracy under static conditions is better than 0.3 mm.

4.2.2. Dynamic Test

The dynamic test for flexible baseline measurement is shown in Figure 12. An external force was imposed on the end of the wing, and then it was suddenly removed. Next, the wing vibrated up and down freely, which lasted for about 600 s. The high-precision dynamic measuring system developed by Xintuo 3D Technology (Shenzhen) Limited Company was taken as the benchmark, whose accuracy is up to 0.02 mm.
The relative deformation between the two targets and relative deformation error are shown in Figure 13 and Figure 14. It can be seen that the wing showed periodic motion six times with damping amplitude. Therefore, taking the Root Mean Square Error (RMSE) as the error criterion, the relative deformation and its error in the six time periods were calculated, and the results are shown in Table 3. The measurement accuracy of the baseline under dynamic conditions is better than 0.67 mm.

5. Conclusions

A flexible baseline measuring system for airborne DPOS has been proposed. Two cameras with non-overlapping fields of view and two targets were utilized to measure the flexible baseline between the nodes. Benchmark tests in a laboratory were conducted and the results showed that the baseline measuring errors under static and dynamic conditions were less than 0.3 mm and 0.67 mm respectively.
In the future, the experiment will be tested in a real flight environment combining DPOS and InSAR, where the imaging sensors (cameras) are located in a pod below the belly and the targets observed through the pod’s windows. Here, general industrial cameras can all be used in the proposed system. However, the camera is easily disturbed by weather, temperature, and light, which deteriorates the measurement accuracy of the baseline, so a robust algorithm against adverse working conditions will be paid more attention.

Author Contributions

Data curation, Y.L.; Funding acquisition, W.Y.; Methodology, Y.L.; Project administration, W.Y.; Software, Y.L.; Writing—original draft, Y.L.; Writing—review and editing, W.Y. and B.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Natural Science Foundation of China under Grant 61901431, in part by Basic research of National Institute of Metrology under Grant AKYJJ1906.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhu, Z.S.; Tan, H.; Jia, Y.; Xu, Q.F. Research on the Gravity Disturbance Compensation Terminal for High-Precision Position and Orientation System. Sensors 2020, 20, 4932. [Google Scholar] [CrossRef]
  2. Rigling, B.D.; Moses, R.L. Motion Measurement Errors and Autofocus in Bistatic SAR. IEEE Trans. Image Process. 2006, 15, 1008–1016. [Google Scholar] [CrossRef]
  3. Liu, Y.H.; Wang, B.; Ye, W.; Ning, X.L.; Gu, B. Global Estimation Method Based on Spatial–Temporal Kalman Filter for DPOS. IEEE Sens. J. 2021, 21, 3748–3756. [Google Scholar] [CrossRef]
  4. Wang, J.; Liang, X.D.; Ding, C.B.; Chen, L.Y.; Wang, Z.Q.; Li, K. A novel scheme for ambiguous energy suppression in MIMO-SAR systems. IEEE Geosci. Remote Sens. Lett. 2015, 12, 344–348. [Google Scholar] [CrossRef]
  5. Lu, Z.X.; Li, J.L.; Fang, J.C.; Wang, S.C.; Zhou, S.Y. Adaptive Unscented Two-Filter Smoother Applied to Transfer Alignment for ADPOS. IEEE Sens. J. 2018, 18, 3410–3418. [Google Scholar] [CrossRef]
  6. Lu, Z.X.; Fang, J.C.; Liu, H.J.; Gong, X.L.; Wang, S.C. Dual-filter transfer alignment for airborne distributed POS based on PVAM. Aerosp. Sci. Technol. 2017, 71, 136–146. [Google Scholar] [CrossRef]
  7. Gong, X.L.; Chen, L.J.; Fang, J.C.; Liu, G. A transfer alignment method for airborne distributed POS with three-dimensional aircraft flexure angles. Sci. China Inf. Sci. 2018, 61, 190–204. [Google Scholar] [CrossRef] [Green Version]
  8. Fang, J.C.; Zang, Z.; Gong, X.L. Model and simulation of transfer alignment for distributed POS. J. Chin. Inert. Techn. 2012, 20, 379–385. [Google Scholar]
  9. Peng, J.Q.; Xu, W.F.; Liang, B. Pose Measurement and Motion Estimation of Space Non-cooperative Targets based on Laser Radar and Stereo-vision Fusion. IEEE Sens. J. 2019, 19, 3008–3019. [Google Scholar] [CrossRef]
  10. Gadwe, A.; Ren, H. Real-Time 6DOF Pose Estimation of Endoscopic Instruments Using Printable Markers. IEEE Sens. J. 2019, 19, 2338–2346. [Google Scholar] [CrossRef]
  11. Guo, J.; Zhu, C.A.; Lu, S.L.; Zhang, D.S.; Zhang, C.Y. Vision-based measurement for rotational speed by improving Lucas-Kanade template tracking algorithm. Appl. Opt. 2016, 55, 7186–7194. [Google Scholar] [CrossRef] [PubMed]
  12. Jiang, F.F.; Zhou, Y.H.; Ling, T.Y.; Zhang, Y.B.; Zhu, Z.Y. Recent Research for Unobtrusive Atrial Fibrillation Detection Methods Based on Cardiac Dynamics Signals: A Survey. Sensors 2021, 21, 3814. [Google Scholar] [CrossRef]
  13. Burner, A.W.; Liu, T. Videogrammetr Model Deformation Measurement Technique. J. Aircr. 2001, 38, 745–754. [Google Scholar] [CrossRef] [Green Version]
  14. Liu, Z.; Zhang, G.J.; Wei, Z.Z. Global Calibration of Multi-sensors Vision System Based on Two Planar Targets. J. Mech. Eng. 2009, 45, 228–232. [Google Scholar] [CrossRef]
  15. Pavlovcic, U.; Arko, P.; Jezersek, M. Simultaneous Hand-Eye and Intrinsic Calibration of a Laser Profilometer Mounted on a Robot Arm. Sensors 2021, 21, 1037. [Google Scholar] [CrossRef]
  16. Wang, G.; Shang, Y.; Guan, B.L. Flexible Calibration of Setting Relation of a Multi-camera Rig for Non-Overlapping Views. Chin. J. Laser. 2017, 2017, 207–213. [Google Scholar]
  17. Xu, D.; Tan, M.; Li, Y. Visual Measurement and Control for Robots, 2nd ed.; National Defense Industry Press: Beijing, China, 2010; pp. 132–138. [Google Scholar]
  18. Wang, P.; Xu, G.L.; Cheng, Y.H.; Yu, Q.D. A Simple, Robust and Fast Method for the Perspective-n-Point Problem. Pattern Recognit. Lett. 2018, 108, 31–37. [Google Scholar] [CrossRef]
  19. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004; pp. 325–340. [Google Scholar]
  20. Tsai, R.Y.; Lenz, R.K. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Trans. Robot. Autom. 1989, 5, 345–358. [Google Scholar] [CrossRef] [Green Version]
  21. Zhang, Z.Y. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  22. Camera Calibration Toolbox for Matlab. Available online: http://www.vision.caltech.edu/bouguetj/calib_doc/ (accessed on 14 October 2015).
  23. Ding, J.Y.; Pan, Z.K. The Lie group Euler methods of multibody system dynamics with holonomic constraints. Adv. Mech. Eng. 2018, 10, 1–10. [Google Scholar] [CrossRef]
  24. Park, F.C.; Martin, B.J. Robot sensor calibration: Solving AX = XB on the Euclidean group. IEEE Trans. Robot. Autom. 2002, 10, 717–721. [Google Scholar] [CrossRef]
Figure 1. The installation layout of SAR antennas (in the pod below the belly).
Figure 1. The installation layout of SAR antennas (in the pod below the belly).
Sensors 21 05333 g001
Figure 2. The installation layout of SAR antennas (inside the steel plate).
Figure 2. The installation layout of SAR antennas (inside the steel plate).
Sensors 21 05333 g002
Figure 3. The installation layout of DPOS.
Figure 3. The installation layout of DPOS.
Sensors 21 05333 g003
Figure 4. Schematic diagram of the flexible baseline measuring system.
Figure 4. Schematic diagram of the flexible baseline measuring system.
Sensors 21 05333 g004
Figure 5. Schematic diagram of hand–eye calibration in robotics.
Figure 5. Schematic diagram of hand–eye calibration in robotics.
Sensors 21 05333 g005
Figure 6. The schematic of the two cameras’ calibration with non-overlapping fields of view.
Figure 6. The schematic of the two cameras’ calibration with non-overlapping fields of view.
Sensors 21 05333 g006
Figure 7. Baseline measurement diagram.
Figure 7. Baseline measurement diagram.
Sensors 21 05333 g007
Figure 8. DPOS demonstration platform.
Figure 8. DPOS demonstration platform.
Sensors 21 05333 g008
Figure 9. Camera.
Figure 9. Camera.
Sensors 21 05333 g009
Figure 10. Static test for flexible baseline measurement.
Figure 10. Static test for flexible baseline measurement.
Sensors 21 05333 g010
Figure 11. Benchmark system.
Figure 11. Benchmark system.
Sensors 21 05333 g011
Figure 12. Dynamic test for flexible baseline measurement.
Figure 12. Dynamic test for flexible baseline measurement.
Sensors 21 05333 g012
Figure 13. Relative deformation.
Figure 13. Relative deformation.
Sensors 21 05333 g013
Figure 14. Relative deformation error.
Figure 14. Relative deformation error.
Sensors 21 05333 g014
Table 1. Camera parameters.
Table 1. Camera parameters.
Camera ParametersIndex
Image resolution2448 * 2050
Frame rate15 fps
Focal length25 mm
Size of CCD pixel3.45 μm * 3.45 μm
lensComputar M2518-MPW2
Table 2. Measurement results (mm).
Table 2. Measurement results (mm).
Loadsx-axisy-axisz-axisBaseline||Baseline Error||
Proposed MethodBenchmarkProposed MethodBenchmarkProposed MethodBenchmarkProposed MethodBenchmark
1 kg556.875556.5756.1037.75610.03710.234556.999556.7230.276
2 kg558.319558.4312.60512.87810.36610.42558.558558.6760.118
3 kg558.916559.11218.04318.00710.83211.231559.312559.5150.203
4 kg559.105559.33225.07625.86511.12111.442559.778560.0470.269
5 kg559.652559.45231.18431.94711.34811.567560.635560.4830.152
6 kg559.863559.76337.21935.74311.60212.012561.219561.0320.187
7 kg560.479560.743.29243.93111.73112.321562.271562.5530.282
8 kg561.516561.2349.61949.41911.89412.305563.83563.5360.294
Table 3. Relative deformation and relative deformation error (RMSE: mm).
Table 3. Relative deformation and relative deformation error (RMSE: mm).
Time Periodsx-axisy-axisz-axisBaseline Error
0–100 s0.570.250.860.61
101–200 s0.430.230.660.45
201–300 s0.480.250.760.51
301–400 s0.460.300.700.45
401–500 s0.490.350.740.51
501–586 s0.640.281.010.67
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liu, Y.; Ye, W.; Wang, B. A Flexible Baseline Measuring System Based on Optics for Airborne DPOS. Sensors 2021, 21, 5333. https://doi.org/10.3390/s21165333

AMA Style

Liu Y, Ye W, Wang B. A Flexible Baseline Measuring System Based on Optics for Airborne DPOS. Sensors. 2021; 21(16):5333. https://doi.org/10.3390/s21165333

Chicago/Turabian Style

Liu, Yanhong, Wen Ye, and Bo Wang. 2021. "A Flexible Baseline Measuring System Based on Optics for Airborne DPOS" Sensors 21, no. 16: 5333. https://doi.org/10.3390/s21165333

APA Style

Liu, Y., Ye, W., & Wang, B. (2021). A Flexible Baseline Measuring System Based on Optics for Airborne DPOS. Sensors, 21(16), 5333. https://doi.org/10.3390/s21165333

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop