Next Article in Journal
Application of CCG Sensors to a High-Temperature Structure Subjected to Thermo-Mechanical Load
Next Article in Special Issue
Pixel-Level and Robust Vibration Source Sensing in High-Frame-Rate Video Analysis
Previous Article in Journal
Visual and Plasmon Resonance Absorption Sensor for Adenosine Triphosphate Based on the High Affinity between Phosphate and Zr(IV)
Previous Article in Special Issue
I-AUV Docking and Panel Intervention at Sea
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Vision/INS Integrated Navigation System for Poor Vision Navigation Environments

1
Payload Electronics Team, Korea Aerospace Research Institute, Daejon 34133, Korea
2
Department of Electronics Engineering, Chungnam National University, Daejon 34134, Korea
*
Author to whom correspondence should be addressed.
Sensors 2016, 16(10), 1672; https://doi.org/10.3390/s16101672
Submission received: 8 August 2016 / Revised: 5 October 2016 / Accepted: 7 October 2016 / Published: 12 October 2016
(This article belongs to the Special Issue Vision-Based Sensors in Field Robotics)

Abstract

:
In order to improve the performance of an inertial navigation system, many aiding sensors can be used. Among these aiding sensors, a vision sensor is of particular note due to its benefits in terms of weight, cost, and power consumption. This paper proposes an inertial and vision integrated navigation method for poor vision navigation environments. The proposed method uses focal plane measurements of landmarks in order to provide position, velocity and attitude outputs even when the number of landmarks on the focal plane is not enough for navigation. In order to verify the proposed method, computer simulations and van tests are carried out. The results show that the proposed method gives accurate and reliable position, velocity and attitude outputs when the number of landmarks is insufficient.

1. Introduction

The inertial navigation system (INS) is a self-contained dead-reckoning navigation system that provides continuous navigation outputs with high-bandwidth and short-term stability. Due to its navigation characteristics, the accuracy of the navigation output degrades as time passes. In order to improve the performance of the INS, a navigation aid can be integrated into the INS. The GPS/INS integrated navigation system is one of the most generally used integrated navigation systems [1,2]. However, the GPS/INS integrated navigation system may not produce reliable navigation outputs, since the GPS signal is vulnerable to interference such as jamming and spoofing [3,4]. In recent years, many alternative navigation systems to GPS such as vision, radar, laser, ultrasonic sensor, UWB (Ultra-Wide Band) and eLoran (enhanced Long range navigation) have been studied in order to provide continuous, reliable navigation outputs [4].
Vision sensors have recently been used for navigation of vehicles such as cars, small-sized low-cost airborne systems and mobile robots due to their benefits in terms of weight, cost and power consumption [5,6,7]. Navigations using vision sensors can be classified into three methods [4,7]. The first method determines the position of the vehicle by comparing the measured image of a camera with the stored image or stored information of a map [8]. The second method, which is called landmark-based vision navigation, determines position and attitude by calculating directions to landmarks from the measured image of the landmarks [9,10]. The third method, called visual odometry, determines the motion of the vehicle from successive images of the camera [11]. Among these three methods, the landmark-based approach is known to have the advantages of bounded navigation parameter error and simple computation [7].
In order to integrate an inertial navigation system with a vision navigation system, several methods have been proposed [12,13,14,15,16]. The method in [12] uses gimbal angle and/or bearing information calculated from camera images. In this case, the integrated navigation method may not give an optimal navigation output since the inputs to the integration filter are processed outputs from raw measurements from the vision sensor. When the visual odometry is used for the integrated navigation system as in [13], the error of the navigation output from the vision navigation system increases with time. The integrated navigation method proposed in [14,15,16] uses the position and attitude, velocity or heading information from the vision navigation system. This integrated navigation method may not give a reliable navigation output when the number of landmarks in the camera image is not enough for the navigation output.
This paper proposes an inertial and vision integrated navigation method for poor vision environments, in which position and attitude outputs cannot be obtained from a vision navigation system due to the limited number of landmarks. The proposed method uses focal plane measurements of landmarks in the camera and INS outputs. Since there is no need to have navigation output from the vision navigation system, the proposed method can give integrated navigation output even when the number of landmarks in the camera is not enough for the navigation output. In addition to this, since the integration method uses raw measurements for integration filter, the navigation output may have better performance. In Section 2, a brief description of landmark-based vision navigation is given. The proposed integration method is presented in Section 3. Results of computer simulations and vehicle experiments are given in Section 4. The concluding remarks and further studies are mentioned in Section 5.

2. Landmark-Based Vision Navigation

Vision navigation output is computed from the projected landmarks on the focal plane in landmark-based vision navigation [7,13,14]. Figure 1 shows projected landmarks on the focal plane when the pin hole camera model is adopted. The x c axis of the camera frame is aligned with the optical axis of the camera. The y c and z c axes are in the horizontal and vertical direction of the focal plane, respectively. The focal plane is placed at a distance of focal length, f , on the x c axis.
As shown in Figure 1, the landmark at position P k c ( X k c ,   Y k c , Z k c ) is projected into the point p k c ( f ,   u k , v k ) on the focal plane in the camera frame. Equations (1) and (2) represent the relationship between the measurements on the focal plane and landmark coordinate values.
u k = f Y k c X k c
v k = f Z k c X k c
Equation (3) is the navigation equation to obtain navigation output from landmark measurement on the focal plane of the camera.
P k n P u n = r k C b n C c b p k c
where the subscript k denotes index of landmarks and P k n is a known position vector of the k th landmark. b , c and n denote the body frame, the camera frame and the navigation frame, respectively. P u n is the vehicle’s three-dimensional position vector in the navigation frame. r k is the distance ratio of the projected landmark on the focal plane to the actual landmark in the camera frame. C b n and C c b are the direction cosine matrix from the body frame to the navigation frame and the direction cosine matrix from the camera frame to the body frame, respectively. Here, C c b is a constant matrix since the camera is fixed to the body.
It can be seen from Equation (3) that at least three measurements are required in order to determine a navigation output of six variables, which are three-dimensional position and attitude [14]. In this paper, more than 0 and less than 3 landmarks are available in the poor vision environments.

3. Vision/INS Integrated Navigation System

3.1. Vision/INS Integrated Navigation System

Figure 2 describes the proposed method of a vision/INS integrated navigation system. The inertial navigation system computes vehicle’s position ( P I N S ), velocity ( V I N S ) and attitude (ΨINS) from outputs of IMU (Inertial Measurement Unit) (Δv, Δθ). The vision navigation system gives projected landmark measurements on the focal plane ( u k ,   V I S , v k ,   V I S ). Kalman filter estimates the INS errors ( δ x n a v , and ε ) and the vision sensor errors ( δ u , δ v and δ f ).

3.2. Process Model of the Kalman Filter

INS error equation can be obtained by perturbing the navigation equation of the INS [17]. The process model based on the INS error equation and the sensor error equation for the proposed method is given in Equation (4).
δ x ˙ ( t ) = F ( t ) δ x ( t ) + w ( t ) ,     w ( t ) ~ N ( 0 ,     Q ( t ) )
where w ( t ) is process noise vector with covariance Q ( t ) . Equation (4) can be rewritten into Equation (5).
[ δ x ˙ n a v δ x ˙ s e n ] = [ F 11 F 12 0 9 × 9 0 9 × 9 ] [ δ x n a v δ x s e n ] + [ w n a v w s e n ]
where w n a v and w s e n are the navigation parameter error vector noise and sensor error vector noise v, respectively. State vector   δ x is composed of navigation parameter error vector δ x n a v and sensor error vector δ x s e n . 0 m × n denotes an m by n zero matrix. The navigation parameter error vector is composed of position error, velocity error and attitude error of the INS as given in Equation (6).
δ x n a v = [ δ P N δ P E δ P D         δ V N δ V E δ V D       φ N φ E φ D ] T
where δ P , δ V , and φ are position error, velocity error and attitude error expressed in the rotation vector, respectively. The subscripts N , E and D are the north, the east and the down axes in the navigation frame, respectively. The sensor error vector includes six inertial sensor errors and three vision sensor errors as in Equation (7).
δ x s e n = [ x y z         ε x ε y ε z       δ u δ v δ f ] T
where and ϵ are accelerometer and the gyro error, respectively. δ u and δ v are errors of the coordinate values on the focal plane and δ f is focal length error. The subscripts x ,   y and z denote roll, pitch and yaw direction in the body frame, respectively. Submatrix F 11 in Equation (5) is given in Equation (8).
F 11 = [ Ω e n n I 3 × 3 O 3 × 3 O 3 × 3 Ω i e n + Ω i n n f n × O 3 × 3 O 3 × 3 Ω i n n ]
where Ω e n n , Ω i e n and Ω i n n are the skew-symmetric matrix of the vehicle’s craft-rate in the navigation frame, the skew-symmetric matrix of the earth rate in the navigation frame and the skew-symmetric matrix of the rotation rate of the navigation frame relative to the inertial frame represented in the navigation frame, respectively. f n × is the skew-symmetric matrix of the vehicle’s specific force in the navigation frame. Submatrix F 12 in Equation (5) is given in Equation (9).
F 12 = [ O 3 × 3 O 3 × 3 O 3 × 3 C b n O 3 × 3 O 3 × 3 O 3 × 3 C b n O 3 × 3 ]
The accelerometer sensor error and the gyro error are modeled as random constants and are given in Equations (10) and (11), respectively.
b ˙ = 0
ε b ˙ = 0
The vision senor errors are also modeled as random constants and are given in Equations (12)–(14).
δ u ˙ = 0
δ v ˙ = 0
δ f ˙ = 0

3.3. Measurement Model of the Kalman Filter

The measurement equation for the Kalman filter is given in Equation (15).
δ z ( t ) = H ( t ) δ x ( t ) + v ( t ) ,     v ( t ) ~ N ( 0 ,   R ( t ) )
where H ( t ) is the observation matrix and v ( t ) is the measurement noise vector with covariance R ( t ) . The measurement vector δ z ( t ) is given in Equation (16).
δ z = [ δ u 1   δ u 2   δ u n δ v 1             δ v 2 δ v n ] T
where δ u k and δ v k denote the differences between the INS-based estimates and measurements on the focal plane for the k -th landmark. n denotes the number of the landmarks on the focal plane of the camera. The INS-based estimates for each element in Equation (16) are calculated from position and attitude outputs of INS and the position information of the landmarks. The observation matrix is given in Equation (17).
H [ H 1 H 2 H 3 ]
Each sub-matrix in the observation matrix can be obtained from computing the Jacobian. The submatrix H1 is given in Equation (18).
Sensors 16 01672 i001
where [ x y z ] T is the position vector in the navigation frame. The submatrix H 2 is given in Equation (19).
Sensors 16 01672 i002
where [ α β γ ] T is the attitude vector expressed in the Euler angle in the navigation frame. The attitude error in the process model Equation (6) is represented in the rotation vector, whereas the attitude error in Equation (19) is represented in the Euler angle. The relationship between the rotation vector and the Euler angle is expressed in Equation (26) [18]. The A i j ( i = 1 , 2 , 3 ,   j = 1 , 2 , 3 ) in Equation (19) are the same as those in Equation (20).
Sensors 16 01672 i003
The submatrix H 3 is given in Equation (21).
H 3 = [ 1 0 Y 1 X 1 1 0 Y n X n 0 1 Z 1 X 1 0 1 Z n X n ]
where [ X k Y k Z k ] T is the position vector of the k th landmark in the camera frame and can be expressed in Equation (22).
[ X k Y k Z k ] = C n c [ x k x y k y z k z ] = [ C 11 C 12 C 13 C 21 C 22 C 23 C 31 C 32 C 33 ] [ x k x y k y z k z ]
Then, [ u k / x u k / y u k / z ] in Equation (18) can be expressed in Equation (23).
[ u k x u k y u k z ] = [ f Y k X k 2 f 1 X k 0 ] [ C 11 C 12 C 13 C 21 C 22 C 23 C 31 C 32 C 33 ]
And [ u k / α u k / β u k / γ ] in Equation (19) can be expressed in Equation (24).
[ u k α u k β u k γ ] = [ f Y k X k 2 f 1 X k 0 ] [ X k α X k β X k γ Y k α Y k β Y k γ Z k α Z k β Z k γ ]
Also, [ v k / x v k / y v k / z ] in Equation (18) and [ v k / α v k / β v k / γ ] in Equation (19) are expressed in Equations (25) and (26), respectively.
[ v k x v k y v k z ] = [ f Z k X k 2 0 f 1 X k ] [ C 11 C 12 C 13 C 21 C 22 C 23 C 31 C 32 C 33 ]
[ v k α v k β v k γ ] = [ f Z k X k 2 0 f 1 X k ] [ X k α X k β X k γ Y k α Y k β Y k γ Z k α Z k β Z k γ ]
It can be seen from Equation (16) that the proposed method can provide an integrated navigation output even though the number of landmarks is not sufficient for a vision navigation output.

4. Computer Simulation and Experimental Result

The proposed method is verified through computer simulations and van tests.

4.1. Computer Simulation

Computer simulations of the proposed integrated navigation method were carried out for a low medium-grade inertial sensor and a low-cost commercial camera. Figure 3 shows the scheme of the simulations. Reference trajectory and inertial sensor data were generated using MATLAB and INS tool box manufactured by GPSoft LLC. True camera measurement data of the landmarks were first generated using the pinhole camera model given in Equations (1) and (2). The camera measurement data on the focal plane of the landmarks were finally generated by adding noises into the true camera measurement data. Zero to ten landmarks to be observed on every image are placed by a random generator. The IMU measurement data were also generated by adding noises into the true IMU measurement data. Table 1 and Table 2 show the specifications of the IMU and the vision sensor for the simulation.
50 Monte-Carlo simulations were performed for an eight-shaped flight path with constant height as shown in Figure 4. Less than three landmarks were intentionally placed randomly in a specific area in order to create a poor vision navigation environment.
Results of the proposed method were compared with those of another integration method in [14]. In the integration method in [14], the outputs of the vision navigation system are position and attitude and state vector is given in Equation (27).
δ x = [ δ P N δ P E δ P D     δ V N δ V E δ V D     φ N φ E φ D x y z ε x ε y ε z ] T
The measurement vector is given in Equation (28).
δ z = [ δ P N δ P E δ P D δ α δ β δ γ ] T
As with the loosely coupled GPS/INS integrated navigation method, the method in [14] has redundancy in the navigation output. The vision navigation system can provide a stand-alone navigation output even when the INS and/or the integrated navigation system cannot provide a navigation output. However, as described in Section 2, the vision system cannot give navigation output when less than three landmarks are available on the focal plane. In this case, performance of integrated navigation system can deteriorate since the measurement update process cannot be performed in the integration Kalman filter. As shown in Equation (15), the measurement update process can be performed even when only one landmark is visible on the focal plane in the proposed method. Only the time update in Kalman filtering is performed when no landmarks are visible at all.
Figure 5 shows results of the estimated vision sensor errors of the proposed method in the simulation. It can be seen from the results that the vision sensor errors are well estimated and the performance of the vision navigation system is improved.
In Figure 6, navigation results of the proposed method are compared with those of the pure INS and the method in [14]. Figure 7 shows the position errors in the north, east and down direction of the proposed method and the method in [14].
Table 3 shows RMS errors for the pure INS, the method in [14] and the proposed method. It can be observed that error of the pure INS becomes large as the navigation operation continues. It can also be observed that the method in [14] gives relatively large navigation parameter errors in the area where the number of the landmarks are not enough for vision navigation output. The proposed method gives approximately 50 and 10 times better performance in the position and the attitude than the method in [14] in this area, respectively.

4.2. Van Test

Figure 8 shows the experimental setup and a reference navigation system. The experimental setup consists of a camera and an IMU and is installed on an optical bench. The reference navigation system, which is a carrier-phase differential GPS (CDGPS)/INS integrated navigation system, is installed together. Outputs of the reference navigation system are regarded as true values in the evaluation of the experimental results. A low-cost commercial camera and a micro electro mechanical system (MEMS) IMU given in Table 4 and Table 5 were used in the experiment. Database of the landmarks was made in advance with the help of large-scale maps and aerial photographs.
Figure 9 shows the position of the vehicle’s reference trajectory in the experiment. The results of the proposed method of the experiment were compared with those of the pure INS and the method in [14]. Figure 10 and Figure 11 show the navigation results and Table 6 shows the errors in the experiment. As with the results of the computer simulations, it can be seen from the experimental result that the proposed method provides reliable solutions with approximately 5 times better positioning performance than the method in [14] even in poor vision environments.

5. Concluding Remarks and Further Studies

This paper proposed an inertial and landmark-based vision integrated navigation method using focal plane measurements of landmarks. An integration model was derived to use the raw measurements on the focal plane in the integration Kalman filter. The proposed method has been verified through computer simulations and van tests. Performance of the proposed method has been compared with other integration method which used a vision navigation output, i.e., position and attitude output from a vision navigation system. It has been observed from the results that the proposed system gives reliable navigation outputs even when the number of landmarks is not sufficient for vision navigation.
An integration method to use continuous images to improve navigation performance and an integration model to efficiently detect and recognize landmarks will be studied. As future works, other filtering methods such as the particle filter and unscented Kalman filter, artificial neural network-based filtering and the application of a vision/INS integrated navigation system for sea navigation can be considered.

Author Contributions

Youngsun Kim and Dong-Hwan Hwang proposed the idea of this paper; Youngsun Kim designed and performed the experiments; Youngsun Kim and Dong-Hwan Hwang analyzed the experimental data and wrote the manuscript.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Biezad, D.J. Integrated Navigation and Guidance Systems, 1st ed.; American Institute of Aeronautics and Astronautics Inc.: Virginia, VA, USA, 1999; pp. 139–150. [Google Scholar]
  2. Groves, P.D. Principles of GNSS, Inertial, and Multisensor Integrated Navigation Systems, 1st ed.; Artech House: Boston, MA, USA, 2008; pp. 361–469. [Google Scholar]
  3. Ehsan, S.; McDonald-Maier, K.D. On-Board Vision Processing for Small UAVs: Time to Rethink Strategy. 2015. arXiv:1504.07021. arXiv.org e-Print archive. Available online: https://arxiv.org/abs/1504.07021 (accessed on 10 October 2016).
  4. Groves, P.D. The PNT boom, future trends in integrated navigation. Inside GNSS 2013, 2013, 44–49. [Google Scholar]
  5. Bryson, M.; Sukkarieh, S. Building a robust implantation of bearing-only inertial SLAM for a UAV. J. Field Robot. 2007, 24, 113–143. [Google Scholar] [CrossRef]
  6. Veth, M.J. Navigation using images, a survey of techniques. J. Inst. Navig. 2011, 58, 127–139. [Google Scholar] [CrossRef]
  7. Borenstein, J.; Everette, H.R.; Feng, L. Where am I? Sensors and Methods for Mobile Robot Positioning, 1st ed.; University of Michigan: Michigan, MI, USA, 1996. [Google Scholar]
  8. Thompson, W.B.; Henderson, T.C.; Colvin, T.L.; Dick, L.B.; Valiquette, C.M. Vision-based localization. In Proceedings of the 1993 Image Understanding Workshop, Maryland, MD, USA, 18–21 April 1993; pp. 491–498.
  9. Betke, M.; Gurvits, L. Mobile robot localization using landmarks. IEEE Trans. Robot. Autom. 1997, 13, 251–263. [Google Scholar] [CrossRef]
  10. Chatterji, G.B.; Menon, P.K.; Sridhar, B. GPS/machine vision navigation systems for aircraft. IEEE Trans. Aerosp. Electron. Syst. 1997, 33, 1012–1025. [Google Scholar] [CrossRef]
  11. Scaramuzza, D. Visual odometry-tutorial. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
  12. George, M.; Sukkarieh, S. Camera aided inertial navigation in poor GPS environments. In Proceedings of the 2007 IEEE Aerospace Conference, Big Sky, MT, USA, 3–10 March 2007; pp. 1–12.
  13. Tardif, J.P.; George, M.; Laverne, M. A new approach to vision-aided inertial navigation. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 4161–4168.
  14. Kim, Y.S.; Hwang, D.-H. INS/vision navigation system considering error characteristics of landmark-based vision navigation. J. Inst. Control Robot. Syst. 2013, 19, 95–101. [Google Scholar] [CrossRef]
  15. Yue, D.X.; Huang, X.S.; Tan, H.L. INS/VNS fusion based on unscented particle filter. In Proceedings of the 2007 International Conference on Wavelet Analysis and Pattern Recognition, Beijing, China, 2–4 November 2007; pp. 151–156.
  16. Wang, W.; Wang, D. Land vehicle navigation using odometry/INS/vision integrated system. In Proceedings of the 2008 IEEE International Conference on Cybernetics Intelligent Systems, Chengdu, China, 21–24 September 2008; pp. 754–759.
  17. Meskin, D.G.; Itzhack, Y.B. A unified approach to inertial navigation system error modeling. J. Guid. Control Dyn. 1992, 15, 648–653. [Google Scholar] [CrossRef]
  18. Song, G.W.; Jeon, C.B.; Yu, J. Relation of euler angle error and eotation vector error. In Proceedings of the 1997 Conference on Control and Instrumentation, Automation, and Robotics, Seoul, Korea, 22–25 July 1997; pp. 217–222.
Figure 1. Landmark measurements in the vision navigation.
Figure 1. Landmark measurements in the vision navigation.
Sensors 16 01672 g001
Figure 2. Proposed method of the vision/INS integrated navigation system.
Figure 2. Proposed method of the vision/INS integrated navigation system.
Sensors 16 01672 g002
Figure 3. Scheme of simulation.
Figure 3. Scheme of simulation.
Sensors 16 01672 g003
Figure 4. Vehicle trajectory and landmarks in simulation.
Figure 4. Vehicle trajectory and landmarks in simulation.
Sensors 16 01672 g004
Figure 5. Camera sensor error estimation results of the simulation.
Figure 5. Camera sensor error estimation results of the simulation.
Sensors 16 01672 g005
Figure 6. Navigation results of the simulation.
Figure 6. Navigation results of the simulation.
Sensors 16 01672 g006
Figure 7. Position error of the simulation.
Figure 7. Position error of the simulation.
Sensors 16 01672 g007
Figure 8. Experimental setup and reference navigation system.
Figure 8. Experimental setup and reference navigation system.
Sensors 16 01672 g008
Figure 9. Vehicle’s reference trajectory in the experiment.
Figure 9. Vehicle’s reference trajectory in the experiment.
Sensors 16 01672 g009
Figure 10. Navigation results of the experiment.
Figure 10. Navigation results of the experiment.
Sensors 16 01672 g010
Figure 11. Position error of the experiment.
Figure 11. Position error of the experiment.
Sensors 16 01672 g011
Table 1. IMU specification and INS initial attitude error for simulation.
Table 1. IMU specification and INS initial attitude error for simulation.
SpecificationDescription
Accelerometer bias5 mg
Accelerometer random walk0.1   m / s / h
Gyro bias100°/h
Gyro random walk0.5° / h
Data rate100 Hz
Roll Error0.1°
Pitch Error0.1°
Yaw Error5.0°
Table 2. Vision sensor specification for simulation.
Table 2. Vision sensor specification for simulation.
SpecificationDescriptionSpecificationDescription
Focal length25 mmAvg. of focal lengh error200 um
No. of horizontal pixels4000Avg. of horizontal optical axis coordinate error200 um
No. of vertical pixels3000Avg. of vertical optical axis coordinate error200 um
Field of view90°Focal lengh error ( 1 σ )200 um
Horizontal pixel pitch8 umHorizontal optical axis coordinate error ( 1 σ )200 um
Vertical pixel pitch8 umVertical optical axis coordinate error ( 1 σ )200 um
Data rate10 Hz
Table 3. RMS navigation parameter error of the simulation.
Table 3. RMS navigation parameter error of the simulation.
ErrorPure INSMethod in [14]Propsosed Method
Position error (m)N1026.8818.590.99
E3350.7215.901.12
D1370.555.340.45
Velocity error (m/s)N22.043.462.73
E22.092.883.07
D15.150.971.14
Attitude error (°)Roll0.692.020.06
Pitch0.652.440.37
Yaw8.423.140.63
Table 4. IMU specification and initial attitude error for the experiment.
Table 4. IMU specification and initial attitude error for the experiment.
SpecificationDescription
ManufacturerCrossbow Ltd.
Accelerometer bias10 mg
Accelerometer random walk0.1 m/s/ h
Accelerometer scaling factor error10,000 ppm
Gyro bias3600°/h
Gyro random walk1.0°/ h
Gyro scaling factor error1000 ppm
Data rate135 Hz
Roll error0.61°
Pitch error0.01°
Yaw error4.30°
Table 5. Vision sensor specification for the experiment.
Table 5. Vision sensor specification for the experiment.
SpecificationDescription
ManufacturerAxis Ltd.
Image sensorSensor typeCMOS-color
No. of horizontal pixel1280
No. of vertical pixel800
Horizontal pixel pitch3 um
Vertical pixel pitch3 um
LensFocal length1.7 mm
Field of view99°
Data rate (frame rate)Max 30 Hz (1.4 Hz is used in experiment)
Table 6. RMS navigation parameter error of the experiment.
Table 6. RMS navigation parameter error of the experiment.
ErrorPure INSMethod in [14]Propsosed Method
Position error (m)N7110.959.296.50
E1228.3216.446.47
D6973.9815.343.25
Velocity error (m/s)N52.022.581.65
E166.064.131.87
D200.257.834.67
Attitude error (°)Roll32.544.042.31
Pitch20.823.532.91
Yaw53.055.614.68

Share and Cite

MDPI and ACS Style

Kim, Y.; Hwang, D.-H. Vision/INS Integrated Navigation System for Poor Vision Navigation Environments. Sensors 2016, 16, 1672. https://doi.org/10.3390/s16101672

AMA Style

Kim Y, Hwang D-H. Vision/INS Integrated Navigation System for Poor Vision Navigation Environments. Sensors. 2016; 16(10):1672. https://doi.org/10.3390/s16101672

Chicago/Turabian Style

Kim, Youngsun, and Dong-Hwan Hwang. 2016. "Vision/INS Integrated Navigation System for Poor Vision Navigation Environments" Sensors 16, no. 10: 1672. https://doi.org/10.3390/s16101672

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop