Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera
Abstract
:1. Introduction
2. Methodology
2.1. Rigorous Imaging Model for the Single Camera
2.2. Rigorous Imaging Model of the Big Virtual Camera
2.3. Image Mosaic Workflow Based on VSD and BVC
2.4. Error Analysis of the Mosaic Imaging Process
2.4.1. Consistency of the Image Positioning Accuracy
2.4.2. Influence of the Elevation Error
3. Experiments and Discussion
3.1. Experimental Data
3.2. Results and Discussion
3.2.1. Visual Evaluation
3.2.2. Geometric Accuracy Evaluation
RFM Fitting Precision
Mosaic Accuracy
Positioning Accuracy
4. Conclusions and Future Work
- 1)
- Based on the rigorous imaging model for the single camera, the rigorous imaging model for the big virtual camera was established, which would exactly apply to the complete stitched image. Additionally, the image mosaic workflow based on the virtual single detector and the big virtual camera were presented in detail.
- 2)
- High accuracy camera parameters on the satellite are necessary in the proposed mosaicking approach. Therefore, high accuracy on-orbit geometric calibration is the precondition to guarantee the effectiveness of our approach.
- 3)
- Benefiting from the platform stability and the small relative installation error, the mosaic error of the camera overlap region could be controlled in one pixel after an on-orbit high accuracy geometric calibration and the relative orientation, otherwise, a local correction may be required to achieve seamless mosaicking.
- 4)
- Cloud coverage and complex terrain in the camera overlap region may influence the homonymy point matching. Although we verified the ability of the translation transformation in the relative orientation to reduce the dependence on the homonymy point quantity and quality, it still cannot be applied to all situations.
- 5)
- The relative installation instability of the double-cameras and the high-frequency platform vibration are key factors in achieving the seamless mosaic from the double-cameras, therefore, if they can be securely attached in the future satellite platform, a simpler workflow without relative orientation can achieve a more ideal mosaic result.
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Baltsavias, E.; Li, Z.; Eisenbeiss, H. DSM generation and interior orientation determination of IKONOS images using a testfield in Switzerland. Photogramm. Fernerkund. Geoinform. 2006, 1, 41–54. [Google Scholar]
- Cao, J.; Yuan, X.; Gong, J. In-orbit Geometric Calibration and Validation of ZY-3 Three-line Cameras Based on CCD-Detector Look Angles. Photogramm. Rec. 2015, 30, 211–226. [Google Scholar] [CrossRef]
- Lussy, F.D.; Kubik, P.; Greslou, D.; Pascal, V.; Gigord, P. Pleiades-HR image system products and quality Pleiades-HR image system products and geometric accuracy. In Proceedings of the ISPRS Hannover Workshop, Hannover, Germany, 17–20 May 2005; pp. 1–6. [Google Scholar]
- Zitova, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comput. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
- Ait-Aoudia, S.; Mahiou, R.; Djebli, H.; Guerrout, E.H. Satellite and aerial image mosaicking—A comparative insight. In Proceedings of the 16th International Conference on Information Visualisation, Montpellier, France, 11–13 July 2012; pp. 652–657. [Google Scholar]
- Afek, Y.; Brand, A. Mosaicking of orthorectified aerial images. Photogramm. Eng. Remote Sens. 1998, 64, 115–124. [Google Scholar]
- Kim, K.; Jezek, K.C.; Liu, H. Orthorectified image mosaic of Antarctica from 1963 Argon satellite photography: Image processing and glaciological applications. Int. J. Remote Sens. 2007, 28, 5357–5373. [Google Scholar] [CrossRef]
- Faraji, M.R.; Qi, X.; Jensen, A. Computer vision–based orthorectification and georeferencing of aerial image sets. J. Appl. Remote Sens. 2016, 10, 036027. [Google Scholar] [CrossRef]
- Brown, M.; Lowe, D.G. Automatic Panoramic Image Stitching using Invariant Features. Int. J. Comput. Vis. 2007, 74, 59–73. [Google Scholar] [CrossRef]
- Adel, E.; Elmogy, M.; Elbakry, H. Image stitching system based on ORB feature-based technique and compensation blending. Int. J. Adv. Comput. Sci. Appl. 2015, 6, 55–62. [Google Scholar] [CrossRef]
- Behrens, A.; Rollinger, H. Analysis of Feature Point Distributions for Fast Image Mosaicing Algorithms. Acta Polytech. J. Adv. Eng. 2010, 50, 11–18. [Google Scholar]
- Pan, J.; Hu, F.; Wang, M.; Tang, X.; Jin, S.; Lu, G. An Inner FOV Stiching Method for Non-collinear TDI CCD Images. Acta Geod. Cartogr. Sin. 2014, 43, 1165–1173. [Google Scholar]
- Jiang, Y.; Zhang, G.; Li, D.; Tang, X.; Huang, W.; Li, L. Correction of Distortions in YG-12 High-Resolution Panchromatic Images. Photogramm. Eng. Remote Sens. 2015, 81, 25–36. [Google Scholar] [CrossRef]
- Poli, D.; Toutin, T. Review of developments in geometric modelling for high resolution satellite pushbroom sensors. Photogramm. Rec. 2012, 27, 58–73. [Google Scholar] [CrossRef]
- Toutin, T. Review article: Geometric processing of remote sensing images: Models, algorithms and methods. Int. J. Remote Sens. 2004, 25, 1893–1924. [Google Scholar] [CrossRef]
- Wang, M.; Yang, B.; Hu, F.; Zang, X. On-Orbit geometric calibration model and its applications for high-resolution optical satellite imagery. Remote Sens. 2014, 6, 4391–4408. [Google Scholar] [CrossRef]
- Zhang, G.; Jiang, Y.; Li, D.; Huang, W.; Pan, H.; Tang, X.; Zhu, X. In-Orbit Geometric Calibration and Validation of ZY-3 Linear Array Sensors. Photogramm. Rec. 2014, 29, 68–88. [Google Scholar] [CrossRef]
- Wang, M.; Zhu, Y.; Jin, S.; Pan, J.; Zhu, Q. Correction of ZY-3 image distortion caused by satellite jitter via virtual steady reimaging using attitude data. ISPRS J. Photogramm. Remote Sens. 2016, 119, 108–123. [Google Scholar] [CrossRef]
- Jiang, Y.H.; Zhang, G.; Tang, X.; Li, D.; Huang, W.C. Detection and Correction of Relative Attitude Errors for ZY1-02C. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7674–7683. [Google Scholar] [CrossRef]
- De Lussy, F.; Greslou, D.; Dechoz, C.; Amberg, V.; Delvit, J.M.; Lebegue, L.; Blanchet, G.; Fourest, S. Pleiades HR in flight geometrical calibration: Location and mapping of the focal plane. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 519–523. [Google Scholar] [CrossRef]
- Wang, M.; Cheng, Y.; Chang, X.; Jin, S.; Zhu, Y. On-orbit geometric calibration and geometric quality assessment for the high-resolution geostationary optical satellite GaoFen4. ISPRS J. Photogramm. Remote Sens. 2017, 125, 63–77. [Google Scholar] [CrossRef]
- Chen, Y.; Xie, Z.; Qiu, Z.; Zhang, Q.; Hu, Z. Calibration and Validation of ZY-3 Optical Sensors. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4616–4626. [Google Scholar] [CrossRef]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Fraser, C.S.; Hanley, H.B. Bias compensation in rational functions for IKONOS satellite imagery. Photogramm. Eng. Remote Sens. 2003, 69, 53–57. [Google Scholar] [CrossRef]
- Hanley, H.B.; Yamakawa, T.; Fraser, C.S. Sensor orientation for high-resolution satellite imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2002, 34, 69–75. [Google Scholar]
Information | Multispectral Sensor | Panchromatic Sensor |
---|---|---|
Spectral range | B1: 450~520 nm | Pan: 450~900 nm |
B2: 520~590 nm | ||
B3: 630~690 nm | ||
B4: 770~890 nm | ||
Pixel size | 40 µm | 10 µm |
TDI-CCD number of each band | 1536 × 5 | 6144 × 5 |
Overlapping TDI-CCD number | 95 × 4 | 380 × 4 |
Ground sample distance | 3.24 m | 0.81 m |
Focal length | 7785 mm | |
Field angle | 2.1° | |
Quantization bits | 10 |
Study Area | Images | Imaging Date | Satellite Attitude Roll/Pitch/Yaw (Degree) | * | Image Size | ||
---|---|---|---|---|---|---|---|
Camera A/B | |||||||
Songshan | Scene A | 27 October 2015 | 12.99870 | 0.00111 | 2.99763 | 431/1359 | 29,200 × 27,620 |
Songshan | Scene B | 27 October 2015 | 12.99870 | 0.00091 | 3.00424 | 431/1359 | 29,200 × 27,620 |
Anyang | Scene C | 20 October 2015 | −7.00335 | −0.00039 | 3.04978 | 39/98 | 29,200 × 27,620 |
Dongying | Scene D | 16 December 2016 | −4.00286 | −0.00014 | 3.00613 | 4/23 | 29,200 × 27,620 |
Scene | Statistic (Pixels) | Oscillating Attitude | Smooth Attitude | ||
---|---|---|---|---|---|
Sample | Line | Sample | Line | ||
A | Mean | −2.01 × 10−4 | −3.92 × 10−5 | −1.05 × 10−6 | −1.78 × 10−7 |
RMSE | 0.13 | 0.43 | 3.37 × 10−5 | 3.03 × 10−5 | |
Maximum | 0.65 | 0.79 | 8.99 × 10−5 | 6.78 × 10−5 | |
Minimum | −0.54 | -0.88 | −9.01 × 10−5 | −6.80 × 10−5 | |
B | Mean | −1.41 × 10−4 | −4.57 × 10−5 | −1.82 × 10−6 | −2.28 × 10−7 |
RMSE | 0.11 | 0.40 | 2.96 × 10−5 | 3.19 × 10−5 | |
Maximum | 0.53 | 0.81 | 7.17 × 10−5 | 8.90 × 10−5 | |
Minimum | −0.28 | −0.84 | −7.12 × 10−5 | −8.84 × 10−5 | |
C | Mean | 4.47 × 10−4 | −3.40 × 10−6 | −3.44 × 10−6 | −1.78 × 10−7 |
RMSE | 0.16 | 0.32 | 3.61 × 10−5 | 2.48 × 10−5 | |
Maximum | 0.57 | 0.83 | 9.27 × 10−5 | 7.78 × 10−5 | |
Minimum | −0.42 | −0.93 | −9.22 × 10−5 | −7.80 × 10−5 | |
D | Mean | −2.63 × 10−6 | −4.20 × 10−6 | −2.22 × 10−6 | −1.50 × 10−7 |
RMSE | 0.23 | 0.32 | 3.26 × 10−5 | 5.12 × 10−6 | |
Maximum | 0.57 | 0.74 | 8.10 × 10−5 | 1.00 × 10−5 | |
Minimum | −0.57 | −0.88 | −8.00 × 10−5 | −9.00 × 10−6 |
Transformation | Error | Scene A | Scene B | Scene C | Scene D | ||||
---|---|---|---|---|---|---|---|---|---|
Sample | Line | Sample | Line | Sample | Line | Sample | Line | ||
Translation | Maximum | 0.59 | 0.58 | 0.59 | 0.65 | 0.52 | 0.60 | −0.55 | 0.56 |
Minimum | −0.60 | −0.66 | −0.58 | −0.69 | −0.61 | −0.69 | −0.55 | −0.55 | |
RMSE | 0.13 | 0.14 | 0.13 | 0.14 | 0.14 | 0.13 | 0.14 | 0.13 | |
Affine | Maximum | 0.59 | 0.59 | 0.60 | 0.64 | 0.53 | 0.60 | −0.54 | 0.56 |
Minimum | −0.61 | −0.63 | −0.59 | −0.69 | −0.60 | −0.68 | −0.55 | −0.55 | |
RMSE | 0.13 | 0.14 | 0.13 | 0.14 | 0.14 | 0.13 | 0.14 | 0.13 |
Absolute Positioning Accuracy | Relative Positioning Accuracy | ||||
---|---|---|---|---|---|
Scene | Sensor | X/Pixel | Y/Pixel | X/Pixel | Y/Pixel |
A | Camera A | −14.81 | −26.84 | 0.90 | 0.89 |
Camera B | −14.84 | −20.67 | 0.95 | 0.92 | |
Virtual Camera by TT | −14.80 | −28.26 | 1.07 | 1.12 | |
Virtual Camera by AT | −13.10 | −21.41 | 1.01 | 1.05 | |
B | Camera A | −13.89 | −29.56 | 0.92 | 0.89 |
Camera B | −14.24 | −29.21 | 0.94 | 0.91 | |
Virtual Camera by TT | −14.15 | −29.23 | 1.05 | 1.08 | |
Virtual Camera by AT | −13.50 | −20.86 | 1.04 | 1.02 | |
C | Camera A | 9.62 | 12.92 | 0.89 | 0.85 |
Camera B | 7.88 | 11.20 | 0.92 | 0.86 | |
Virtual Camera by TT | 9.13 | 12.33 | 1.12 | 0.98 | |
Virtual Camera by AT | 9.93 | 11.18 | 1.00 | 0.96 | |
D | Camera A | −19.19 | −23.16 | 0.91 | 0.90 |
Camera B | −18.51 | −29.24 | 0.92 | 0.92 | |
Virtual Camera by TT | −19.23 | −27.27 | 1.02 | 1.08 | |
Virtual Camera by AT | −18.10 | −28.35 | 0.98 | 0.99 |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Cheng, Y.; Jin, S.; Wang, M.; Zhu, Y.; Dong, Z. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera. Sensors 2017, 17, 1441. https://doi.org/10.3390/s17061441
Cheng Y, Jin S, Wang M, Zhu Y, Dong Z. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera. Sensors. 2017; 17(6):1441. https://doi.org/10.3390/s17061441
Chicago/Turabian StyleCheng, Yufeng, Shuying Jin, Mi Wang, Ying Zhu, and Zhipeng Dong. 2017. "Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera" Sensors 17, no. 6: 1441. https://doi.org/10.3390/s17061441
APA StyleCheng, Y., Jin, S., Wang, M., Zhu, Y., & Dong, Z. (2017). Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera. Sensors, 17(6), 1441. https://doi.org/10.3390/s17061441