Next Article in Journal
KVGCN: A KNN Searching and VLAD Combined Graph Convolutional Network for Point Cloud Segmentation
Next Article in Special Issue
An Improved Aggregated-Mosaic Method for the Sparse Object Detection of Remote Sensing Imagery
Previous Article in Journal
Rethinking the Fourier-Mellin Transform: Multiple Depths in the Camera’s View
Previous Article in Special Issue
A Lightweight Object Detection Framework for Remote Sensing Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Image Stitching Method for Airborne Wide-Swath HyperSpectral Imaging System Equipped with Multiple Imagers

1
Key Laboratory of Quantitative Remote Sensing Information Technology, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
3
Jihua Laboratory, Foshan 528200, China
4
China Centre for Resources Satellite Data and Application, Beijing 100094, China
5
Beijing SpaceWill Info. Co., Ltd., Beijing 100089,China
6
Chinese Academy of Surveying & Mapping, Beijing 100036, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2021, 13(5), 1001; https://doi.org/10.3390/rs13051001
Submission received: 12 January 2021 / Revised: 28 February 2021 / Accepted: 1 March 2021 / Published: 6 March 2021
(This article belongs to the Special Issue Advances in Optical Remote Sensing Image Processing and Applications)

Abstract

:
The field of view (FOV) of pushbroom hyperspectral imager is limited by the compromise of the detector scale and requirements of spatial resolution. Combining imagers along the sampling direction effectively expands its FOV and improves the imaging efficiency. Due to the small overlapping area between the adjacent imagers, stitching the images using traditional methods need a large amount of ground control points (GCPs) or additional strips, which reduce the efficiency of both image acquisition and processing. This paper proposed a new method to precisely stitch images acquired from multiple pushbroom imagers. First, the relative orientation model was built based on the homonymy points to calculate the relative relationship between the adjacent imagers. Then rigorous geometric imaging model was adopted to generate a seamless stitching image. Simulation data was used to verify the accuracy of the method and to quantitatively analyze the effect of different error sources. Results show that the stitching accuracy is better than two pixels. Overall, this method provides a novel solution for stitching airborne multiple pushbroom images, to generate the seamless stitching image with wide FOV.

Graphical Abstract

1. Introduction

Hyperspectral imaging technology has been developed over the past decades and is widely used in agriculture, vegetation, environmental monitoring, and other fields [1,2,3,4]. Airborne technology has played an important role in these fields because of its higher spatial resolution and flexibility as compared to similar imagers on spaceborne platforms. The imaging mode of aerial hyperspectral imager can be mainly divided into whiskbroom, pushbroom, and step-stare mode. Among these, pushbroom imaging is the most widely adopted mode. It has the advantage of increasing pixel dwell time and improving signal-to-noise ratio (SNR). Moreover, there is no complex mechanical scanning mechanism, so the weight and volume of the instrument are lower. However, this mode is difficult to get the balance between the wide field of view (FOV) and narrow instantaneous field of view (IFOV) [5]. To improve the swath without reducing the spatial resolution, combination technology has been developed. Research has achieved, for example, an airborne pushbroom hyperspectral imaging system with 42° FOV by combining two imagers [6], and wide swath and high resolution airborne pushbroom hyperspectral imager with 40° FOV consisted of three imagers [7]. However, with the purpose of increasing the FOV, the combined imagers usually produce images with very small overlaps. Besides, low stability of aviation flight platform, and the pushbroom imaging mode make the image stitching more difficulty. Thus, high precision images stitching with small overlaps is of great importance for further application. Therefore, it is necessary to investigate the image stitching methods for the airborne imaging system with multiple pushbroom imagers.
Image stitching methods have been studied by many researchers, especially for spaceborne pushbroom imaging system. The methods can be generally grouped into the image-space-oriented and object-space-oriented methods [8]. The image-space-oriented method registers original images based on homonymy points extracted from overlaps. Generally, simple transformation models—such as affine transformation—are often used for original images registration [9,10,11,12,13,14]. Since this method cannot provide a strict imaging model, the stitched images are difficult to perform geometric rectification processing, which limits further processing and applications [15]. In addition, airborne platform usually performs lager image internal distortion due to its lower stability compared with space platform—i.e., the attitude changes more drastically in a few imaging arrays, which causes the homonymy points obtained from inconsistent imaging time have different scale and direction of distortion on adjacent images. It is difficult to achieve registration for the images with irregular internal distortions. Overall, this method is not suitable for airborne pushbroom images stitching.
The object-space-oriented method aims to establish relationship between the stitched image and each original sub-image by rigorous geometric models. Spaceborne multi-CCD (charge-coupled device) images are stitched based on virtual projection plane [15,16,17,18]. They assumed that the stitched image is observed by a virtual CCD on the focal plane, then based on the geometric sensor model, the relationship between the original image and the virtual image can be established through the ground surface. Tang et al. [19] presented an inner FOV stitching method for spaceborne multi-CCD images based on sensor geometry and projection plane in object space. Cheng et al. [20] proposed a high accuracy image mosaicking approach for a spaceborne dual-camera system based on the big virtual camera. Jiang et al. [21] proposed a method to stitch images of dual-cameras onboard one satellite, it recovers the relative geometric relation of the dual cameras by building a relative calibration model. Researchers also pointed out that the consistency of the image positioning accuracy between the adjacent single images is the foundation of geometric accuracy and stitching accuracy [18,19,20]. The accuracies are assured by the on-orbit high accuracy geometric calibration for each sensor which may better than 0.3 pixels. In other words, dedicated calibration field is required to ensure the calibration accuracy of each camera. For airborne pushbroom sensors, several calibration methods were proposed based on a lot of ground control points (GCPs) or calibration strips with large overlaps [22,23,24,25,26,27]. The calibration accuracy is about two pixels. However, to perform the calibration, a sufficient number of GCPs should be distributed evenly in the coverage area of each imager, or calibration strips for each imager should be performed, which means multiple complicated data acquisition and calibration processes need to be carried out for each camera separately. Meanwhile, it will increase both money and time cost.
For airborne pushbroom image stitching, Zhang [28] proposed a method which performs image registration after geometric rectification. Images obtained by each camera is georeferenced with the initial parameters first. Then, tie points of adjacent georectified images are extracted for registration. However, all groups of images should be georectified separately and registered successively, which limits data processing efficiency.
In this paper, we proposed a novel method for stitching images of airborne imaging system equipped with multiple pushbroom imagers. Firstly, relative orientation relation of the adjacent imagers is established by the homonymy points and digital elevation model (DEM). Then seamless stitching images can be produced based on the rigorous geometric model. The advantage of this method is that calibration for relative orientation parameters by GCPs or calibration strips is no longer needed. The validation of the proposed method and error analysis were performed by the experiments with simulation data.

2. Materials and Methods

2.1. Materials

In this paper, the Airborne Wide Swath and High-Resolution Hyperspectral Imaging System (AWSHRHIS) in development is composed of three pushbroom hyperspectral imagers in visible and near-infrared band. The parameters of AWSHRHIS are designed as shown in Table 1. Three subsystems work for left, middle and right FOV respectively, and the total FOV is designed to be 30 degrees with a 0.11 mrad IFOV. The hyperspectral imagers are mounted on a three-axis stabilized platform. Therefore, the influences of various disturbances on imaging sensors can be mostly isolated.
Navigation sensors are used to record the navigation parameters of the platform, including the position and attitudes. The quality of the geometrical rectification using the navigation parameters depends on the accuracy of these sensors. The inertial measuring unit (IMU) is mounted together with the imaging system inside the stabilized platform to measure the attitude. The Global Navigation Satellite System (GNSS) antenna is mounted outside the aircraft platform to measure the position. Synchronization unit is used to drive the triple imagers to exposure simultaneously. For each image line collected, the GPS timestamp is written in the header file of the raw image.
The principle of imaging is shown in Figure 1. The middle imager is perpendicular to the visual field of ACBC. The left and right imagers are tilted inwards, with the FOV on the ground of ALBL and ARBR. The total FOV of AWSHRHIS is ARBL, in which ALBC is the overlap between the middle and right images, and ACBR is the overlap between the middle and left images.
Pushbroom scanning mode is used in AWSHRHIS. With the movement of the flight platform, three image strips can be obtained respectively, with small imaging overlaps. Due to the installation alignment error, the FOV of each subsystem exposured at the same time is misaligned on the ground as shown in Figure 2. The boresight misalignments can induce inconsistence of imaging time for the same object in image overlaps. It is hard to clear off the influence of platform vibration, especially for an airborne platform with low stability, which usually induce obvious image distortion. The inconsistent imaging time of homonymy points will cause different scale and direction of distortion on images. Therefore, eliminating image distortion based on geometric imaging model is an important step to achieve seamless image stitching.

2.2. Rigorous Imaging Model

Each scan line of the pushbroom images is an independent image and has its own position and orientation due to platform motion. Therefore, image georectification has to be performed line by line for each image. The orientations of the IMU/GNSS data should be converted to the external orientation elements associated with each scan line. After a series of coordinate transformations, the linear image points can be projected from the image coordinate system to the mapping coordinate system. The coordinate systems involved in: (i) imaging space coordinate system(c); (ii) sensor coordinate system(s); (iii) IMU coordinate system(b); (iv) navigation coordinate system(g); (v) Earth-center fixed coordinate system(e), i.e., WGS84 geodetic system is used in this paper; and (vi) the mapping coordinate system(m) [29].
The rigorous geometric imaging model can be represented by the following formula:
X Y Z = λ R c , i m x x 0 y y 0 f + T c , i m ,
where
R c , i m = R e m R g e R b g R s b R c s = a i , 1 a i , 2 a i , 3 b i , 1 b i , 2 b i , 3 c i , 1 c i , 2 c i , 3 ,
T c , i m = R e m R g e R b g ( T s b + R s b T c s ) + T GPS T m = X i , S Y i , S Z i , S ,  
where X Y Z T is the geographical coordinates of the image point x , y at the imaging time i in the mapping coordinate system. For pushbroom images, when the X direction of imaging space coordinate system is defined as the sampling direction, the y coordinate is always equal to 0. λ is a scale factor. R c s is the rotation matrix from the imaging space coordinate system of each imager to the sensor coordinate system. R s b is the rotation matrix from the sensor coordinate system to the IMU coordinate system. R b g is the rotation matrix from the IMU coordinate system to the navigation coordinate system. R g e is the rotation matrix from the navigation coordinate system to the Earth center fixed coordinate system. R e m is the rotation matrix from the Earth center fixed coordinate system to the mapping coordinate system. T c s is the level arm between imager coordinate system and the sensor coordinate system. T s b is the level arm between the sensor coordinate system and IMU coordinate system. T GPS is the coordinate of the platform position measured by GPS at the imaging time i. T m is the origin of the mapping coordinate system.

2.3. Relative Orientation Model Base on DEM

After the installation of imaging system, the relative orientation relationships between adjacent imagers are fixed during the flight. Ideally, the roll angle between adjacent imagers should be equal to the designed angle, and the pitch and heading angles of the triple imagers should be aligned in parallel. However, due to the assembly errors, the boresight is not strictly aligned as designed.
Therefore, we propose a relative orientation method based on homonymy points to calculate the misalignment angles between adjacent imagers. The baseline between the adjacent imagers is very small (below 20 cm in AWSHRHIS). Thus, the base-height ratio is much less than that of expected in conventional mapping photography. The small parallactic angle reduces the z-dimension measuring accuracy in the stereo model, i.e., the coordinates of ground points calculated by forward intersection is unreliable. In order to solve this problem, DEM data is introduced to improve the accuracy.
The middle imager is taken as the reference imager, and the adjacent imager as the target imager. Thus, the sensor coordinate system is parallel to the image space coordinate system of middle imager. The unknowns are the three orientation parameters (ω, φ, κ) between the target imager and the reference imager image space coordinate system. p R and p T are a pair of homonymy points (as shown in Figure 3), which are obtained by the reference imager and the target imager at the time of t i and t j , respectively. The corresponding coordinates a R , i and a T , j in the image space coordinate systems are defined by
a R , i = u R , i 0 f R ,   a T , j = u T , j 0 f T ,  
where u R , i and u T , j are the x coordinates in reference image and target image. f R and f T are the focus of the reference imager and the target imager.
The point p R is transformed from the image space coordinate system to the object space coordinate system by Equation (5):
A R , i = x R , i y R , i z R , i = R c m , i u R , i 0 f R + T c m , i
M in Figure 3 is the ground point corresponding to the image point p R . Its coordinate X R , i in object space coordinate is calculated by Equation (6), i.e., the intersection between the look direction and the topographic surface, by
X R , i = H z R , i A R , i ,
where H is the height difference between the perspective center S R , i and the ground point M . The elevation of the ground point M is calculated using bilinear interpolation based on the DEM. Iterative is required to calculate the point coordinate.
With the rotation matrix R m c , j and offset T m c , j acquired from IMU/GNSS data, the coordinates X R , j in the image space coordinate at time t j is determined such that
X R , j = R m c , j X R , i + T m c , j ,
With the relative orientation matrix R c s and T c s from target camera to the reference camera, the coordinates X T , j in the image space coordinate of target camera defined by
X T , j = R c s X R , j + T c s ,
where R c s is constructed by using three sequential rotations: ω about the X-axis, φ about the once-rotated Y-axis, and κ about the twice-rotated Z-axis. T c s is measured before the flight experiment.
Then, the coordinate a T , j u T , v T , f T of the image point p T corresponding to the ground point M is determined by
a T = u T v T f T = f T X T , j Z T , j ,
That is
u T = f T X T , j Z T , j v T = f T Y T , j Z T , j ,
Theoretically, the point p T and p T should be coincident, but deviation usually exist between the calculated image position and actual position due to the projective compensation of systematic errors, which is determined by the relative orientation parameters. A least squares solution can be performed to minimize distance between p T and p T by Equation (11) to solve the relative orientation elements.
Δ u = u T , j u T 0 v T = u T , j + f T X T , j Z T , j f T Y T , j Z T , j ,

2.4. Workflow

As illustrated in Figure 4, the stitching method based on relative orientation and rigorous geometric model is performed via the following steps. Image radiometric and IMU/GNSS data interpolation were performed first. Then, relative orientation elements between the adjacent imagers were calculated. A rigorous imaging model was established with the relative orientation elements, and georectification was performed to reduce the image distortion which mainly caused by the unstable flight condition. Then, image resampling and stitching were performed for each band of the image.

2.4.1. Step 1: Preprocessing

Images and support data were prepared for the images stitching in this step. Firstly, the position and attitude data measured by IMU/GNSS were interpolated by the GPS timestamps. Secondly, the position and attitude data were converted to the exterior orientation elements of each scan line in the original images. Meanwhile, the radiometric processing for hyperspectral images was performed using a pixel-based dark current calibration and relative radiometric correction.

2.4.2. Step 2: Relative Orientation

Scale-invariant feature transform (SIFT) algorithm was implemented extract and match homonymy points between the target image and reference image. Additionally, the random sample consensus (RANSAC) algorithm was used to remove outliers. Then the relative orientation model mentioned in Section 2.2 was performed. Taking the minimum projection difference between the homonymy point ( u T , j , 0 ) and the projective point ( u T , v T ) as the optimized condition, the error equation was established, and the relative orientation elements were calculated using weighted least square method.

2.4.3. Step 3: Image Stitching

After relative orientation, the rigorous geometric model was established for each imager. Then image stitching was performed together with the image georectification.
Before image stitching and georectification, the coverage of the stitched image was decided. Firstly, the points ( x i , y i ) j , where i = 1 , 2 , 3 , 4 , and j denotes the imager number, at the two ends of the first scanline and the last scanline were selected from the original images obtained by each imager. Then, their object space coordinates were calculated by Equation (1) with height Z as the average elevation H . Subsequently, the coordinates of the edge points ( X i , Y i ) j , where i = 1 , 2 , 3 , 4 , in original image j were decided. The coverage of the stitched image was decided by the minimum rectangular region enveloping, in which the edge points ( X max , Y max ) and ( X min , Y min ) were compared by the edge points ( X i , Y i ) j .
Then the relationship is established between pixels of the stitching image and that of original images. According to the original image space coordinates ( x i , y i ) j and the object space coordinates ( X i , Y i ) j , the affine transformation between the original image and the stitched image was established as
x = A 1 X + B 1 Y + C 1 y = A 2 X + B 2 Y + C 2 ,
According to the ground coordinates ( X , Y ) in the stitched image, the image coordinates in each original image was calculated using Equation (10). By judging whether the point is located in the image, the image number n was determined. Meanwhile, the initial image coordinates ( x , y ) of each pixel in original image n were calculated. The best scanline y of the ground point was searched by iteration. Then the image coordinates ( x , y ) were calculated using the backward projection from the IMU/GNSS data at corresponding scanline and the object space coordinate ( X , Y , Z ) , where Z can be interpolated based on the reference DEM.
Since for each grid unit in stitched image, the corresponding pixel ( x , y , n ) on the original image were calculated according to the previous step. Thus, the backward projection and resampling processes were performed to generate the stitching image.

3. Results

It is vital to analyze and evaluate the application feasibility and effectiveness of the model. In this paper, experiments were conducted to verify the proposed method and assess the accuracy of the stitched image based on simulated data. The effect of error factors was analyzed, so as to provide guidance for system design and flight test. In this section, firstly, we introduced the process and major imaging factors of the simulation. Then, the stitching results under different imaging conditions were shown, and quantitative analysis of the sensitivity corresponding to different error factors were performed.

3.1. Data Sources

Since the geometric rectification and stitching processes are same for different bands of hyperspectral data, we selected one band to perform simulation and analysis. Digital orthophoto map (DOM) and DEM, which represent the true conditions of the test area, were used to express the reflectivity information of the three-dimensional coordinate information during imaging. Then the simulation was performed based on the flight trajectory. High resolution SuperView-1 panchromatic image in Dangchang, Gansu Province was used as the base DOM as shown in Figure 5a. SuperView-1 is a high-resolution commercial remote sensing satellite, launched in 2016 by China. The ground sample distance (GSD) of the image is 0.5 m. Figure 5b shows the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) DEM used in the simulation. The ground resolution is 1″. The average height of the test area is about 1500 m. A real trajectory data obtained in actual flight test was used to simulate the position of perspective center in WGS84 coordinate system and its attitude in IMU body coordinate system.
Sensor imaging simulation includes the following aspects: imaging position and attitude simulation, and image space coordinate projection simulation. The base trajectory data was interpolated according to the speed-height ratio of AWSHRHIS, and then, the position in the trajectory is shifted to cover the range of the base DOM. The flight altitude was set to 10,000 m, so the GSD of the image was about 1 m. Image simulation is equal to single-ray backprojection. Since the three-dimensional information of the terrain is lost when it is projected into a two-dimensional image, the fixing elevation coordinate must be provided before we can backproject the ray into the world. For each pixel on the simulated image, according to the position and attitude corresponding to the imaging time, the image ray in the object space system was calculated based on the rigorous imaging model. Then iterative photogrammetry method was used to determine if the ray has intersected the terrain surface. Once the elevation was found, the position can be extracted. According to the object space coordinates ( X , Y ) , the digital number (DN) value in the DOM was interpolated and assigned to the simulated image. The simulated images were generated by traverse all the image pixels. The parameters of each imager are same with the designed parameters as shown in Table 1.
To validate the accuracy of the stitched images, two types of experiments were designed. Quantitative analysis experiments of the sensitivity corresponding to different error factors were performed in Section 3.2. Adaptability analysis experiments to different initial offset between adjacent imagers were performed in Section 3.3.
According to the relative orientation and stitching model, the main factors which may influence stitching accuracy includes: (1) the matching accuracy of the homonymy points; (2) the elevation error of auxiliary DEM; and (3) the accuracy of inner and exterior orientation elements. Different from the spaceborne platform, the stability of airborne platform is poor due to vibrations of the platform itself, the turbulence and wind field conditions. It is difficult to isolate the high-frequency vibration completely even if a three-axis stabilized platform is used. Based on the workflow mentioned above, the SIFT and RANSAC algorithms were used to ensure the homonymy points matching accuracy. However, due to the different imaging time between the adjacent images, the unstable airborne platform will result in characteristic distinction between the homonymy points, which decreases the matching accuracy.
It is meaningful for us to distinguish the errors caused by the model itself or by the environment disturbance, and to know how flexible the mode is. Therefore, two flight working conditions were considered in this paper. The first one is the stable flight condition. In order to simulate the stable flight condition, we simply set the roll and pitch angle of the attitude in the trajectory as fixed values. The other one is to simulate the actual unstable flight condition, especially with high-frequency vibration. In this condition, a group of actual recorded attitudes was used to simulate the original images.
The relative orientation parameters between adjacent imagers were designed randomly as shown in Table 2. With the base DOM and DEM data, five groups of 2048 line framing images were simulated. Figure 6 shows a group of images of right, middle, and left imagers in stable flight condition. The overlap area covers about 140 pixels, which are 7% of the total pixels in the left and right images.
To simulate the actual unstable flight condition, a variety of experiments with different initial values of relative orientation parameters were designed to analyze the robustness of the method. The basis data—such as the DOM, DEM, and trajectory data—for simulation is the same with the experiments in the unstable flight condition. Then imager misalignment angles between −0.15°and 0.15° were added randomly as the initial value for image simulation. Comprehensive analysis was performed later to evaluate the adaptability of the proposed method.

3.2. Analysis Corresponding to Different Error Sources

Simulation data under stable and unstable flight condition were used for quantitative analysis. In each condition, we analyzed the influence of error sources such as DEM, interior orientation, IMU/GNSS, and comprehensive analysis for all the errors mentioned above. In total, 10 groups of experiments were designed in the first type of experiments as shown in Table 3. Relative orientation was carried out considering error sources. Then, the relative orientation parameters were used to stitch images. To further validate the accuracy of the proposed method, quality of the stitched image was assessed by two indices: (1) the relative orientation residual, (2) the stitching accuracy of overlapping area.

3.2.1. Error Analysis in Stable Flight Condition

Analysis in stable flight condition was discussed with five tests: image processing in ideal case (1) without data error, (2) with DEM elevation error, (3) with interior orientation error, (4) with IMU/GNSS measurement error, (5) and with all the errors above.
(1)
Ideal Case
In order to verify the accuracy of the model, relative orientation and image stitching were processed without error. Figure 7 shows the stitched image generated from images in Figure 6.
Image projection residual calculated by Equation (9) was used to evaluate the precision of relative orientation. Figure 8a shows the image projection residuals of the left- middle images after relative orientation. A total of 639 pairs of homonymy points are matched. The residual error is less than 0.8 pixels. The RMSE (root mean square error) of the residual across the track and along the track are 0.103 pixels and 0.114 pixels, respectively. Figure 8b shows the image projection residuals of the right-middle images. A total of 728 pairs of homonymy points are matched. The RMSE of the residual across the track and along the track are 0.097 pixels and 0.101 pixels, respectively.
It is difficult to quantify the stitching accuracy through the seamline of the stitched image. In this paper, the mosaic accuracy was evaluated by the consistency of homonymy points in the overlap of adjacent images. This consistency can be calculated as follows. First, adopting the stitched image coverage as the frame, all the three original images were projected to this frame respectively based on the rigorous geometric model. This will generate three georectified images in the same object coordinate system. Ideally, the homonymy points in adjacent georectified images should have the same image coordinates. Then, a number of homonymy point pairs in adjacent georectified images were extracted automatically using the SIFT algorithm. The biases of the location of the point pairs were employed to evaluate the stitching accuracy.
Table 4 shows the stitching error of the left-middle and right-middle images. The stitching error of left-middle images is 0.049 pixels and 0.038 pixels in X and Y direction, and it of right-middle images is 0.045 pixels and 0.033 pixels, which are less than 0.1 pixels. Within the ideal stable flight model, the RMSE of relative orientation residual and the stitching error are both less than 0.1 pixels. For the rigorous imaging model used in this process, the small amount of error may be caused by the homonymy points matching error.
(2)
Effect of DEM Elevation Error
In order to avoid low accuracy of forward intersection caused by small parallactic angle, reference DEM was used in relative orientation. In addition, reference DEM was also used for georectification. Therefore, the elevation accuracy of the reference DEM is one of the influencing factors of image stitching. In this test, random elevation error following the normal distribution was added to the DEM used in image stitching, then the relative orientation residual and the stitching error were analyzed taking the left-middle image as example. The results are shown as Table 5 and Table 6.
Generally speaking, the effect of elevation accuracy is insignificant, and mainly affects the accuracy along the track. Even if the elevation error is increased to 100 m, the stitching RMSE is no more than 0.2 pixels.
(3)
Effect of Interior Orientation Error
Interior orientation parameters directly affect the metric accuracy. These include the location of the principal points, the focal length, and the lens distortion. The interior orientation error increases with the radial—i.e., the distortion at the image edge is more significant. The proposed method used the small overlap of the adjacent images which located at the image edge with the lager distortion for image stitching. Therefore, it is necessary to analyze the effects of interior orientation on image stitching. At present, the laboratory geometric calibration method for linear array camera has been mature. The precision is about one pixel by using precise angle measurement [30]. In this test, 1~3 pixels of distortion was introduced in image stitching. Table 7 and Table 8 list the relative orientation residual and stitching error.
The result shows that interior orientation error mainly affects the accuracy across the track. With the interior orientation error of three pixels on the edge, the stitching accuracy is less than 0.2 pixels.
(4)
Effect of IMU/GNSS Measurement Accuracy
The measurement accuracy of IMU/GNSS system, which is used to measure the position and attitude of the platform, directly affects the accuracy of exterior orientation elements of pushbroom images. The typical IMU/GNSS systems are POS AV 510 and 610 made by Applanix company. The post process spatial accuracy of POS AV 510 is 0.3 m, pitch and roll accuracy is 0.005°, and yaw accuracy is 0.008°. The post process spatial accuracy of POS AV 610 is 0.3 m, pitch and roll accuracy is 0.0025°, and yaw accuracy is 0.005°. According to this, the normal distribution error was added to the IMU/GNSS data used in stitching. Taking the left-middle image as an example, the relative orientation residual and the stitching error were analyzed, and the results are shown as Table 9 and Table 10.
The results show that the measurement accuracy of IMU/GNSS has an obvious influence on the results of stitching. The stitching RMSE is about one pixel using POS AV 510 system.
(5)
Comprehensive Analysis
In this experiment, typical errors were considered in the processing for accuracy analysis. The DEM elevation error was set to 20 m since the nominal accuracy of free global DEM data are reported as 20 m, such as SRTM, ASTER, and AW3D [31,32]. The interior orientation error was set to 1 pixel, which is the current laboratory geometric calibration accuracy. The position and attitude measurement accuracies were set according to POS AV 510.
Based on the method introduced before, the relative orientation and the image stitching were performed. Figure 9 shows the experimental results of the simulation data. Areas 1 and 2 are left-middle image overlapping regions and right-middle image overlapping regions. The stitching process effect were checked by visual inspection. By observing the continuity of ground features in the processed images of stitching regions, we can see that the offset of the original overlapping areas is eliminated.
Although the offset of the seamline has been well corrected, the IMU/GNSS measurement error, which directly affect the georectification accuracy, reduces the final stitching accuracy. The relative orientation residual is 1.397 pixels along track, and 1.388 pixels across track. The stitching error is 1.436 pixels in X direction, and 0.838 pixels in Y direction. Because of the pitch angle between the adjacent imagers, there is an offset about 14 lines in the flight direction, which can be effectively detected by visual evaluation. Figure 10 shows the overlapping area of adjacent images after georectification and the stitching image at the same location. It is obvious that different distortions of the same ground object features are represented in adjacent images. That is because the corresponding IMU/GNSS measurement error direction is inconsistent. In our test, the georectification accuracy caused by POS AV510 is more than one pixel, which directly results in the reduction of stitching accuracy.

3.2.2. Error Analysis in Unstable Flight Condition

Actually, the instability of the platform caused by the vibration of the platform itself and the turbulence of the atmosphere should also be taken into account. A group of actual recorded IMU data was used for simulation. Experiments EA-unstable 1~5 as shown in Table 3 were performed and analyzed.
(1)
Without Considering Any Error Factor
Based on the method proposed, the relative orientation and the image stitching were performed based on the unstable simulated images without data error. Figure 11a shows the image projection residuals of the left-middle images after relative orientation. Only 59 pairs of homonymy points are matched. The residual is less than 2 pixels. The RMSE across the track and along the track are 0.613 pixels and 0.688 pixels, respectively. Fifty pairs of homonymy points are matched from the right-middle images as shown in Figure 11b. The RMSE across the track and along the track are 0.626 pixels and 0.748 pixels, respectively.
Table 11 shows the stitching RMSE of the left-middle and right-middle images. The errors of left-middle images in x- and y-direction are 0.285 pixels and 0.126 pixels, respectively, and those of right-middle images are 0.135 pixels and 0.102 pixels, respectively.
Compared with the stable flight state, the original images are seriously distorted due to the attitude change of the platform as depicted in Figure 12. In this case, the matching accuracy of homonymy points is reduced, and the relative orientation accuracy is reduced too. After georectification, most of the geometric distortion is corrected, and the stitching accuracy can still be guaranteed at a good level.
(2)
Influence of DEM Elevation Error
With the original images in unstable flight condition, random elevation error in accordance with normal distribution was added to the DEM used in image stitching. Table 12 and Table 13 shows the relative orientation residual and stitching error of the left-middle images.
The results show that DEM elevation error has little effect on the image stitching. Because the effect of DEM error is far less than that of points matching, even if the elevation error increases to 100 m, it brings unobvious error to stitching.
(3)
Influence of Interior Orientation Error
In this experiment, distortion of experiment EA-stable-3 was added to the simulated images. The results are shown as Table 14 and Table 15. The effect of interior orientation error is less than that of points matching. Compared with the case without considered error influence, the stitching accuracy is slightly reduced.
(4)
Influence of IMU/GNSS Measurement Accuracy
IMU/GNSS measurement errors were considered in this experiment. The relative orientation residual and the stitching error are shown as Table 16 and Table 17. The decrease in stitching accuracy is evidently caused by IMU/GNSS measurement error.
(5)
Comprehensive Analysis
The same typical error value as EA-stable-5 were considered in the processing using the unstable flight simulation data.
The images before and after stitching are shown as Figure 13. Figure 13a is the overlapping area of left and middle original image, and Figure 13b shows the corresponding area of stitching image. Figure 13c,d show original and stitching images at the overlapping area between right and middle images. After georectification and stitching, most of the distortion is rectified, and the offset is eliminated.
Table 18 and Table 19 show the relative orientation residual and stitching error in unstable flight condition considering the elevation error, interior orientation error and measurement error of IMU/GNSS, which is close to the true circumstance. The results show that the stitching error is as the same level as that in stable platform, which is about 1.5 pixels.

3.3. Accuracy Evaluation

Within different initial orientation values, a total of 10 groups simulation data were used in this experiment. The results of relative orientation residuals and stitching accuracy of left-middle image and right-middle images are listed in Table 20 and Table 21. It can be seen that the RMSE of relative orientation residuals and stitching errors are less than two pixels.

4. Discussion

It is an effective way to expand the FOV and improve the imaging efficiency by assembling multiple pushbroom imagers in the sampling direction. In order to solve the problem of high-precision image stitching for wide FOV with weak overlap constraint and short baseline between adjacent pushbroom imagers, a relative orientation method based on DEM and object-space stitching method based on overlapping area were proposed. Based on the homonymy points of adjacent images, the relative relationship between adjacent imagers was retrieved with the assistance of DEM data, so as to realize the geometric rectification and image stitching. According to the analysis of the experimental results, the proposed method can effectively stitch the sub-images with an accuracy better than 2 pixels, even in poor flight conditions.
We compared our method with the object-space-oriented calibration method and the rectified images registration method. The experiment was performed using the simulation data of EA-unstable-5. Without the support of GCPs, the result shows our method has the stitching accuracy of left-middle image with 1.518 pixels in X direction, and 0.761 pixels in Y direction. (1) Comparison with object-space-oriented calibration method: This method is commonly used in spaceborne pushbroom image stitching. We applied the idea of the object-space-oriented method to the airborne pushbroom image stitching. Firstly, eight GCPs which are evenly distributed in the FOV of the left and the right images were selected. Then, the boresight misalignment of each imager was calibrated with the support of the GCPs. Finally, direct georeferencing (DG) and image stitching were performed. The stitching accuracy of overlapping area was used to validate the results. Taking the example of left-middle image, the stitching accuracy is 1.70 pixels in X direction, and 1.12 pixels in Y direction. However, this method needs at least four GCPs evenly distributed in the FOV or more than three strips with large overlaps for each imager calibration. That means three times number of the GCPs or calibration strips should be prepared for the AMHIS data processing. The method proposed in this paper does not need the support of GCPs or additional strips, which saves the corresponding workload and cost, and has good adaptability for the areas difficult to arrange GCPs evenly. (2) Comparison with rectified images registration method. This method is suitable for the condition when GCPs cannot be arranged, or in-flight calibration cannot be carried out. Firstly, images obtained by each imager were rectified using the designed parameters, then, the rectified images were registered by homonymy points. The stitching accuracy was used to validate the stitching results. the stitching accuracy of left-middle image is 1.45 pixels in X direction, and 0.97 pixels in Y direction. However, in this method, the images of each sub-imager need DG separately, and then, the image matching and stitching can be carried out one by one. In our method, image georectification and stitching can be performed together without matching process after the relative orientation. Overall, our method can improve the data processing efficiency significantly, and reach high stitching accuracy which is the same level compared with traditional methods.
Although our method is based on the rigorous imaging model, the quality of image stitching was limited by the elevation accuracy of the auxiliary DEM, the imager interior orientation error, the position and attitude measurement accuracy, and the platform stability.
The elevation difference between the actual elevation and the interpolated elevation would bring in stitching error along the flight direction [16,17,18,19,20,21]. As shown in Figure 14, supposing A is a ground point located in the overlap of camera 1 and camera 2, α 1 and α 2 denote the directional angle of the corresponding image homonymy points. The projection distance D of the stitching error caused by the elevation error Δ h can be established as
D = Δ h × ( tan α 1 tan α 2 ) ,
The larger the elevation error is, the greater the influence of the elevation error will be. For the free downloaded global DEM data with elevation accuracy better than 20 m, when the intersection angle in the flight direction of the homonymy points is up to 0.5°, D will reach 0.17 m, which is better than one pixel size for most airborne spectral imagers.
The experimental results show that, the interior orientation error has slight effect on the stitching accuracy. The error caused by interior orientation parameters of three pixels on the edge is about 0.2 pixels. However, the most serious distortion usually exists in the overlapping area of images. The distortion in the sample direction will cause deviation of the relative orientation parameters from the true value, and finally affect the georeferencing accuracy of images. Thanks to the system with combined imagers, the FOV of each sub-imager is narrow, which limited the lens distortion. However, in order to obtain better geometric positioning accuracy, laboratory calibration should be performed to reduce the distortion effects.
The accuracy of position and attitude measurement is the largest source of error. The error of external orientation elements directly influences the relative orientation and georectification effects for each sub-image. The imaging system analyzed in this paper has a very high resolution. When using DG method for geometric rectification, the positioning error caused by POS AV510 measurement accuracy is more than 1 pixel. The geometric distortion which cannot be eliminated completely will inevitably affect the accuracy of image stitching. Therefore, a high precision IMU/GNSS system is critical to ensure the high precision image stitching and georeferencing of AWSHRHIS.
The stability of the platform is also one of the main factors. Under the influence of platform jitter and the imaging time difference of the same object features, the matching accuracy of homonymy points will be greatly affected, which will reduce the relative orientation accuracy and cause the deviation of relative orientation results from the true value. Therefore, a three-axis stabilized platform and damping strategy should be used to isolate the high-frequency vibration of the platform to reduce the internal geometric distortion of the original image as much as possible.
The method described in this paper also has some shortage.
(1)
As mentioned before, the relative orientation results will deviate from the true value in some cases. Although the seamless stitching in one strip can be realized, the absolute geometric position accuracy at the edge of the stitched image may be reduced. The main reasons are the small overlapping area and the unevenly distributed points, which cause the low accuracy of estimated parameters. On the basis of the relative orientation model, further research should be studied to raise the georeferencing accuracy, with the assistance of one GCP at the image edge, or an adjacent strip.
(2)
The actual flight test data is still lacking for verification. In the near future, flight tests will be performed, and a large number of verifications and analyses will be carried out.

5. Conclusions

In this paper, a relative orientation model based on narrow image overlaps were proposed to recover the relative relationship between imagers of AWSHRHIS and achieve seamless image stitching. Experiments with simulation data were performed to assess the accuracy of the method and analyze the sensitivity corresponding to different error factors are analyzed sufficiently. Several conclusions can be drawn as follows: (1) The proposed relative orientation and stitching method can be performed based on the homonymy points and DEM data. Experiments show that stitching accuracy reaches 1~2 pixels. (2) Accuracy of the attitude measurement and unstability of the platform are the largest affecting factors of error. To achieve high accuracy image processing for pushbroom imagers with high spatial resolution, high-precision IMU/GNSS equipment and three-axis stabilized platform should be integrated amounted to ensure the stitching and georeferencing accuracy of the images stitching. (3) The proposed method has similar stitching accuracy to existing methods, but has better applicability. The method does not need a large number of evenly distributed GCPs or calibration strips, thus reducing the operational complexity and cost during flight test. Meanwhile, our method simplifies the process compared with the image registration method. All the sub-images directly conduct image stitching without perform DG and registered for each image.
In the near future, the development method should be studied to improve the georeferencing accuracy. Besides, when flight tests were carried out for verification and analysis, different image stitching methods could be compared quantitatively based on the same set of data.
Overall, this method provides a new solution for image stitching of airborne hyperspectral imaging system with multi-imager such as AWSHRHIS. It reduces the demand for GCPs and calibration strips. Consequently, the cost of setting up and measuring GCPs or additional cost for flight strip calibration can be saved. In addition, this method is suitable for the operating areas where GCPs are difficult to be laid out or measured. The results of simulation experiments indicate that it can effectively promote the efficiency of airborne hyperspectral camera imaging and processing in the near future.

Author Contributions

Conceptualization, J.L. and C.L.; Funding acquisition, L.M.; Methodology, J.L.; L.M. and Y.F.; Software, Y.F.; X.Z. and G.S.; Validation, J.L.; N.W. and Q.H.; Writing—original draft, J.L.; N.W. and K.D.; Writing—review and editing, L.M. and L.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported partly by the National Key Research and Development Program of China under grant no. 2016YFB0500400, the Bureau of International Co-operation Chinese Academy of Sciences, grant no. 181811KYSB20160040.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to [the limited redistribution policy of the commercial satellite data].

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ambrose, A.; Kandpal, L.M.; Kim, M.S.; Lee, W.H.; Cho, B.K. High Speed Measurement of Corn Seed Viability using Hyperspectral Imaging. Infrared Phys. Technol. 2016, 75, 173–179. [Google Scholar] [CrossRef]
  2. Clevers, J.G.P.W.; Kooistra, L. Using Hyperspectral Remote Sensing Data for Retrieving Canopy Chlorophyll and Nitrogen Content. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 574–583. [Google Scholar] [CrossRef]
  3. Moroni, M.; Lupo, E.; Marra, E.; Cenedese, A. Hyperspectral Image Analysis in Environmental Monitoring: Setup of a New Tunable Filter Platform. Procedia Environ. Sci. 2013, 19, 885–894. [Google Scholar] [CrossRef] [Green Version]
  4. Shimoni, M.; Haelterman, R.; Perneel, C. Hyperspectral Imaging for Military and Security Applications: Combining Myriad Processing and Sensing Techniques. IEEE Geosci. Remote Sens. Mag. 2019, 7, 101–117. [Google Scholar] [CrossRef]
  5. Jia, J.X.; Wang, Y.M.; Chen, J.S.; Guo, R.; Shu, R.; Wang, J.Y. Status and Application of Advanced Airborne Hyperspectral Imaging Technology: A review. Infrared Phys. Technol. 2019, 104, 103115. [Google Scholar] [CrossRef]
  6. Hu, P.X.; Lu, Q.M.; Shu, R.; Wang, J.Y. An Airborne Pushbroom Hyperspectral Imager with Wide Field of View. Chin. Opt. Lett. 2005, 3, 689–691. [Google Scholar]
  7. Zhang, D.; Yuan, L.Y.; Wang, S.W.; Yu, H.X.; Zhang, C.X.; He, D.G.; Han, G.C.; Wang, J.Y.; Wang, Y.M. Wide Swath and High Resolution Airborne Hyperspectral Imaging System and Flight Validation. Sensors 2019, 19, 1667. [Google Scholar] [CrossRef] [Green Version]
  8. Hu, F. Research on Inner FOV Stitching Theories and Algorithms for Sub-Images of Three Non-Collinear TDI CCD Chips. Ph.D. Thesis, Wuhan University, Wuhan, China, 2010. [Google Scholar]
  9. Karsten, J. Geometric and Information Potential of IRS-1C PAN-images. In Proceedings of the IEEE 1999 International Geoscience and Remote Sensing Symposium. IGARSS’99 (Cat. No.99CH36293), Hamburg, Germany, 28 June–2 July 1999; pp. 428–430. [Google Scholar]
  10. Brown, M.; Lowe, D.G. Automatic Panoramic Image Stitching using Invariant Features. Int. J. Comput. Vis. 2007, 74, 59–73. [Google Scholar] [CrossRef] [Green Version]
  11. Li, S.W.; Liu, T.J.; Wang, H.Q. Image Mosaic for TDICCD Push-broom Camera Image Based on Image Matching. Remote Sens. Technol. Appl. 2009, 24, 374–378. [Google Scholar]
  12. Lu, J.B.; He, B. Automatic Mosaic Method of Large Field View and Multi-Channel Remote Sensing Images of TDICCD Cameras. Space Sci. 2012, 32, 154–160. [Google Scholar]
  13. Meng, W.C.; Zhu, S.L.; Zhu, B.S.; Bian, S.J. The Research of TDI-CCDs Imagery Stitching Using Information Mending Algorithm. In International Symposium on Photoelectronic Detection and Imaging 2013: Imaging Sensors and Applications; International Society for Optics and Photonics: Washington, DC, USA, 2013; p. 89081C. [Google Scholar]
  14. Adel, E.; Elmogy, M.; Elbakry, H.M. Image Stitching System Based on ORB Feature-based Technique and Compensation Blending. Int. J. Adv. Comput. Sci. Appl. 2015, 6, 55–62. [Google Scholar] [CrossRef] [Green Version]
  15. Zhang, G.; Liu, B.; Jiang, W.S. Inner FOV Stitching Algorithm of Spaceborne Optical Sensor Based on the Virtual CCD Line. J. Image Graph. 2012, 17, 696–700. [Google Scholar]
  16. Tang, X.M.; Zhang, G.; Zhu, X.Y.; Pan, H.B.; Jiang, Y.H.; Zhou, P.; Wang, X.; Guo, L. Triple Linear-array Image Geometry Model of ZiYuan-3 Surveying Satellite and Its Validation. Acta Geod. Et Cartogr. Sin. 2012, 41, 191–198. [Google Scholar] [CrossRef]
  17. Wang, H.; Mo, F.; Li, Q.J.; Wang, E. Inner FOV Stitching of Spaceborne Multispectral Camera Based on Virtual CCD Line under Back Projection in Object-space. Spacecr. Recovery Remote Sens. 2019, 40, 118–125. [Google Scholar]
  18. Jun, P.A.N.; Fen, H.U.; Mi, W.A.N.G. Inner FOV Stitching of ZY-1 02C HR Camera Based on Virtual CCD Line. Geomat. Inf. Sci. Wuhan Univ. 2015, 040, 436–443. [Google Scholar]
  19. Tang, X.M.; Hu, F.; Wang, M.; Pan, J.; Jin, S.Y.; Lu, G. Inner FoV Stitching of Spaceborne TDI CCD Images Based on Sensor Geometry and Projection Plane in Object Space. Remote Sens. 2014, 6, 6386–6406. [Google Scholar] [CrossRef] [Green Version]
  20. Cheng, Y.F.; Jin, S.Y.; Wang, M.; Zhu, Y.; Dong, Z.P. Image Mosaicking Approach for a Double-camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera. Sensors 2017, 17, 1441. [Google Scholar] [CrossRef] [Green Version]
  21. Jiang, Y.H.; Xu, K.; Zhao, R.S.; Zhang, G.; Cheng, K.; Zhou, P. Stitching Images of Dual-cameras Onboard Satellite. Isprs J. Photogramm. Remote Sens. 2017, 128, 274–286. [Google Scholar] [CrossRef]
  22. Lenz, A.; Schilling, H.; Perpeet, D.; Middelmann, W. Automatic In-flight Boresight Calibration Considering Topography for Hyperspectral Pushbroom sensors. In Proceedings of the 2014 IEEE Geoscience and Remote Sensing Symposium, Quebec City, QC, Canada, 13–18 July 2014; pp. 2981–2984. [Google Scholar]
  23. Wang, M.; Hu, J.; Zhou, M.; Li, J.M.; Zhang, Z. Geometric Correction of Airborne Linear Array Image Based on Bias Matrix. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W1, 369–372. [Google Scholar] [CrossRef] [Green Version]
  24. Zhang, A.; Hu, S.; Meng, X.; Yang, L.; Li, H. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors. Remote Sens. 2015, 7, 17297–17311. [Google Scholar] [CrossRef] [Green Version]
  25. Zhao, H.T.; Zhang, B.; Zuo, Z.L.; Chen, Z.C. Boresight Misalignment and Position Offset Calibration of Push-broom Hyperspectral Sensor Integrated POS System. Geomat. Inf. Sci. Wuhan Univ. 2013, 38, 973–977. [Google Scholar]
  26. Yeh, C.K.; Tsai, V.J.D. Self-calibrated Direct Geo-referencing of Airborne Pushbroom Hyperspectral Images. In Proceedings of the 2011 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Vancouver, BC, Canada, 24–29 July 2011; pp. 2881–2883. [Google Scholar]
  27. Zhou, T.; Habib, A.; Masjedi, A.; Zhang, Z.; Flatt, J.E.; Crawford, M. Boresight Calibration of GNSS/INS-Assisted Push-Broom Hyperspectral Scanners on UAV Platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1734–1749. [Google Scholar]
  28. Zhang, L.Y. Research and Realization on Registration Technology of Narrow Overlapping Imaging Spectrometer Data. Master’s Thesis, Shandong University, Jinan, China, 2013. [Google Scholar]
  29. Liu, J.; Wang, D.H.; Zhang, Y.S. Analysis of and Transformation between HPR and OPK Angles for The GPS /INS System. Sci. Surv. Mapp. 2006, 31, 54–57. [Google Scholar]
  30. Wu, G.D.; Han, B.; He, X. Calibration of Geometric Parameters of Line Array CCD Camera Based on Exact Measuring Angle in Lab. Opt. Precis. Eng. 2007, 15, 1628–1632. [Google Scholar]
  31. Du, X.P.; Guo, H.D.; Fan, X.T.; Zhu, J.J.; Yan, Z.Z.; Zhan, Q. Vertical Accuracy Assessment of SRTM and ASTER GDEM over Typical Regions of China Using ICESat/GLAS. Earth Sci. J. China Univ. Geosci. 2013, 38, 887–897. [Google Scholar]
  32. Takaku, J.; Tadono, T.; Tsutsui, K.; Ichikawa, M. Validation of ‘AW3D’ global DSM Generated from ALOS PRISM. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 3, 25–31. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Imaging principle of AWSHRHIS.
Figure 1. Imaging principle of AWSHRHIS.
Remotesensing 13 01001 g001
Figure 2. Pushbroom imaging mode of AWSHRHIS.
Figure 2. Pushbroom imaging mode of AWSHRHIS.
Remotesensing 13 01001 g002
Figure 3. Relative orientation schematic diagram.
Figure 3. Relative orientation schematic diagram.
Remotesensing 13 01001 g003
Figure 4. Workflow of imaging stitching.
Figure 4. Workflow of imaging stitching.
Remotesensing 13 01001 g004
Figure 5. Original DOM and DEM used in simulation: (a) SuperView-1 panchromatic image; (b) ASTER DEM.
Figure 5. Original DOM and DEM used in simulation: (a) SuperView-1 panchromatic image; (b) ASTER DEM.
Remotesensing 13 01001 g005
Figure 6. Simulation images: (a) Left image; (b) Middle image; (c) Right image.
Figure 6. Simulation images: (a) Left image; (b) Middle image; (c) Right image.
Remotesensing 13 01001 g006
Figure 7. Stitched image in stable flight condition.
Figure 7. Stitched image in stable flight condition.
Remotesensing 13 01001 g007
Figure 8. Relative orientation residual (without data error): (a) Residual of left-middle images; (b) Residual of right-middle images.
Figure 8. Relative orientation residual (without data error): (a) Residual of left-middle images; (b) Residual of right-middle images.
Remotesensing 13 01001 g008
Figure 9. Experimental results in stable flight: (a) original and (b) updated overlapping area 1 of left-middle images; (c) original and (d) updated overlapping area 2 of right-middle images. The red rectangular frames the seamline between two adjacent images before and after stitching.
Figure 9. Experimental results in stable flight: (a) original and (b) updated overlapping area 1 of left-middle images; (c) original and (d) updated overlapping area 2 of right-middle images. The red rectangular frames the seamline between two adjacent images before and after stitching.
Remotesensing 13 01001 g009aRemotesensing 13 01001 g009b
Figure 10. Local magnification experimental results in stable flight: (a) left image before merge; (b) middle image before merge; (c) stitched image.
Figure 10. Local magnification experimental results in stable flight: (a) left image before merge; (b) middle image before merge; (c) stitched image.
Remotesensing 13 01001 g010
Figure 11. Relative orientation residual(unstable platform): (a) residual of left-middle images; (b) residual of right-middle images.
Figure 11. Relative orientation residual(unstable platform): (a) residual of left-middle images; (b) residual of right-middle images.
Remotesensing 13 01001 g011
Figure 12. Overlapping area of original images: (a) A same region in the overlapping of left-middle original images; (b) A same region in the overlapping of right-middle images.
Figure 12. Overlapping area of original images: (a) A same region in the overlapping of left-middle original images; (b) A same region in the overlapping of right-middle images.
Remotesensing 13 01001 g012
Figure 13. Experimental results in unstable flight condition: (a) original and (b) updated overlapping area between left and middle images; (c) original and (d) updated overlapping area between middle and right images. The red rectangular frames the seamline between two adjacent images before and after stitching.
Figure 13. Experimental results in unstable flight condition: (a) original and (b) updated overlapping area between left and middle images; (c) original and (d) updated overlapping area between middle and right images. The red rectangular frames the seamline between two adjacent images before and after stitching.
Remotesensing 13 01001 g013
Figure 14. Influence of the elevation error.
Figure 14. Influence of the elevation error.
Remotesensing 13 01001 g014
Table 1. Main features of AWSHRHIS.
Table 1. Main features of AWSHRHIS.
ParameterLeft ImagerMiddle ImagerRight Imager
spectral range(um)0.4–0.90.4–0.90.4–0.9
tilted angles (°)−9.409.4
FOV (°)12.6812.6
samples (pixels)200012502000
Table 2. Simulated relative orientation parameters.
Table 2. Simulated relative orientation parameters.
ParameterRelative Orientation Offset (m)Relative Orientation Angles (°)
xyzωφκ
Right imager0.2009.40.08−0.13
Left imager0−0.20−9.4−0.030.05
Table 3. Experiments for error sources analysis.
Table 3. Experiments for error sources analysis.
Experiments Flight ConditionError Sources
EA-stable-1Stable flight conditionWithout any error
EA-stable-2DEM elevation error
EA-stable-3Interior orientation error
EA-stable-4IMU/GNSS measurement error
EA-stable-5Comprehensive analysis
EA-unstable-1Unstable flight conditionWithout any error
EA-unstable-2DEM elevation error
EA-unstable-3Interior orientation error
EA-unstable-4IMU/GNSS measurement error
EA-unstable-5Comprehensive analysis
Table 4. Stitching error of EA-stable-1.
Table 4. Stitching error of EA-stable-1.
ImageNo. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
left–middle1430.183−0.2560.0490.166−0.1970.038
right–middle2900.163−0.3810.0450.234−0.1950.033
Table 5. Relative orientation residual of EA-stable-2 (left-middle image).
Table 5. Relative orientation residual of EA-stable-2 (left-middle image).
Elevation STD (m)No. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
56390.590−0.4720.1140.496−0.5530.102
206390.545−0.4780.1160.497−0.5540.102
506390.503−0.4820.1340.495−0.5550.102
1006390.603−0.5360.1860.495−0.5560.102
Table 6. Stitching error of EA-stable-2 (left-middle image).
Table 6. Stitching error of EA-stable-2 (left-middle image).
Elevation STD (m)No. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
51540.276−0.2800.0370.209−0.3070.021
201370.251−0.2090.0540.172−0.1220.038
501570.412−0.3430.0840.354−0.2150.066
1001220.481−0.4740.1350.201−0.3100.091
Table 7. Relative orientation residual of EA-stable-3 (left-middle image).
Table 7. Relative orientation residual of EA-stable-3 (left-middle image).
Distortion (pix)No. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
16360.525−0.5600.1090.452−0.7390.179
26430.384−0.4280.1000.635−0.9180.325
36600.558−0.4470.1101.039−1.2270.490
Table 8. Stitching error of EA-stable-3 (left-middle image).
Table 8. Stitching error of EA-stable-3 (left-middle image).
Distortion (pix)No. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
11450.273−0.0900.0570.062−0.1930.058
21450.480−0.1820.1170.226−0.3440.118
31450.407−0.1560.1420.176−0.4550.172
Table 9. Relative orientation residual of EA-stable-4 (left-middle image).
Table 9. Relative orientation residual of EA-stable-4 (left-middle image).
IMU/GNSS SystemNo. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
POS AV6106111.809−1.8630.5831.952−1.9670.836
POS AV5103742.243−2.1891.3752.180−2.2561.367
Table 10. Stitching error of EA-stable-4 (left-middle image).
Table 10. Stitching error of EA-stable-4 (left-middle image).
IMU/GNSS SystemNo. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
POS AV610920.899−1.2660.4331.060−1.1910.454
POS AV510561.775−2.3930.8280.897−1.290.466
Table 11. Stitching error of EA-unstable-1.
Table 11. Stitching error of EA-unstable-1.
ImageNo. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
left–middle440.8720.0030.2850.3970.0030.126
right–middle710.5330.0000.1350.3360.0050.102
Table 12. Relative orientation residual of EA-unstable-2 (left-middle image).
Table 12. Relative orientation residual of EA-unstable-2 (left-middle image).
Elevation STD (m)No. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
5581.563−1.0980.5811.284−1.6060.696
20581.564−1.0890.5811.284−1.6070.696
50581.563−1.0810.5851.284−1.6070.696
100581.565−1.0320.5851.284−1.6070.696
Table 13. Stitching error of EA-unstable-2 (left-middle image).
Table 13. Stitching error of EA-unstable-2 (left-middle image).
Elevation STD (m)No. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
51030.579−0.7060.2220.273−0.6160.174
20980.711−0.6600.2620.235−0.6340.169
501070.567−0.6770.2420.390−0.6170.170
100960.479−0.9140.2630.413−0.5580.177
Table 14. Relative orientation residual of EA-unstable-3 (left-middle image).
Table 14. Relative orientation residual of EA-unstable-3 (left-middle image).
Distortion (pix)No. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
1611.771−1.7580.6221.390−1.6570.696
2501.623−1.4670.6601.990−1.7820.771
3611.776−1.9350.6911.900−1.8650.798
Table 15. Stitching error of EA-unstable-3 (left-middle image).
Table 15. Stitching error of EA-unstable-3 (left-middle image).
Distortion (pix)No. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
11040.540−0.7960.2510.588−0.5480.222
2980.687−0.8520.3100.935−0.6080.254
31050.365−0.7780.2890.740−0.5990.299
Table 16. Relative orientation residual of EA-unstable-4 (left-middle image).
Table 16. Relative orientation residual of EA-unstable-4 (left-middle image).
IMU/GNSSNo. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
POS AV610551.963−1.5670.8781.873−1.7720.994
POS AV510331.912−1.8661.3111.882−1.9001.101
Table 17. Stitching error of EA-unstable-4 (left-middle image).
Table 17. Stitching error of EA-unstable-4 (left-middle image).
IMU/GNSSNo. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
POS AV610901.132−1.5290.7090.978−0.9330.469
POS AV510401.842−2.4221.5181.888−1.4910.868
Table 18. Relative orientation residual of EA-unstable-5 (left-middle image).
Table 18. Relative orientation residual of EA-unstable-5 (left-middle image).
Flight ConditionNo. of Homonymy PointsAlong Track (pix)Across Track (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
stable flight6392.819−2.3621.3972.373−2.2301.388
unstable flight391.901−1.9151.2751.977−1.9751.188
Table 19. Stitching error of EA-unstable-5 (left-middle image).
Table 19. Stitching error of EA-unstable-5 (left-middle image).
Flight ConditionNo. of Homonymy PointsX (pix)Y (pix)
Max. ErrorMin. ErrorRMSEMax. ErrorMin. ErrorRMSE
stable flight722.176−2.0971.4361.368−1.5960.838
unstable flight391.955−2.5581.5181.571−1.4550.761
Table 20. Relative orientation residual for different initial value.
Table 20. Relative orientation residual for different initial value.
Test Number RMSE of Left-Middle Image (pix) RMSE of Right-Middle Image (pix)
Along TrackAcross TrackAlong Track Across Track
11.131 1.205 1.094 1.269
21.261 1.186 0.838 0.822
31.293 1.373 0.993 1.129
41.025 1.158 1.141 0.967
51.140 1.008 1.183 1.116
61.054 1.217 0.868 0.831
71.131 0.958 0.928 0.813
81.177 1.293 1.052 1.066
90.896 0.845 1.061 1.294
100.066 0.235 1.173 1.044
average1.017 1.048 1.033 1.035
Table 21. Stitching error for different initial value.
Table 21. Stitching error for different initial value.
Test NumberRMSE of Left-Middle Image (pix)RMSE of Right-Middle Image (pix)
XYXY
11.565 0.999 1.802 1.172
21.536 0.969 0.681 0.598
31.449 0.967 1.095 0.747
41.118 0.576 1.005 0.579
51.250 0.677 1.694 0.980
61.682 1.003 0.770 0.495
71.465 0.843 0.767 0.472
81.566 0.912 1.305 0.666
90.901 0.408 1.607 1.054
100.133 0.161 1.408 0.678
average1.267 0.751 1.213 0.744
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, J.; Ma, L.; Fan, Y.; Wang, N.; Duan, K.; Han, Q.; Zhang, X.; Su, G.; Li, C.; Tang, L. An Image Stitching Method for Airborne Wide-Swath HyperSpectral Imaging System Equipped with Multiple Imagers. Remote Sens. 2021, 13, 1001. https://doi.org/10.3390/rs13051001

AMA Style

Li J, Ma L, Fan Y, Wang N, Duan K, Han Q, Zhang X, Su G, Li C, Tang L. An Image Stitching Method for Airborne Wide-Swath HyperSpectral Imaging System Equipped with Multiple Imagers. Remote Sensing. 2021; 13(5):1001. https://doi.org/10.3390/rs13051001

Chicago/Turabian Style

Li, Jingmei, Lingling Ma, Yongxiang Fan, Ning Wang, Keke Duan, Qijin Han, Xuyuan Zhang, Guozhong Su, Chuanrong Li, and Lingli Tang. 2021. "An Image Stitching Method for Airborne Wide-Swath HyperSpectral Imaging System Equipped with Multiple Imagers" Remote Sensing 13, no. 5: 1001. https://doi.org/10.3390/rs13051001

APA Style

Li, J., Ma, L., Fan, Y., Wang, N., Duan, K., Han, Q., Zhang, X., Su, G., Li, C., & Tang, L. (2021). An Image Stitching Method for Airborne Wide-Swath HyperSpectral Imaging System Equipped with Multiple Imagers. Remote Sensing, 13(5), 1001. https://doi.org/10.3390/rs13051001

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop