1. Introduction
Hyperspectral imaging technology has been developed over the past decades and is widely used in agriculture, vegetation, environmental monitoring, and other fields [
1,
2,
3,
4]. Airborne technology has played an important role in these fields because of its higher spatial resolution and flexibility as compared to similar imagers on spaceborne platforms. The imaging mode of aerial hyperspectral imager can be mainly divided into whiskbroom, pushbroom, and step-stare mode. Among these, pushbroom imaging is the most widely adopted mode. It has the advantage of increasing pixel dwell time and improving signal-to-noise ratio (SNR). Moreover, there is no complex mechanical scanning mechanism, so the weight and volume of the instrument are lower. However, this mode is difficult to get the balance between the wide field of view (FOV) and narrow instantaneous field of view (IFOV) [
5]. To improve the swath without reducing the spatial resolution, combination technology has been developed. Research has achieved, for example, an airborne pushbroom hyperspectral imaging system with 42° FOV by combining two imagers [
6], and wide swath and high resolution airborne pushbroom hyperspectral imager with 40° FOV consisted of three imagers [
7]. However, with the purpose of increasing the FOV, the combined imagers usually produce images with very small overlaps. Besides, low stability of aviation flight platform, and the pushbroom imaging mode make the image stitching more difficulty. Thus, high precision images stitching with small overlaps is of great importance for further application. Therefore, it is necessary to investigate the image stitching methods for the airborne imaging system with multiple pushbroom imagers.
Image stitching methods have been studied by many researchers, especially for spaceborne pushbroom imaging system. The methods can be generally grouped into the image-space-oriented and object-space-oriented methods [
8]. The image-space-oriented method registers original images based on homonymy points extracted from overlaps. Generally, simple transformation models—such as affine transformation—are often used for original images registration [
9,
10,
11,
12,
13,
14]. Since this method cannot provide a strict imaging model, the stitched images are difficult to perform geometric rectification processing, which limits further processing and applications [
15]. In addition, airborne platform usually performs lager image internal distortion due to its lower stability compared with space platform—i.e., the attitude changes more drastically in a few imaging arrays, which causes the homonymy points obtained from inconsistent imaging time have different scale and direction of distortion on adjacent images. It is difficult to achieve registration for the images with irregular internal distortions. Overall, this method is not suitable for airborne pushbroom images stitching.
The object-space-oriented method aims to establish relationship between the stitched image and each original sub-image by rigorous geometric models. Spaceborne multi-CCD (charge-coupled device) images are stitched based on virtual projection plane [
15,
16,
17,
18]. They assumed that the stitched image is observed by a virtual CCD on the focal plane, then based on the geometric sensor model, the relationship between the original image and the virtual image can be established through the ground surface. Tang et al. [
19] presented an inner FOV stitching method for spaceborne multi-CCD images based on sensor geometry and projection plane in object space. Cheng et al. [
20] proposed a high accuracy image mosaicking approach for a spaceborne dual-camera system based on the big virtual camera. Jiang et al. [
21] proposed a method to stitch images of dual-cameras onboard one satellite, it recovers the relative geometric relation of the dual cameras by building a relative calibration model. Researchers also pointed out that the consistency of the image positioning accuracy between the adjacent single images is the foundation of geometric accuracy and stitching accuracy [
18,
19,
20]. The accuracies are assured by the on-orbit high accuracy geometric calibration for each sensor which may better than 0.3 pixels. In other words, dedicated calibration field is required to ensure the calibration accuracy of each camera. For airborne pushbroom sensors, several calibration methods were proposed based on a lot of ground control points (GCPs) or calibration strips with large overlaps [
22,
23,
24,
25,
26,
27]. The calibration accuracy is about two pixels. However, to perform the calibration, a sufficient number of GCPs should be distributed evenly in the coverage area of each imager, or calibration strips for each imager should be performed, which means multiple complicated data acquisition and calibration processes need to be carried out for each camera separately. Meanwhile, it will increase both money and time cost.
For airborne pushbroom image stitching, Zhang [
28] proposed a method which performs image registration after geometric rectification. Images obtained by each camera is georeferenced with the initial parameters first. Then, tie points of adjacent georectified images are extracted for registration. However, all groups of images should be georectified separately and registered successively, which limits data processing efficiency.
In this paper, we proposed a novel method for stitching images of airborne imaging system equipped with multiple pushbroom imagers. Firstly, relative orientation relation of the adjacent imagers is established by the homonymy points and digital elevation model (DEM). Then seamless stitching images can be produced based on the rigorous geometric model. The advantage of this method is that calibration for relative orientation parameters by GCPs or calibration strips is no longer needed. The validation of the proposed method and error analysis were performed by the experiments with simulation data.
2. Materials and Methods
2.1. Materials
In this paper, the Airborne Wide Swath and High-Resolution Hyperspectral Imaging System (AWSHRHIS) in development is composed of three pushbroom hyperspectral imagers in visible and near-infrared band. The parameters of AWSHRHIS are designed as shown in
Table 1. Three subsystems work for left, middle and right FOV respectively, and the total FOV is designed to be 30 degrees with a 0.11 mrad IFOV. The hyperspectral imagers are mounted on a three-axis stabilized platform. Therefore, the influences of various disturbances on imaging sensors can be mostly isolated.
Navigation sensors are used to record the navigation parameters of the platform, including the position and attitudes. The quality of the geometrical rectification using the navigation parameters depends on the accuracy of these sensors. The inertial measuring unit (IMU) is mounted together with the imaging system inside the stabilized platform to measure the attitude. The Global Navigation Satellite System (GNSS) antenna is mounted outside the aircraft platform to measure the position. Synchronization unit is used to drive the triple imagers to exposure simultaneously. For each image line collected, the GPS timestamp is written in the header file of the raw image.
The principle of imaging is shown in
Figure 1. The middle imager is perpendicular to the visual field of A
CB
C. The left and right imagers are tilted inwards, with the FOV on the ground of A
LB
L and A
RB
R. The total FOV of AWSHRHIS is A
RB
L, in which A
LB
C is the overlap between the middle and right images, and A
CB
R is the overlap between the middle and left images.
Pushbroom scanning mode is used in AWSHRHIS. With the movement of the flight platform, three image strips can be obtained respectively, with small imaging overlaps. Due to the installation alignment error, the FOV of each subsystem exposured at the same time is misaligned on the ground as shown in
Figure 2. The boresight misalignments can induce inconsistence of imaging time for the same object in image overlaps. It is hard to clear off the influence of platform vibration, especially for an airborne platform with low stability, which usually induce obvious image distortion. The inconsistent imaging time of homonymy points will cause different scale and direction of distortion on images. Therefore, eliminating image distortion based on geometric imaging model is an important step to achieve seamless image stitching.
2.2. Rigorous Imaging Model
Each scan line of the pushbroom images is an independent image and has its own position and orientation due to platform motion. Therefore, image georectification has to be performed line by line for each image. The orientations of the IMU/GNSS data should be converted to the external orientation elements associated with each scan line. After a series of coordinate transformations, the linear image points can be projected from the image coordinate system to the mapping coordinate system. The coordinate systems involved in: (i) imaging space coordinate system(c); (ii) sensor coordinate system(s); (iii) IMU coordinate system(b); (iv) navigation coordinate system(g); (v) Earth-center fixed coordinate system(e), i.e., WGS84 geodetic system is used in this paper; and (vi) the mapping coordinate system(m) [
29].
The rigorous geometric imaging model can be represented by the following formula:
where
where
is the geographical coordinates of the image point
at the imaging time
i in the mapping coordinate system. For pushbroom images, when the X direction of imaging space coordinate system is defined as the sampling direction, the
y coordinate is always equal to 0.
is a scale factor.
is the rotation matrix from the imaging space coordinate system of each imager to the sensor coordinate system.
is the rotation matrix from the sensor coordinate system to the IMU coordinate system.
is the rotation matrix from the IMU coordinate system to the navigation coordinate system.
is the rotation matrix from the navigation coordinate system to the Earth center fixed coordinate system.
is the rotation matrix from the Earth center fixed coordinate system to the mapping coordinate system.
is the level arm between imager coordinate system and the sensor coordinate system.
is the level arm between the sensor coordinate system and IMU coordinate system.
is the coordinate of the platform position measured by GPS at the imaging time
i.
is the origin of the mapping coordinate system.
2.3. Relative Orientation Model Base on DEM
After the installation of imaging system, the relative orientation relationships between adjacent imagers are fixed during the flight. Ideally, the roll angle between adjacent imagers should be equal to the designed angle, and the pitch and heading angles of the triple imagers should be aligned in parallel. However, due to the assembly errors, the boresight is not strictly aligned as designed.
Therefore, we propose a relative orientation method based on homonymy points to calculate the misalignment angles between adjacent imagers. The baseline between the adjacent imagers is very small (below 20 cm in AWSHRHIS). Thus, the base-height ratio is much less than that of expected in conventional mapping photography. The small parallactic angle reduces the z-dimension measuring accuracy in the stereo model, i.e., the coordinates of ground points calculated by forward intersection is unreliable. In order to solve this problem, DEM data is introduced to improve the accuracy.
The middle imager is taken as the reference imager, and the adjacent imager as the target imager. Thus, the sensor coordinate system is parallel to the image space coordinate system of middle imager. The unknowns are the three orientation parameters (ω, φ, κ) between the target imager and the reference imager image space coordinate system.
and
are a pair of homonymy points (as shown in
Figure 3), which are obtained by the reference imager and the target imager at the time of
and
, respectively. The corresponding coordinates
and
in the image space coordinate systems are defined by
where
and
are the x coordinates in reference image and target image.
and
are the focus of the reference imager and the target imager.
The point
is transformed from the image space coordinate system to the object space coordinate system by Equation (5):
M in
Figure 3 is the ground point corresponding to the image point
. Its coordinate
in object space coordinate is calculated by Equation (6), i.e., the intersection between the look direction and the topographic surface, by
where
is the height difference between the perspective center
and the ground point
. The elevation of the ground point
is calculated using bilinear interpolation based on the DEM. Iterative is required to calculate the point coordinate.
With the rotation matrix
and offset
acquired from IMU/GNSS data, the coordinates
in the image space coordinate at time
is determined such that
With the relative orientation matrix
and
from target camera to the reference camera, the coordinates
in the image space coordinate of target camera defined by
where
is constructed by using three sequential rotations:
ω about the X-axis,
φ about the once-rotated Y-axis, and
κ about the twice-rotated Z-axis.
is measured before the flight experiment.
Then, the coordinate
of the image point
corresponding to the ground point
is determined by
Theoretically, the point
and
should be coincident, but deviation usually exist between the calculated image position and actual position due to the projective compensation of systematic errors, which is determined by the relative orientation parameters. A least squares solution can be performed to minimize distance between
and
by Equation (11) to solve the relative orientation elements.
2.4. Workflow
As illustrated in
Figure 4, the stitching method based on relative orientation and rigorous geometric model is performed via the following steps. Image radiometric and IMU/GNSS data interpolation were performed first. Then, relative orientation elements between the adjacent imagers were calculated. A rigorous imaging model was established with the relative orientation elements, and georectification was performed to reduce the image distortion which mainly caused by the unstable flight condition. Then, image resampling and stitching were performed for each band of the image.
2.4.1. Step 1: Preprocessing
Images and support data were prepared for the images stitching in this step. Firstly, the position and attitude data measured by IMU/GNSS were interpolated by the GPS timestamps. Secondly, the position and attitude data were converted to the exterior orientation elements of each scan line in the original images. Meanwhile, the radiometric processing for hyperspectral images was performed using a pixel-based dark current calibration and relative radiometric correction.
2.4.2. Step 2: Relative Orientation
Scale-invariant feature transform (SIFT) algorithm was implemented extract and match homonymy points between the target image and reference image. Additionally, the random sample consensus (RANSAC) algorithm was used to remove outliers. Then the relative orientation model mentioned in
Section 2.2 was performed. Taking the minimum projection difference between the homonymy point
and the projective point
as the optimized condition, the error equation was established, and the relative orientation elements were calculated using weighted least square method.
2.4.3. Step 3: Image Stitching
After relative orientation, the rigorous geometric model was established for each imager. Then image stitching was performed together with the image georectification.
Before image stitching and georectification, the coverage of the stitched image was decided. Firstly, the points , where , and denotes the imager number, at the two ends of the first scanline and the last scanline were selected from the original images obtained by each imager. Then, their object space coordinates were calculated by Equation (1) with height as the average elevation . Subsequently, the coordinates of the edge points , where , in original image were decided. The coverage of the stitched image was decided by the minimum rectangular region enveloping, in which the edge points and were compared by the edge points .
Then the relationship is established between pixels of the stitching image and that of original images. According to the original image space coordinates
and the object space coordinates
, the affine transformation between the original image and the stitched image was established as
According to the ground coordinates in the stitched image, the image coordinates in each original image was calculated using Equation (10). By judging whether the point is located in the image, the image number was determined. Meanwhile, the initial image coordinates of each pixel in original image were calculated. The best scanline y of the ground point was searched by iteration. Then the image coordinates were calculated using the backward projection from the IMU/GNSS data at corresponding scanline and the object space coordinate , where can be interpolated based on the reference DEM.
Since for each grid unit in stitched image, the corresponding pixel on the original image were calculated according to the previous step. Thus, the backward projection and resampling processes were performed to generate the stitching image.
3. Results
It is vital to analyze and evaluate the application feasibility and effectiveness of the model. In this paper, experiments were conducted to verify the proposed method and assess the accuracy of the stitched image based on simulated data. The effect of error factors was analyzed, so as to provide guidance for system design and flight test. In this section, firstly, we introduced the process and major imaging factors of the simulation. Then, the stitching results under different imaging conditions were shown, and quantitative analysis of the sensitivity corresponding to different error factors were performed.
3.1. Data Sources
Since the geometric rectification and stitching processes are same for different bands of hyperspectral data, we selected one band to perform simulation and analysis. Digital orthophoto map (DOM) and DEM, which represent the true conditions of the test area, were used to express the reflectivity information of the three-dimensional coordinate information during imaging. Then the simulation was performed based on the flight trajectory. High resolution SuperView-1 panchromatic image in Dangchang, Gansu Province was used as the base DOM as shown in
Figure 5a. SuperView-1 is a high-resolution commercial remote sensing satellite, launched in 2016 by China. The ground sample distance (GSD) of the image is 0.5 m.
Figure 5b shows the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) DEM used in the simulation. The ground resolution is 1″. The average height of the test area is about 1500 m. A real trajectory data obtained in actual flight test was used to simulate the position of perspective center in WGS84 coordinate system and its attitude in IMU body coordinate system.
Sensor imaging simulation includes the following aspects: imaging position and attitude simulation, and image space coordinate projection simulation. The base trajectory data was interpolated according to the speed-height ratio of AWSHRHIS, and then, the position in the trajectory is shifted to cover the range of the base DOM. The flight altitude was set to 10,000 m, so the GSD of the image was about 1 m. Image simulation is equal to single-ray backprojection. Since the three-dimensional information of the terrain is lost when it is projected into a two-dimensional image, the fixing elevation coordinate must be provided before we can backproject the ray into the world. For each pixel on the simulated image, according to the position and attitude corresponding to the imaging time, the image ray in the object space system was calculated based on the rigorous imaging model. Then iterative photogrammetry method was used to determine if the ray has intersected the terrain surface. Once the elevation was found, the position can be extracted. According to the object space coordinates
, the digital number (DN) value in the DOM was interpolated and assigned to the simulated image. The simulated images were generated by traverse all the image pixels. The parameters of each imager are same with the designed parameters as shown in
Table 1.
To validate the accuracy of the stitched images, two types of experiments were designed. Quantitative analysis experiments of the sensitivity corresponding to different error factors were performed in
Section 3.2. Adaptability analysis experiments to different initial offset between adjacent imagers were performed in
Section 3.3.
According to the relative orientation and stitching model, the main factors which may influence stitching accuracy includes: (1) the matching accuracy of the homonymy points; (2) the elevation error of auxiliary DEM; and (3) the accuracy of inner and exterior orientation elements. Different from the spaceborne platform, the stability of airborne platform is poor due to vibrations of the platform itself, the turbulence and wind field conditions. It is difficult to isolate the high-frequency vibration completely even if a three-axis stabilized platform is used. Based on the workflow mentioned above, the SIFT and RANSAC algorithms were used to ensure the homonymy points matching accuracy. However, due to the different imaging time between the adjacent images, the unstable airborne platform will result in characteristic distinction between the homonymy points, which decreases the matching accuracy.
It is meaningful for us to distinguish the errors caused by the model itself or by the environment disturbance, and to know how flexible the mode is. Therefore, two flight working conditions were considered in this paper. The first one is the stable flight condition. In order to simulate the stable flight condition, we simply set the roll and pitch angle of the attitude in the trajectory as fixed values. The other one is to simulate the actual unstable flight condition, especially with high-frequency vibration. In this condition, a group of actual recorded attitudes was used to simulate the original images.
The relative orientation parameters between adjacent imagers were designed randomly as shown in
Table 2. With the base DOM and DEM data, five groups of 2048 line framing images were simulated.
Figure 6 shows a group of images of right, middle, and left imagers in stable flight condition. The overlap area covers about 140 pixels, which are 7% of the total pixels in the left and right images.
To simulate the actual unstable flight condition, a variety of experiments with different initial values of relative orientation parameters were designed to analyze the robustness of the method. The basis data—such as the DOM, DEM, and trajectory data—for simulation is the same with the experiments in the unstable flight condition. Then imager misalignment angles between −0.15°and 0.15° were added randomly as the initial value for image simulation. Comprehensive analysis was performed later to evaluate the adaptability of the proposed method.
3.2. Analysis Corresponding to Different Error Sources
Simulation data under stable and unstable flight condition were used for quantitative analysis. In each condition, we analyzed the influence of error sources such as DEM, interior orientation, IMU/GNSS, and comprehensive analysis for all the errors mentioned above. In total, 10 groups of experiments were designed in the first type of experiments as shown in
Table 3. Relative orientation was carried out considering error sources. Then, the relative orientation parameters were used to stitch images. To further validate the accuracy of the proposed method, quality of the stitched image was assessed by two indices: (1) the relative orientation residual, (2) the stitching accuracy of overlapping area.
3.2.1. Error Analysis in Stable Flight Condition
Analysis in stable flight condition was discussed with five tests: image processing in ideal case (1) without data error, (2) with DEM elevation error, (3) with interior orientation error, (4) with IMU/GNSS measurement error, (5) and with all the errors above.
- (1)
Ideal Case
In order to verify the accuracy of the model, relative orientation and image stitching were processed without error.
Figure 7 shows the stitched image generated from images in
Figure 6.
Image projection residual calculated by Equation (9) was used to evaluate the precision of relative orientation.
Figure 8a shows the image projection residuals of the left- middle images after relative orientation. A total of 639 pairs of homonymy points are matched. The residual error is less than 0.8 pixels. The RMSE (root mean square error) of the residual across the track and along the track are 0.103 pixels and 0.114 pixels, respectively.
Figure 8b shows the image projection residuals of the right-middle images. A total of 728 pairs of homonymy points are matched. The RMSE of the residual across the track and along the track are 0.097 pixels and 0.101 pixels, respectively.
It is difficult to quantify the stitching accuracy through the seamline of the stitched image. In this paper, the mosaic accuracy was evaluated by the consistency of homonymy points in the overlap of adjacent images. This consistency can be calculated as follows. First, adopting the stitched image coverage as the frame, all the three original images were projected to this frame respectively based on the rigorous geometric model. This will generate three georectified images in the same object coordinate system. Ideally, the homonymy points in adjacent georectified images should have the same image coordinates. Then, a number of homonymy point pairs in adjacent georectified images were extracted automatically using the SIFT algorithm. The biases of the location of the point pairs were employed to evaluate the stitching accuracy.
Table 4 shows the stitching error of the left-middle and right-middle images. The stitching error of left-middle images is 0.049 pixels and 0.038 pixels in X and Y direction, and it of right-middle images is 0.045 pixels and 0.033 pixels, which are less than 0.1 pixels. Within the ideal stable flight model, the RMSE of relative orientation residual and the stitching error are both less than 0.1 pixels. For the rigorous imaging model used in this process, the small amount of error may be caused by the homonymy points matching error.
- (2)
Effect of DEM Elevation Error
In order to avoid low accuracy of forward intersection caused by small parallactic angle, reference DEM was used in relative orientation. In addition, reference DEM was also used for georectification. Therefore, the elevation accuracy of the reference DEM is one of the influencing factors of image stitching. In this test, random elevation error following the normal distribution was added to the DEM used in image stitching, then the relative orientation residual and the stitching error were analyzed taking the left-middle image as example. The results are shown as
Table 5 and
Table 6.
Generally speaking, the effect of elevation accuracy is insignificant, and mainly affects the accuracy along the track. Even if the elevation error is increased to 100 m, the stitching RMSE is no more than 0.2 pixels.
- (3)
Effect of Interior Orientation Error
Interior orientation parameters directly affect the metric accuracy. These include the location of the principal points, the focal length, and the lens distortion. The interior orientation error increases with the radial—i.e., the distortion at the image edge is more significant. The proposed method used the small overlap of the adjacent images which located at the image edge with the lager distortion for image stitching. Therefore, it is necessary to analyze the effects of interior orientation on image stitching. At present, the laboratory geometric calibration method for linear array camera has been mature. The precision is about one pixel by using precise angle measurement [
30]. In this test, 1~3 pixels of distortion was introduced in image stitching.
Table 7 and
Table 8 list the relative orientation residual and stitching error.
The result shows that interior orientation error mainly affects the accuracy across the track. With the interior orientation error of three pixels on the edge, the stitching accuracy is less than 0.2 pixels.
- (4)
Effect of IMU/GNSS Measurement Accuracy
The measurement accuracy of IMU/GNSS system, which is used to measure the position and attitude of the platform, directly affects the accuracy of exterior orientation elements of pushbroom images. The typical IMU/GNSS systems are POS AV 510 and 610 made by Applanix company. The post process spatial accuracy of POS AV 510 is 0.3 m, pitch and roll accuracy is 0.005°, and yaw accuracy is 0.008°. The post process spatial accuracy of POS AV 610 is 0.3 m, pitch and roll accuracy is 0.0025°, and yaw accuracy is 0.005°. According to this, the normal distribution error was added to the IMU/GNSS data used in stitching. Taking the left-middle image as an example, the relative orientation residual and the stitching error were analyzed, and the results are shown as
Table 9 and
Table 10.
The results show that the measurement accuracy of IMU/GNSS has an obvious influence on the results of stitching. The stitching RMSE is about one pixel using POS AV 510 system.
- (5)
Comprehensive Analysis
In this experiment, typical errors were considered in the processing for accuracy analysis. The DEM elevation error was set to 20 m since the nominal accuracy of free global DEM data are reported as 20 m, such as SRTM, ASTER, and AW3D [
31,
32]. The interior orientation error was set to 1 pixel, which is the current laboratory geometric calibration accuracy. The position and attitude measurement accuracies were set according to POS AV 510.
Based on the method introduced before, the relative orientation and the image stitching were performed.
Figure 9 shows the experimental results of the simulation data. Areas 1 and 2 are left-middle image overlapping regions and right-middle image overlapping regions. The stitching process effect were checked by visual inspection. By observing the continuity of ground features in the processed images of stitching regions, we can see that the offset of the original overlapping areas is eliminated.
Although the offset of the seamline has been well corrected, the IMU/GNSS measurement error, which directly affect the georectification accuracy, reduces the final stitching accuracy. The relative orientation residual is 1.397 pixels along track, and 1.388 pixels across track. The stitching error is 1.436 pixels in X direction, and 0.838 pixels in Y direction. Because of the pitch angle between the adjacent imagers, there is an offset about 14 lines in the flight direction, which can be effectively detected by visual evaluation.
Figure 10 shows the overlapping area of adjacent images after georectification and the stitching image at the same location. It is obvious that different distortions of the same ground object features are represented in adjacent images. That is because the corresponding IMU/GNSS measurement error direction is inconsistent. In our test, the georectification accuracy caused by POS AV510 is more than one pixel, which directly results in the reduction of stitching accuracy.
3.2.2. Error Analysis in Unstable Flight Condition
Actually, the instability of the platform caused by the vibration of the platform itself and the turbulence of the atmosphere should also be taken into account. A group of actual recorded IMU data was used for simulation. Experiments EA-unstable 1~5 as shown in
Table 3 were performed and analyzed.
- (1)
Without Considering Any Error Factor
Based on the method proposed, the relative orientation and the image stitching were performed based on the unstable simulated images without data error.
Figure 11a shows the image projection residuals of the left-middle images after relative orientation. Only 59 pairs of homonymy points are matched. The residual is less than 2 pixels. The RMSE across the track and along the track are 0.613 pixels and 0.688 pixels, respectively. Fifty pairs of homonymy points are matched from the right-middle images as shown in
Figure 11b. The RMSE across the track and along the track are 0.626 pixels and 0.748 pixels, respectively.
Table 11 shows the stitching RMSE of the left-middle and right-middle images. The errors of left-middle images in x- and y-direction are 0.285 pixels and 0.126 pixels, respectively, and those of right-middle images are 0.135 pixels and 0.102 pixels, respectively.
Compared with the stable flight state, the original images are seriously distorted due to the attitude change of the platform as depicted in
Figure 12. In this case, the matching accuracy of homonymy points is reduced, and the relative orientation accuracy is reduced too. After georectification, most of the geometric distortion is corrected, and the stitching accuracy can still be guaranteed at a good level.
- (2)
Influence of DEM Elevation Error
With the original images in unstable flight condition, random elevation error in accordance with normal distribution was added to the DEM used in image stitching.
Table 12 and
Table 13 shows the relative orientation residual and stitching error of the left-middle images.
The results show that DEM elevation error has little effect on the image stitching. Because the effect of DEM error is far less than that of points matching, even if the elevation error increases to 100 m, it brings unobvious error to stitching.
- (3)
Influence of Interior Orientation Error
In this experiment, distortion of experiment EA-stable-3 was added to the simulated images. The results are shown as
Table 14 and
Table 15. The effect of interior orientation error is less than that of points matching. Compared with the case without considered error influence, the stitching accuracy is slightly reduced.
- (4)
Influence of IMU/GNSS Measurement Accuracy
IMU/GNSS measurement errors were considered in this experiment. The relative orientation residual and the stitching error are shown as
Table 16 and
Table 17. The decrease in stitching accuracy is evidently caused by IMU/GNSS measurement error.
- (5)
Comprehensive Analysis
The same typical error value as EA-stable-5 were considered in the processing using the unstable flight simulation data.
The images before and after stitching are shown as
Figure 13.
Figure 13a is the overlapping area of left and middle original image, and
Figure 13b shows the corresponding area of stitching image.
Figure 13c,d show original and stitching images at the overlapping area between right and middle images. After georectification and stitching, most of the distortion is rectified, and the offset is eliminated.
Table 18 and
Table 19 show the relative orientation residual and stitching error in unstable flight condition considering the elevation error, interior orientation error and measurement error of IMU/GNSS, which is close to the true circumstance. The results show that the stitching error is as the same level as that in stable platform, which is about 1.5 pixels.
3.3. Accuracy Evaluation
Within different initial orientation values, a total of 10 groups simulation data were used in this experiment. The results of relative orientation residuals and stitching accuracy of left-middle image and right-middle images are listed in
Table 20 and
Table 21. It can be seen that the RMSE of relative orientation residuals and stitching errors are less than two pixels.
4. Discussion
It is an effective way to expand the FOV and improve the imaging efficiency by assembling multiple pushbroom imagers in the sampling direction. In order to solve the problem of high-precision image stitching for wide FOV with weak overlap constraint and short baseline between adjacent pushbroom imagers, a relative orientation method based on DEM and object-space stitching method based on overlapping area were proposed. Based on the homonymy points of adjacent images, the relative relationship between adjacent imagers was retrieved with the assistance of DEM data, so as to realize the geometric rectification and image stitching. According to the analysis of the experimental results, the proposed method can effectively stitch the sub-images with an accuracy better than 2 pixels, even in poor flight conditions.
We compared our method with the object-space-oriented calibration method and the rectified images registration method. The experiment was performed using the simulation data of EA-unstable-5. Without the support of GCPs, the result shows our method has the stitching accuracy of left-middle image with 1.518 pixels in X direction, and 0.761 pixels in Y direction. (1) Comparison with object-space-oriented calibration method: This method is commonly used in spaceborne pushbroom image stitching. We applied the idea of the object-space-oriented method to the airborne pushbroom image stitching. Firstly, eight GCPs which are evenly distributed in the FOV of the left and the right images were selected. Then, the boresight misalignment of each imager was calibrated with the support of the GCPs. Finally, direct georeferencing (DG) and image stitching were performed. The stitching accuracy of overlapping area was used to validate the results. Taking the example of left-middle image, the stitching accuracy is 1.70 pixels in X direction, and 1.12 pixels in Y direction. However, this method needs at least four GCPs evenly distributed in the FOV or more than three strips with large overlaps for each imager calibration. That means three times number of the GCPs or calibration strips should be prepared for the AMHIS data processing. The method proposed in this paper does not need the support of GCPs or additional strips, which saves the corresponding workload and cost, and has good adaptability for the areas difficult to arrange GCPs evenly. (2) Comparison with rectified images registration method. This method is suitable for the condition when GCPs cannot be arranged, or in-flight calibration cannot be carried out. Firstly, images obtained by each imager were rectified using the designed parameters, then, the rectified images were registered by homonymy points. The stitching accuracy was used to validate the stitching results. the stitching accuracy of left-middle image is 1.45 pixels in X direction, and 0.97 pixels in Y direction. However, in this method, the images of each sub-imager need DG separately, and then, the image matching and stitching can be carried out one by one. In our method, image georectification and stitching can be performed together without matching process after the relative orientation. Overall, our method can improve the data processing efficiency significantly, and reach high stitching accuracy which is the same level compared with traditional methods.
Although our method is based on the rigorous imaging model, the quality of image stitching was limited by the elevation accuracy of the auxiliary DEM, the imager interior orientation error, the position and attitude measurement accuracy, and the platform stability.
The elevation difference between the actual elevation and the interpolated elevation would bring in stitching error along the flight direction [
16,
17,
18,
19,
20,
21]. As shown in
Figure 14, supposing A is a ground point located in the overlap of camera 1 and camera 2,
and
denote the directional angle of the corresponding image homonymy points. The projection distance
of the stitching error caused by the elevation error
can be established as
The larger the elevation error is, the greater the influence of the elevation error will be. For the free downloaded global DEM data with elevation accuracy better than 20 m, when the intersection angle in the flight direction of the homonymy points is up to 0.5°, D will reach 0.17 m, which is better than one pixel size for most airborne spectral imagers.
The experimental results show that, the interior orientation error has slight effect on the stitching accuracy. The error caused by interior orientation parameters of three pixels on the edge is about 0.2 pixels. However, the most serious distortion usually exists in the overlapping area of images. The distortion in the sample direction will cause deviation of the relative orientation parameters from the true value, and finally affect the georeferencing accuracy of images. Thanks to the system with combined imagers, the FOV of each sub-imager is narrow, which limited the lens distortion. However, in order to obtain better geometric positioning accuracy, laboratory calibration should be performed to reduce the distortion effects.
The accuracy of position and attitude measurement is the largest source of error. The error of external orientation elements directly influences the relative orientation and georectification effects for each sub-image. The imaging system analyzed in this paper has a very high resolution. When using DG method for geometric rectification, the positioning error caused by POS AV510 measurement accuracy is more than 1 pixel. The geometric distortion which cannot be eliminated completely will inevitably affect the accuracy of image stitching. Therefore, a high precision IMU/GNSS system is critical to ensure the high precision image stitching and georeferencing of AWSHRHIS.
The stability of the platform is also one of the main factors. Under the influence of platform jitter and the imaging time difference of the same object features, the matching accuracy of homonymy points will be greatly affected, which will reduce the relative orientation accuracy and cause the deviation of relative orientation results from the true value. Therefore, a three-axis stabilized platform and damping strategy should be used to isolate the high-frequency vibration of the platform to reduce the internal geometric distortion of the original image as much as possible.
The method described in this paper also has some shortage.
- (1)
As mentioned before, the relative orientation results will deviate from the true value in some cases. Although the seamless stitching in one strip can be realized, the absolute geometric position accuracy at the edge of the stitched image may be reduced. The main reasons are the small overlapping area and the unevenly distributed points, which cause the low accuracy of estimated parameters. On the basis of the relative orientation model, further research should be studied to raise the georeferencing accuracy, with the assistance of one GCP at the image edge, or an adjacent strip.
- (2)
The actual flight test data is still lacking for verification. In the near future, flight tests will be performed, and a large number of verifications and analyses will be carried out.
5. Conclusions
In this paper, a relative orientation model based on narrow image overlaps were proposed to recover the relative relationship between imagers of AWSHRHIS and achieve seamless image stitching. Experiments with simulation data were performed to assess the accuracy of the method and analyze the sensitivity corresponding to different error factors are analyzed sufficiently. Several conclusions can be drawn as follows: (1) The proposed relative orientation and stitching method can be performed based on the homonymy points and DEM data. Experiments show that stitching accuracy reaches 1~2 pixels. (2) Accuracy of the attitude measurement and unstability of the platform are the largest affecting factors of error. To achieve high accuracy image processing for pushbroom imagers with high spatial resolution, high-precision IMU/GNSS equipment and three-axis stabilized platform should be integrated amounted to ensure the stitching and georeferencing accuracy of the images stitching. (3) The proposed method has similar stitching accuracy to existing methods, but has better applicability. The method does not need a large number of evenly distributed GCPs or calibration strips, thus reducing the operational complexity and cost during flight test. Meanwhile, our method simplifies the process compared with the image registration method. All the sub-images directly conduct image stitching without perform DG and registered for each image.
In the near future, the development method should be studied to improve the georeferencing accuracy. Besides, when flight tests were carried out for verification and analysis, different image stitching methods could be compared quantitatively based on the same set of data.
Overall, this method provides a new solution for image stitching of airborne hyperspectral imaging system with multi-imager such as AWSHRHIS. It reduces the demand for GCPs and calibration strips. Consequently, the cost of setting up and measuring GCPs or additional cost for flight strip calibration can be saved. In addition, this method is suitable for the operating areas where GCPs are difficult to be laid out or measured. The results of simulation experiments indicate that it can effectively promote the efficiency of airborne hyperspectral camera imaging and processing in the near future.