Next Article in Journal
The Applicability of 2-amino-4,6-diphenyl-pyridine-3-carbonitrile Sensors for Monitoring Different Types of Photopolymerization Processes and Acceleration of Cationic and Free-Radical Photopolymerization Under Near UV Light
Previous Article in Journal
Effect of Gamma-Ray Irradiation on the Growth of Au Nano-Particles Embedded in the Germano-Silicate Glass Cladding of the Silica Glass Fiber and its Surface Plasmon Resonance Response
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Wide Swath and High Resolution Airborne HyperSpectral Imaging System and Flight Validation

Key Laboratory of Space Active Opto-Electronics Technology, Shanghai Institute of Technical Physics of CAS, Shanghai 200083, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(7), 1667; https://doi.org/10.3390/s19071667
Submission received: 11 March 2019 / Revised: 2 April 2019 / Accepted: 4 April 2019 / Published: 8 April 2019
(This article belongs to the Section Remote Sensors)

Abstract

:
Wide Swath and High Resolution Airborne Pushbroom Hyperspectral Imager (WiSHiRaPHI) is the new-generation airborne hyperspectral imager instrument of China, aimed at acquiring accurate spectral curve of target on the ground with both high spatial resolution and high spectral resolution. The spectral sampling interval of WiSHiRaPHI is 2.4 nm and the spectral resolution is 3.5 nm (FWHM), integrating 256 channels coving from 400 nm to 1000 nm. The instrument has a 40-degree field of view (FOV), 0.125 mrad instantaneous field of view (IFOV) and can work in high spectral resolution mode, high spatial resolution mode and high sensitivity mode for different applications, which can adapt to the Velocity to Height Ratio (VHR) lower than 0.04. The integration has been finished, and several airborne flight validation experiments have been conducted. The results showed the system’s excellent performance and high efficiency.

1. Introduction

Spectral imagers can acquire not only 2D spatial image, but also spectral information of the target on Earth [1]. With the spectral information, many different targets can be identified and analyzed, which are usually quite difficult to realize using a traditional camera because of the similarity in RGB channel. Since JPL (Jet Propulsion Laboratory, Pasadena, CA, USA) developed the first spectral imagers AIS in 1983 [2], airborne hyperspectral imagers have played an important role in agricultural yield estimation, atmosphere component analyses, environmental monitoring, etc. In 1989, JPL developed a new airborne imaging spectrometer: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) [3,4,5,6,7]. The new spectrometer covers a range from 0.4 μm to 2.4 μm using 224 channels. The spectral sampling interval is less than 10 nm. The swath width from U2 platform is 10.5 km, with instantaneous field of view (IFOV) of about 1 mrad. The AVIRIS system has been improved several times since it first became operational to provide more accurate data [4]. Meanwhile, many other typical airborne hyperspectral imaging systems have been developed, such as LEISA [8], AisaFENIX [9,10], OMIS [11], PHI, Hymap [12], CASI/SASI [13], etc. The main features are shown in Table 1.
Although these airborne hyperspectral imagers have been widely used in many fields [14,15,16,17], low spectral or spatial resolution or small field of view (FOV) restricting the working efficiency cannot meet the requirements of many modern applications. Areas such as environment pollution monitoring and precise agriculture require the imagers to have higher spectral and spatial resolution, larger dynamic range and wider FOV so that the pollution or crops can be investigated rapidly with high efficiency [18].
Under these circumstances, the new airborne hyperspectral imager project, Wide Swath and High Resolution Airborne Pushbroom Hyperspectral Imager (WiSHiRaPHI), was set up in 2013 [19,20]. Because the characteristic spectrum, which was often used to research surface materials, is mainly in the Visible and InfraRed band (Table 2), the wavelength range of the new hyperspectral imager is designed to cover 400–1000 nm. To make sure that the new imager has a high working efficiency, the FOV of WiSHiRaPHI is designed to be 40 degrees with a 0.125 mrad IFOV and 2.4 nm spectral sampling interval.

2. WiSHiRaPHI System Introduction

The WiSHiRaPHI system consists of three subsystems to acquire wide FOV. Every subsystem consists of fore optics, a spectrometer with planar blazed grating, electronics, three-axis platform and IMU (Inertial measurement unit), etc. The three subsystems work for the left, middle and right FOV, respectively. They are aligned on an arch frame to form the 40-degree FOV (Figure 1). The main features of the WiSHiRaPHI system are shown in Table 3.

2.1. Optical System

The optical system consists of fore optics and a spectrometer subsystem. The fore optics adopts a three-mirror off-axis optics (TMA) design with compact structure, which facilitates system integration and image registration (Figure 2). The focal length of the fore optics is 128 mm, and F number is 3.8.
The spectrometer system consists of a collimator, a prism-grating dispersion element and an imaging lens. Through the primary optical system, the objects are imaged on the slit surface, dispersed by the prism-grating element and then converged on the photo-sensitive surface of the detector. Thus, the system has a low distortion and excellent Modulation Transfer Function (MTF) shown in Figure 3.

2.2. Photoelectric System

The photoelectric system is the core of the hyperspectral imager, which includes a CCD detector and camera electronics system. The CCD detector completes the function of photoelectric conversion, and the camera electronics system controls the running of the CCD detector.
The camera electronics system, shown in Figure 4, consists of a power module, a CCD interface module, a driver module, an information processing and control module, a data transmission module and an RS422 communication module. Thus, different imaging patterns with different spectral resolutions and spatial resolutions are provided.
Taking detector scale, pixel size and frame frequency into account, a customized frame transfer CCD having a four-phase array back-illumination thinning frame transfer type detector with a total of 2048 × 256 pixels, a full well charge about 200 Ke and a pixel size of 16 μm × 16 μm was selected. The main performance parameters are shown in Table 4.
The CCD sensor architecture is shown in Figure 5. The CCD detector consists of three parts: The imaging region, the memory region and the horizontal read register. The resolution of the image region is 2048 (H) × 256 (V) pixels. On both sides of this region, along V direction, extra isolation rows are aligned, which may or may not be light-sensitive. CIx and CSx are the control signals for the imaging area and the memory area, and CRx is the pixel horizontal readout control signal. The sensor is operated by integrating photo-charges in the imaging region, after which it is transferred via rapid clocks into the storage region. While the storage region reads out the photo-charge into the horizontal CCD, the imaging region begins to integrate the next frame of photo-charge during this period. The sensor has a total of 34 taps, including 32 active taps and two dummy taps (one dummy tap on each side of the sensor. The dummy taps are not shown in the architecture).
By controlling the timing of CS1, CS2, CS3, CS4 and RST, the photo-charges of adjacent pixels can be merged and realize online programming of spatial dimensions and spectral dimensions. If 256 spectral channels are adopted, the spectral resolution is 2.34 nm. When it corresponds to 64 channels, the spectral resolution is 9.32 nm. While satisfying the requirements of different applications, the pixel merger on spectral or spatial dimensions can also increase the effective size of the pixels, the sensitivity of the detector and the work frame frequency.

2.3. Block System

The block system consists of three parts: Control electronics system, camera section (contains three cameras) and a data composite board, as shown in Figure 6. The control electronics system is designed to provide power supply, communication and data storage, which is the cerebrum of the block system. The camera section, as the core of the block system, includes detector driving, information acquisition and image data transmission. The main functions of the data composite board comprise data format conversion and data forwarding.

3. Laboratory Calibration

Accurate calibration of hyperspectral sensors is indispensable to the successful use of data from such sensors. Laboratory calibration is conducted after the major system is assembled in the laboratory to check whether or not the feature satisfies the assignment. The calibration mainly contains spectral calibration, radiometric calibration, MTF and Signal-Noise Ratio (SNR).

3.1. Spectral Calibration

Spectral calibration is performed using monochromatic collimation light from a monochromator as the light source [21,22]. To calibrate the Spectral Response Function (SRF) of the whole FOV, a full-FOV-covered spectral calibration facility is designed to perform spectral calibration measurements, as shown in Figure 7.
By changing the wavelength of the monochromatic collimation light each time, the spectral response curve of each channel is acquired (Figure 8). The SRF can be well-approximated by a Gaussian function with appropriate parameters as in Equation (1) [23].
S R F ( λ ) = A 0 + A 1 exp [ ( λ λ 0 ) 2 2 σ 2 ]
where λ 0 is the center wavelength and full width at half maximum (FWHM) is calculated as FWHM = 2 2 ln 2 σ .
To make sure that the wavelength of out-light from the monochromator is accurate, a 546 nm standard spectral line of mercury lamp was used to correct the monochromator before scanning. By scanning the range around 546 nm at a step of 0.1 nm and calculating the barycenter of response curve, the accuracy of the monochromator would be below 0.1 nm (Figure 9).

3.2. Radiometric Calibration

Radiometric calibration contains relative radiometric calibration and absolute radiometric calibration. Relative radiometric calibration is conducted to correct the difference between different pixels in the same spectral channel. Absolute radiometric calibration is conducted to indicate the relationship between the pupil radiance and the response of the detector. Radiometric calibration is conducted using an integrating sphere and a standardized hyperspectral sensor.
Relative radiometric calibration accuracy is defined to indicate the error between the pixel’s response after relative radiometric calibration and the mean response of the same spectral channel. It is measured by setting the integrating sphere at different radiance levels and checking the response of the unstandardized hyperspectral sensor. Relative radiometric calibration accuracy is calculated as Equation (2).
R R ( λ ) = m e a n n = 1 ~ N ( 1 I 1 i = 1 I ( D N i , n ( λ ) D N n ( λ ) ¯ ) 2 D N n ( λ ) ¯ )
where D N i , n ( λ ) is the response of i -th pixel in λ channel at n -th radiance level after relative radiometric calibration, and D N n ( λ ) ¯ is the mean response of all the pixels in λ channel at n -th radiance level.
The relative radiometric calibration accuracy of the hyperspectral imager depends on the worst R R ( λ ) for all λ . Thus, the relative radiometric calibration accuracy is 1.9%. Figure 10 shows the relative radiometric calibration accuracy of the three subsystems at eight different bands.
Absolute radiometric calibration accuracy is defined to indicate the error between the pixel’s response after relative radiometric calibration and the pupil radiance. It mainly contains three aspects: Unsteady error R 1 , nonlinearity error R 2 and standardized hyperspectral sensor’s error R 3 . The absolute radiometric calibration accuracy is measured using the response of the unstandardized hyperspectral sensor after relative radiometric calibration and it can be expressed as Equation (3):
R A = R 1 2 + R 2 2 + R 3 2
Unsteady error indicates the stability of the sensor’s response for the same radiance. It is calculated as Equation (4):
R 1 = m e a n i , λ ( max n = 1 ~ N ( j = 1 500 ( D N i , n j ( λ ) D N i , n ( λ ) ¯ ) 2 500 D N i , n ( λ ) ¯ × 100 % ) )
where D N i , n j ( λ ) is the j -th response of i -th pixel in λ channel at n -th radiance level, and D N i , n ( λ ) ¯ = 1 500 j = 1 500 D N i , n j ( λ ) .
Nonlinearity error is calculated as:
R 2 = m e a n i , λ ( R M S E i ( λ ) 1 N n = 1 N D N i , n ( λ ) × 100 % )
where R M S E i ( λ ) is the standard deviation of linear fitting residual error.
The standardized hyperspectral sensor’s error R 3 has a relationship with the standardized hyperspectral sensor, and usually is a constant. The absolute calibration accuracy of WiSHiRaPHI of the three subsystems is listed in Table 5.

3.3. MTF Determination

MTF is one of the most important indicators to evaluate the image quality of a hypers-pectral imager. It indicates the modulation transfer characteristic of the imager to targets of different sizes, which will directly affect the sharpness of the hyperspectral sensing image in spatial direction. MTF calibration is conducted using a streak target at focal plane of collimator (Figure 11). The width of one streak is chosen to fit the focal length of collimator, thus the image on the detector just covers one pixel. With the response of the streak interdicting light, or not, MTF at Nyquist frequency is calibrated [24]. The MTF of high spectral mode and high spatial mode measured are shown in (Figure 12 and Figure 13).
M T F N y q u i s t = D N max D N min D N max + D N min × π 4
where D N min is the signal where the streak interdicts light (the dark signal between two bright signals) and D N max is where the light passes through the streak.

3.4. SNR Determination

SNR is the ratio of the equivalent charge of the output signal and the equivalent charge of the noise, and it is an important indicator to measure the detection sensitivity of the system. As with absolute radiometric calibration, SNR calibration is also conducted using the integrating sphere and standardized hypers-pectral sensor. Because the integrating sphere uses tungsten-halogen lamps as the light source, the spectral radiance curve of the integrating sphere is quite different from that of solar at all bands (Figure 14). Several radiance levels are used to make sure that the spectral radiance of the integrating sphere can approximate that of solar at several bands in proper radiance level. At each radiance level, 100 images are taken to calibrate SNR. The expression is shown as Equation (7).
S N R i ( λ ) = 1 100 j = 1 100 D N i j ( λ ) 1 99 j = 1 100 ( D N i j ( λ ) D N i d a r k ( λ ) ¯ ) 2
where D N i j ( λ ) is the j -th response of i -th pixel in λ channel, and D N i d a r k ( λ ) ¯ is the mean background of i -th pixel in λ channel.
An extrapolation method is used to calculate the SNR in a laboratory to satisfy the condition in actual use. The specific formula for estimating the SNR from the actual measured SNR is Equation (8):
S N R s u n ( λ ) = S N R m e a s u r e ( λ ) L s u n ( λ ) L m e a s u r e ( λ )
where L s u n ( λ ) is the radiometric of solar at λ wavelength, and L m e a s u r e ( λ ) are the radiometric measures using integrating sphere.
Figure 15 shows the scene of the SNR test and the test result at central FOV of the system. The SNR curve indicates that WiSHiRaPHI has high sensitivity and large dynamic range.

4. Airborne Flight Validation Experiments

Several aerial flight experiments have been conducted to check the status of WiSHiRaPHI in Dongfang City of Hainan Province, Zhenjiang Port of Jiangsu Province, Xiong’an City of Hebei Province and Sansha City of Hainan Province, and lots of images with good quality were obtained [25].
Figure 16 shows the RGB synthesis image of a coast in Dongfang. A boat in the sea was easily distinguished from the oceanic background using some characteristic bands in hypers-pectral image because of the difference between the spectral curve of materials used in the boat and that of the oceanic background.
Figure 17 shows the hypers-pectral image of an island with atoll. Figure 17a is an RGB image with R = 670 nm, G = 560 nm, B = 470 nm and Figure 17b with R = 600 nm, G = 480 nm, B = 430 nm. Figure 17c is the reflectance curve of some typical targets. By choosing the proper RGB channel, the target we needed was easily enhanced in the image.
A major application of the hyperspectral imager is to classify and measure the targets using the spectral radiance of the hyperspectral image. Because of the difference between the spectral radiance curve of the integrating sphere and that of solar, radiometric calibration in a laboratory was not precise enough. To increase the radiometric calibration accuracy, an air route for radiometric calibration was planned [26]. Five standard diffuse reflectance targets of different reflectance array were used along the trace to provide different radiance levels. An ASD hyperspectral sensor was used to detect the radiance value, when WiSHiRaPHI just flew over the diffuse reflectance targets (Figure 18). The image of each target should have at least covered 4 × 4 pixels to decrease the random error. Thus, the height and the speed of the air route have a relationship with the size of the diffuse reflectance targets and working frame frequency (Equation (9)).
N c r o s s = W c r o s s h I F O V N a l o n g = W a l o n g f v v h f I F O V
where N c r o s s , N a l o n g are the number of pixels that target covers across/along the trace, W c r o s s , W a l o n g are the length of the target, v is the speed, h is the height and f is the working frame frequency.
To validate the radiometric calibration accuracy, several other targets around the diffuse reflectance targets were detected using ASD. The spectral radiance curves of these targets were both detected with ASD and WiSHiRaPHI, shown in Figure 19. The max residual error between ASD and WiSHiRaPHI was 0.02   W / ( m 2 n m S r ) .
Figure 20 is the RGB synthesis image of Feicheng, Shandong at high spatial resolution mode and high spectral mode. The spatial resolution of high spatial resolution mode was 12.5 cm@1000 m and high spectral mode was 25 cm@1000 m. From the high spatial resolution image, many details such as venation of the rail and track of the car could be seen more clearly than in high spectral mode.
In September 2018, WiSHiRaPHI had the opportunity of flying cover Xiong’an City. Thanks to the large FOV and dynamic range, the system covered an area about 28 km × 48 km just using twenty air routes at a height of 2100 m with the spatial resolution reaching 0.5 m in two days (Figure 21). Figure 22 shows the use of a plot of land in Xiong’an City using the image data obtained from the hyperspectral imager. With the spectral curve from the hyperspectral image, different elements were classified and identified, which shows the use of the hyperspectral imager in extracting crop growth information, monitoring crop growth and quality and protecting agricultural resources and environmental quality.

5. Conclusions

Hyperspectral imaging technology shows huge potential in many aspects because of its capacity to fuse traditional images with spectral images. The airborne hyperspectral imager has a high spatial resolution with high timeliness, which affords the application of airborne hyperspectral technology great convenience in environmental monitoring, agricultural resources investigation, mineral research, etc. The WiSHiRaPHI system is designed and integrated with the spectral range covering 400–1000 nm and spectral resolution better than 5 nm. The total FOV exceeding 40° and 0.25 mrad IFOV endow WiSHiRaPHI with high working efficiency and high spatial resolution. The weight below 20 kg with a power consumption of about 60 W means that the system could be installed on many platforms, such as ARJ-21, Y-12, etc. Several flight validation experiments were conducted, and a large amount of image data with good quality was obtained. This data has been used in actual business and meets the needs well. More flight missions have been arranged in the next year.

Author Contributions

Conceptualization, J.W. and Y.W.; Methodology, L.W. and H.Y.; Software, C.Z., D.Z.; Validation, Y.W. and D.Z.; Formal Analysis, S.W. and G.H.; Investigation, S.W. and G.H.; Resources, D.H.; Data Curation, D.H., S.W. and G.H.; Writing-Original Draft Preparation, D.Z.; Writing-Review & Editing, D.Z.; Visualization, H.Y. and Y.W.; Supervision, L.W. and Y.W.; Project Administration, Y.W.

Funding

This research received no external funding.

Acknowledgments

The author would like to thank the editor and anonymous reviewers for their thoughtful comments and suggestions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Greer, J.; Flake, J.C.; Busuioceanu, M.; Messinger, D.W. Compressive spectral imaging for accurate remote sensing. SPIE Newsroom 2013. [Google Scholar] [CrossRef]
  2. Vane, G.; Goetz, A.F.H.; Wellman, J.B. Airborne imaging spectrometer: A new tool for remote sensing. IEEE Trans. Geosci. Remote Sens. 1984, GE-22, 546–549. [Google Scholar] [CrossRef]
  3. Green, R.O.; Eastwood, M.L.; Sarture, C.M.; Chrien, T.G.; Aronsson, M.; Chippendale, B.J. Imaging spectroscopy and the airborne visible/infrared imaging spectrometer (AVIRIS). Remote Sens. Environ. 1998, 65, 227–248. [Google Scholar] [CrossRef]
  4. Chrien, T.G.; Green, R.O.; Chovit, C.; Eastwood, M.; Faust, J.; Hajek, P. New calibration techniques for the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). Available online: https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19950027368.pdf (accessed on 8 March 2019).
  5. Porter, W.M.; Enmark, H.T. A system overview of the airborne visible/infrared imaging spectrometer (AVIRIS). In Proceedings of the Imaging Spectroscopy II, International Society for Optics and Photonics, San Diego, CA, USA, 20–21 August 1987; Volume 834, pp. 22–32. [Google Scholar]
  6. Eastwood, M.L.; Sarture, C.M.; Chrien, T.G.; Green, R.O.; Porter, W.M. Current instrument status of the airborne visible/infrared imaging spectrometer (AVIRIS). In Proceedings of the Infrared Technology XVII, International Society for Optics and Photonics, San Diego, CA, USA, 22–26 July 1991; Volume 1540, pp. 164–176. [Google Scholar]
  7. Macenka, S.A.; Chrisp, M.P. Airborne visible/infrared imaging spectrometer (AVIRIS) spectrometer design and performance. In Proceedings of the Imaging Spectroscopy II, International Society for Optics and Photonics, San Diego, CA, USA, 20–21 August 1987; Volume 834, pp. 32–44. [Google Scholar]
  8. McCabe, G.H.; Reuter, D.C.; Jennings, D.E.; Shu, P.K.; Tsay, S.C.; Coronado, P.L.; Ross, J.L. Observations using the airborne Linear Etalon Imaging Spectral Array (LEISA): 1-to 25-micron hyperspectral imager for remote sensing applications. In Proceedings of the Sensors, Systems, and Next-Generation Satellites III, International Society for Optics and Photonics, Florence, Italy, 20–24 September 1999; Volume 3870, pp. 140–150. [Google Scholar]
  9. AisaFENIX: AisaFENIX Hyperspectral Sensor. Available online: http://www.specim.fi/products/aisafenix-hyperspectral-sensor (accessed on 8 March 2019).
  10. Cushnahan, T.; Yule, I.J.; Grafton, M.C.E.; Pullanagari, R.; White, M. The Classification of Hill Country Vegetation from Hyperspectral Imagery. Available online: https://mro.massey.ac.nz/handle/10179/10890 (accessed on 8 March 2019).
  11. Zhao, Y.; Meng, Z.; Wang, L.; Miyazaki, S.; Geng, X.; Zhou, G.; Li, X. A new cross-track Radiometric Correction method (VRadCor) for airborne hyperspectral image of Operational Modular Imaging Spectrometer (OMIS). In Proceedings of the 2005 IEEE International Geoscience and Remote Sensing Symposium, IGARSS’05, Seoul, Korea, 29 July 2005; Volume 5, pp. 3553–3556. [Google Scholar]
  12. Kruse, F.A.; Boardman, J.W.; Lefkoff, A.B.; Young, J.M.; Kierein-Young, K.S.; Cocks, T.D.; Cocks, P.A. HyMap: An Australian hyperspectral sensor solving global problems-results from USA HyMap data acquisitions. In Proceedings of the 10th Australasian Remote Sensing and Photogrammetry Conference, Adelaide, Australia, 21–25 August 2000. [Google Scholar]
  13. Babey, S.K.; Anger, C.D. Compact airborne spectrographic imager (CASI): A progress review. In Proceedings of the Imaging Spectrometry of the Terrestrial Environment, International Society for Optics and Photonics, Orlando, FL, USA, 11–16 April 1993; Volume 1937, pp. 152–164. [Google Scholar]
  14. Pan, W.; Yang, X.; Chen, X.; Feng, P. Application of Hymap image in the environmental survey in Shenzhen, China. In Proceedings of the Remote Sensing Technologies and Applications in Urban Environments II, International Society for Optics and Photonics, Warsaw, Poland, 11–14 September 2017; Volume 10431, p. 104310. [Google Scholar]
  15. Bachmann, C.M.; Donato, T.F.; Fusina, R.A.; Lathrop, R.; Geib, J.; Russ, A.L.; Rhea, W.J. Coastal land-cover mapping: A comparison of PHILLS, HyMAP, and PROBE2 airborne hyperspectral imagery. In Proceedings of the Imaging Spectrometry IX, International Society for Optics and Photonics, San Diego, CA, USA, 3–8 August 2003; Volume 5159, pp. 180–188. [Google Scholar]
  16. Gonzalez-Sampedro, M.C.; Zomer, R.J.; Alonso-Chorda, L.; Moreno, J.F.; Ustin, S.L. Direct gradient analysis as a new tool for interpretation of hyperspectral remote sensing data: Application to HYMAP/DAISEX-99 data. In Proceedings of the Remote Sensing for Agriculture, Ecosystems, and Hydrology II, International Society for Optics and Photonics, Barcelona, Spain, 25–29 September 2000; Volume 4171, pp. 229–239. [Google Scholar]
  17. Bishop, C.A.; Liu, J.G.; Mason, P.J. Hyperspectral remote sensing for mineral exploration in Pulang, Yunnan Province, China. Int. J. Remote Sens. 2011, 32, 2409–2426. [Google Scholar] [CrossRef]
  18. Puschell, J.J. Hyperspectral imagers for current and future missions. In Proceedings of the Visual Information Processing IX, International Society for Optics and Photonics, Orlando, FL, USA, 24–28 April 2000; Volume 4041. [Google Scholar]
  19. Wang, Y.M.; Hu, W.D.; Shu, R.; Li, C.L.; Yuan, L.Y.; Wang, J.Y. Recent progress of push-broom infrared hyper-spectral imager in SITP. In Proceedings of the Society of Photo-Optical Instrumentation Engineers, Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, Anaheim, CA, USA, 9–13 April 2007. [Google Scholar]
  20. Yu, Y.N.; Wang, Y.M.; Yuan, L.Y.; Wang, S.W.; Zhao, D.; Wang, J.Y. Wide field of view visible and near infrared pushbroom airborne hyperspectral imager. In Proceedings of the Infrared Technology and Applications XLIV, International Society for Optics and Photonics, Orlando, FL, USA, 15–19 April 2018; Volume 10624, p. 106240. [Google Scholar]
  21. Vane, G.; Chrien, T.G.; Reimer, J.H.; Green, R.O.; Conel, J.E. Comparison of laboratory calibrations of the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) at the beginning and end of the first flight season. In Proceedings of the Recent Advances in Sensors, Radiometry, and Data Processing for Remote sensing, International Society for Optics and Photonics, Orlando, FL, USA, 4–8 April 1988; Volume 924, pp. 168–179. [Google Scholar]
  22. Gege, P.; Fries, J.; Haschberger, P.; Schötz, P.; Schwarzer, H.; Strobl, P.; Vreeling, W.J. Calibration facility for airborne imaging spectrometers. ISPRS J. Photogramm. Remote Sens. 2009, 64, 387–397. [Google Scholar] [CrossRef]
  23. Yu, X.; Sun, Y.; Fang, A.; Qi, W.; Liu, C. Laboratory Spectral Calibration and Radiometric Calibration of Hyper-Spectral Imaging Spectrometer. In Proceedings of the International Conference on Systems & Informatics, Shanghai, China, 15–17 November 2014. [Google Scholar]
  24. Boreman, G.D. Modulation Transfer Function in Optical and Electro-Optical Systems; SPIE Press: Bellingham, WA, USA, 2001. [Google Scholar]
  25. Wang, Y.M. Flight experiment of wide-FOV airborne hyperspectral imager. Asia-Pac. Remote Sens. Tech. Program. 2018, 10780, 35. [Google Scholar]
  26. Hörig, B.; Kühn, F.; Oschütz, F.; Lehmann, F. HyMap hyperspectral remote sensing to detect hydrocarbons. Int. J. Remote Sens. 2001, 22, 1413–1422. [Google Scholar] [CrossRef]
Figure 1. Sketch map of the Wide Swath and High Resolution Airborne Pushbroom Hyperspectral Imager (WiSHiRaPHI) system.
Figure 1. Sketch map of the Wide Swath and High Resolution Airborne Pushbroom Hyperspectral Imager (WiSHiRaPHI) system.
Sensors 19 01667 g001
Figure 2. Optical system of WiSHiRaPHI.
Figure 2. Optical system of WiSHiRaPHI.
Sensors 19 01667 g002
Figure 3. Modulation Transfer Function (MTF) curve and spot map of the optical system at 0.4 μm, 0.7 μm and 1 μm (left are spot maps, and right are MTF curve).
Figure 3. Modulation Transfer Function (MTF) curve and spot map of the optical system at 0.4 μm, 0.7 μm and 1 μm (left are spot maps, and right are MTF curve).
Sensors 19 01667 g003
Figure 4. Electronic system block diagram.
Figure 4. Electronic system block diagram.
Sensors 19 01667 g004
Figure 5. CCD structure diagram.
Figure 5. CCD structure diagram.
Sensors 19 01667 g005
Figure 6. System block diagram.
Figure 6. System block diagram.
Sensors 19 01667 g006
Figure 7. Full-field of view (FOV)-covered spectral calibration facility.
Figure 7. Full-field of view (FOV)-covered spectral calibration facility.
Sensors 19 01667 g007
Figure 8. Spectral response curve of the left subsystem at Band 51–55 (mean full width at half maximum (FWHM) is 3.48 nm).
Figure 8. Spectral response curve of the left subsystem at Band 51–55 (mean full width at half maximum (FWHM) is 3.48 nm).
Sensors 19 01667 g008
Figure 9. Monochromator’s response for 546 nm standard spectral line of mercury lamp (error of monochromator is 0.08 nm).
Figure 9. Monochromator’s response for 546 nm standard spectral line of mercury lamp (error of monochromator is 0.08 nm).
Sensors 19 01667 g009
Figure 10. Relative radiometric calibration accuracy of three subsystems at Band 50, Band 75, Band 100, Band 125, Band 150, Band 175, Band 200, and Band 225.
Figure 10. Relative radiometric calibration accuracy of three subsystems at Band 50, Band 75, Band 100, Band 125, Band 150, Band 175, Band 200, and Band 225.
Sensors 19 01667 g010
Figure 11. Sketch map of the streak target and MTF calibration system. (a) Shows the streak target and (b) shows the calibration system.
Figure 11. Sketch map of the streak target and MTF calibration system. (a) Shows the streak target and (b) shows the calibration system.
Sensors 19 01667 g011
Figure 12. MTF calibration of three bands at the central FOV of high spectral mode (Band 120: MTF = 0.51, Band 180: MTF = 0.54, Band 240: MTF = 0.47).
Figure 12. MTF calibration of three bands at the central FOV of high spectral mode (Band 120: MTF = 0.51, Band 180: MTF = 0.54, Band 240: MTF = 0.47).
Sensors 19 01667 g012
Figure 13. MTF calibration of three bands at the central FOV of high spatial mode (Band 30: MTF = 0.27, Band 45: MTF = 0.29, Band 62: MTF = 0.24).
Figure 13. MTF calibration of three bands at the central FOV of high spatial mode (Band 30: MTF = 0.27, Band 45: MTF = 0.29, Band 62: MTF = 0.24).
Sensors 19 01667 g013
Figure 14. Spectral radiance curves of integrating sphere and solar.
Figure 14. Spectral radiance curves of integrating sphere and solar.
Sensors 19 01667 g014
Figure 15. Scene of Signal-Noise Ratio (SNR) test (a) and the curve of SNR at central FOV (b).
Figure 15. Scene of Signal-Noise Ratio (SNR) test (a) and the curve of SNR at central FOV (b).
Sensors 19 01667 g015
Figure 16. Hypers-pectral image of a port ((a) shows the RGB image of the coast, (b) and (c) show the images at two different spectral channels, (d) distinguishes the boat from the background using characteristic bands, and (e) shows the spectral curves of white water, boat and dark water).
Figure 16. Hypers-pectral image of a port ((a) shows the RGB image of the coast, (b) and (c) show the images at two different spectral channels, (d) distinguishes the boat from the background using characteristic bands, and (e) shows the spectral curves of white water, boat and dark water).
Sensors 19 01667 g016
Figure 17. Hyperspectral image of island with atoll ((a) and (b) show the RGB image at different spectral channel, and (c) shows the reflectance curve of some typical targets).
Figure 17. Hyperspectral image of island with atoll ((a) and (b) show the RGB image at different spectral channel, and (c) shows the reflectance curve of some typical targets).
Sensors 19 01667 g017
Figure 18. Hyperspectral image of diffuse reflectance targets ((b) is the picture of diffuse reflectance targets, and (c) shows the radiation curves of these diffuse reflectance targets).
Figure 18. Hyperspectral image of diffuse reflectance targets ((b) is the picture of diffuse reflectance targets, and (c) shows the radiation curves of these diffuse reflectance targets).
Sensors 19 01667 g018
Figure 19. Hyperspectral image of diffuse reflectance targets.
Figure 19. Hyperspectral image of diffuse reflectance targets.
Sensors 19 01667 g019
Figure 20. RGB synthesis image at high spatial/spectral resolution mode. (a) Is high spatial mode (instantaneous field of view (IFOV) = 0.125 mrad) and (b) is high spectral mode (IFOV = 0.25 mrad).
Figure 20. RGB synthesis image at high spatial/spectral resolution mode. (a) Is high spatial mode (instantaneous field of view (IFOV) = 0.125 mrad) and (b) is high spectral mode (IFOV = 0.25 mrad).
Sensors 19 01667 g020
Figure 21. Air rout plan of Xiong’an City.
Figure 21. Air rout plan of Xiong’an City.
Sensors 19 01667 g021
Figure 22. Land classification map.
Figure 22. Land classification map.
Sensors 19 01667 g022
Table 1. Main features of some typical airborne hyperspectral imaging systems.
Table 1. Main features of some typical airborne hyperspectral imaging systems.
Spectral Range (nm)Spectral Sampling IntervalNumber of ChannelsFOV (deg)IFOV (mrad)
AVIRIS400–250010 nm224301
LEISA1000–25004–10 nm432192
AisaFENIX380–25003–5 nm@380–970 nm
12 nm@970–2500 nm
44832.31.4
OMIS100–12,50010 [email protected]–1.1 μm
30 [email protected]–1.7 μm
15 nm@2–2.5 μm
2 μm@3–5 μm
600 nm@8–12.5 μm
128731.5/3
PHI400–8501.8 nm244211.5
Hymap400–250010–20 nm12861.32 × 2.5
CASI/SASI400–25002.4 nm/7.5 nm96/200400.49/0.698
Table 2. Characteristic spectral lines of typical material in Visible and InfraRed band.
Table 2. Characteristic spectral lines of typical material in Visible and InfraRed band.
Band (μm)Characteristic Spectral LineBand (μm)Characteristic Spectral Line
0.46–0.48Absorb of renieratene (high)0.66–0.68Valley of reflectance for most plant
0.50–0.52Reflectance of chlorophyll (high)0.70–0.72Red edge of plant
0.54–0.56Absorb of Fe2+, Fe3+0.88–0.90Peak of reflectance for plant, absorb of Fe3+
0.56–0.62Absorb of phycoerythrin0.92–0.94Absorb of Fe2+
Table 3. Main features of the WiSHiRaPHI system.
Table 3. Main features of the WiSHiRaPHI system.
ParameterIndexParameterIndex
Spectral Range (μm)0.4–1.0MTF≥0.5
FOV40°VHR0.02–0.04
Spectral Resolution (nm)3.5/9.2, adjustableWeight≤20 Kg
Number of Channels256/64, adjustablePower Consumption60 W
IFOV (mrad)0.25/0.125, adjustablePlatformARJ-21, Y-12, and so on
SNR≥500 (ρ = 0.3)
Table 4. Parameters of the CCD detector.
Table 4. Parameters of the CCD detector.
ItemParameters
Detector scale2048 × 256, Frame transfer
Pixel size16 μm × 16 μm
Number of output channels32
Maximum pixel rate25 MHz
Full well charge≥200,000 e
CCE8 μV/e
QE≥61% @ 248 nm
Noise≤55 e
Table 5. Main features of the WiSHiRaPHI system.
Table 5. Main features of the WiSHiRaPHI system.
SubsystemsR1R2R3RA
Left1.89%0.29%5.01%5.36%
Middle2.37%0.28%5.01%5.55%
Right1.45%0.31%5.01%5.22%

Share and Cite

MDPI and ACS Style

Zhang, D.; Yuan, L.; Wang, S.; Yu, H.; Zhang, C.; He, D.; Han, G.; Wang, J.; Wang, Y. Wide Swath and High Resolution Airborne HyperSpectral Imaging System and Flight Validation. Sensors 2019, 19, 1667. https://doi.org/10.3390/s19071667

AMA Style

Zhang D, Yuan L, Wang S, Yu H, Zhang C, He D, Han G, Wang J, Wang Y. Wide Swath and High Resolution Airborne HyperSpectral Imaging System and Flight Validation. Sensors. 2019; 19(7):1667. https://doi.org/10.3390/s19071667

Chicago/Turabian Style

Zhang, Dong, Liyin Yuan, Shengwei Wang, Hongxuan Yu, Changxing Zhang, Daogang He, Guicheng Han, Jianyu Wang, and Yueming Wang. 2019. "Wide Swath and High Resolution Airborne HyperSpectral Imaging System and Flight Validation" Sensors 19, no. 7: 1667. https://doi.org/10.3390/s19071667

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop