Next Article in Journal
Accessibility, Demography and Protection: Drivers of Forest Stability and Change at Multiple Scales in the Cauvery Basin, India
Previous Article in Journal
Spatial Enhancement of MODIS-based Images of Leaf Area Index: Application to the Boreal Forest Region of Northern Alberta, Canada
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring

1
USDA-Agricultural Research Service, Hydrology and Remote Sensing Laboratory, Building 007 Room 104 BARC-West, 10300 Baltimore Avenue, Beltsville, MD 20705, USA
2
IntelliTech Microsystems, Inc., 2138 Priest Bridge Court, Suite 3, Crofton, MD 21114, USA
3
DSL Consulting, Inc., 7611 Kingfisher Court, Dexter, MI 48130, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2010, 2(1), 290-305; https://doi.org/10.3390/rs2010290
Submission received: 8 November 2009 / Revised: 1 December 2009 / Accepted: 6 January 2010 / Published: 11 January 2010

Abstract

:
Payload size and weight are critical factors for small Unmanned Aerial Vehicles (UAVs). Digital color-infrared photographs were acquired from a single 12-megapixel camera that did not have an internal hot-mirror filter and had a red-light-blocking filter in front of the lens, resulting in near-infrared (NIR), green and blue images. We tested the UAV-camera system over two variably-fertilized fields of winter wheat and found a good correlation between leaf area index and the green normalized difference vegetation index (GNDVI). The low cost and very-high spatial resolution associated with the camera-UAV system may provide important information for site-specific agriculture.

Graphical Abstract

1. Introduction

For more than 50 years, aerial color and color-infrared photography have been used to monitor crop growth [1]. Currently, these methods are being re-assessed for analyzing within-field spatial variability for agricultural precision management, because aerial imagery may be acquired quickly during critical periods of rapid crop growth [2,3,4,5,6,7]. Typically, data acquired from manned aircraft use large-format cameras, which are expensive compared to other types of imagery [8]. Based on the payload capacity, there are a wide variety of sensors, power supplies and data storage options available for manned aircraft. Unmanned aerial vehicles (UAVs) and other small unmanned aircraft have potentially lower cost but also have a significantly lower payload capacity, so light-weight compact sensors are required. Furthermore, UAVs can be flown at lower altitudes with greater safety than manned aircraft, thereby increasing spatial resolution [8,9,10,11,12,13,14,15,16,17]. Low-cost, light-weight sensors are critical for the development of UAVs as a cost-effective platform for image acquisition.
Digital photography uses either silicon-based charge-coupled detectors or complementary metal-oxide-semiconductors, both of which have a spectral sensitivity from about 350 nm to about 1,100 nm wavelength [18,19]. Most digital cameras use a Bayer pattern array of filters to obtain red, green and blue bands for a digital image [18,19]; however, the chemical basis for making these filters is proprietary and there is variation in filter spectral transmittances among various digital cameras. Typically, Bayer-pattern filters transmit at least some near-infrared (NIR) light through either the blue, green or red channels, so almost all commercially-available digital cameras have an internal hot-mirror filter blocking NIR light. This filter can be removed allowing detection of reflected NIR radiation from vegetation [18,19].
Certain digital cameras (for example Kodak DCS cameras [20,21]) have Bayer-pattern filters which the red, green and blue filters all transmit significant amounts of NIR light [21,22]. When the internal hot-mirror filter is removed and a blue-blocking filter is placed in front of the lens, Ziglado et al. [22] found that the blue channel records the NIR light reflected from vegetation. With calibration, the contributions of NIR light to the digital numbers of the green and red channels are subtracted based on the value from the blue channel [22]. Therefore, with extensive post-processing, the raw digital camera image can be converted into a red, green and NIR false-color image. Currently, the few color-infrared digital cameras that are commercially available are based on the method of Ziglado et al. [22].
For some commercial digital cameras, only the red channel is sensitive to NIR light. Furthermore, post-processing of each raw image to obtain a false-color image presents a significant extra workload that may become a burden when processing large numbers of images. We present a new method that allows color-infrared digital photographs to be obtained from cameras in which only the red channel is sensitive to NIR light and does not require post-processing. The tradeoff is that a NIR, green and blue digital image is obtained instead of an NIR, red and green image. Whereas indices such as the normalized different vegetation index (NDVI) can not be determined without a red band, alternative vegetation indices such as the Green NDVI (GNDVI) [23] have similar information content and value. We demonstrate an application of a NIR-green-blue digital camera, when mounted in a UAV, provides a method for measuring crop leaf area index (LAI) at very-high spatial resolution.

2. Methodology

2.1. Camera System

The Fuji Photofilm Co., Ltd (Tokyo, Japan) FinePix S3 Pro UVIR camera (12 megapixels) [19] and a few newer models are sold without an internal hot-mirror filter blocking NIR light. This camera with a standard lens weighs about 1.24 kg. To block red light, we first tested two commercially-available cyan dichroic filters, and found these filters do not have high transmittance in the NIR. A custom interference filter (Omega Optical, Inc., Brattleboro, VT, USA) was designed to block red light from 610 to 725 nm (to avoid the edge of the chlorophyll absorption feature from 700 to 720 nm, called the red edge) and transmit blue, green and NIR light, and was mounted in front of the lens. The digital camera was spectrally calibrated with and without the custom filter by taking photographs of monochromatic light from a SPEX 1,680 monochromator (Jobin-Yvon, Edison, NJ, USA) projected onto a Spectralon white panel (Labsphere, Inc., North Sutton, NH, USA). The average digital numbers for each channel were determined for the center of the projected light beam using image processing software (ENVI version 4.3, Research Systems, Inc., Boulder, CO, USA). Without the custom filter, the spectral response of digital camera showed that the red channel had a much larger response in the NIR compared to the blue and green channels (Figure 1a), and with the custom filter, the camera produced NIR-green-blue images (Figure 1a). The lower response in the NIR (Figure 1b) indicated that the white-light balance of the images should be adjusted manually.
Figure 1. (a) Relative spectral response of the Fuji FinePix S3 Pro UVIR camera for the blue, green, and red/NIR channels. (b) Relative spectral response after a custom red-light-blocking filter (610–720 nm) was placed in front of the lens.
Figure 1. (a) Relative spectral response of the Fuji FinePix S3 Pro UVIR camera for the blue, green, and red/NIR channels. (b) Relative spectral response after a custom red-light-blocking filter (610–720 nm) was placed in front of the lens.
Remotesensing 02 00290 g001aRemotesensing 02 00290 g001b

2.2. Field Site

In the autumn of 2006, field experiments were established on two adjoining fields on the south-eastern shore of the Chester River in Queen Anne’s County, Maryland, USA (39.03°N latitude and 76.18°W longitude; Figure 2). The regional average annual temperature is 15.4 °C, with an average annual precipitation of 88.4 cm. Soil types included a mixture of silt loams, including Othello (poorly drained Fine-silty, mixed, active, mesic Typic Endoaquults), Pineyneck (moderately well-drained Coarse-loamy, mixed, active, mesic Aquic Hapludults) and Unicorn (well-drained Coarse-loamy, mixed, semiactive, mesic Typic Hapludults).
Figure 2. Location of field sites in Queen Anne’s County, MD, USA.
Figure 2. Location of field sites in Queen Anne’s County, MD, USA.
Remotesensing 02 00290 g002
The two fields were approximately 10 ha (Figure 3). The eastern field was planted to winter wheat (Triticum aestivum L.) on 26 September 2006 following corn (Zea mays L.). The western field was planted to winter wheat on 31 October 2006 following soybean (Glycine max L.). The spacing between rows of winter wheat was 19 cm. Within each field, six 27-m wide strips running the length of the field were established, and were managed with one of four treatments (Figure 3). Broadcast pelletized fertilizer (34 kg N ha−1 plus 68 kg K ha−1) was applied to alternating strips in both fields on 7 November 2006, with the remaining strips receiving no autumn fertilizer. On 6 March 2007, four edge strips from each field were fertilized with 34 kg N ha−1 leaving the middle two strips of each field without fertilizer. The result was four nitrogen fertilizer treatments (N-autumn N-spring, N-autumn o-spring, o-autumn N-spring, and o-autumn o-spring).
Figure 3. Layout of winter wheat planted with four nitrogen fertilizer treatments (NN, oN, No, oo). The first N/o represents either fertilization (N) or no fertilization (o) in the autumn and the second N/o represents either fertilization or no fertilization in the spring. The black dots indicate planned sample locations for each strip. The blue symbols indicate plots sampled ad hoc; dots and crosses show the locations of the low biomass and planter skip plots, respectively. The gray-scale image is a red channel (650 nm, 2.5 m pixel) from an AISA sensor image acquired on 10 April 2007 by SpecTIR, Inc. (Easton, MD, USA).
Figure 3. Layout of winter wheat planted with four nitrogen fertilizer treatments (NN, oN, No, oo). The first N/o represents either fertilization (N) or no fertilization (o) in the autumn and the second N/o represents either fertilization or no fertilization in the spring. The black dots indicate planned sample locations for each strip. The blue symbols indicate plots sampled ad hoc; dots and crosses show the locations of the low biomass and planter skip plots, respectively. The gray-scale image is a red channel (650 nm, 2.5 m pixel) from an AISA sensor image acquired on 10 April 2007 by SpecTIR, Inc. (Easton, MD, USA).
Remotesensing 02 00290 g003
There is a genetically-set maximum number of leaves on a single stem of wheat—fertilization during early growth generally increases the number of stems (tillers) on a single plant. In order to increase the protein concentration in grain, and obtain a higher selling price, the fields were uniformly sprayed with 68 kg N ha−1 on 11 April 2007. Therefore, there were no significant differences in chlorophyll content among the treatments, nor significant interactions, using a two-way analysis of variance (data not shown).
At the beginning of the experiment, three sampling locations were established geolocated at the southern end, center, and northern end of each strip (Figure 3) with sub-meter accuracy. Field data were collected on 2 May 2007 when the last leaf was fully expanded and before the flowers emerged. Sampling occurred ten days after 5-cm rainfall event, without any interim precipitation. Soil moisture contents at time of sampling were not measured, but appeared to be intermediate (not saturated, not overly dry). LAI of each plot was measured using an LAI-2000 Plant Canopy Analyzer (LI-COR, Inc., Lincoln, NE, USA) and three adjacent 0.5-m rows of wheat, were cut at ground level, dried and weighed to determine biomass. While collecting data, five other locations were found that had large differences in plant density, so these plots were sampled on an ad hoc basis (Figure 3).

2.3. Unmanned Aircraft Flights

The camera system was mounted in the Vector-P UAV (IntelliTech Microsystems, Inc., Bowie, MD, USA) and was controlled by an autopilot computer program to take photographs at user-selected waypoints to ensure complete coverage of the field (Figure 4). Test flights were conducted on 30 April and 1 May 2007. On 4 May 2007, the UAV was flown and collected digital imagery at two altitudes, 105 m (350 feet) and 210 m (700 feet) above ground level. Tarpaulins of various colors (red, green, black, gray, and beige) were used to test the spectral and radiometric calibration of the modified camera (Figure 5). Spectral reflectances of the tarpaulins were measured using a FieldSpec Pro FR spectroradiometer (Analytical Spectral Devices, Inc., Boulder, CO, USA) [13].
Figure 4. (a) The Vector-P unmanned aerial vehicle from IntelliTech Microsystems, Inc (http://www.vectorp.com). (b) Autopilot flight path showing waypoints for image collection.
Figure 4. (a) The Vector-P unmanned aerial vehicle from IntelliTech Microsystems, Inc (http://www.vectorp.com). (b) Autopilot flight path showing waypoints for image collection.
Remotesensing 02 00290 g004aRemotesensing 02 00290 g004b
The 12-megapixel digital images were saved in Tagged Image File Format (TIFF) and imported into the Environment for Visualizing Images (ENVI v4.3 RSI, Inc., Boulder, CO, USA) for image processing. The raw digital numbers are affected by changes in solar irradiance (exposure settings of the camera were fixed). Therefore, the same ground location may have different digital numbers in sequential images because of different angles of incidence and changes in atmospheric transmittance [24]. The images over the field were analyzed one by one, and not combined into a single image; research by other groups is ongoing on image mosaicing and orthorectification [25,26].
Vegetation indices are a method for reducing the variation among images and enhancing the contrast between vegetation and the ground. From Gitelson et al. [23], the Green Normalized Difference Vegetation Index (GNDVI) is defined:
GNDVI = (NIR − green)/(NIR + green)
where NIR and green are the digital numbers from the NIR and green bands, respectively. For calibration, GNDVI of the colored tarpaulins calculated from digital numbers were compared to the GNDVI of the tarpaulins measured in the laboratory with the spectroradiometer. Because differences in irradiance are factored out in Equation 1, tarpaulin GNDVI in the images was linearly related to laboratory-measured GNDVI with an R2 of 0.99. Therefore, the regression equation was applied to the images so support comparisons for other dates and locations.
There were an insufficient number of ground control points in the wheat field for accurate image to image registration or orthorectification, particularly for the imagery acquired at 105 m above ground level. Furthermore, most of the sample plots were not in center of the images acquired at 105 m altitude (i.e., nadir view). Therefore, we chose the imagery acquired at 210 m above ground level (with a 4.9-cm pixel resolution) for determining plot GNDVI, using the bare ground left after biomass sampling as a point of reference for locating each plot. For a specific plot, we chose the image that had the plot closest in the nadir direction. Mean GNDVI was determined for a 2-m-by-2-m area surrounding each harvested plot, masking out the actual harvested plot.

3. Results and Discussion

3.1. UAV-Camera System

The red/NIR channel on the Fuji UVIR camera had spectral sensitivity in the NIR (Figure 1). The blue and green channels had comparatively small responses in the NIR from 725 to 800 nm wavelength (Figure 1). At wavelengths greater than 800 nm, the responses of the three channels were about equal and at the noise level of the digital camera (data not shown). Because the blue and green channels did not have spectral sensitivity to NIR light (Figure 1), the method of Ziglado et al. [20] could not be applied to the digital camera used in this study. In the absence of a spectroradiometer, a simple test with black-dyed paper will show if the camera’s blue and green bands are sensitive to NIR light because color-dyed paper is usually very reflective in the NIR. If photographs show the paper as red, then a red-blocking filter can be used to obtain NIR-green-blue digital images. If photographs of black-dyed paper show either gray or white, then all three bands are sensitive to NIR, and the method of Ziglado et al. [20] could be used for NIR-red-green digital imagery.
From the focal length, the size of the camera lens, and the number of detector elements, the spatial resolution and the area covered by each photograph can be determined as a function of altitude above ground level (Table 1). For flights at 230 m above ground level, a wide angle lens (24 mm) will produce a pixel size of 5.1 cm and cover an area of 3.2 ha.
Table 1. Pixel size and coverage area for the 12-megapixel Fuji FinePix S3 Pro UVIR digital camera at various altitudes (above ground level) for two focal lengths of the camera lens (equivalent to lenses of standard single lens reflex film cameras).
Table 1. Pixel size and coverage area for the 12-megapixel Fuji FinePix S3 Pro UVIR digital camera at various altitudes (above ground level) for two focal lengths of the camera lens (equivalent to lenses of standard single lens reflex film cameras).
Altitude (m)Altitude (feet)Focal length 24 mmFocal length 55 mm
Pixel size (cm)Area (ha)Pixel size (cm)Area (ha)
762501.70.360.750.068
1204002.70.911.20.17
1505003.41.41.50.27
2307505.13.22.20.61
3101,0006.95.73.01.1
4601,50010.312.84.52.4
6102,00013.722.86.04.3
7602,50017.135.67.56.8
1,0703,50024.069.910.513.3
1,5205,00034.314315.027.2
Flying at 30 m s−1 and assuming the time required to save one photograph is about 3 seconds, successive pictures will have about 50% overlap, which allows registration to obtain a complete image of a field and possibly stereo pairs of photographs. However, for flights at 120 m above ground level, the pixel size is about 2.7 cm and the area covered in one photograph is 0.91 ha (Table 1), so there will be very little overlap between sequential pictures (if any). Because wind and turbulence will strongly affect roll, pitch and yaw of UAVs and other light aircraft, photographs from a digital camera system will be easier to register in comparison to scanning-type sensors.
The NIR-green-blue digital images acquired from the UAV (Figure 5) are very similar to the NIR, red and green images acquired using color-infrared film. With this high spatial resolution, features in Figure 5 can be identified including, harvested sample plots, trampled wheat indicating foot paths to the sample plots, and a 5-gallon orange bucket, which was used as a ground control point. GNDVI showed strong differences in the amount of vegetation over the variably-fertilized winter wheat fields (Figure 6). The values of GNDVI at the north and south edges of the photograph are higher than values in the center for two possible reasons: (a) problems with the interference filter at wide view angles; and (b) wide view angles creating an oblique view angle through the wheat canopy. The oblique view increases the amount of vegetation compared to the nadir view, thereby increasing the relative amount of NIR light reflected and decreasing the relative amount of green light reflected. These factors can be reduced by using lens with a longer focal length with a smaller range of view angles, or simply not using the pixels around the periphery of the image.
Figure 5. A NIR-green-blue digital photograph acquired at 105 m above ground level with a pixel size of 2.5 cm. The top of the images points north and the UAV was flying south. Points of interests are: (A) calibration tarpaulins (2.92 m by 2.92 m); (B) a seed-planter skip, where three ad hoc plots were placed; (C) a harvested plot (strip 12, south); (D) two footpaths to the plot; and (E) an orange, 5-gallon bucket used for a ground control point.
Figure 5. A NIR-green-blue digital photograph acquired at 105 m above ground level with a pixel size of 2.5 cm. The top of the images points north and the UAV was flying south. Points of interests are: (A) calibration tarpaulins (2.92 m by 2.92 m); (B) a seed-planter skip, where three ad hoc plots were placed; (C) a harvested plot (strip 12, south); (D) two footpaths to the plot; and (E) an orange, 5-gallon bucket used for a ground control point.
Remotesensing 02 00290 g005
Figure 6. Green normalized difference vegetation index (GNDVI) for the photograph in Figure 5. Ranges of GNDVI were color coded: <0.5 (black), 0.50–0.57 (purple), 0.57–0.64 (blue), 0.64–0.71 (cyan), 0.71–0.78 (dark green), 0.78–0.85 (light green), and 0.85–1.00 (orange).
Figure 6. Green normalized difference vegetation index (GNDVI) for the photograph in Figure 5. Ranges of GNDVI were color coded: <0.5 (black), 0.50–0.57 (purple), 0.57–0.64 (blue), 0.64–0.71 (cyan), 0.71–0.78 (dark green), 0.78–0.85 (light green), and 0.85–1.00 (orange).
Remotesensing 02 00290 g006

3.2. Field Data Analysis

The experimental treatments resulted in significantly different aboveground wheat biomass, with zero-N treatments averaging 411 g m−2 and 66 g m−2 in the early- and late-planted fields, respectively (Table 2). Fertility increased biomass, with the springtime application showing greater effects than fall application, and the fall plus spring N fertility treatment exhibiting the highest average observed biomass values (832 g m−2 and 227 g m−2 in the early- and late-planted fields, respectively), approximately double the unfertilized biomass. A similar pattern was seen in grain yield (Table 2). A moderate correlation was observed between biomass and LAI (R2 = 0.79) and fertility treatment increased LAI in a similar manner.
Table 2. Dry biomass, leaf area index (LAI) and yield data for winter wheat experiment. Biomass and LAI data were measured May 2, 2007, whereas the yield data were measured on June 27, 2007 for the whole strip. Nitrogen fertilization treatment symbols are the same as Figure 3; strip numbering starts from the east. For the 12 treatment strips, the data are mean ± standard deviation for 3 plots. Two low biomass plots (LB) were selected to increase the range of biomass and LAI and were not included in the strip means. A planter skip (labeled B in Figure 5) had three plant densities with differences in biomass and LAI.
Table 2. Dry biomass, leaf area index (LAI) and yield data for winter wheat experiment. Biomass and LAI data were measured May 2, 2007, whereas the yield data were measured on June 27, 2007 for the whole strip. Nitrogen fertilization treatment symbols are the same as Figure 3; strip numbering starts from the east. For the 12 treatment strips, the data are mean ± standard deviation for 3 plots. Two low biomass plots (LB) were selected to increase the range of biomass and LAI and were not included in the strip means. A planter skip (labeled B in Figure 5) had three plant densities with differences in biomass and LAI.
Strip/PlotPlantingN-TrmtBiomass (g m−2)LAI (m2 m−2)Yield (g m−2)
1EarlyN-N493.4 ± 89.42.70 ± 0.13417.8
2Earlyo-N598.4 ± 121.62.62 ± 0.14432.9
3EarlyN-o502.0 ± 94.72.30 ± 0.27391.7
4Earlyo-o411.0 ± 71.42.02 ± 0.10291.5
5EarlyN-N832.8 ± 128.33.00 ± 0.35397.5
6Earlyo-N579.7 ± 89.62.43 ± 0.25399.4
7LateN-N203.2 ± 57.61.60 ± 0.12276.9
8Lateo-N64.1 ± 17.50.68 ± 0.13218.9
9LateN-o95.3 ± 32.70.79 ± 0.17197.2
10Lateo-o66.4 ± 15.70.77 ± 0.07213.5
11LateN-N226.7 ± 43.51.61 ± 0.10268.8
12Lateo-N151.5 ± 133.41.38 ± 0.12289.1
LB1LateN-N25.90.37
LB2Lateo-N30.20.35
Skip 1LateN-N401.02.15
Skip 2LateN-N295.61.93
Skip 3LateN-N83.30.82
GNDVI was linearly related to LAI measured for 35 of the 41 plots that had LAI less than 2.7 (Figure 7). Above an LAI of 2.7 (6 plots), GNDVI was not responsive to changes in LAI and so was excluded from the regression. Most vegetation indices saturate at some level of LAI [23], so the response of GNDVI to LAI was expected. GNDVI was originally developed for determining plant chlorophyll status [27], which is strongly related to nitrogen status in wheat [28] along with other stress factors. Also, Shanahan et al. [29] showed that among various remote-sensing indices, GNDVI had the highest correlation with corn grain yield. UAVs equipped with the camera system presented in this study may be useful for within-season crop management in site-specific agriculture or precision farming. However, more research is needed to separate differences in GNDVI caused by variation in leaf chlorophyll concentration from differences in GNDVI caused by variation in leaf area index [23].
Whereas the UAV-camera system presented here was developed for crop management, the NIR-green-blue images can be used for vegetation monitoring for a variety of applications. For example, NIR-green-blue digital images can be acquired using cameras mounted on poles or stands for ground-based determination of plant cover [30]. Digital color photography from low flying manned aircraft or UAVs is being used for remote sensing in rangeland ecosystems [25,31,32,33,34], because small pixel sizes help identify determine ground cover of plants and bare soil. However, variation in soil types and the presence of shadows create problems separating various cover classes. Since living vegetation strongly reflects NIR light, a NIR-green-blue camera system would be a good alternative for monitoring natural resources.
Figure 7. Relationship of GNDVI with leaf area index (LAI) for winter wheat. At an altitude of 210 m above ground level, each plot occurred in a number of images, so the plot GNDVI was determined using the image that had the plot location closest to nadir. For LAI from 0 to 2.7 (35 of 41 plots), the regression equation is GNDVI = 0.50 + 0.16 LAI, with an R2 of 0.85.
Figure 7. Relationship of GNDVI with leaf area index (LAI) for winter wheat. At an altitude of 210 m above ground level, each plot occurred in a number of images, so the plot GNDVI was determined using the image that had the plot location closest to nadir. For LAI from 0 to 2.7 (35 of 41 plots), the regression equation is GNDVI = 0.50 + 0.16 LAI, with an R2 of 0.85.
Remotesensing 02 00290 g007
UAVs and other unmanned aircraft may be particularly important for crop monitoring during the early part of the growing season, when cloud cover may prevent satellite data acquisition. Currently, successful determination of crop nutrient requirements are made later in the growing season when the full canopy increases the number of pure crop pixels [2,3,4,5,6,7]. In mixed pixels of crops and soil, the accurate determination of nutrient requirements is difficult, because the spectral reflectance from vegetation gets averaged out with bare soil [5]. With smaller pixel sizes, there will be more pure crop pixels, more pure bare-soil pixels, and fewer mixed pixels. Then, crop nutrient requirements can be estimated using only the pure crop pixels earlier in the growing season for use in precision farming or site-specific agriculture [5].
The requirement that a camera-small UAV system may fill is the need for very high spatial resolution at low cost, because small UAVs can fly at low altitudes above ground level. Small manned aircraft can fly close to the ground when these aircraft have rocket-propelled ejection seats and other safety devices for the pilot [31,32]. For airborne remote sensing from large manned aircraft, initial-costs, operating costs and maintenance costs are generally larger compared to UAVs. There are a large number of sensors available for remote sensing from manned aircraft; however, large manned aircraft can not fly near the ground safely, very high spatial resolutions have to be obtained from the sensor optics, which increases sensor costs exponentially. Currently, the US Federal Aviation Administration has strict regulations on civil UAV operations; changes are being considered which may allow small UAVs to fly for commercial purposes [35].

4. Conclusions

The UAV-mounted digital color-infrared camera system developed by this project found a good correlation between leaf area index and green normalized difference vegetation index (GNDVI) in imagery collected 210 m above two variably-fertilized fields of winter wheat, indicating that this lightweight camera system can be used to provide important information for site-specific agricultural decision making.
The first principal advantage of the NIR-green-blue camera is the low cost, low weight and compact size of the system, making this camera an ideal sensor for small UAVs. The resulting images can be used to obtain quantitative information about vegetation growth and health similar to NIR-red-green color-infrared cameras. Custom cameras could be constructed using off-the-shelf components; however, the camera system has to integrate a power supply, data storage, global positioning system receivers, and computer processing which adds to the size, weight and cost. Small UAV-camera systems, such as the one presented in this study, can provide high spatial resolution imagery at lower cost.
The second advantage of the NIR-green-blue digital camera for small UAVs is that the three bands are already registered by the camera’s software. Sensor systems have been designed with separate cameras for blue, green, red, and NIR imagery, but the separate images from the multiple cameras have to be registered. Registration causes errors decreasing the spatial and radiometric accuracy of the combined imagery.
The third advantage of the NIR-green-blue digital camera-small UAV system is that post-processing is not required in contrast with color-infrared photographs using the method of Ziglado et al. [20], so the images can be visually inspected upon landing. The lower maximum response of the NIR in this camera system indicates some image enhancement and processing may be necessary to aid quantitative assessments using the imagery.

Acknowledgements

This project was funded by a Cooperative Research and Development Agreement (CRADA) with IntelliTech Microsystems, Inc. (Bowie, MD, USA). We thank Charlie Ng, Michael Tranchitella, Don Tatum, Lanny Herron, and David Yoel from IntelliTech Microsystems, Inc. and FalconScan, LLC. Also, we thank Ben Tilghman and Temple Rhodes for use of their field, and we thank Guy Serbin, Paul Biddle, and Antonio Periera, for assistance in the field. Finally, we thank the reviewers for suggestions which improved the manuscript.

References and Notes

  1. Colewell, R.N. Determining the prevalence of certain cereal crop diseases by means of aerial photography. Hilgardia 1956, 26, 223–286. [Google Scholar] [CrossRef]
  2. Blackmer, T.M.; Schepers, J.S.; Varvel, G.E.; Meyer, G.E. Analysis of aerial photography for nitrogen stress within corn fields. Agron. J. 1996, 88, 729–733. [Google Scholar] [CrossRef]
  3. GopalaPillai, S.; Tian, L. In-field variability detection and spatial yield modeling for corn using digital aerial imaging. Trans. ASAE 1999, 42, 1911–1920. [Google Scholar] [CrossRef]
  4. Yang, C.; Everett, J.H.; Bradford, J.M.; Escobar, D.E. Mapping grain sorghum growth and yield variations using airborne multispectral digital imagery. Trans. ASAE 2000, 43, 1927–1938. [Google Scholar] [CrossRef]
  5. Scharf, P.C.; Lory, J.A. Calibrating corn color from aerial photographs to predict sidedress nitrogen need. Agron. J. 2002, 94, 397–404. [Google Scholar] [CrossRef]
  6. Flowers, M.; Weisz, R.; Heiniger, R. Quantitative approaches for using color infrared photography for assessing in-season nitrogen status in winter wheat. Agron. J. 2003, 95, 1189–1200. [Google Scholar] [CrossRef]
  7. Sripada, R.P.; Farrer, D.C.; Weisz, R.; Heiniger, R.W.; White, J.G. Aerial color infrared photography to optimize in-season nitrogen fertilizer recommendations in winter wheat. Agron. J. 2007, 99, 1424–1435. [Google Scholar] [CrossRef]
  8. Hunt, E.R., Jr.; Daughtry, C.S.T.; Walthall, C.L.; McMurtrey, J.E., III; Dulaney, W.P. Agricultural remote sensing using radio-controlled model aircraft. In Digital Imaging and Spectral Techniques: Applications to Precision Agriculture and Crop Physiology, ASA Special Publication 66; VanToai, T., Major, D., McDonald, M., Schepers, J., Tarpley, L., Eds.; ASA, CSSA, and SSSA: Madison, WI, USA, 2003; pp. 191–199. [Google Scholar]
  9. Fouché, P.S. Detecting nitrogen deficiency on irrigated cash crops using remote sensing methods. S. Afric. J. Plant Soil 1999, 16, 59–63. [Google Scholar] [CrossRef]
  10. Fouché, P.S.; Booysen, N.W. Assessment of crop stress conditions using low altitude aerial color-infrared photography and computer image processing. Geocarto Int. 1994, 2, 25–31. [Google Scholar] [CrossRef]
  11. Quilter, M.C.; Anderson, V.J. A proposed method for determining shrub utilization using (LA/LS) imagery. J. Range Manage. 2001, 54, 378–381. [Google Scholar] [CrossRef]
  12. Herwitz, S.R.; Johnson, L.F.; Dunagan, S.E.; Higgins, R.G.; Sullivan, D.V.; Zheng, J.; Lobitz, B.M.; Leung, J.G.; Gallmeyer, B.A.; Aoyagi, M.; Slye, R.E.; Brass, J.A. Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support. Comp. Elect. Agric. 2004, 44, 49–61. [Google Scholar] [CrossRef]
  13. Hunt, E.R., Jr.; Cavigelli, M.; Daughtry, C.S.T.; McMurtrey, J.E., III; Walthall, C.L. Evaluation of digital photography from model aircraft for remote sensing of crop biomass and nitrogen status. Prec. Agric. 2005, 6, 359–378. [Google Scholar] [CrossRef]
  14. Jensen, T.; Apan, A.; Young, F.; Zeller, L. Detecting the attributes of a wheat crop using digital imagery acquired from a low-altitude platform. Comp. Elect. Agric. 2007, 59, 66–77. [Google Scholar] [CrossRef] [Green Version]
  15. Swain, K.C.; Jayasuriya, H.P.W.; Salokhe, V.M. Suitability of low-altitude remote sensing for estimating nitrogen treatment variations in rice cropping for precision agriculture adoption. J. Appl. Remote Sens. 2007, 1, 013547. [Google Scholar] [CrossRef]
  16. Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Labbé, S.; Baret, F. Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors 2008, 8, 3557–3585. [Google Scholar] [CrossRef]
  17. Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  18. Milton, E.J. Low-cost ground-based digital infrared photography. Int. J. Remote Sens. 2002, 23, 1001–1007. [Google Scholar] [CrossRef]
  19. Lebourgeois, V.; Bégué, A.; Labbé, S.; Mallavan, B.; Prévot, L.; Roux, B. Can commercial digital cameras be used as multispectral sensors? A crop monitoring test. Sensors 2008, 8, 7300–7322. [Google Scholar] [CrossRef] [Green Version]
  20. Mention of a trademark, proprietary product, or company by USDA personnel is intended for description only and does not imply USDA approval to the exclusion of other products that may also be suitable.
  21. Bobbe, T.; McKean, J.; Zigadlo, J.P. An evaluation of natural color and color infrared digital cameras as a remote sensing tool for natural resource management. In Airborne Reconnaissance XIX; Fishell, W.G., Andraitis, A.A., Henkel, P.A., Eds.; SPIE: Bellingham, WA, USA, 1995; Volume 2555, pp. 151–157. [Google Scholar]
  22. Zigadlo, J.P.; Holden, C.L.; Schrader, M.E.; Vogel, R.M. Electronic color infrared camera. US Patent No. 6292212 B1, 2001. [Google Scholar]
  23. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  24. Honkavaara, E.; Arbiol, R.; Markelin, L.; Martinez, L.; Cramer, M.; Bovet, S.; Chandelier, L.; Ilves, R.; Klonus, S.; Marshall, P.; Schläpfer, D.; Tabor, M.; Thom, C.; Veje, N. Digital airborne photogrammetry—a new tool for quantitative remote sensing?—A state-of-the-art review on radiometric aspects of digital photogrammetric images. Remote Sens. 2009, 1, 577–605. [Google Scholar] [CrossRef]
  25. Laliberte, A.S.; Herrick, J.E.; Rango, A.; Winters, C. Acquisition, orthorectification, and object-based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring. Photogramm. Eng. & Remote Sens. 2009, in press. [Google Scholar]
  26. Zhou, G. Near real-time orthorectification and mosaic of small UAV video flow for time-critical event response. IEEE Trans. Geosci. Remote Sens. 2009, 47, 739–747. [Google Scholar] [CrossRef]
  27. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; Brown de Colstoun, E.; McMurtrey, J.E., III. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  28. Hinzman, L.D.; Bauer, M.E.; Daughtry, C.S.T. Effects of nitrogen fertilization on growth and reflectance characteristics of winter wheat. Remote Sens. Environ. 1986, 19, 47–61. [Google Scholar] [CrossRef]
  29. Shanahan, J.F.; Schepers, J.S.; Francis, D.D.; Varvel, G.E.; Wilhelm, W.W.; Tringe, J.M.; Schlemmmer, M.R.; Major, D.J. Use of remote-sensing imagery to estimate corn grain yield. Agron. J. 2001, 93, 583–589. [Google Scholar] [CrossRef]
  30. Booth, D.T.; Cox, S.E.; Louhaichi, M.; Johnson, D.E. Lightweight camera stand for close-to-earth remote sensing. J. Range Manage. 2004, 57, 675–678. [Google Scholar] [CrossRef]
  31. Booth, D.T.; Cox, S.E. Very large scale aerial photography for rangeland monitoring. Geocarto Int. 2006, 21, 27–34. [Google Scholar] [CrossRef]
  32. Booth, D.T.; Cox, S.E. Image-based monitoring to measure ecological change in rangeland. Front. Ecol. Environ. 2008, 6, 185–190. [Google Scholar] [CrossRef]
  33. Rango, A.; Laliberte, A.; Steele, C.; Herrick, J.E.; Bestilmeyer, B.; Schmugge, T.; Roanhorse, A.; Jenkins, V. Using unmanned aerial vehicles for rangelands: current applications and future potentials. Environ. Pract. 2006, 8, 159–168. [Google Scholar] [CrossRef]
  34. Rango, A.; Laliberte, A.; Herrick, J.E.; Winters, C.; Havstad, K.; Steele, C.; Browning, D. Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. J. Appl. Remote Sens. 2009, 3, 033542. [Google Scholar]
  35. Tarbert, B. FAA Unmanned Aircraft Program Office: Gaining Access to the National Airspace System. In Proceedings of International Symposium on Unmanned Aerial Vehicles 2008, Orlando, FL, USA, 2008.

Share and Cite

MDPI and ACS Style

Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sens. 2010, 2, 290-305. https://doi.org/10.3390/rs2010290

AMA Style

Hunt ER Jr., Hively WD, Fujikawa SJ, Linden DS, Daughtry CST, McCarty GW. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sensing. 2010; 2(1):290-305. https://doi.org/10.3390/rs2010290

Chicago/Turabian Style

Hunt, E. Raymond, Jr., W. Dean Hively, Stephen J. Fujikawa, David S. Linden, Craig S. T. Daughtry, and Greg W. McCarty. 2010. "Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring" Remote Sensing 2, no. 1: 290-305. https://doi.org/10.3390/rs2010290

Article Metrics

Back to TopTop