**1. Introduction**

Numerous studies are being conducted on cotton crop growth monitoring for precision agriculture. Cotton is an important crop in the state of Texas, which produces more than 50% of the total cotton produced by the entire country, comprising a spatial coverage of around six million acres [1]. Recent advances in genetic engineering and genomics have significantly accelerated the breeding process

of cotton [2]. There is a growing need for phenotyping to match this high pace breeding process. Consequently, plant breeders and agriculture scientists have recognized the need for a high-throughput phenotyping (HTP) system that can efficiently measure phenotypic traits such as crop height, volume, canopy cover, and vegetation indices (VIs) with reasonable accuracy [3]. An accurate phenotyping process is very critical for the reliable quantification of phenotypical traits to select the genotypes of interest. HTP is an extensively discussed phenomenon; however, until recently, its implementation has been rather fragmentary [4]. The change in this situation has been mainly attributed to the recent developments in unmanned aircraft systems (UAS). Lightweight platforms combined with consumer grade imaging sensors have provided an affordable system to perform the necessary remote sensing activities for precision agriculture, especially with low altitude flights that provide high temporal and spatial resolution data [5–8].

In this paper, canopy cover (CC), which is commonly expressed as the percentage of total ground areal coverage by the vertical projection of plant canopy, is studied. Plant canopy cover is strongly related to crop growth, development, water use, and photosynthesis, which makes it an important trait to be observed throughout the growing season [9]. In addition, CC is an important ancillary variable in the estimation of the leaf area index (LAI) [10]. Various remote sensing techniques have been employed in the literature to compute CC, and these include satellite imagery with varying degree of resolutions [11–15], airborne imagery [16] and light detection and ranging (LiDAR) data [17,18]. Satellite imagery has the advantage of providing large spatial coverage. However, coarser spatial resolution limits its application in computing CC over small breeding fields where genotype screening is the objective. Moreover, the temporal resolution of satellite imagery is also not enough for phenotypic applications. Furthermore, satellite imagery is highly affected by cloud cover and other atmospheric conditions [19]. On the other hand, aerial imagery usually has a higher spatial resolution, but it has fewer spectral bands as compared to satellite imagery [20]. CC estimation using LiDAR data can be slightly biased in visual interpretation; however, in general, it is particularly useful in the estimation of vertical canopy cover and angular canopy closure which is otherwise difficult to compute [21]. Terrestrial and airborne LiDAR data have been successfully used to compute CC in the literature [22,23]. However, data collection frequency has remained a significant issue, as LiDAR sensors and airborne imaging sensors are relatively expensive compared to UAS. Recently, UAS have emerged as an alternate to the satellite, airborne imaging sensors or LiDAR sensors to estimate CC, and this approach is more affordable and could provide higher temporal and spatial resolution [24–28]. UAS-based CC measurements have been efficiently used to estimate LAI [29,30] and have been used as one of the comparison parameters to quantify the difference between various crop management practices throughout the growing season [31]. Moreover, a recent study conducted over maize field indicated that UAS-based CC is significantly correlated with the grain yield [32].

CC computation using multispectral (MS) sensors has gained more popularity over RGB(red, green, and blue)-based CC, with the primary reason being that the MS sensor is more stable over time and remains relatively unaffected by changes in environmental conditions (e.g., sunlight angle and cloud cover) throughout the crop growing season due to its irradiance sensor [3,7,33,34]. However, MS sensors are more sensitive and expensive compared to RGB sensors. RGB-based CC estimation methods can be divided into two categories, namely the thresholding method and the pixel classification method. Thresholding methods require the specification of the color thresholds or the ratios to identify canopy pixels. Pixel classification methods use a supervised or unsupervised pixel-wise classification method to identify canopy pixels. Though pixel classification methods are highly accurate, they are time consuming and computationally extensive. Supervised classification methods require training samples to be collected, which is expensive and prone to human error. However, pixel classification methods are particularly useful to calibrate thresholding methods [35]. There is an ample amount of work in the literature that has used RGB sensors to compute CC. Early work in this direction includes the quantification of turfgrass cover using digital image analysis by Richarson et al., (2001) [36]. Lee and Lee, (2011), estimated canopy cover over the rice field using an RGB sensor [37]. Patrignani and Ochsner, (2015), developed the Canopeo algorithm to extract fractional green canopy cover [38]. Despite having a significant amount of previous literature exploring RGB-based CC estimation, there is a scarcity of work that compares different CC estimations throughout the crop growing season. Torres-Sánchez et al., (2014), [39] developed a multitemporal CC framework for a wheat field using UAS-based RGB images. However, it was limited to early season CC estimation only. Moreover, the highest accuracy that they achieved in mapping CC was less than 92%. Fang et al., (2016), [40] presented a case study of CC estimation using UAS-based MS sensor data over an oilseed rape. However, their study was aimed to provide CC estimation and flower fraction for the crop species that have conspicuous non-green flowers or fruits. Moreover, they primarily used MS sensor-based CC estimation methodology in their study, with only one RGB-based CC estimation approach that only worked efficiently during the vegetative period. Marcial-Pablo et al., (2019), [41] compared CC estimation using RGB and MS sensor-based vegetation indices over a maize field. Their results suggested that RGB-based CC estimation can be useful in the early-season growth stage of the crop, while later in the season CC estimation, using MS sensor-based indices was more accurate. Moreover, the accuracy of the CC estimation was also dependent on automatic thresholding using the Otsu method. Lima-Cueto et al., (2019), [42] used 11 VIs to quantify vegetation cover in olive groves, and they suggested that MS sensor-based CC had better accuracy as compared to RGB-based CC. A consistent observation in the aforementioned case studies was that RGB-based CC estimation was not efficient in the late season. Therefore, the objective of this study was not only to compare various RGB-based CC estimation methods with MS sensor-based CC estimation but also to improve RGB-based CC estimation to provide a more affordable option to breeders and agriculture scientists, particularly in late season.

### **2. Materials and Methods**
