**1. Introduction**

Evapotranspiration (ET) is an important component of the water cycle and the surface energy balance, where water changes phase from liquid to vapor, and a change in energy takes place [1]. The development of reliable long-term estimates of ET is needed to improve agricultural water use

efficiency (WUE). WUE is defined as the assimilated carbon amount as biomass/grain produced per consumed unit of water by a crop. Paulson [2] reported that climate elements and wind conditions affect regional and seasonal ET estimates. For the irrigation scheduling decision-making process, knowledge of ET variability is important for proper water resources management. This is especially true in arid regions, where crop water requirements exceed rainfall, and irrigation is essential for crop production.

To sustain agricultural production in regions that are dependent on irrigation, blue water (surface water) is extracted from streams and aquifers (groundwater) to help sustain the crop and to meet water demand. Over time, farmers are required to dig deeper wells and extract water from deeper aquifers to ensure adequate water supply for their irrigation needs [3]. Unfortunately, the recharge rate of some aquifers is not sufficient to meet this demand, and historical declines in underground water table have occurred due to extensive water withdrawal [4]. In some cases, this has resulted in land subsidence and localized infrastructure damage [3]. ET is the main driver in irrigation water management and planning. The more accurate the ET estimates, the more likely associated management strategies are to achieving potential crop yields. ET is also an important component in estimating soil water to facilitate improved water use efficiency.

Remote sensing-based ET models are effective for crop water requirement estimations at the field and regional scales [5]. Several ET algorithms have been developed to utilize airborne and satellite data for irrigation scheduling and management purposes. ET can be measured over a surface using the Bowen ratio (BR), the eddy covariance (EC), and lysimeter systems at the field-scale. In all of these cases, spatial variability does not apply, since each method represents a very small scale measurement (typically less than 150 m in an agricultural setting). In addition, these methods only provide a single, averaged value that may not adequately capture the variability across a region. Satellite-based ET models produce regional-scale crop water use [6]. Many remote sensing-based ET algorithms have been developed, assessed, and widely used for estimating regional ET [6–9].

Park et al. [10], Jackson [11], and Choudhury et al. [12] reported that regional and watershed-scale spatially distributed ET can be better represented using remote sensing compared to traditional ET estimation methods. There are typically two approaches that are used to provide remote sensing ET estimates: (1) the land surface energy balance (EB) approach, and (2) the reflectance-based crop coefficient (Kc) and reference ET approach. The first approach is based on ET being a change of the state of water, using available energy in the environment for vaporization [13].

Multiple satellite platforms are available to use with energy balance ET models, such as Land Satellite (Landsat), The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Geostationary Operational Environmental Satellite (GEOS), moderate resolution imaging spectroradiometer (MODIS), and others, based on surface energy balance [14]. However, the spatiotemporal resolution is complex and cannot be combined with only one satellite. For daily estimates of ET with a high spatial resolution (field-scale), results depend on data assimilation, model applicability, and accuracy [15]. ET calibration based on remote sensing ET estimates has been conducted for a single source method such as the Surface Energy Balance Algorithm for Land (SEBAL) and the Mapping Evapotranspiration with Internalized Calibration model (METRIC) [16,17], and two source methods such as the Two-Source Energy Balance model (TSEB) [18,19]. Energy balance models based on thermal remote sensing include TSEB [18,20], SEBAL [8], METRIC [6,17], and several others [7,21].

A detailed assessment of existing EB models [7,22] stated that daily ET estimates varied 3 to 35% in comparison to Bowen ratio and EC ET measurements. Error sources included (a) modeling uncertainties, and (b) measurement errors and discrepancies in model-measurement scales. Other studies that assessed multiple airborne high-resolution sensor platforms indicated good agreement with measured ET [23].

The BEAREX08 experiment [15] was a robust remote sensing experiment that involved mass balance measurements of ET using four weighing lysimeters [24,25]. A wide array of instrumentation was installed for supportive data, including neutron probe (NP) access tubes for soil water measurements, multi-level canopy temperature measurements, and above and below canopy irradiance measurements. The purpose of these measurements was to provide as many measured data as possible to reduce estimations and provide more points of comparison. These data were used to address the lack of studies where EB model estimates of ET are compared with mass balance measurements, reducing uncertainty sources in EB models. Gowda et al. [7] reported that few remotely sensed ET estimates are used in irrigation scheduling and in-field management due to the absence of daily data with field-scale resolution. Many methods have been proposed to overcome this issue, such as use of infrequent, high spatial resolution data, including Landsat with 60 m spatial resolution, 16-day temporal resolution, in combination with lower spatial resolution and more frequent data, such as the Moderate Resolution Imaging Spectroradiometer (MODIS) with 1000 m spatial resolution and 1-day temporal resolution. The Disaggregated Atmosphere-Land Exchange Inverse (DisALEXI) model demonstrated this concept [14,26,27], but no evaluation of this approach for management has been successful thus far.

Spatial and temporal resolution problems could be resolved using aircraft. However, cost, data processing, and lack of experienced users have prevented their widespread use in providing such platforms as potential imagery sources for water management. The EB method typically provides an instantaneous value of ET, and interpolated to daily ET using the evaporative fraction method, [28], or reference ET approach [6,29]. Each ET estimation approach has its uncertainties; however, measurement data quality assessment and quality control reduce such modeling uncertainties [30,31]. Multi-time scale remotely-sensed ET estimation accuracy is essential for water management and irrigation planning. Extensive studies have been conducted to assess the METRIC model performance on irrigated fields [32–35]; however, none of these studies evaluated the model performance under dryland conditions for extended periods with various crops and temporal resolutions [36]. This study assessed daily interpolated ET estimation accuracy using the linear interpolation method on dryland and irrigated lysimeters for a ten year period with various crops at multiple time scales. Such assessment is crucial for water management policies, researchers, hydrologists, and irrigators when using such technologies for real time irrigation decisions.

The objective of this research was to assess the daily, seven-day running average, monthly, and seasonal satellite-based ET estimation accuracy versus large weighing lysimeter ET measurements in the Texas High Plains under irrigated and dryland conditions during the growing and non-growing seasons.
