Accurate knowledge of ET is an essential component in efforts to monitor the global water cycle, climate variability, agricultural productivity, floods, and droughts. Model-based estimates of ET from global landmasses range from 58 to 85 103 km3 yr-1, although the exact magnitude and spatial distribution is still in question (Dirmeyer et al. 2006). Thermal infrared (TIR) RS has proved to be an invaluable asset in modeling spatially distributed evaporative fluxes (Kalma et al. 2008). Most prognostic LSMs determine ET through a water-balance approach, relying on spatially distributed estimates of Prcp interpolated from coarse-resolution gauge networks or mapped using satellite techniques, neither of which currently provides adequate accuracy at scales useful for drought monitoring. Nonetheless, a number of diagnostic, RS-based methods to estimate ET have been developed in the past few decades, mainly estimating ET as a residual of the surface energy balance (Kalma et al. 2008):
where Rn and H are already defined, and G is soil heat flux.
Diagnostic ET methods based on TIR RS require no information regarding antecedent Prcp or SM storage capacity—the current surface moisture status is deduced directly from the RS-derived temperature signal. In general, dry soil or stressed vegetation heats up faster than wet soil or well- watered vegetation. TIR RS data sources provide multiscale information that can be used to bridge between the observation scale (~100 m) and global model pixel scale (10-100 km), facilitating direct model accuracy assessment. Examples of diagnostic ET methods include the Surface Energy Balance Algorithm for Land (SEBAL; Bastiaanssen et al. 1998) model, the Mapping EvapoTranspiration at high Resolution with Internalized Calibration (METRIC; Allen et al. 2007) model, the Simplified Surface Energy Balance (SSEB; Su 2002) model, the Atmosphere Land Exchange Inverse (ALEXI;
Anderson et al. 1997, 2011b) model, and the Operational Simplified Surface Energy Balance (SSEBop; Senay et al. 2011, 2013) model. Some of these methods (SEBAL, METRIC, SEBS, and SSEBop) focus generally on the use of a single RS observation of TIR LST and provide a scaling between a "hot" pixel (where ET = 0) and a "cold" pixel (where ET = E0), providing ET estimates when accurate representations of the "hot/cold" pixels can be made. Few of these methods have been employed in drought monitoring, as much of their focus has been placed on high-resolution, field-scale estimation of consumptive water use (i.e., actual ET). However, methods such as ALEXI use a time-integrated measure of LST during the mid-morning hours, a time when LST and SM have been shown to be strongly correlated (Anderson et al. 1997). The ability of TIR LST to assess the current SM state and the effects of vegetation stress on ET provides a unique opportunity to augment current drought-monitoring methods, providing an estimate of actual water availability that can be used with estimates of E0 to estimate anomalies in water use and/or vegetation stress.