Land surface temperature (LST) derived from thermal infrared (TIR) satellite data has been reliably used as a remote indicator of evapotranspiration (ET) and surface moisture status. However, in order to retrieve the ET with an accuracy approaching 10%, LST should be retrieved to within 1 ◦C or better, disregarding other elements of uncertainty. The removal of atmospheric effects is key towards achieving a precise estimation of LST and it requires detailed information on water vapor. The Thermal Infrared Sensor (TIRS) onboard Landsat 8 captures data in two long wave thermal bands with 100-meter resolution. However, the US Geological Survey has reported a calibration problem of TIRS bands caused by stray light, resulting in a higher bias in one of its two bands (4% in band 11, 2% in band 10). Therefore, split-window algorithms for the estimation of LST might not be reliable. Our work will focus on the impact of using different atmospheric profiles (e.g. weather prediction models, satellite) for the estimation of LST derived from MODTRAN by using one of the TIRS bands onboard Landsat 8 (band 10). Sites with in-situ measurements of LST are used as evaluation sources. Comparisons between the measured LST and LST derived based on different atmospheric profile inputs to MODTRAN are carried out from 2 Landsat-overpass days (DOY 153 and 160 2015). Preliminary results show a mean absolute error of around 3 ◦C between in-situ and estimated LST over two different crops (alfalfa and carrot) and bare soil.
|Original language||English (US)|
|Title of host publication||21st International Congress on Modelling and Simulation|
|State||Published - Nov 29 2015|