In this paper, the Coordinated Enhanced Observing Period data during an Enhanced Observing Period (EOP-1) is used to assess the Surface Energy Balance System (SEBS) model. The purpose of this study is to evaluate the adaptability of SEBS to different climatic zones and land cover classifications at two different scales. The SEBS model was examined at the field (tower) scale based primarily on in-situ observations from CEOP sites. To examine a broader scale application, remotely sensed land surface temperature (LST) from the MODIS sensor and surface meteorology from the Global Land Data Assimilation System (GLDAS) were used for the required forcing datasets. Comparisons at tower scale show that the model predictions of the energy fluxes agree reasonably well with the observations. The root mean square error (RMSE) of the ET prediction based on MODIS Land Surface Temperature (LST) plus CEOP meteorological observations is about 61 W m-2 at a grassland site (Cabauw) and a needle leaf forest site (BERMS). The RMSE of ET predication at a corn site (Bondville) is 96 W m-2 and the corresponding percentage error is 28.9%. When GLDAS forcing was used instead of the CEOP tower observations, the RMSEs of ET prediction at Cabauw, BERMS and Bondville are increased to 82, 84 and 140 W m-2 respectively. The negative bias of surface downward radiative forcing from GLDAS contributed much to the larger deviation of the ET prediction when compared to tower based values. The innovative aspects of our study in this paper are: a) No similar work on evaluating remote sensing based ET model under a diverse climate and land cover condition has been done before; b) ET modeling was assessed in different scales ranging from site scale to GLDAS grid cell; c) The framework of estimating the spatial distribution of ET combining satellite data and available ground meteorology is tested.
ASJC Scopus subject areas
- Atmospheric Science