Publications

Zhang, Fang; Zhu, Xiaolin; Liu, Desheng (2014). Blending MODIS and Landsat images for urban flood mapping. INTERNATIONAL JOURNAL OF REMOTE SENSING, 35(9), 3237-3253.

Abstract
Satellite images provide important data sources for monitoring flood disasters. However, the trade-off between spatial and temporal resolutions of current satellite sensors limits their uses in urban flooding studies. This study applied and compared two data fusion models, the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) and Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM), in generating synthetic flooding images with improved temporal and spatial resolution for flood mapping. The synthetic images are produced in two scenarios: (1) for real-time prediction based on Landsat and MODIS images acquired before the investigated flooding; and (2) for post-disaster prediction based on images acquired after the flooding. The 2005 Hurricane Katrina in New Orleans was selected as a case study. The result shows that the Landsat-like images generated can be successfully applied in flood mapping. Particularly, ESTARFM surpasses STARFM in predicting surface reflectance in both real-time and post-flooding predictions. However, the flood mapping results from the Landsat-like images produced by both STARFM and ESTARFM are similar with overall accuracy around 0.9. Only for the flooding maps of real-time predictions does ESTARFM get a slightly higher overall accuracy than STARFM, indicating that the lower quality of the Landsat-like image generated by STARFM may not affect flood mapping accuracy, due to the marked contrast between land and water. This study suggests great potential of both STARFM and ESTARFM in urban flooding research. Blending multi-sources images could also support other disaster studies that require remotely sensed data with both high spatial and temporal resolution.

DOI:
10.1080/01431161.2014.903351

ISSN:
0143-1161; 1366-5901