Publications

Ticehurst, C; Karim, F (2023). Towards developing comparable optical and SAR remote sensing inundation mapping with hydrodynamic modelling. INTERNATIONAL JOURNAL OF REMOTE SENSING, 44(9), 2912-2935.

Abstract
Inundation mapping is an essential part of environmental monitoring, flood disaster management and risk mitigation. There are many earth observation sensors that can provide spatial information about inundation extent at different times. However, these extents need to be comparable to provide an accurate and consistent estimate of a flood's progression. Monitoring inundation extent around the peak of a flood event is important because it captures the hazardous period during a large flood event, and it identifies connections between off-stream waterholes and flood waters for environmental monitoring. This paper presents results from a study comparing near-coincident flood inundation maps derived from optical (Landsat, MODIS, Sentinel-2 and VIIRS) and Synthetic Aperture Radar (SAR) (Sentinel-1 and NovaSAR-1) remote sensing imagery. We also compare all inundation maps derived from remote sensing data with near-coincident hydrodynamic (HD) modelling. The study was conducted for the Fitzroy floodplain in Western Australia, a large, complex and remotely located river basin. The results show that optical remote sensing data have average F1 scores ranging from 0.57 to 0.65 when compared to the HD model results, with Landsat and MODIS performing best. Sentinel-1 and NovaSAR-1 SAR have a poor agreement (average F1 score of 0.31 and 0.35 respectively) when compared to the HD model results, particularly within scattered vegetation adjacent to the river channel, with better results in open water on the floodplain and in the river. If comparisons are made only during the peak flood stage, the F1 scores improve for the optical data (0.61 up to 0.8). Comparisons of the remote sensing inundation maps show that the optical data are suitable for interchangeable mapping during large flood events.

DOI:
10.1080/01431161.2023.2211714

ISSN:
1366-5901