Publications

Seydi, ST; Akhoondzadeh, M; Amani, M; Mahdavi, S (2021). Wildfire Damage Assessment over Australia Using Sentinel-2 Imagery and MODIS Land Cover Product within the Google Earth Engine Cloud Platform. REMOTE SENSING, 13(2), 220.

Abstract
Wildfires are major natural disasters negatively affecting human safety, natural ecosystems, and wildlife. Timely and accurate estimation of wildfire burn areas is particularly important for post-fire management and decision making. In this regard, Remote Sensing (RS) images are great resources due to their wide coverage, high spatial and temporal resolution, and low cost. In this study, Australian areas affected by wildfire were estimated using Sentinel-2 imagery and Moderate Resolution Imaging Spectroradiometer (MODIS) products within the Google Earth Engine (GEE) cloud computing platform. To this end, a framework based on change analysis was implemented in two main phases: (1) producing the binary map of burned areas (i.e., burned vs. unburned); (2) estimating burned areas of different Land Use/Land Cover (LULC) types. The first phase was implemented in five main steps: (i) preprocessing, (ii) spectral and spatial feature extraction for pre-fire and post-fire analyses; (iii) prediction of burned areas based on a change detection by differencing the pre-fire and post-fire datasets; (iv) feature selection; and (v) binary mapping of burned areas based on the selected features by the classifiers. The second phase was defining the types of LULC classes over the burned areas using the global MODIS land cover product (MCD12Q1). Based on the test datasets, the proposed framework showed high potential in detecting burned areas with an overall accuracy (OA) and kappa coefficient (KC) of 91.02% and 0.82, respectively. It was also observed that the greatest burned area among different LULC classes was related to evergreen needle leaf forests with burning rate of over 25 (%). Finally, the results of this study were in good agreement with the Landsat burned products.

DOI:
10.3390/rs13020220

ISSN: