Publications

Cho, AY; Park, SE; Kim, DJ; Kim, J; Li, CL; Song, J (2023). Burned Area Mapping Using Unitemporal PlanetScope Imagery With a Deep Learning Based Approach. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 16, 242-253.

Abstract
The risk and damage of wildfires have been increasing due to various reasons including climate change, and the Republic of Korea is no exception to this situation. Burned area mapping is crucial not only to prevent further damage but also to manage burned areas. Burned area mapping using satellite data, however, has been limited by the spatial and temporal resolution of satellite data and classification accuracy. This article presents a new burned area mapping method, by which damaged areas can be mapped using semantic segmentation. For this research, PlanetScope imagery that has high-resolution images with very short revisit time was used, and the proposed method is based on U-Net which requires a unitemporal PlanetScope image. The network was trained using 17 satellite images for 12 forest fires and corresponding label images that were obtained semiautomatically by setting threshold values. Band combination tests were conducted to produce an optimal burned area mapping model. The results demonstrated that the optimal and most stable band combination is red, green, blue, and near infrared of PlanetScope. To improve classification accuracy, Normalized Difference Vegetation Index, dissimilarity extracted from Gray-Level Co-Occurrence Matrix, and Land Cover Maps were used as additional datasets. In addition, topographic normalization was conducted to improve model performance and classification accuracy by reducing shadow effects. The F1 scores and overall accuracies of the final image segmentation models are ranged from 0.883 to 0.939, and from 0.990 to 0.997, respectively. These results highlight the potential of detecting burned areas using the deep learning based approach.

DOI:
10.1109/JSTARS.2022.3225070

ISSN:
2151-1535