Zhang, LZ; Zhang, Q; Yang, QQ; Yue, LW; He, J; Jin, XY; Yuan, QQ (2025). Near-real-time wildfire detection approach with Himawari-8/9 geostationary satellite data integrating multi-scale spatial-temporal feature. INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 137, 104416.
Abstract
Wildfires pose a great threat to the ecological environment and human safety. Therefore, rapid and accurate detection of wildfires holds significant importance. However, existing wildfire detection methods neglect the full integration of spatial-temporal relationships across different scales, and thus suffer from issues of low robustness and accuracy in varying wildfire scenes. To address this, we propose a deep learning model for near-real-time wildfire detection, where the core idea is to integrate multi-scale spatial-temporal features (MSSTF) to efficiently capture the dynamics of wildfires. Specifically, we design a multi-kernel attention-based convolution (MKAC) module for extracting spatial features representing the differences between fire and non-fire pixels within multi-scale receptive fields. Moreover, a long short-term Transformer (LSTT) module is used to capture the temporal differences from the image sequences with different window lengths. The two modules are combined into multiple streams to integrate the multi-scale spatial-temporal features, and the multi-stream features are then fused to generate the fire classification map. Extensive experiments on various fire scenes show that the proposed method is superior to JAXA Wildfire products and representative deep learning models, achieving the best accuracy scores (i.e., average fire accuracy (FA): 88.25%, average false alarm rate (FAR): 20.82%). The results also show that the method is sensitive to early-stage fire events and can be applied in the task of near-realtime wildfire detection with 10-minute Himawari-8/9 satellite data. The data and codes used in the study are detailed in: https://github.com/eagle-void/MSSTF.
DOI:
10.1016/j.jag.2025.104416
ISSN:
1872-826X