Highlights What are the main findings? The integration of Sentinel-1 (SAR) and Sentinel-2 (optical) data within a Google Earth Engine framework achieved a highly accurate burned area estimation, showing a minimal discrepancy of only 1.3% compared to official EFFIS records. Statistical analysis confirms strong correlations between SAR-based indices (RVI and DPSVI) and traditional optical metrics, validating the use of radar to assess fire severity in areas with frequent cloud or smoke obstruction. What are the implications of the main findings? The proposed multi-sensor methodology provides a scalable and automated solution for large-scale wildfire monitoring, ensuring temporal continuity when optical sensors are limited by atmospheric conditions. Leveraging cloud computing and unsupervised machine learning (K-means) reduces reliance on manual interpretation, offering a robust tool for rapid post-fire damage assessment and global wildfire mitigation strategies.Highlights What are the main findings? The integration of Sentinel-1 (SAR) and Sentinel-2 (optical) data within a Google Earth Engine framework achieved a highly accurate burned area estimation, showing a minimal discrepancy of only 1.3% compared to official EFFIS records. Statistical analysis confirms strong correlations between SAR-based indices (RVI and DPSVI) and traditional optical metrics, validating the use of radar to assess fire severity in areas with frequent cloud or smoke obstruction. What are the implications of the main findings? The proposed multi-sensor methodology provides a scalable and automated solution for large-scale wildfire monitoring, ensuring temporal continuity when optical sensors are limited by atmospheric conditions. Leveraging cloud computing and unsupervised machine learning (K-means) reduces reliance on manual interpretation, offering a robust tool for rapid post-fire damage assessment and global wildfire mitigation strategies.Abstract Wildfires represent a significant global environmental challenge, necessitating advanced monitoring and assessment techniques. This study explores the integration of Sentinel-1 Synthetic Aperture Radar (SAR) and Sentinel-2 optical data within a Google Earth Engine (GEE) framework to enhance wildfire detection, burned area estimation, and severity assessment. By leveraging SAR's capability to penetrate atmospheric obstructions and optical data's spectral sensitivity to vegetation changes, the proposed methodology addresses limitations of single-sensor approaches. The results demonstrate strong correlations between SAR-based indices, such as the Radar Vegetation Index (RVI) and Dual-Polarized SAR Vegetation Index (DPSVI), and traditional optical indices, including the Normalized Burn Ratio (NBR) and differenced NBR (Delta NBR). Despite challenges related to terrain influence, sensor resolution differences, and computational demands, the integration of multi-sensor data in a cloud-based environment offers a scalable and efficient solution for wildfire monitoring. During the peak of the fire events, significant atmospheric obstruction was technically verified using Sentinel-2 metadata and the QA60 cloud mask band, which confirmed persistent cloud cover and thick smoke plumes over the study areas. This interference limited the reliability of purely optical monitoring, further justifying the integration of SAR data.Future research should focus on refining data fusion techniques, incorporating additional datasets such as thermal infrared imagery and meteorological variables, and enhancing automation through artificial intelligence (AI). This study underscores the potential of remote sensing advancements in improving fire management strategies and global wildfire mitigation efforts.
An Unsupervised Machine Learning-Based Approach for Combining Sentinel 1 and 2 to Assess the Severity of Fires over Large Areas Using a Google Earth Engine
Riccardi C. G.;Abate N.;Lasaponara R.
2026
Abstract
Highlights What are the main findings? The integration of Sentinel-1 (SAR) and Sentinel-2 (optical) data within a Google Earth Engine framework achieved a highly accurate burned area estimation, showing a minimal discrepancy of only 1.3% compared to official EFFIS records. Statistical analysis confirms strong correlations between SAR-based indices (RVI and DPSVI) and traditional optical metrics, validating the use of radar to assess fire severity in areas with frequent cloud or smoke obstruction. What are the implications of the main findings? The proposed multi-sensor methodology provides a scalable and automated solution for large-scale wildfire monitoring, ensuring temporal continuity when optical sensors are limited by atmospheric conditions. Leveraging cloud computing and unsupervised machine learning (K-means) reduces reliance on manual interpretation, offering a robust tool for rapid post-fire damage assessment and global wildfire mitigation strategies.Highlights What are the main findings? The integration of Sentinel-1 (SAR) and Sentinel-2 (optical) data within a Google Earth Engine framework achieved a highly accurate burned area estimation, showing a minimal discrepancy of only 1.3% compared to official EFFIS records. Statistical analysis confirms strong correlations between SAR-based indices (RVI and DPSVI) and traditional optical metrics, validating the use of radar to assess fire severity in areas with frequent cloud or smoke obstruction. What are the implications of the main findings? The proposed multi-sensor methodology provides a scalable and automated solution for large-scale wildfire monitoring, ensuring temporal continuity when optical sensors are limited by atmospheric conditions. Leveraging cloud computing and unsupervised machine learning (K-means) reduces reliance on manual interpretation, offering a robust tool for rapid post-fire damage assessment and global wildfire mitigation strategies.Abstract Wildfires represent a significant global environmental challenge, necessitating advanced monitoring and assessment techniques. This study explores the integration of Sentinel-1 Synthetic Aperture Radar (SAR) and Sentinel-2 optical data within a Google Earth Engine (GEE) framework to enhance wildfire detection, burned area estimation, and severity assessment. By leveraging SAR's capability to penetrate atmospheric obstructions and optical data's spectral sensitivity to vegetation changes, the proposed methodology addresses limitations of single-sensor approaches. The results demonstrate strong correlations between SAR-based indices, such as the Radar Vegetation Index (RVI) and Dual-Polarized SAR Vegetation Index (DPSVI), and traditional optical indices, including the Normalized Burn Ratio (NBR) and differenced NBR (Delta NBR). Despite challenges related to terrain influence, sensor resolution differences, and computational demands, the integration of multi-sensor data in a cloud-based environment offers a scalable and efficient solution for wildfire monitoring. During the peak of the fire events, significant atmospheric obstruction was technically verified using Sentinel-2 metadata and the QA60 cloud mask band, which confirmed persistent cloud cover and thick smoke plumes over the study areas. This interference limited the reliability of purely optical monitoring, further justifying the integration of SAR data.Future research should focus on refining data fusion techniques, incorporating additional datasets such as thermal infrared imagery and meteorological variables, and enhancing automation through artificial intelligence (AI). This study underscores the potential of remote sensing advancements in improving fire management strategies and global wildfire mitigation efforts.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


