Multimodal Wildland Fire Smoke Detection (Papers Track)

Mai Nguyen (University of California San Diego); Shreyas Anantha Ramaprasad (University of California San Diego); Jaspreet Kaur Bhamra (University of California San Diego); Siddhant Baldota (University of California San Diego); Garrison Cottrell (UC San Diego)

Paper PDF Slides PDF Recorded Talk NeurIPS 2022 Poster Topia Link Cite
Disaster Management and Relief Computer Vision & Remote Sensing


Research has shown that climate change creates warmer temperatures and drier conditions, leading to longer wildfire seasons and increased wildfire risks in the United States. These factors have in turn led to increases in the frequency, extent, and severity of wildfires in recent years. Given the danger posed by wildland fires to people, property, wildlife, and the environment, there is an urgency to provide tools for effective wildfire management. Early detection of wildfires is essential to minimizing potentially catastrophic destruction. In this paper, we present our work on integrating multiple data sources in SmokeyNet, a deep learning model using spatio-temporal information to detect smoke from wildland fires. Camera image data is integrated with weather sensor measurements and processed by SmokeyNet to create a multimodal wildland fire smoke detection system. Our results show that incorporating multimodal data in SmokeyNet improves performance in terms of both F1 and time-to-detection over the baseline with a single data source. With a time-to-detection of only a few minutes, SmokeyNet can serve as an automated early notification system, providing a useful tool in the fight against destructive wildfires.

Recorded Talk (direct link)