NeurIPS 2019 Workshop: Tackling Climate Change with Machine Learning

About Speakers Schedule Organizers Accepted Works Submit FAQ


Many in the ML community wish to take action on climate change, yet feel their skills are inapplicable. This workshop aims to show that in fact the opposite is true: while no silver bullet, ML can be an invaluable tool both in reducing greenhouse gas emissions and in helping society adapt to the effects of climate change. Climate change is a complex problem, for which action takes many forms - from designing smart electrical grids to tracking deforestation in satellite imagery. Many of these actions represent high-impact opportunities for real-world change, as well as being interesting problems for ML research.

About the Workshop

Invited Speakers

Jeff Dean (Google AI) Carla Gomes (Cornell) Felix Creutzig (MCC Berlin, TU Berlin) Lester Mackey (Microsoft Research, Stanford)



David Rolnick (UPenn)
Priya Donti (CMU)
Lynn Kaack (ETH Zürich)
Alexandre Lacoste (Element AI)
Tegan Maharaj (Mila)
Andrew Ng (Stanford)
John Platt (Google AI)
Jennifer Chayes (Microsoft Research)
Yoshua Bengio (Mila)

Accepted Works

Works were submitted to one of two tracks: Papers or Proposals.


Title Authors
(1) Warm-Starting AC Optimal Power Flow with Graph Neural Networks Frederik Diehl (fortiss)
(2) Quantifying Urban Canopy Cover with Deep Convolutional Neural Networks Bill Cai (Massachusetts Institute of Technology); Xiaojiang Li (Temple University); Carlo Ratti (Massachusetts Institute of Technology )
(3) Using LSTMs for climate change assessment studies on droughts and floods Frederik Kratzert (LIT AI Lab, Institute for Machine Learning, Johannes Kepler University Linz, Austria); Daniel Klotz (LIT AI Lab, Institute for Machine Learning, Johannes Kepler University Linz, Austria); Johannes Brandstetter (LIT AI Lab, Institute for Machine Learning, Johannes Kepler University Linz, Austria); Pieter-Jan Hoedt (Johannes Kepler University Linz); Grey Nearing (Department of Geological Sciences, University of Alabama, Tuscaloosa, AL United States); Sepp Hochreiter (LIT AI Lab, Institute for Machine Learning, Johannes Kepler University Linz, Austria)
(4) Learning to Focus and Track Hurricanes Sookyung Kim (Lawrence Livermore National Laboratory); Sunghyun Park (Korea University); Sunghyo Chung (Kakao Corp.); Joonseok Lee (Google Research); Jaegul Choo (Korea University); Mr Prabhat (Lawrence Berkeley National Laboratory); Yunsung Lee (Korea University)
(5) DeepWind: Weakly Supervised Localization of Wind Turbines in Satellite Imagery Sharon Zhou (Stanford University); Jeremy Irvin (Stanford); Zhecheng Wang (Stanford University); Ram Rajagopal (Stanford University); Andrew Ng (Stanford U.); Eva Zhang (Stanford University); Will Deaderick (Stanford University); Jabs Aljubran (Stanford University)
(6) Streamflow Prediction with Limited Spatially-Distributed Input Data Martin Gauch (University of Waterloo); Juliane Mai (University of Waterloo); Shervan Gharari (University of Saskatchewan); Jimmy Lin (University of Waterloo)
(7) Establishing an Evaluation Metric to Quantify Climate Change Image Realism Sharon Zhou (Stanford University); Sasha Luccioni (Mila); Gautier Cosne (Mila); Michael Bernstein (Stanford University); Yoshua Bengio (Mila)
(8) Energy Usage Reports: Environmental awareness as part of algorithmic accountability Kadan Lottick (Haverford College); Silvia Susai (Haverford College); Sorelle Friedler (Haverford College); Jonathan Wilson (Haverford College)
(9) Natural Language Generation for Operations and Maintenance in Wind Turbines Joyjit Chatterjee (University of Hull); Nina Dethlefs (University of Hull)
(10) Make Thunderbolts Less Frightening — Predicting Extreme Weather Using Deep Learning Christian Schön (Saarland Informatics Campus); Jens Dittrich (Saarland University)
(11) Cumulo: A Dataset for Learning Cloud Classes Best Paper Award Valentina Zantedeschi (Jean Monnet University); Fabrizio Falasca (Georgia Institute of Technology); Alyson Douglas (University of Wisconsin Madison); Richard Strange (University of Oxford); Matt Kusner (University College London); Duncan Watson-Parris (University of Oxford)
(12) Targeting Buildings for Energy Retrofit Using Recurrent Neural Networks with Multivariate Time Series Gaby Baasch (University of Victoria)
(13) Coupling Oceanic Observation Systems to Study Mesoscale Ocean Dynamics Gautier Cosne (Mila); Pierre Tandeo (IMT-Atlantique); Guillaume Maze (Ifremer)
(14) Background noise trends and the detection of calving events in a glacial fjord Dara Farrell (Graduate of University of Washington)
(15) Reducing Inefficiency in Carbon Auctions with Imperfect Competition Kira Goldner (Columbia University); Nicole Immorlica (Microsoft Research); Brendan Lucier (Microsoft Research New England)
(16) Reduction of the Optimal Power Flow Problem through Meta-Optimization Letif Mones (Invenia Labs); Alex Robson (Invenia Labs); Mahdi Jamei (Invenia Labs); Cozmin Ududec (Invenia Labs)
(17) Human-Machine Collaboration for Fast Land Cover Mapping Caleb Robinson (Georgia Institute of Technology); Anthony Ortiz (University of Texas at El Paso); Nikolay Malkin (Yale University); Blake Elias (Microsoft); Andi Peng (Microsoft); Dan Morris (Microsoft); Bistra Dilkina (University of Southern California); Nebojsa Jojic (Microsoft Research)
(18) A User Study of Perceived Carbon Footprint Victor Kristof (EPFL); Valentin Quelquejay-Leclere (EPFL); Robin Zbinden (EPFL); Lucas Maystre (Spotify); Matthias Grossglauser (École Polytechnique Fédérale de Lausanne (EPFL)); Patrick Thiran (EPFL)
(19) Design, Benchmarking and Graphical Lasso based Explainability Analysis of an Energy Game-Theoretic Framework Hari Prasanna Das (UC Berkeley ); Ioannis C. Konstantakopoulos (UC Berkeley); Aummul Baneen Manasawala (UC Berkeley); Tanya Veeravalli (UC Berkeley); Huihan Liu (UC Berkeley ); Costas J. Spanos (University of California at Berkeley)
(20) Predicting ice flow using machine learning Yimeng Min (Mila); Surya Karthik Mukkavilli (Mila); Yoshua Bengio (Mila)
(21) DeepClimGAN: A High-Resolution Climate Data Generator Alexandra Puchko (Western Washington University); Brian Hutchinson (Western Washington University); Robert Link (Joint Global Change Research Institute)
(22) Quantifying the Carbon Emissions of Machine Learning Sasha Luccioni (Mila); Victor Schmidt (Mila); Alexandre Lacoste (Element AI); Thomas Dandres (Polytechnique Montreal)
(23) Measuring Impact of Climate Change on Tree Species: analysis of JSDM on FIA data Honorable Mention Hyun Choi (University of Florida); Sergio Marconi (University of Florida); Ali Sadeghian (University of Florida); Ethan White (University of Florida); Daisy Zhe Wang (Univeresity of Florida)
(24) A Global Census of Solar Facilities Using Deep Learning and Remote Sensing Honorable Mention Lucas Kruitwagen (University of Oxford); Kyle Story (Descartes Labs); Johannes Friedrich (World Resource Institute); Sam Skillman (Descartes Labs); Cameron Hepburn (University of Oxford)
(25) Machine Learning for Precipitation Nowcasting from Radar Images Shreya Agrawal (Google); Luke Barrington (Google); Carla Bromberg (Google); John Burge (Google); Cenk Gazen (Google); Jason Hickey (Google)
(26) Enhancing Stratospheric Weather Analyses and Forecasts by Deploying Sensors from a Weather Balloon Kiwan Maeng (Carnegie Mellon University); Iskender Kushan (Microsoft); Brandon Lucia (Carnegie Mellon University); Ashish Kapoor (Microsoft)
(27) Automatic data cleaning via tensor factorization for large urban environmental sensor networks Yue Hu (Vanderbilt University); Yanbing Wang (Vanderbilt University); Canwen Jiao (Vanderbilt University); Rajesh Sankaran (Argonne National Lab); Charles Catlett (Argonne National Lab); Daniel Work (Vanderbilt University)
(28) Identify Solar Panels in Low Resolution Satellite Imagery with Siamese Architecture and Cross-Correlation Zhengcheng Wang (Tsinghua University); Zhecheng Wang (Stanford University); Arun Majumdar (Stanford University); Ram Rajagopal (Stanford University)
(29) VideoGasNet: Deep Learning for Natural Gas Methane Leak Classification Using An Infrared Camera Jingfan Wang (Stanford University)
(30) Detecting Avalanche Deposits using Variational Autoencoder on Sentinel-1 Satellite Imagery Saumya Sinha (University of Colorado, Boulder); Sophie Giffard-Roisin (University of Colorado Boulder); Fatima Karbou (Meteo France); Michael Deschatres (Irstea); Nicolas Eckert (Irstea); Anna Karas (Meteo France); Cécile Coléou (Meteo France); Claire Monteleoni (University of Colorado Boulder)
(31) Fine-Grained Distribution Grid Mapping Using Street View Imagery Qinghu Tang (Tsinghua University); Zhecheng Wang (Stanford University); Arun Majumdar (Stanford University); Ram Rajagopal (Stanford University)
(32) Bayesian optimization with theory-based constraints accelerates search for stable photovoltaic perovskite materials Armi Tiihonen (Massachusetts Institute of Technology)
(33) Increasing performance of electric vehicles in ride-hailing services using deep reinforcement learning Jon Donadee (LLNL); Jacob Pettit (LLNL); Ruben Glatt (LLNL); Brenden Petersen (Lawrence Livermore National Laboratory)
(34) Stripping off the implementation complexity of physics-based model predictive control for buildings via deep learning Jan Drgona (Pacific Northwest National Laboratory); Lieve Helsen (KU Leuven); Draguna Vrabie (PNNL)
(35) Machine learning identifies the most valuable synthesis conditions for next-generation photovoltaics Best Paper Award Felipe Oviedo (MIT) and Zekun Ren (MIT)
(36) Helping Reduce Environmental Impact of Aviation with Machine Learning Best Paper Award Ashish Kapoor (Microsoft)
(37) Machine Learning for Generalizable Prediction of Flood Susceptibility Dylan Fitzpatrick (Carnegie Mellon University); Chelsea Sidrane (Stanford University); Andrew Annex (Johns Hopkins University); Diane O'Donoghue (kx); Piotr Bilinski (University of Warsaw)
(38) A Deep Learning-based Framework for the Detection of Schools of Herring in Echograms Alireza Rezvanifar (University of Victoria); Tunai Porto Marques (University of Victoria ); Melissa Cote (University of Victoria); Alexandra Branzan Albu (University of Victoria); Alex Slonimer (ASL Environmental Sciences); Thomas Tolhurst (ASL Environmental Sciences ); Kaan Ersahin (ASL Environmental Sciences ); Todd Mudge (ASL Environmental Sciences ); Stephane Gauthier (Fisheries and Oceans Canada)
(39) Emulating Numeric Hydroclimate Models with Physics-Informed cGANs Honorable Mention Ashray Manepalli (terrafuse); Adrian Albert (terrafuse, inc.); Alan Rhoades (Lawrence Berkeley National Lab); Daniel Feldman (Lawrence Berkeley National Lab)
(40) Forecasting El Niño with Convolutional and Recurrent Neural Networks Ankur Mahesh (ClimateAi); Maximilian Evans (ClimateAi); Garima Jain (ClimateAi); Mattias Castillo (ClimateAi); Aranildo Lima (ClimateAi); Brent Lunghino (ClimateAi); Himanshu Gupta (ClimateAi); Carlos Gaitan (ClimateAi); Jarrett Hunt (ClimateAi); Omeed Tavasoli (ClimateAi); Patrick Brown (ClimateAi, San Jose State University); V. Balaji (Geophysical Fluid Dynamics Laboratory)


Title Authors
(41) Deep learning predictions of sand dune migration Kelly Kochanski (University of Colorado Boulder); Divya Mohan (University of California Berkeley); Jenna Horrall (James Madison University); Ghaleb Abdulla (Lawrence Livermore National Laboratory)
(42) Predictive Inference of a Wildfire Risk Pipeline in the United States Shamindra Shrotriya (Carnegie Mellon University); Niccolo Dalmasso (Carnegie Mellon University); Alex Reinhart (Carnegie Mellon University)
(43) FutureArctic - beyond Computational Ecology Steven Latre (UAntwerpen); Dimitri Papadimitriou (UAntwerpen); Ivan Janssens (UAntwerpen); Eric Struyf (UAntwerpen); Erik Verbruggen (UAntwerpen); Ivika Ostonen (UT); Josep Penuelas (UAB); Boris Rewald (RootEcology); Andreas Richter (University of Vienna); Michael Bahn (University of Innsbruck)
(44) Machine Learning-based Estimation of Forest Carbon Stocks to increase Transparency of Forest Preservation Efforts Björn Lütjens (MIT); Lucas Liebenwein (Massachusetts Institute of Technology); Katharina Kramer (Massachusetts Institute of Technology)
(45) DeepRI: End-to-end Prediction of Tropical Cyclone Rapid Intensification from Climate Data Renzhi Jing (Princeton University); Ning Lin (Princeton University); Yinda Zhang (Google LLC)
(46) Autonomous Sensing and Scientific Machine Learning for Monitoring Greenhouse Gas Emissions Genevieve Flaspohler (MIT); Victoria Preston (MIT); Nicholas Roy (MIT); John Fisher (MIT); Adam Soule (Woods Hole Oceanographic Institution); Anna Michel (Woods Hole Oceanographic Institution)
(47) Optimizing trees for carbon sequestration Jeremy Freeman
(48) Toward Resilient Cities: Using Deep Learning to Downscale Climate Model Projections Muge Komurcu (MIT); Zikri Bayraktar (IEEE)
(49) Towards self-adaptive building energy control in smart grids Juan Gómez-Romero (Universidad de Granada); Miguel Molina-Solana (Imperial College London)
(50) Predicting Arctic Methane Seeps via Satellite Imagery Olya (Olga) Irzak (Frost Methane Labs); Amber Leigh Thomas (Stanford); Stephanie Schneider (Stanford); Catalin Voss (Stanford University)
(51) GeoLabels: Towards Efficient Ecosystem Monitoring using Data Programming on Geospatial Information David Dao (ETH); Johannes Rausch (ETH Zurich); Ce Zhang (ETH)
(52) A deep learning approach for classifying black carbon aerosol morphology Kara Lamb (Cooperative Institute for Research in the Environmental Sciences)

Program Committee

Andrew Ross (Harvard)
Aneesh Rangnekar (RIT)
Ashesh Chattopadhyay (Rice)
Ashley Pilipiszyn (Stanford)
Bolong Cheng (SigOpt)
Christian Schroeder (Oxford)
Clement Duhart (MIT)
Dali Wang (Oak Ridge National Lab)
David Dao (ETH)
Di Wu (McGill)
Dimitrios Giannakis (Courant Institute, NYU)
Duncan Watson-Parris (Oxford)
Evan Sherwin (Stanford)
Femke van Geffen (FU Berlin)
Gege Wen (Stanford)
George Chen (CMU)
Greg Schivley (Carbon Impact Consulting)
Han Zou (UC Berkeley)
Hari Prasanna Das (UC Berkeley)
Hillary Scannell (University of Washington)
Joanna Slawinska (University of Wisconsin-Milwaukee)
Johan Mathe (Frog Labs)
Jonathan Binas (Mila, Montreal)
Jussi Gillberg (Aalto University)
Kalai Ramea (PARC)
Karthik Kashinath (Lawrence Berkeley National Lab)
Kate Duffy (Northeastern)
Kelly Kochanski (CU Boulder)
Kevin McCloskey (Google)
Kris Sankaran (Mila)
Lea Boche (EPRI)
Loubna Benabbou (Mohammadia School of Engineering, Mohammed V University)
Mahdi Jamei (Invenia Labs)
Max Callaghan (MCC Berlin)
Mayur Mudigonda (UC Berkeley)
Melrose Roderick (CMU)
Mohammad Mahdi Kamani (Penn State)
Natasha Jaques (MIT)
Neel Guha (CMU)
Niccolo Dalmasso (CMU)
Nikola Milojevic-Dupont (MCC Berlin)
Pedram Hassanzadeh (Rice)
Robin Dunn (CMU)
Sajad Haghanifar (University of Pittsburgh)
Sanam Mirzazad (EPRI)
Sandeep Manjanna (McGill)
Sasha Luccioni (Mila)
Sharon Zhou (Stanford)
Shubhankar Deshpande (CMU)
Sookyung Kim (Lawrence Livermore National Lab)
Soukayna Mouatadid (University of Toronto)
Surya Karthik Mukkavilli (Mila)
Telmo Felgueira (IST)
Thomas Hornigold (Oxford)
Tianle Yuan (NASA)
Tom Beucler (Columbia & UCI)
Vikram Voleti (Mila, Montreal)
Volodymyr Kuleshov (Stanford)
Yang Song (Oak Ridge National Lab)
Ydo Wexler (Amperon)
Zhecheng Wang (Stanford)
Zhuangfang Yi (Development Seed)

About NeurIPS

NeurIPS (formerly written “NIPS”) is one of the premier conferences on machine learning, and includes a wide audience of researchers and practitioners in academia, industry, and related fields. It is possible to attend the workshop without either presenting at or attending the main NeurIPS conference.

Call for Submissions

We invite submissions of short papers using machine learning to address problems in climate mitigation, adaptation, or modeling, including but not limited to the following topics:

All machine learning techniques are welcome, from kernel methods to deep learning. Each submission should make clear why the application has (or could have) positive impacts regarding climate change. We highly encourage submissions which make their data publicly available. Accepted submissions will be invited to give poster presentations, of which some will be selected for spotlight talks.

The workshop does not record proceedings, and submissions are non-archival. Submission to this workshop does not preclude future publication. Previously published work may be submitted under certain circumstances (see the FAQ).

All submissions must be through the submission website. Submissions will be reviewed double-blind; do your best to anonymize your submission, and do not include identifying information for authors in the PDF. We encourage, but do not require, use of the NeurIPS style template (please do not use the “Accepted” format as it will deanonymize your submission).

We will be awarding $30K in cloud computing credits, sponsored by Microsoft AI for Earth, as prizes for top submissions. Winners will be announced at the workshop.

Please see the Tips for Submissions and FAQ, and contact with questions.

Submission tracks

There are two tracks for submissions. Submissions are limited to 3 pages for the Papers track, and 2 pages for the Proposals track, in PDF format (see examples here). References do not count towards this total. Supplementary appendices are allowed but will be read at the discretion of the reviewers. All submissions must explain why the proposed work has (or could have) positive impacts regarding climate change.

PAPERS track

Work that is in progress, published, and/or deployed

Submissions for the Papers track should describe projects relevant to climate change that involve machine learning. These may include (but are not limited to) academic research; deployed results from startups, industry, public institutions, etc.; and climate-relevant datasets.

Submissions should provide experimental or theoretical validation of the method presented, as well as specifying what gap the method fills. Algorithms need not be novel from a machine learning perspective if they are applied in a novel setting. Details of methodology need not be revealed if they are proprietary, though transparency is highly encouraged.

Submissions creating novel datasets are welcomed. Datasets should be designed to permit machine learning research (e.g. formatted with clear benchmarks for evaluation). In this case, baseline experimental results on the dataset are preferred, but not required.


Detailed descriptions of ideas for future work

Submissions for the Proposals track should describe detailed ideas for how machine learning can be used to solve climate-relevant problems. While less constrained than the Papers track, Proposals will be subject to a very high standard of review. No results need to be demonstrated, but ideas should be justified as extensively as possible, including motivation for why the problem being solved is important in tackling climate change, discussion of why current methods are inadequate, and explanation of the proposed method.

Tips for submissions

Travel Grants

We are excited to announce limited travel grants, sponsored by Microsoft Research. Travel grant applications can be submitted at, and are due October 3.

We also encourage workshop participants to apply for NeurIPS 2019 travel grants and other grants (e.g. Google Conference and Travel Scholarships) for which they may be eligible. If you are aware of additional scholarships that may be relevant to workshop attendees, please contact the workshop organizers so we can make this information available.

Frequently Asked Questions

Q: How can I keep up to date on this kind of stuff? A: Sign up for our mailing list!

Q: I’m not in machine learning. Can I still submit? A: Yes, absolutely! We welcome submissions from many fields. Do bear in mind, however, that the majority of attendees of the workshop will have a machine learning background; therefore, other fields should be introduced sufficiently to provide context for the work.

Q: What if my submission is accepted but I can’t attend the workshop? A: You may ask someone else to present your work in your stead.

Q: Do I need to use LaTeX or the NeurIPS style files? A: No, although we encourage it.

Q: It’s hard for me to fit my submission on 2 or 3 pages. What should I do? A: Feel free to include appendices with additional material (these should be part of the same PDF file as the main submission). Do not, however, put essential material in an appendix, as it will be read at the discretion of the reviewers.

Q: What do I do if I need an earlier decision for visa reasons? A: Contact us at and explain your situation and the date by which you require a decision and we will do our best to be accommodating.

Q: Can I send submissions directly by email? A: No, please use the CMT website to make submissions.

Q: The submission website is asking for my name. Is this a problem for anonymization? A: You should fill out your name and other info when asked on the submission website; CMT will keep your submission anonymous to reviewers.

Q: Do submissions for the Proposals track need to have experimental validation? A: No, although some initial experiments or citation of published results would strengthen your submission.

Q: The submission website never sent me a confirmation email. Is this a problem? A: No, the CMT system does not send automatic confirmation emails after a submission, though the submission should show up on the CMT page once submitted. If in any doubt regarding the submission process, please contact the organizers. Also please avoid making multiple submissions of the same article to CMT.

Q: Can I submit previously published work to this workshop? A: If it was previously published in a non-ML venue, YES! If it was previously published in an ML venue, NO! If you are unsure, contact This policy is as per the NeurIPS workshop guidelines: “Workshops are not a venue for work that has been previously published in other conferences on machine learning or related fields. Work that is presented at the main NeurIPS conference should not appear in a workshop, including as part of an invited talk… (Presenting work that has been published in other fields is, however, encouraged!)”

Q: Can I submit work to this workshop if I am also submitting to another NeurIPS 2019 workshop? A: Yes. We cannot, however, guarantee that you will not be expected to present the material at a time that conflicts with the other workshop.