ICML 2019 Workshop
Climate Change: How Can AI Help?

Many in the ML community wish to take action on climate change, yet feel their skills are inapplicable. This workshop aims to show that in fact the opposite is true: while no silver bullet, ML can be an invaluable tool both in reducing greenhouse gas emissions and in helping society adapt to the effects of climate change. Climate change is a complex problem, for which action takes many forms - from designing smart electrical grids to tracking deforestation in satellite imagery. Many of these actions represent high-impact opportunities for real-world change, as well as being interesting problems for ML research.

Recordings of the workshop are linked in the schedule.

About the workshop

Speakers

Schedule

8:30 - 8:45 - Welcome and Opening Remarks
8:45 - 9:20 - John Platt (Google AI): AI for Climate Change: the Context (Keynote talk)
9:20 - 9:45 - Jack Kelly (Open Climate Fix): Why it’s hard to mitigate climate change, and how to do better (Invited talk)
9:45 - 10:10 - Andrew Ng (Stanford): Tackling climate change challenges with AI through collaboration (Invited talk)
10:10 - 10:20 - Volodymyr Kuleshov: Towards a Sustainable Food Supply Chain Powered by Artificial Intelligence (Spotlight talk)
10:20 - 10:30 - Clement Duhart: Deep Learning for Wildlife Conservation and Restoration Efforts (Spotlight talk)
10:30 - 11:00 - Coffee break + Poster Session
11:00 - 12:00 - Chad Frischmann (Project Drawdown): Achieving Drawdown (Keynote talk)
12:00 - 1:30 - Networking lunch (provided) + Poster Session
1:30 - 1:55 - Yoshua Bengio (Mila): Personalized Visualization of the Impact of Climate Change (Invited talk)
1:55 - 2:30 - Claire Monteleoni (CU Boulder): Advances in Climate Informatics: Machine Learning for the Study of Climate Change (Invited talk)
2:30 - 2:40 - Duncan Watson-Parris: Detecting anthropogenic cloud perturbations with deep learning (Spotlight talk)
2:40 - 2:50 - Chaopeng Shen: Evaluating aleatoric and epistemic uncertainties of time series deep learning models for soil moisture predictions (Spotlight talk)
2:50 - 3:00 - Mohammad Mahdi Kamani: Targeted Meta-Learning for Critical Incident Detection in Weather Data (Spotlight talk)
3:00 - 3:30 - Coffee break + Poster Session
3:30 - 3:45 - Karthik Mukkavilli (Mila): Geoscience data and models for the Climate Change AI community (Invited talk)
3:45 - 4:20 - Sims Witherspoon (DeepMind): ML vs. Climate Change, Applications in Energy at DeepMind (Invited talk)
4:20 - 4:30 - Lynn Kaack: Truck Traffic Monitoring with Satellite Images (Spotlight talk)
4:30 - 4:50 - Neel Guha: Machine Learning for AC Optimal Power Flow (Spotlight talk)
4:40 - 4:50 - Christian Clough, Gopal Erinjippurath: Planetary Scale Monitoring of Urban Growth in High Flood Risk Areas (Spotlight talk)
4:50 - 5:15 - “Ideas” mini-spotlights
5:15 - 6:00 - Yoshua Bengio, Andrew Ng, Sims Witherspoon, John Platt, Claire Monteleoni: Panel discussion

Organizers

David Rolnick (UPenn)
Alexandre Lacoste (ElementAI)
Tegan Maharaj (MILA)
Jennifer Chayes (Microsoft)
Yoshua Bengio (MILA)

Karthik Mukkavilli (MILA)
Narmada Balasooriya (ConscientAI)
Di Wu (MILA)
Priya Donti (CMU)
Lynn Kaack (ETH Zürich)
Manvitha Ponnapati (MIT)

ElementAI logo


Works are submitted to one of three tracks: Research, Deployed, or Ideas.

Research Track

(1) Policy Search with Non-uniform State Representations for Environmental Sampling pdf

Sandeep Manjanna (McGill University); Herke van Hoof (University of Amsterdam); Gregory Dudek (McGill University)

Abstract: (click to expand) Surveying fragile ecosystems like coral reefs is important to monitor the effects of climate change. We present an adaptive sampling technique that generates efficient trajectories covering hotspots in the region of interest at a high rate. A key feature of our sampling algorithm is the ability to generate action plans for any new hotspot distribution using the parameters learned on other similar looking distributions.

(2) Modelling GxE with historical weather information improves genomic prediction in new environments

Jussi Gillberg (Aalto University); Pekka Marttinen (Aalto University); Hiroshi Mamitsuka (Kyoto University); Samuel Kaski (Aalto University)

Abstract: (click to expand) Interaction between the genotype and the environment ($G \times E$) has a strong impact on the yield of major crop plants. Recently $G \times E$ has been predicted from environmental and genomic covariates, but existing works have not considered generalization to new environments and years without access to in-season data. We study \textit{in silico} the viability of $G \times E$ prediction under realistic constraints. We show that the environmental response of a new generation of untested Barley cultivars can be predicted in new locations and years using genomic data, machine learning and historical weather observations. Our results highlight the need for models of $G \times E$: non-linear effects clearly dominate linear ones and the interaction between the soil type and daily rain is identified as the main driver for $G \times E$. Our study implies that genomic selection can be used to capture the yield potential in $G \times E$ effects for future growth seasons, providing a possible means to achieve yield improvements. $G \times E$ models are also needed to select for varieties that react favourably to the altering climate conditions. For this purpose, the historical weather observations could be replaced by climate simulations to study the yield potential under various climate scenarios.This abstract summarizes the findings of a recently published article.

(3) Machine Learning empowered Occupancy Sensing for Smart Buildings pdf

Han Zou (UC Berkeley); Hari Prasanna Das (UC Berkeley ); Jianfei Yang (Nanyang Technological University); Yuxun Zhou (UC Berkeley); Costas Spanos (UC Berkeley)

Abstract: (click to expand) Over half of the global electricity consumption is attributed to buildings, which are often operated poorly from an energy perspective. Significant improvements in energy efficiency can be achieved via intelligent building control techniques. To realize such advanced control schemes, accurate and robust occupancy information is highly valuable. In this work, we present a cutting-edge WiFi sensing platform and state-of-the-art machine learning methods to address longstanding occupancy sensing challenges in smart buildings. Our systematic solution provides comprehensive fine-grained occupancy information in a non-intrusive and privacy-preserving manner, which facilitates eco-friendly and sustainable buildings.

(4) Focus and track: pixel-wise spatio-temporal hurricane tracking pdf

Sookyung Kim (Lawrence Livermore National Laboratory); Sunghyun Park (Korea University); Sunghyo Chung (Korea University); Yunsung Lee (Korea University); Hyojin Kim (LLNL); Joonseok Lee (Google Research); Jaegul Choo (Korea University); Mr Prabhat (Lawrence Berkeley National Laboratory)

Abstract: (click to expand) We tackle extreme climate event tracking problem. It has unique challenges to other visual object tracking problems, including wider range of spatio-temporal dynamics, blur boundary of the target, and shortage of labeled dataset. In this paper, we propose a simple but robust end-to-end model based on multi-layered ConvLSTM, suitable for the climate event tracking problem. It first learns to imprint location and appearance of the target at the first frame with one-shot auto-encoding fashion, and then, the learned feature is consumed by the tracking module to track the target in subsequent time frames. To tackle the data shortage problem, we propose data augmentation based on Social GAN. Extensive experiments show that the proposed framework significantly improves tracking performance on hurricane tracking task over several state-of-the-art methods.

(5) Recovering the parameters underlying the Lorenz-96 chaotic dynamics pdf

Soukayna Mouatadid (University of Toronto); Pierre Gentine (Columbia University); Wei Yu (University of Toronto); Steve Easterbrook (University of Toronto)

Abstract: (click to expand) Climate projections suffer from uncertain equilibrium climate sensitivity. The reason behind this uncertainty is the resolution of global climate models, which is too coarse to resolve key processes such as clouds and convection. These processes are approximated using heuristics in a process called parameterization. The selection of these parameters can be subjective, leading to significant uncertainties in the way clouds are represented in global climate models. Here, we explore three deep network algorithms to infer these parameters in an objective and data-driven way. We compare the performance of a fully-connected network, a one-dimensional and, a two-dimensional convolutional networks to recover the underlying parameters of the Lorenz-96 model, a non-linear dynamical system that has similar behavior to the climate system.

(6) Using Bayesian Optimization to Improve Solar Panel Performance by Developing Antireflective, Superomniphobic Glass

Sajad Haghanifar (University of Pittsburgh); Bolong Cheng (SigOpt); Mike Mccourt (SigOpt); Paul Leu (University of Pittsburgh)

Abstract: (click to expand) Photovoltaic solar panel efficiency is dependent on photons transmitting through the glass sheet covering and into the crystalline silicon solar cells within. However, complications such as soiling and light reflection degrade performance. Our goal is to identify a fabrication process to produce glass which promotes photon transmission and is superomniphobic (repels fluids), for easier cleaning. In this paper, we propose adapting Bayesian optimization to efficiently search the space of possible glass fabrication strategies; in this search we balance three competing objectives (transmittance, haze and oil contact angle). We present the glass generated from this Bayesian optimization strategy and detail its properties relevant to photovoltaic solar power.

(7) A quantum mechanical approach for data assimilation in climate dynamics pdf

Dimitrios Giannakis (Courant Institute of Mathematical Sciences, New York University); Joanna Slawinska (University of Wisconsin-Milwaukee); Abbas Ourmazd (University of Wisconsin-Milwaukee)

Abstract: (click to expand) A framework for data assimilation in climate dynamics is presented, combining aspects of quantum mechanics, Koopman operator theory, and kernel methods for machine learning. This approach adapts the Dirac-von Neumann formalism of quantum dynamics and measurement to perform data assimilation (filtering) of climate dynamics, using the Koopman operator governing the evolution of observables as an analog of the Heisenberg operator in quantum mechanics, and a quantum mechanical density operator to represent the data assimilation state. The framework is implemented in a fully empirical, data-driven manner, using kernel methods for machine learning to represent the evolution and measurement operators via matrices in a basis learned from time-ordered observations. Applications to data assimilation of the Nino 3.4 index for the El Nino Southern Oscillation (ENSO) in a comprehensive climate model show promising results.

(8) Data-driven Chance Constrained Programming based Electric Vehicle Penetration Analysis pdf

Di Wu (McGill); Tracy Cui (Google NYC); Doina Precup (McGill University); Benoit Boulet (McGill)

Abstract: (click to expand) Transportation electrification has been growing rapidly in recent years. The adoption of electric vehicles (EVs) could help to release the dependency on oil and reduce greenhouse gas emission. However, the increasing EV adoption will also impose a high demand on the power grid and may jeopardize the grid network infrastructures. For certain high EV penetration areas, the EV charging demand may lead to transformer overloading at peak hours which makes the maximal EV penetration analysis an urgent problem to solve. This paper proposes a data-driven chance constrained programming based framework for maximal EV penetration analysis. Simulation results are presented for a real-world neighborhood level network. The proposed framework could serve as a guidance for utility companies to schedule infrastructure upgrades.

(9) (Spotlight: 4:30PM) Machine Learning for AC Optimal Power Flow pdf Honorable Mention

Neel Guha (Carnegie Mellon University); Zhecheng Wang (Stanford University); Arun Majumdar (Stanford University)

Abstract: (click to expand) F( We explore machine learning methods for AC Optimal Powerflow (ACOPF) - the task of optimizing power generation in a transmission network according while respecting physical and engineering constraints. We present two formulations of ACOPF as a machine learning problem: 1) an end-to-end prediction task where we directly predict the optimal generator settings, and 2) a constraint prediction task where we predict the set of active constraints in the optimal solution. We validate these approaches on two benchmark grids.

(10) (Spotlight: 2:50PM) Targeted Meta-Learning for Critical Incident Detection in Weather Data pdf

Mohammad Mahdi Kamani (The Pennsylvania State University); Sadegh Farhang (Pennsylvania State University); Mehrdad Mahdavi (Penn State); James Z Wang (The Pennsylvania State University)

Abstract: (click to expand) Due to imbalanced or heavy-tailed nature of weather- and climate-related datasets, the performance of standard deep learning models significantly deviates from their expected behavior on test data. Classical methods to address these issues are mostly data or application dependent, hence burdensome to tune. Meta-learning approaches, on the other hand, aim to learn hyperparameters in the learning process using different objective functions on training and validation data. However, these methods suffer from high computational complexity and are not scalable to large datasets. In this paper, we aim to apply a novel framework named as targeted meta-learning to rectify this issue, and show its efficacy in dealing with the aforementioned biases in datasets. This framework employs a small, well-crafted target dataset that resembles the desired nature of test data in order to guide the learning process in a coupled manner. We empirically show that this framework can overcome the bias issue, common to weather-related datasets, in a bow echo detection case study.

(11) ClimateNet: Bringing the power of Deep Learning to weather and climate sciences via open datasets and architectures

Karthik Kashinath (Lawrence Berkeley National Laboratory); Mayur Mudigonda (UC Berkeley); Kevin Yang (UC Berkeley); Jiayi Chen (UC Berkeley); Annette Greiner (Lawrence Berkeley National Laboratory); Mr Prabhat (Lawrence Berkeley National Laboratory)

Abstract: (click to expand) Pattern recognition tasks such as classification, object detection and segmentation have remained challenging problems in the weather and climate sciences. While there exist many empirical heuristics for detecting weather patterns and extreme events, the disparities between the output of these different methods even for a single event are large and often difficult to reconcile. Given the success of Deep Learning in tackling similar problems in computer vision, we advocate a DL-based approach. However, DL works best in the context of supervised learning, when labeled datasets are readily available. Reliable, labeled training data is scarce in climate science. `ClimateNet' is an effort to solve this problem by creating open, community-sourced expert-labeled datasets that capture information pertaining to class or pattern labels, bounding boxes and segmentation masks. In this paper we present the motivation, design and status of the ClimateNet dataset and associated model architecture.

(12) Improving Subseasonal Forecasting in the Western U.S. with Machine Learning

Paulo Orenstein (Stanford); Jessica Hwang (Stanford); Judah Cohen (AER); Karl Pfeiffer (AER); Lester Mackey (Microsoft Research New England)

Abstract: (click to expand) Water managers in the western United States (U.S.) rely on longterm forecasts of temperature and precipitation to prepare for droughts and other wet weather extremes. To improve the accuracy of these long-term forecasts, the Bureau of Reclamation and the National Oceanic and Atmospheric Administration (NOAA) launched the Subseasonal Climate Forecast Rodeo, a year-long real-time forecasting challenge, in which participants aimed to skillfully predict temperature and precipitation in the western U.S. two to four weeks and four to six weeks in advance. We present and evaluate our machine learning approach to the Rodeo and release our SubseasonalRodeo dataset, collected to train and evaluate our forecasting system. Our predictive system is an ensemble of two regression models, and exceeds that of the top Rodeo competitor as well as the government baselines for each target variable and forecast horizon.

(13) Unsupervised Temporal Clustering to Monitor the Performance of Alternative Fueling Infrastructure pdf

Kalai Ramea (PARC)

Abstract: (click to expand) Zero Emission Vehicles (ZEV) play an important role in the decarbonization of the transportation sector. For a wider adoption of ZEVs, providing a reliable infrastructure is critical. We present a machine learning approach that uses unsupervised temporal clustering algorithm along with survey analysis to determine infrastructure performance and reliability of alternative fuels. We illustrate this approach for the hydrogen fueling stations in California, but this can be generalized for other regions and fuels.

(14) A Flexible Pipeline for Prediction of Tropical Cyclone Paths pdf

Niccolo Dalmasso (Carnegie Mellon University); Robin Dunn (Carnegie Mellon University); Benjamin LeRoy (Carnegie Mellon University); Chad Schafer (Carnegie Mellon University)

Abstract: (click to expand) Hurricanes and, more generally, tropical cyclones (TCs) are rare, complex natural phenomena of both scientific and public interest. The importance of understanding TCs in a changing climate has increased as recent TCs have had devastating impacts on human lives and communities. Moreover, good prediction and understanding about the complex nature of TCs can mitigate some of these human and property losses. Though TCs have been studied from many different angles, more work is needed from a statistical approach of providing prediction regions. The current state-of-the-art in TC prediction bands comes from the National Hurricane Center at NOAA, whose proprietary model provides "cones of uncertainty" for TCs through an analysis of historical forecast errors. The contribution of this paper is twofold. We introduce a new pipeline that encourages transparent and adaptable prediction band development by streamlining cyclone track simulation and prediction band generation. We also provide updates to existing models and novel statistical methodologies in both areas of the pipeline respectively.

(15) Mapping land use and land cover changes faster and at scale with deep learning on the cloud pdf

Zhuangfang Yi (Development Seed); Drew Bollinger (Development Seed); Devis Peressutti (Sinergise)

Abstract: (click to expand) Policymakers rely on Land Use and Land Cover (LULC) maps for evaluation and planning. They use these maps to plan climate-smart agriculture policy, improve housing resilience (to earthquakes or other natural disasters), and understand how to grow commerce in small communities. A number of institutions have created global land use maps from historic satellite imagery. However, these maps can be outdated and are often inaccurate, particularly in their representation of developing countries. We worked with the European Space Agency (ESA) to develop a LULC deep learning workflow on the cloud that can ingest Sentinel-2 optical imagery for a large scale LULC change detection. It’s an end-to-end workflow that sits on top of two comprehensive tools, SentinelHub, and eo-learn, which seamlessly link earth observation data with machine learning libraries. It can take in the labeled LULC and associated AOI in shapefiles, set up a task to fetch cloud-free, time series imagery stacks within the defined time interval by the users. It will pair the satellite imagery tile with it’s labeled LULC mask for the supervised deep learning model training on the cloud. Once a well-performing model is trained, it can be exported as a Tensorflow/Pytorch serving docker image to work with our cloud-based model inference pipeline. The inference pipeline can automatically scale with the number of images to be processed. Changes in land use are heavily influenced by human activities (e.g. agriculture, deforestation, human settlement expansion) and have been a great source of greenhouse gas emissions. Sustainable forest and land management practices vary from region to region, which means having flexible, scalable tools will be critical. With these tools, we can empower analysts, engineers, and decision-makers to see where contributions to climate-smart agricultural, forestry and urban resilience programs can be made.

(16) Achieving Conservation of Energy in Neural Network Emulators for Climate Modeling pdf

Tom G Beucler (Columbia University & UCI); Stephan Rasp (Ludwig-Maximilian University of Munich); Michael Pritchard (UCI); Pierre Gentine (Columbia University)

Abstract: (click to expand) Artificial neural-networks have the potential to emulate cloud processes with higher accuracy than the semi-empirical emulators currently used in climate models. However, neural-network models do not intrinsically conserve energy and mass, which is an obstacle to using them for long-term climate predictions. Here, we propose two methods to enforce linear conservation laws in neural-network emulators of physical models: Constraining (1) the loss function or (2) the architecture of the network itself. Applied to the emulation of explicitly-resolved cloud processes in a prototype multi-scale climate model, we show that architecture constraints can enforce conservation laws to satisfactory numerical precision, while all constraints help the neural-network better generalize to conditions outside of its training set, such as global warming.

(17) The Impact of Feature Causality on Normal Behaviour Models for SCADA-based Wind Turbine Fault Detection pdf

Telmo Felgueira (IST)

Abstract: (click to expand) The cost of wind energy can be reduced by using SCADA data to detect faults in wind turbine components. Normal behavior models are one of the main fault detection approaches, but there is a lack of work in how different input features affect the results. In this work, a new taxonomy based on the causal relations between the input features and the target is presented. Based on this taxonomy, the impact of different input feature configurations on the modelling and fault detection performance is evaluated. To this end, a framework that formulates the detection of faults as a classification problem is also presented.

(18) Predicting CO2 Plume Migration using Deep Neural Networks pdf

Gege Wen (Stanford University)

Abstract: (click to expand) Carbon capture and sequestration (CCS) is an essential climate change mitigation technology for achieving the 2 degree C target. Numerical simulation of CO2 plume migration in the subsurface is a prerequisite to effective CCS projects. However, stochastic high spatial resolution simulations are currently limited by computational resources. We propose a deep neural network approach to predict the CO2 plume migration in high dimensional systems with complex geology. Upon training, the network is able to give accurate predictions that are 6 orders of magnitude faster than traditional numerical simulators. This approach can be easily adopted to history-matching and uncertainty analysis problems to support the scale-up of CCS deployment.

(19) (Spotlight: 4:20PM) Truck Traffic Monitoring with Satellite Images

Lynn Kaack (ETH Zurich); George H Chen (Carnegie Mellon University); Granger Morgan (Carnegie Mellon University)

Abstract: (click to expand) The road freight sector is responsible for a large and growing share of greenhouse gas emissions, but reliable data on the amount of freight that is moved on roads in many parts of the world are scarce. Many low- and middle-income countries have limited ground-based traffic monitoring and freight surveying activities. In this proof of concept, we show that we can use an object detection network to count trucks in satellite images and predict average annual daily truck traffic from those counts. We describe a complete model, test the uncertainty of the estimation, and discuss the transfer to developing countries.

(20) (Spotlight: 2:40PM) Evaluating aleatoric and epistemic uncertainties of time series deep learning models for soil moisture predictions pdf

Chaopeng Shen (Pennsylvania State University)

Abstract: (click to expand) Soil moisture is an important variable that determines floods, vegetation health, agriculture productivity, and land surface feedbacks to the atmosphere, etc.. The recently available satellite-based observations give us a unique opportunity to directly build data-driven models to predict soil moisture instead of using land surface models, but previously there was no uncertainty estimate. We tested Monte Carlo dropout with an aleatoric term (MCD+A) for our long short-term memory models for this problem, and ask if the uncertainty terms behave as they were argued to. We show that MCD+A indeed gave a good estimate of our predictive error, provided we tune a hyperparameter and use a representative training dataset. The aleatoric term responded strongly to observational noise and the epistemic term clearly acted as a detector for physiographic dissimilarity from the training data. However, when the training and test data are characteristically different, the aleatoric term could be misled, undermining its reliability. We will also discuss some of the major challenges for which we anticipate the geoscientific communities will need help from computer scientists in applying AI to climate or hydrologic modeling.

(21) (Spotlight: 2:30PM) Detecting anthropogenic cloud perturbations with deep learning pdf Best Paper Award

Duncan Watson-Parris (University of Oxford); Sam Sutherland (University of Oxford); Matthew Christensen (University of Oxford); Anthony Caterini (University of Oxford); Dino Sejdinovic (University of Oxford); Philip Stier (University of Oxford)

Abstract: (click to expand) One of the most pressing questions in climate science is that of the effect of anthropogenic aerosol on the Earth's energy balance. Aerosols provide the `seeds' on which cloud droplets form, and changes in the amount of aerosol available to a cloud can change its brightness and other physical properties such as optical thickness and spatial extent. Clouds play a critical role in moderating global temperatures and small perturbations can lead to significant amounts of cooling or warming. Uncertainty in this effect is so large it is not currently known if it is negligible, or provides a large enough cooling to largely negate present-day warming by CO2. This work uses deep convolutional neural networks to look for two particular perturbations in clouds due to anthropogenic aerosol and assess their properties and prevalence, providing valuable insights into their climatic effects.

(22) Data-driven surrogate models for climate modeling: application of echo state networks, RNN-LSTM and ANN to the multi-scale Lorenz system as a test case pdf

Ashesh K Chattopadhyay (Rice University); Pedram Hassanzadeh (Rice University); Devika Subramanian (Rice University); Krishna Palem (Rice University); Charles Jiang (Rice University); Adam Subel (Rice University)

Abstract: (click to expand) Understanding the effects of climate change relies on physics driven computationally expensive climate models which are still imperfect owing to ineffective subgrid scale parametrization. An effective way to treat these ineffective parametrization of largely uncertain subgrid scale processes are data-driven surrogate models with machine learning techniques. These surrogate models train on observational data capturing either the embed- dings of their (subgrid scale processes’) underlying dynamics on the large scale processes or to simulate the subgrid processes accurately to be fed into the large scale processes. In this paper an extended version of the Lorenz 96 system is studied, which consists of three equations for a set of slow, intermediate, and fast variables, providing a fitting prototype for multi-scale, spatio-temporal chaos, and in particular, the complex dynamics of the climate system. In this work, we have built a data-driven model based on echo state net- works (ESN) aimed, specifically at climate modeling. This model can predict the spatio-temporal chaotic evolution of the Lorenz system for several Lyapunov timescales. We show that the ESN model outperforms, in terms of the prediction horizon, a deep learning technique based on recurrent neural network (RNN) with long short-term memory (LSTM) and an artificial neural network by factors between 3 and 10. The results suggest that ESN has the potential for being a powerful method for surrogate modeling and data-driven prediction for problems of interest to the climate community.

(23) Learning Radiative Transfer Models for Climate Change Applications in Imaging Spectroscopy pdf

Shubhankar V Deshpande (Carnegie Mellon University), Brian D Bue (NASA JPL/Caltech), David R Thompson (NASA JPL/Caltech), Vijay Natraj (NASA JPL/Caltech), Mario Parente (UMass Amherst)

Abstract: (click to expand) According to a recent investigation, an estimated 33-50% of the world's coral reefs have undergone degradation, believed to be as a result of climate change. A strong driver of climate change and the subsequent environmental impact are greenhouse gases such as methane. However, the exact relation climate change has to the environmental condition cannot be easily established. Remote sensing methods are increasingly being used to quantify and draw connections between rapidly changing climatic conditions and environmental impact. A crucial part of this analysis is processing spectroscopy data using radiative transfer models (RTMs) which is a computationally expensive process and limits their use with high volume imaging spectrometers. This work presents an algorithm that can efficiently emulate RTMs using neural networks leading to a multifold speedup in processing time, and yielding multiple downstream benefits.

(24) (Spotlight: 4:40PM) Planetary Scale Monitoring of Urban Growth in High Flood Risk Areas pdf

Christian F Clough (Planet); Ramesh Nair (Planet); Gopal Erinjippurath (Planet); Matt George (Planet); Jesus Martinez Manso (Planet)

Abstract: (click to expand) Climate change is increasing the incidence of flooding. Many areas in the developing world are experiencing strong population growth but lack adequate urban planning. This represents a significant humanitarian risk. We explore the use of high-cadence satellite imagery provided by Planet, who’s flock of over one hundred ’Dove’ satellites image the entire earth’s landmass everyday at 3-5m resolution. We use a deep learning-based computer vision approach to measure flood-related humanitarian risk in 5 cities in Africa.

(25) Efficient Multi-temporal and In-season Crop Mapping with Landsat Analysis Ready Data via Long Short-term Memory Networks pdf

Jinfan Xu (Zhejiang University); Renhai Zhong (Zhejiang University); Jialu Xu (Zhejiang University); Haifeng Li (Central South University); Jingfeng Huang (Zhejiang University); Tao Lin (Zhejiang University)

Abstract: (click to expand) Globe crop analysis from plentiful satellite images yields state-of-the-art results about estimating climate change impacts on agriculture with modern machine learning technology. Generating accurate and timely crop mapping across years remains a scientific challenge since existing non-temporal classifiers are hardly capable of capturing complicated temporal links from multi-temporal remote sensing data and adapting to interannual variability. We developed an LSTM-based model trained by previous years to distinguish corn and soybean for the current year. The results showed that LSTM outperformed random forest baseline in both in-season and end-of-the-season crop type classification. The improved performance is a result of the cumulative effect of remote sensing information that has been learned by LSTM model structure. The work pF(24rovides a valuable opportunity for estimating the impact of climate change on crop yield and early warning of extreme weather events in the future.

Deployed Track

(26) Autopilot of Cement Plants for Reduction of Fuel Consumption and Emissions pdf

Prabal Acharyya (Petuum Inc); Sean D Rosario (Petuum Inc); Roey Flor (Petuum Inc); Ritvik Joshi (Petuum Inc); Dian Li (Petuum Inc); Roberto Linares (Petuum Inc); Hongbao Zhang (Petuum Inc)

Abstract: (click to expand) The cement manufacturing industry is an essential component of the global economy and infrastructure. However, cement plants inevitably produce hazardous air pollutants, including greenhouse gases, and heavy metal emissions as byproducts of the process. Byproducts from cement manufacturing alone accounts for approximately 5% of global carbon dioxide (CO2) emissions. We have developed "Autopilot" - a machine learning based Software as a Service (SaaS) to learn manufacturing process dynamics and optimize the operation of cement plants - in order to reduce the overall fuel consumption and emissions of cement production. Autopilot is able to increase the ratio of alternative fuels (including biowaste and tires) to Petroleum coke, while optimizing operation of pyro, the core process of cement production that includes the preheater, kiln and cooler. Emissions of gases such as NOx and SOx, and heavy metals such as mercury and lead which are generated through burning petroleum coke can be reduced through the use of Autopilot. Our system has been proven to work in real world deployments and an analysis of cement plant performance with Autopilot enabled shows energy consumption savings and a decrease of up to 28,000 metric tons of CO2 produced per year.

(27) (Spotlight: 10:10AM) Towards a Sustainable Food Supply Chain Powered by Artificial Intelligence Honorable Mention

Volodymyr Kuleshov (Stanford University)

Abstract: (click to expand) About 30-40% of food produced worldwide is wasted. This puts a severe strain on the environment and represents a $165B loss to the US economy. This paper explores how artificial intelligence can be used to automate decisions across the food supply chain in order to reduce waste and increase the quality and affordability of food. We focus our attention on supermarkets — combined with downstream consumer waste, these contribute to 40% of total US food losses — and we describe an intelligent decision support system for supermarket operators that optimizes purchasing decisions and minimizes losses. The core of our system is a model-based reinforcement learn- ing engine for perishable inventory management; in a real-world pilot with a US supermarket chain, our system reduced waste by up to 50%. We hope that this paper will bring the food waste problem to the attention of the broader machine learning research community.

(28) PVNet: A LRCN Architecture for Spatio-Temporal Photovoltaic Power Forecasting from Numerical Weather Prediction pdf

Johan Mathe (Frog Labs)

Abstract: (click to expand) Photovoltaic (PV) power generation has emerged as one of the leading renewable energy sources. Yet, its production is characterized by high uncertainty, being dependent on weather conditions like solar irradiance and temperature. Predicting PV production, even in the 24-hour forecast, remains a challenge and leads energy providers to left idling - often carbon-emitting - plants. In this paper, we introduce a Long-Term Recurrent Convolutional Network using Numerical Weather Predictions (NWP) to predict, in turn, PV production in the 24-hour and 48-hour forecast horizons. This network architecture fully leverages both temporal and spatial weather data, sampled over the whole geographical area of interest. We train our model on a prediction dataset from the National Oceanic and Atmospheric Administration (NOAA) to predict spatially aggregated PV production in Germany. We compare its performance to the persistence model and state-of-the-art methods.

Tianle Yuan (NASA)

Abstract: (click to expand) Ship-tracks appear as long winding linear features in satellite images and are produced by aerosols from ship exhausts changing low cloud properties. They are one of the best examples of aerosol-cloud interaction experiments, which is currently the largest source of uncertainty in our understanding of climate forcing. Manually finding ship-tracks from satellite data on a large-scale is prohibitively costly while a large number of samples are required to better understand aerosol-cloud interactions. Here we train a deep neural network to automate finding ship-tracks. The neural network model generalizes well as it not only finds ship-tracks labeled by human experts, but also detects those that are occasionally missed by humans. It increases our sampling capability of ship-tracks by orders of magnitude and produces a first global map of ship-track distributions using satellite data. Major shipping routes that are mapped by the algorithm correspond well with available commercial data. There are also situations where commercial data are missing shipping routes that are detected by our algorithm. Our technique will enable studying aerosol effects on low clouds using ship-tracks on a large-scale, which will potentially narrow the uncertainty of the aerosol-cloud interactions. The product is also useful for applications such as coastal air pollution and trade.

(30) Using Smart Meter Data to Forecast Grid Scale Electricity Demand

Abraham Stanway (Amperon Holdings, Inc); Ydo Wexler (Amperon)

Abstract: (click to expand) Highly accurate electricity demand forecasts represent a major opportunity to create grid stability in light of the concurrent deployment of distributed renewables and energy storage, as well as the increasing occurrence of extreme weather events caused by climate change. We present an overview of a deployed machine learning system that accomplishes this task by using smart meter data (AMI) within the region governed by the Electric Reliability Council of Texas (ERCOT).

(31) (Spotlight: 10:30AM) Deep Learning for Wildlife Conservation and Restoration Efforts pdf

Clement Duhart (MIT Media Lab)

Abstract: (click to expand) Climate change and environmental degradation are causing species extinction worldwide. Automatic wildlife sensing is an urgent requirement to track biodiversity losses on Earth. Recent improvements in machine learning can accelerate the development of large-scale monitoring systems that would help track conservation outcomes and target efforts. In this paper, we present one such system we developed. 'Tidzam' is a Deep Learning framework for wildlife detection, identification, and geolocalization, designed for the Tidmarsh Wildlife Sanctuary, the site of the largest freshwater wetland restoration in Massachusetts.

Ideas Track

(32) (Spotlight: 4:50PM) Reinforcement Learning for Sustainable Agriculture pdf

Jonathan Binas (Mila, Montreal); Leonie Luginbuehl (Department of Plant Sciences, University of Cambridge); Yoshua Bengio (Mila)

Abstract: (click to expand) The growing population and the changing climate will push modern agriculture to its limits in an increasing number of regions on earth. Establishing next-generation sustainable food supply systems will mean producing more food on less arable land, while keeping the environmental impact to a minimum. Modern machine learning methods have achieved super-human performance on a variety of tasks, simply learning from the outcomes of their actions. We propose a path towards more sustainable agriculture, considering plant development an optimization problem with respect to certain parameters, such as yield and environmental impact, which can be optimized in an automated way. Specifically, we propose to use reinforcement learning to autonomously explore and learn ways of influencing the development of certain types of plants, controlling environmental parameters, such as irrigation or nutrient supply, and receiving sensory feedback, such as camera images, humidity, and moisture measurements. The trained system will thus be able to provide instructions for optimal treatment of a local population of plants, based on non-invasive measurements, such as imaging.

(33) (Spotlight: 4:55PM) Stratospheric Aerosol Injection as a Deep Reinforcement Learning Problem pdf Honorable Mention

Christian A Schroeder (University of Oxford); Thomas Hornigold (University of Oxford)

Abstract: (click to expand) As global greenhouse gas emissions continue to rise, the use of geoengineering in order to artificially mitigate climate change effects is increasingly considered. Stratospheric aerosol injection (SAI), which reduces solar radiative forcing and thus can be used to offset excess radiative forcing due to the greenhouse effect, is both technically and economically feasible. However, naive deployment of SAI has been shown in simulation to produce highly adversarial regional climatic effects in regions such as India and West Africa. Wealthy countries would most likely be able to trigger SAI unilaterally, i.e. China, Russia or US could decide to fix their own climates and, by collateral damage, drying India out by disrupting the monsoon or inducing termination effects with rapid warming. Understanding both how SAI can be optimised and how to best react to rogue injections is therefore of crucial geostrategic interest. In this paper, we argue that optimal SAI control can be characterised as a high-dimensional Markov Decision Process. This motivates the use of deep reinforcement learning in order to automatically discover non-trivial, and potentially time-varying, optimal injection policies or identify catastrophic ones. To overcome the inherent sample inefficiency of deep reinforcement learning, we propose to emulate a Global Circulation Model using deep learning techniques. To our knowledge, this is the first proposed application of deep reinforcement learning to the climate sciences.

(34) (Spotlight: 5:00PM) Using Natural Language Processing to Analyze Financial Climate Disclosures pdf

Sasha Luccioni (Mila); Hector Palacios (Element AI)

Abstract: (click to expand) According to U.S. financial legislation, companies traded on the stock market are obliged to regularly disclose risks and uncertainties that are likely to affect their operations or financial position. Since 2010, these disclosures must also include climate-related risk projections. These disclosures therefore present a large quantity of textual information on which we can apply NLP techniques in order to pinpoint the companies that divulge their climate risks and those that do not, the types of vulnerabilities that are disclosed, and to follow the evolution of these risks over time.

(35) Machine Learning-based Maintenance for Renewable Energy: The Case of Power Plants in Morocco pdf

Kris Sankaran (Montreal Institute for Learning Algorithms); Zouheir Malki (Polytechnique Montréal); Loubna Benabou (UQAR); Hicham Bouzekri (MASEN)

Abstract: (click to expand) In this project, the focus will be on the reduction of the overall electricity cost by the reduction of operating expenditures, including maintenance costs. We propose a predictive maintenance (PdM) framework for multi-component systems in renewables power plants based on machine learning (ML) and optimization approaches. This project would benefit from a real database acquired from the Moroccan Agency Of Sustainable Energy (MASEN) that own and operate several wind, solar and hydro power plants spread over Moroccan territory. Morocco has launched an ambitious energy strategy since 2009 that aims to ensure the energy security of the country, diversify the source of energy and preserve the environment. Ultimately, Morocco has set the target of 52% of renewables by 2030 with a large capital investment of USD 30 billion. To this end, Morocco will install 10 GW allocated as follows: 45% for solar, 42% for wind and 13% for hydro. Through the commitment of many actors, in particular in Research and Development, Morocco intends to become a regional leader and a model to follow in its climate change efforts. MASEN is investing in several strategies to reduce the cost of renewables, including the cost of operations and maintenance. Our project will provide a ML predictive maintenance framework to support these efforts.

(36) GainForest: Scaling Climate Finance for Forest Conservation using Interpretable Machine Learning on Satellite Imagery pdf

David Dao (ETH); Ce Zhang (ETH); Nick Beglinger (Cleantech21); Catherine Cang (UC Berkeley); Reuven Gonzales (OasisLabs); Ming-Da Liu Zhang (ETHZ); Nick Pawlowski (Imperial College London); Clement Fung (University of British Columbia)

Abstract: (click to expand) Designing effective REDD+ policies, assessing their GHG impact, and linking them with the corresponding payments, is a resource intensive and complex task. GainForest leverages video prediction with remote sensing to monitor and forecast forest change at high resolution. Furthermore, by viewing payment allocation as a feature selection problem, GainForest can efficiently design payment schemes based on the Shapley value.

(37) Machine Intelligence for Floods and the Built Environment Under Climate Change pdf

Kate Duffy (Northeastern University); Auroop Ganguly (Northeastern University)

Abstract: (click to expand) While intensification of precipitation extremes has been attributed to anthropogenic climate change using statistical analysis and physics-based numerical models, understanding floods in a climate context remains a grand challenge. Meanwhile, an increasing volume of Earth science data from climate simulations, remote sensing, and Geographic Information System (GIS) tools offers opportunity for data-driven insight and action plans. Defining Machine Intelligence (MI) broadly to include machine learning and network science, here we develop a vision and use preliminary results to showcase how scientific understanding of floods can be improved in a climate context and translated to impacts with a focus on Critical Lifeline Infrastructure Networks (CLIN).

(38) Predicting Marine Heatwaves using Global Climate Models with Cluster Based Long Short-Term Memory

Hillary S Scannell (University of Washington); Chris Fraley (Tableau Software); Nathan Mannheimer (Tableau Software); Sarah Battersby (Tableau Software); LuAnne Thompson (University of Washington)

Abstract: (click to expand) Marine heatwaves make human and natural systems vulnerable to disaster risk through the disruption of ecological services and biological function. These extreme warming events in sea surface temperature are expected to become more frequent and longer lasting as a result of climate change. Large ensembles of global climate models now provide petabytes of climate-relevant data and an opportunity to probe machine learning to glean new insights about the climate conditions that cause marine heatwaves. Here we propose a k-means cluster based learning objective to map the geography of marine heatwave drivers globally to build a forecast for extreme sea surface temperatures using Long Short-Term Memory. We describe our machine learning approach to predict when and where future marine heatwaves will occur while leveraging the massive output of data from global climate models where traditional forecasting approaches fall short. The impacts of this work could warn coastal communities by providing a forecast for marine heatwaves, which would mitigate the negative effects on fishery productivity, ecosystem health, and tourism.

(39) (Spotlight: 5:05PM) ML-driven search for zero-emissions ammonia production materials pdf

Kevin McCloskey (Google)

Abstract: (click to expand) Ammonia (NH3) production is an industrial process that consumes between 1-2% of global energy annually and is responsible for 2-3% of greenhouse gas emissions (Van der Ham et al.,2014). Ammonia is primarily used for agricultural fertilizers, but it also conforms to the US-DOE targets for hydrogen storage materials (Lanet al., 2012). Modern industrial facilities use the century-old Haber-Bosch process, whose energy usage and carbon emissions are strongly dominated by the use of methane as the combined energy source and hydrogen feedstock, not by the energy used to maintain elevated temperatures and pressures (Pfromm, 2017). Generating the hydrogen feedstock with renewable electricity through water electrolysis is an option that would allow retrofitting the billions of dollars of invested capital in Haber-Bosch production capacity. Economic viability is however strongly dependent on the relative regional prices of methane and renewable energy; renewables have been trending lower in cost but forecasting methane prices is difficult (Stehly et al., 2018; IRENA, 2017; Wainberg et al., 2017). Electrochemical ammonia production, which can use aqueous or steam H2O as its hydrogen source (first demonstrated ̃20years ago) is a promising means of emissions-free ammonia production. Its viability is also linked to the relative price of renewable energy versus methane, but in principle it can be significantly more cost-effective than Haber-Bosch (Giddeyet al., 2013) and also downscale to developing areas lacking ammonia transport infrastructure(Shipman & Symes, 2017). However to date it has only been demonstrated at laboratory scales with yields and Faradaic efficiencies insufficient to be economically competitive. Promising machine-learning approaches to fix this are discussed.

(40) (Spotlight: 5:10PM) Low-carbon urban planning with machine learning pdf

Nikola Milojevic-Dupont (Mercator Research Institute on Global Commons and Climate Change (MCC)); Felix Creutzig (Mercator Research Institute on Global Commons and Climate Change (MCC))

Abstract: (click to expand) Widespread climate action is urgently needed, but current solutions do not account enough for local differences. Here, we take the example of cities to point to the potential of machine learning (ML) for generating at scale high-resolution information on energy use and greenhouse gas (GHG) emissions, and make this information actionable for concrete solutions. We map the existing relevant ML literature and articulate ML methods that can make sense of spatial data for climate solutions in cities. Machine learning has the potential to find solutions that are tailored for each settlement, and transfer solutions across the world.

(41) The Grid Resilience & Intelligence Platform (GRIP) pdf

Ashley Pilipiszyn (Stanford University)

Abstract: (click to expand) Extreme weather events pose an enormous and increasing threat to the nation’s electric power systems and the associated socio-economic systems that depend on reliable delivery of electric power. The US Department of Energy reported in 2015, almost a quarter of unplanned grid outages were caused by extreme weather events and variability in the environment. Because climate change increases the frequency and severity of extreme weather events, communities everywhere will need to take steps to better prepare for, and if possible prevent major outages. While utilities have software tools available to help plan their daily and future operations, these tools do not include capabilities to help them plan for and recover from extreme events. Software for resilient design and recovery is not available commercially and research efforts in this area are preliminary. In this project, we are developing and deploying a suite of novel software tools to anticipate, absorb and recover from extreme events. The innovations in the project include the application of artificial intelligence and machine learning for distribution grid resilience, specifically, by using predictive analytics, image recognition and classification, and increased learning and problem-solving capabilities for the anticipation of grid events.

(42) Meta-Optimization of Optimal Power Flow pdf

Mahdi Jamei (Invenia Labs); Letif Mones (Invenia Labs); Alex Robson (Invenia Labs); Lyndon White (Invenia Labs); James Requeima (Invenia Labs); Cozmin Ududec (Invenia Labs)

Abstract: (click to expand) The planning and operation of electricity grids is carried out by solving various forms of con- strained optimization problems. With the increasing variability of system conditions due to the integration of renewable and other distributed energy resources, such optimization problems are growing in complexity and need to be repeated daily, often limited to a 5 minute solve-time. To address this, we propose a meta-optimizer that is used to initialize interior-point solvers. This can significantly reduce the number of iterations to converge to optimality.

(43) Learning representations to predict landslide occurrences and detect illegal mining across multiple domains pdf

Aneesh Rangnekar (Rochester Institute of Technology); Matthew J Hoffman (Rochester Institute of Technology)

Abstract: (click to expand) Modelling landslide occurrences is challenging due to lack of valuable prior information on the trigger. Satellites can provide crucial insights for identifying landslide activity and characterizing patterns spatially and temporally. We propose to analyze remote sensing data from affected regions using deep learning methods, find correlation in the changes over time, and predict future landslide occurrences and their potential causes. The learned networks can then be applied to generate task-specific imagery, including but not limited to, illegal mining detection and disaster relief modelling.

(44) Harness the Power of Artificial intelligence and -Omics to Identify Soil Microbial Functions in Climate Change Projection

Yang Song (Oak Ridge National Lab); Dali Wang (Oak Ridge National Lab); Melanie Mayes (Oak Ridge National Lab)

Abstract: (click to expand) Contemporary Earth system models (ESMs) omit one of the significant drivers of the terrestrial carbon cycle, soil microbial communities. Soil microbial community not only directly emit greenhouse gasses into the atmosphere through the respiration process, but also release diverse enzymes to catalyze the decomposition of soil organic matter and determine nutrient availability for aboveground vegetation. Therefore, soil microbial community control over terrestrial carbon dynamics and their feedbacks to climate. Currently, inadequate representation of soil microbial communities in ESMs has introduced significant uncertainty in current terrestrial carbon-climate feedbacks. Mitigation of this uncertainty requires to identify functions, diversity, and environmental adaptation of soil microbial communities under global climate change. The revolution of -omics technology allows high throughput quantification of diverse soil enzymes, enabling large-scale studies of microbial functions in climate change. Such studies may lead to revolutionary solutions to predicting microbial-mediated climate-carbon feedbacks at the global scale based on gene-level environmental adaptation strategies of the microbial community. A key initial step in this direction is to identify the biogeography and environmental adaptation of soil enzyme functions based on the massive amount of data generated by -omics technologies. Here we propose to make this step. Artificial intelligence is a powerful, ideal tool for this leap forward. Our project is to integrate Artificial intelligence technologies and global -omics data to represent climate controls on microbial enzyme functions and mapping biogeography of soil enzyme functional groups at global scale. This outcome of this study will allow us to improve the representation of microbial function in earth system modeling and mitigate uncertainty in current climate projection.



About ICML

ICML is one of the premier conferences on machine learning, and includes a wide audience of researchers and practitioners in academia, industry, and related fields. It is possible to attend the workshop without either presenting or attending the main ICML conference. Those interested should register for the Workshops component of ICML at https://icml.cc/ while tickets last (a number of spots will be reserved for accepted submissions).

Call For Submissions

We invite submission of extended abstracts on machine learning applied to problems in climate mitigation, adaptation, or modeling, including but not limited to the following topics:

Accepted submissions will be invited to give poster presentations at the workshop, of which some will be selected for spotlight talks. Please contact climatechangeai.icml2019@gmail.com with questions, or if visa considerations make earlier notification important.

Dual-submissions are allowed, and the workshop does not record proceedings. All submissions must be through the website. Submissions will be reviewed double-blind; do your best to anonymize your submission, and do not include identifying information for authors in the PDF. We encourage, but do not require, use of the ICML style template (please do not use the “Accepted” format as it will deanonymize your submission).

Submission tracks

Extended abstracts are limited to 3 pages for the Deployed and Research tracks, and 2 pages for the Ideas track, in PDF format. An additional page may be used for references. All machine learning techniques are welcome, from kernel methods to deep learning. Each submission should make clear why the application has (or could have) positive impacts regarding climate change. There are three tracks for submissions:

DEPLOYED track

Work that is already having an impact

Submissions for the Deployed track are intended for machine learning approaches which are impacting climate-relevant problems through consumers or partner institutions. This could include implementations of academic research that have moved beyond the testing phase, as well as results from startups/industry. Details of methodology need not be revealed if they are proprietary, though transparency is encouraged.

RESEARCH track

Work that will have an impact when deployed

Submissions for the Research track are intended for machine learning research applied to climate-relevant problems. Submissions should provide experimental or theoretical validation of the method proposed, as well as specifying what gap the method fills. Algorithms need not be novel from a machine learning perspective if they are applied in a novel setting.

Datasets may be submitted to this track that are designed to permit machine learning research (e.g. formatted with clear benchmarks for evaluation). In this case, baseline experimental results on the dataset are preferred but not required.

IDEAS track

Future work that could have an impact

Submissions for the Ideas track are intended for proposed applications of machine learning to solve climate-relevant problems. While the least constrained, this track will be subject to a very high standard of review. No results need be demonstrated, but ideas should be justified as extensively as possible, including motivation for the problem being solved, an explanation of why current tools or methods are inadequate, and details of how tools from machine learning are proposed to fill the gap (i.e. it is important to justify the use of machine learning in your approach).

Frequently Asked Questions

Q: How can I keep up to date on this kind of stuff?
A: Sign up for our mailing list! https://www.climatechange.ai/mailing_list.html

Q: What is the date of the workshop / when will we know?
A: Friday, June 14 was recently confirmed as the date.

Q: I’m not in machine learning. Can I still submit?
A: Yes, absolutely! We welcome submissions from many fields. Do bear in mind, however, that the majority of attendees of the workshop will have a machine learning background; therefore, other fields should be introduced sufficiently to provide context for the work.

Q: What if my submission is accepted but I can’t attend the workshop?
A: You may ask someone else to present your work in your stead, or we can also print a poster for you and put it up during the poster session.

Q: Do I need to use LaTeX or the ICML style files?
A: No, although we encourage it.

Q: What do I do if I need an earlier decision for visa reasons?
A: Contact us at climatechangeai.icml2019@gmail.com and explain your situation and the date by which you require a decision and we will do our best to be accomodating.

Q: Can I send submissions directly by email?
A: No, please use the CMT website to make submissions.

Q: The submission website is asking for my name. Is this a problem for anonymization?
A: You should fill out your name and other info when asked on the submission website; CMT will keep your submission anonymous to reviewers.

Q: I don’t know whether to submit my work in the Deployed or Research track. What’s the difference?
A: Deployed means it’s “really being used” in a real-world setting (i.e. not just that you verify your method on real-world data). If you are still unsure, just pick whichever track you prefer your method be evaluated as.

Q: Do submissions for the Ideas track need to have experimental validation?
A: No, although some initial experiments or citation of published results would strengthen your submission.

Q: The submission website never sent me a confirmation email. Is this a problem?
A: No, the CMT system does not send automatic confirmation emails after a submission, though the submission should show up on the CMT page once submitted. If in any doubt regarding the submission process, please contact the organizers. Also please avoid making multiple submissions of the same article to CMT.