Generating physically-consistent high-resolution climate data with hard-constrained neural networks (Papers Track)
Paula Harder (Mila); Qidong Yang (New York University); Venkatesh Ramesh (Mila); Prasanna Sattigeri (IBM Research); Alex Hernandez-Garcia (Mila - Quebec AI Institute); Campbell D Watson (IBM Reserch); Daniela Szwarcman (IBM Research); David Rolnick (McGill University, Mila)
The availability of reliable, high-resolution climate and weather data is important to inform long-term decisions on climate adaptation and mitigation and to guide rapid responses to extreme events. Forecasting models are limited by computational costs and therefore often can only make coarse resolution predictions. Statistical downscaling can provide an efficient method of upsampling low-resolution data. In this field, deep learning has been applied successfully, often using image super-resolution methods from computer vision. Despite achieving visually compelling results in some cases, such models often violate conservation laws when predicting physical variables. In order to conserve important physical quantities, we developed a deep downscaling method that guarantees physical constraints are satisfied, by adding a renormalization layer at the end of the neural network. Furthermore, the constrained model also improves the performance according to standard metrics. We show the applicability of our methods across different popular architectures and upsampling factors using ERA5 reanalysis data.