Explainable Offline-online Training of Neural Networks for Multi-scale Climate Modeling (Papers Track)

Hamid Alizadeh Pahlavan (Rice University); Pedram Hassanzadeh (Rice University); M. Joan Alexander (NorthWest Research Associates)

NeurIPS 2023 Poster Cite
Hybrid Physical Models Unsupervised & Semi-Supervised Learning

Abstract

In global climate models, small-scale physical processes are represented using subgrid-scale (SGS) models known as parameterizations, and these parameterizations contribute substantially to uncertainties in climate projections. Recently, machine learning techniques, particularly deep neural networks (NNs), have emerged as novel tools for developing SGS parameterizations. Different strategies exist for training these NN-based SGS models. Here, we use a 1D model of the quasi-biennial oscillation (QBO) and atmospheric gravity wave (GW) parameterizations as testbeds to explore various learning strategies and challenges due to scarcity of high-fidelity training data. We show that a 12-layer convolutional NN that predicts GW forcings for given wind profiles, when trained offline in a big-data regime (100-years), produces realistic QBOs once coupled to the 1D model. In contrast, offline training of this NN in a small-data regime (18-months) yields unrealistic QBOs. However, online re-training of just two layers of this NN using ensemble Kalman inversion and only time-averaged QBO statistics leads to parameterizations that yield realistic QBOs. Fourier analysis of these three NNs’ kernels suggests how/why re-training works and reveals that these NNs primarily learn low-pass, high-pass, and a combination of band-pass Gabor filters, consistent with the importance of both local and non-local dynamics in GW propagation/dissipation. These strategies/findings apply to data-driven parameterizations of other climate processes generally.