Pure and Physics-encoded Spatiotemporal Deep Learning for Climate-Vegetation Dynamics (Papers Track)
Qianqiu Longyang (Kansas Geological Survey, The University of Kansas); Ruijie Zeng (School of Sustainable Engineering and the Built Environment, Arizona State University)
Abstract
Vegetation is a central hub of water, energy, and carbon exchanges, making its accurate spatiotemporal modeling essential for projecting climate impacts. Existing models for vegetation dynamics often suffer from physical simplifications or limited treatment of spatiotemporal interactions. We present a spatially distributed framework for daily-scale climate-vegetation dynamics, comparing: (1) a Long Short-Term Memory (LSTM) baseline with flattened spatial inputs; (2) a pure Convolutional LSTM (ConvLSTM) that captures spatial heterogeneity and temporal dependencies while implicitly representing ecohydrological states; and (3) a physics-encoded ConvLSTM that serves as a bias-corrector of physically simulated Leaf Area Index (LAI). ConvLSTM architectures outperform the LSTM baseline, and the physics-encoded variant underscores the promise of combining physical knowledge with data-driven models. This framework supports more reliable vegetation projections and highlights the potential of closer collaboration between AI researchers and Earth system scientists to develop trustworthy tools for climate adaptation and mitigation.