Transformer Neural Networks for Building Load Forecasting (Papers Track)

Matthias Hertel (KIT); Simon Ott (KIT); Oliver Neumann (KIT); Benjamin Schäfer (KIT); Ralf Mikut (Karlsruhe Institute of Technology); Veit Hagenmeyer (Karlsruhe Institute of Technology (KIT))

Paper PDF Slides PDF Recorded Talk NeurIPS 2022 Poster Topia Link Cite


Accurate electrical load forecasts of buildings are needed to optimize local energy storage and to make use of demand-side flexibility. We study the usage of Transformer neural networks for short-term electrical load forecasting of 296 buildings from a public dataset. Transformer neural networks trained on many buildings give the best forecasts on 115 buildings, and multi-layer perceptrons trained on a single building are better on 161 buildings. In addition, we evaluate the models on buildings that were not used for training, and find that Transformer neural networks generalize better than multi-layer perceptrons and our statistical baselines. This shows that the usage of Transformer neural networks for building load forecasting could reduce training resources due to the good generalization to unseen buildings, and they could be useful for cold-start scenarios.

Recorded Talk (direct link)