XAI for transparent wind turbine power curve models (Papers Track)

Simon Letzgus (Technische Universität Berlin)

Paper PDF Cite
Power & Energy Interpretable ML

Abstract

Accurate wind turbine power curve models, which translate ambient conditions into turbine power output, are crucial for wind energy to scale and fulfill its proposed role in the global energy transition. While machine learning (ML) methods have shown significant advantages over parametric, physics-informed approaches, they are often criticized for being opaque "black boxes", which hinders their application in practice. We apply Shapley values, a popular explainable artificial intelligence (XAI) method, and the latest findings from XAI for regression models, to uncover the strategies ML models have learned from operational wind turbine data. Our findings reveal that the trend towards ever larger model architectures, driven by a focus on test set performance, can result in physically implausible model strategies. Therefore, we call for a more prominent role of XAI methods in model selection. Moreover, we propose a practical approach to utilize explanations for root cause analysis in the context of wind turbine performance monitoring. This can help to reduce downtime and increase the utilization of turbines in the field.