Tutorial: Quantus x Climate - Applying explainable AI evaluation in climate science (Tutorials Track) Spotlight
Philine L Bommer (TU Berlin); Anna Hedström (Technische Universität Berlin); Marlene Kretschmer (University of Reading); Marina M.-C. Höhne (TU Berlin)
Explainable artificial intelligence (XAI) methods shed light on the predictions of deep neural networks (DNNs). In the climate context, XAI has been applied to improve and validate deep learning (DL) methods while providing researchers with new insight into physical processes. However, the evaluation, validation and selection of XAI methods are challenging due to often lacking ground truth explanations. In this tutorial, we introduce the XAI evaluation package Quantus to the climate community. We start by providing the users with pre-processed input and output data alongside a convolutional neural network (CNN) trained to assign yearly temperature maps to classes according to their decade. We explain the network prediction of an example temperature map using five different explanation techniques Gradient GradientShap, IntegratedGradients, LRP-z and Occlusion. By visually analyzing each explanation method around the North Atlantic (NA) cooling patch 10-80W, 20-60N, we provide a motivating example that shows that different explanations may disagree in their explained evidence which subsequently can lead to different scientific interpretation and potentially, misleading conclusions. We continue by introducing Quantus including the explanation properties that can be evaluated such as robustness, faithfulness, complexity, localization and randomization. We guide the participants towards a practical understanding of XAI evaluation by demonstrating how metrics differ in their scoring and interpretation. Moreover, we teach the participants to compare and select an appropriate XAI method by performing a comprehensive XAI evaluation. Lastly, we return to the motivating example, highlighting how Quantus can facilitate well-founded XAI research in climate science.