Explaining deep multi-class time series classifiers
Author(s)
Doddaiah, Ramesh; Parvatharaju, Prathyush S.; Rundensteiner, Elke; Hartvigsen, Thomas
Download10115_2024_2073_ReferencePDF.pdf (2.347Mb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
Explainability helps users trust deep learning solutions for time series classification. However, existing explainability methods for multi-class time series classifiers focus on one class at a time, ignoring relationships between the classes. Instead, when a classifier is choosing between many classes, an effective explanation must show what sets the chosen class apart from the rest. We now formalize this notion, studying the open problem of class-specific explainability for deep time series classifiers, a challenging and impactful problem setting. We design a novel explainability method, DEMUX, which learns saliency maps for explaining deep multi-class time series classifiers by adaptively ensuring that its explanation spotlights the regions in an input time series that a model uses specifically to its predicted class. DEMUX adopts a gradient-based approach composed of three interdependent modules that combine to generate consistent, class-specific saliency maps that remain faithful to the classifier’s behavior yet are easily understood by end users. We demonstrate that DEMUX outperforms nine state-of-the-art alternatives on seven popular datasets when explaining two types of deep time series classifiers. We analyze runtime performance, show the impacts of hyperparameter selection, and introduce a detailed study of perturbation methods for time series. Further, through a case study, we demonstrate that DEMUX’s explanations indeed highlight what separates the predicted class from the others in the eyes of the classifier.
Date issued
2024-03-04Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
Knowledge and Information Systems
Publisher
Springer London
Citation
Doddaiah, R., Parvatharaju, P.S., Rundensteiner, E. et al. Explaining deep multi-class time series classifiers. Knowl Inf Syst 66, 3497–3521 (2024).
Version: Author's final manuscript