DoLFIn: Distributions over Latent Features for Interpretability

Open Access
Authors
Publication date 2020
Host editors
  • D. Scott
  • N. Bel
  • C. Zong
Book title The 28th International Conference on Computational Linguistics
Book subtitle COLING 2020 : Proceedings of the Conference : December 8-13, 2020, Barcelona, Spain (Online)
ISBN (electronic)
  • 9781952148279
Event COLING 2020
Pages (from-to) 1468-1474
Number of pages 7
Publisher International Committee on Computational Linguistics
Organisations
  • Interfacultary Research - Institute for Logic, Language and Computation (ILLC)
Abstract

Interpreting the inner workings of neural models is a key step in ensuring the robustness and trustworthiness of the models, but work on neural network interpretability typically faces a tradeoff: either the models are too constrained to be very useful, or the solutions found by the models are too complex to interpret. We propose a novel strategy for achieving interpretability that – in our experiments – avoids this trade-off. Our approach builds on the success of using probability as the central quantity, such as for instance within the attention mechanism. In our architecture, DoLFIn (Distributions over Latent Features for Interpretability), we do no determine beforehand what each feature represents, and features go altogether into an unordered set. Each feature has an associated probability ranging from 0 to 1, weighing its importance for further processing. We show that, unlike attention and saliency map approaches, this set-up makes it straight-forward to compute the probability with which an input component supports the decision the neural model makes. To demonstrate the usefulness of the approach, we apply DoLFIn to text classification, and show that DoLFIn not only provides interpretable solutions, but even slightly outperforms the classical CNN and BiLSTM text classifiers on the SST2 and AG-news datasets.

Document type Conference contribution
Language English
Published at https://doi.org/10.18653/v1/2020.coling-main.127
Other links https://www.scopus.com/pages/publications/85149644424
Downloads
2020.coling-main.127 (Final published version)
Permalink to this page
Back