arXiv Analytics

Sign in

arXiv:2405.08297 [cs.LG]AbstractReferencesReviewsResources

Distance-Restricted Explanations: Theoretical Underpinnings & Efficient Implementation

Yacine Izza, Xuanxiang Huang, Antonio Morgado, Jordi Planes, Alexey Ignatiev, Joao Marques-Silva

Published 2024-05-14Version 1

The uses of machine learning (ML) have snowballed in recent years. In many cases, ML models are highly complex, and their operation is beyond the understanding of human decision-makers. Nevertheless, some uses of ML models involve high-stakes and safety-critical applications. Explainable artificial intelligence (XAI) aims to help human decision-makers in understanding the operation of such complex ML models, thus eliciting trust in their operation. Unfortunately, the majority of past XAI work is based on informal approaches, that offer no guarantees of rigor. Unsurprisingly, there exists comprehensive experimental and theoretical evidence confirming that informal methods of XAI can provide human-decision makers with erroneous information. Logic-based XAI represents a rigorous approach to explainability; it is model-based and offers the strongest guarantees of rigor of computed explanations. However, a well-known drawback of logic-based XAI is the complexity of logic reasoning, especially for highly complex ML models. Recent work proposed distance-restricted explanations, i.e. explanations that are rigorous provided the distance to a given input is small enough. Distance-restricted explainability is tightly related with adversarial robustness, and it has been shown to scale for moderately complex ML models, but the number of inputs still represents a key limiting factor. This paper investigates novel algorithms for scaling up the performance of logic-based explainers when computing and enumerating ML model explanations with a large number of inputs.

Related articles: Most relevant | Search more
arXiv:2104.01303 [cs.LG] (Published 2021-04-03)
Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation
arXiv:2105.01196 [cs.LG] (Published 2021-05-03)
EBIC.JL -- an Efficient Implementation of Evolutionary Biclustering Algorithm in Julia
arXiv:1705.02583 [cs.LG] (Published 2017-05-07)
A Design Methodology for Efficient Implementation of Deconvolutional Neural Networks on an FPGA