Finding counterfactual explanations through constraint relaxations

Loading...
Thumbnail Image
Date
2022-02
Authors
Dev Gupta, Sharmi
Genç, Begüm
O'Sullivan, Barry
Journal Title
Journal ISSN
Volume Title
Publisher
AAAI
Published Version
Research Projects
Organizational Units
Journal Issue
Abstract
Interactive constraint systems often suffer from infeasibility (no solution) due to conflicting user constraints. A common approach to recover infeasibility is to eliminate the constraints that cause the conflicts in the system. This approach allows the system to provide an explanation as: "if the user is willing to drop out some of their constraints, there exists a solution". However, one can criticise this form of explanation as not being very informative. A counterfactual explanation is a type of explanation that can provide a basis for the user to recover feasibility by helping them understand which changes can be applied to their existing constraints rather than removing them. This approach has been extensively studied in the machine learning field, but requires a more thorough investigation in the context of constraint satisfaction. We propose an iterative method based on conflict detection and maximal relaxations in over-constrained constraint satisfaction problems to help compute a counterfactual explanation.
Description
Keywords
Counterfactual explanation , Maximal relaxation , Constraint programming
Citation
Dev Gupta, S., Genc, B. and O'Sullivan, B. (2022) 'Finding counterfactual explanations through constraint relaxations', AAA1 22 - Thirty-Sixth AAAI Conference on Artificial Intelligence: Explainable Agency in Artificial Intelligence Workshop (EAAI'22), Online, 22 Feb - 01 Mar.
Copyright