=Paper=
{{Paper
|id=Vol-3733/invited4
|storemode=property
|title=None
|pdfUrl=https://ceur-ws.org/Vol-3733/invited4.pdf
|volume=Vol-3733
}}
==None==
Interactive Explanations for Contestable AI⋆
Francesca Toni1
1
Imperial College London, UK
Abstract
AI has become pervasive in recent years, but state-of-the-art approaches mostly neglect the need for AI systems
to be contestable. Contestability is advocated by AI guidelines (e.g. by the OECD) and regulation of automated
decision-making (e.g. GDPR). In contrast, there has been little attention in AI to suggest how contestability
requirements can be met computationally. Contestability requires dynamic (human-machine or machine-machine)
decision-making processes, whereas much of the current AI landscape is tailored to static AIs - thus the need
to accommodate contestability will require a radical rethinking. In this talk I will argue that computational
forms of contestable AI will require forms of explainability whereby machines and humans can interact, and that
computational argumentation can support the needed interactive explainability for contestability.
CILC 2024: 39th Italian Conference on Computational Logic, June 26-28, 2024, Rome, Italy
⋆
This talk is sponsored by the Association for Logic Programming
$ ft@ic.ac.uk (F. Toni)
0000-0001-8194-1459 (F. Toni)
© 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
CEUR
ceur-ws.org
Workshop ISSN 1613-0073
Proceedings