=Paper= {{Paper |id=Vol-3389/Demos83 |storemode=property |title=iSee: Intelligent Sharing of Explanation Experiences |pdfUrl=https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_83.pdf |volume=Vol-3389 |authors=Kyle Martin,Anjana Wijekoon,Nirmalie Wiratunga,Chamath Palihawadana,Ikechukwu Nkisi-Orji,David Corsar,Belén Díaz-Agudo,Juan A. Recio-García,Marta Caro-Martínez,Derek Bridge,Preeja Pradeep,Anne Liret,Bruno Fleisch |dblpUrl=https://dblp.org/rec/conf/iccbr/MartinWWPNCDRC022 }} ==iSee: Intelligent Sharing of Explanation Experiences== https://ceur-ws.org/Vol-3389/ICCBR_2022_Workshop_paper_83.pdf
iSee: Intelligent Sharing of Explanation Experiences
Kyle Martin1,∗ , Anjana Wijekoon1 , Nirmalie Wiratunga1 , Chamath Palihawadana1 ,
Ikechukwu Nkisi-Orji1 , David Corsar1 , Belén Díaz-Agudo2 , Juan A. Recio-García2 ,
Marta Caro-Martínez2 , Derek Bridge3 , Preeja Pradeep3 , Anne Liret4 and
Bruno Fleisch4
1
  Robert Gordon University, Aberdeen, Scotland
2
  Universidad Complutense de Madrid, Madrid, Spain
3
  University College Cork, Cork, Ireland
4
  BT France, Paris, France


                                         Abstract
                                         The right to an explanation of the decision reached by a machine learning (ML) model is now an
                                         EU regulation. However, different system stakeholders may have different background knowledge,
                                         competencies and goals, thus requiring different kinds of explanations. There is a growing armoury of XAI
                                         methods, interpreting ML models and explaining their predictions, recommendations and diagnoses. We
                                         refer to these collectively as ”explanation strategies”. As these explanation strategies mature, practitioners
                                         gain experience in understanding which strategies to deploy in different circumstances. What is lacking,
                                         and what the iSee project will address, is the science and technology for capturing, sharing and re-using
                                         explanation strategies based on similar user experiences, along with a much-needed route to explainable
                                         AI (XAI) compliance. Our vision is to improve every user’s experience of AI, by harnessing experiences of
                                         best practice in XAI by providing an interactive environment where personalised explanation experiences
                                         are accessible to everyone.
                                         Video Link: https://youtu.be/81O6-q_yx0s

                                         Keywords
                                         Explainability, Case-Based Reasoning, Project Showcase




1. iSee System Overview
The iSee platform employs a Case-Based Reasoning (CBR) methodology to capture, store and
recommend explanation experiences. When queried, the system draws on a case-base of histor-
ical explanation experiences to suggest an appropriate explanation strategy (i.e. combination of
explainer algorithms designed to holistically satisfy a set of personas and corresponding intents).
Cases are formed of knowledge of the AI model and its user group (problem component), the
explanation strategy recommended (solution component), and feedback from the user group to
describe whether the provisioned explanations were satisfactory (outcome component). In this
manner, cases represent a comprehensive record of explanation experience.


ICCBR CBR Demos’22: Workshop on CBR Demos and Showcases. ICCBR-2022, September, 2022, Nancy, France
∗
    Corresponding author.
Envelope-Open k.martin3@rgu.ac.uk (K. Martin)
Orcid 0000-0003-0941-3111 (K. Martin)
                                       © 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)



                                                                                                           1
Kyle Martin et al.                                                                                                                                                                        ICCBR’22 Workshop Proceedings


                               USER MODELS                  Interactive Retrieval        Evaluation Cockpit                                                                              Revision
                               INTERACTION
                                                            and Adaptation                                                                                         Precise?
                                KNOWLEDGE                                                                   EVAL. COCKPIT                                                  More           FEEDBACK   Revised XE
                                                              ML Model?                  Adapted XE         Asdfdsafdsf sdfsd a
                                                                                                                                                                          details
                                EVALUATION                                                                  A sdfasdfasdf a
                                                                                                            Asdf asdf asd fads
                                                                                                                                                                                                      XM
                                 CRITERIA                                                                   Asdf adsf ads ads ad
                                                                                                            Asdfasdfa dsdas
                                                                                                                                                                                                                XM
                                                                                                                                                                Useful?
                                                                                                            Dsadsf dsfasdfadsf a
                                                                                          XM                Asd asd sad dsa df
                                                                                                            Asdfasdfdfadsaadf
                                                                                                            Asdlkfjdslkfjlkjf as
                                                                                                    XM
                                                                Data?                                                                                                   Yes                                     EM
                                                                                                            Asdfsadfds asdf sdf adsfdasf adf asdfasd
                                                                                                            fdsf f asdf fsdf asfdsf a adsf asdf asdf as
                                                                                                    EM      asdf adsf asdf asdf ads asdf a asdf asdf                                                 XM    XM
                                ONTOLOGY                                                                    sd asdf s asdf asdf         asdf adsf adf
                                                                                                                                                                              END-USER
                                                                                                                                                              Clear?
                                                                            DESIGNER     XM    XM
                                                               Goal?                                                                                                   Satisfied         XAI Certification
                     WP1                                                      WP3                                                                                              WP4                          WP4&5

                      eXplanation                                                   eXplanation            Interaction                                         Evaluation
                      Experience
                                                                                    Methods API              Engine                                           Methods API
                           X eXplanation
                           M   Experience
                                      eXplanation                                                        INTERACTIVE                                                               EM




                                                                                                                                                          EVALUATION
                                   X
                                   M Experience                                                          EXPLANATION




                                                                                                                                                           METHODS
                                                                                    XM   XM    XM
                                                                                                         ESTRATEGIES
                       X
                       M
                               X
                               M
                                          X
                                          M
                                              eXplanation
                                              Experience        CBR                                         ML MODEL
                                                                                                                                                                                   EM

                                                                                                                                                                                   EM
                                   X
                                   M
                                                   X
                                                   M           Engine               XM   XM    XM           EXECUTION

                                                                                                          DATA STORAGE




                                                                                                                                                          ANALYTICS
                                          X
                                          M




                                                                                                                                                            USER
                                               X       X                            XM   XM    XM                   HCI
                                               M       M


                                                                                                         REPRODUCIBILITY


                           CASEBASE                                        REASONING PLATFORM                                                                                 WP2          USE CASES            WP5
                                                                 Retain



Figure 1: iSee System Diagram


    We have developed novel knowledge bases and algorithms to fill the CBR knowledge con-
tainers. The case base knowledge container is filled with explanation experiences, where the
above case composition has been applied to novel application of XAI in literature and real-life
use cases. We have created iSeeOnto, an ontology for the description of XAI systems which
fills the vocabulary container and describe AI models, how they are explained and how these
explanations are evaluated. The similarity knowledge container is currently under development.
There we are building novel similarity methods based on Many-Are-Called, Few-Are-Chosen
(MAC/FAC) and edit distance based methods. Finally, the adaptation container will empower
users to adapt their personalised user strategy, in the first instance through constructive adapta-
tion during reuse, and finally by helping to evaluate their explanation strategy with end users
to and modification to meet needs by revisiting earlier retrieval stages of the process.


2. Community Support
The iSee project aims to become a platform which facilitates capture of explanation experi-
ences and empowers reuse of explanation strategies. To achieve this, iSee would benefit from
community support in two areas:
   1. Provision of explainer algorithms to increase available explanation strategies.
   2. Description of known use cases to expand explanation experience coverage.
As a benefit to the community, by providing these artefacts the XAI research community can
expect wider impact and more convenient reusability of their contributions.


Funding and Acknowledgements
This research is funded by the iSee project (https://isee4xai.com) which received funding from
EPSRC under the grant number EP/V061755/1. The iSee project is supported by it’s use-case
partners: BT, Bosch, Jiva, Total Energies and Automix. Special thanks to Malavika Suresh and
Craig Pirie as Dr Armstrong and Dr McOlon respectively.



                                                                                                    2