On The Role of Knowledge Graphs in Explainable AI Freddy Lecue CortAIx (Centre of Research & Technology in Artificial Intelligence eXpertise) Montréal, Kanada freddy.lecue@inria.fr Abstract. The current hype of Artificial Intelligence (AI) mostly refers to the success of machine learning and its sub-domain of deep learning. However, AI is also about other areas, such as Knowledge Representation and Reasoning, or Distributed AI, i.e., areas that need to be combined to reach the level of intelligence initially envisioned in the 1950s. Ex- plainable AI (XAI) now refers to the core backup for industry to apply AI in products at scale, particularly for industries operating with critical systems. XAI can not only be reviewed from a Machine Learning per- spective, but also from the other AI research areas, such as AI Planning or Constraint Satisfaction and Search. We expose the XAI challenges of AI fields, their existing approaches, limitations and the great opportuni- ties for Semantic Web Technologies and Knowledge Graphs to push the boundaries of XAI further. Copyright c 2019 for this paper by its authors. Use permitted under Creative Com- mons License Attribution 4.0 International (CC BY 4.0).