Exploring Accessible Explainable AI: Promising Avenues

Abstract

The concept of Accessible Explainable Artificial Intelligence (AXAI) addresses the need for inclusive XAI design, focusing on accessibility for individuals with disabilities. This paper explores potential avenues for promoting AXAI by examining audio description (AD) systems and Sound-based XAI technologies, with a focus on their benefits for blind and partially sighted users. The study highlights promising research areas in human-centered computing and accessibility tools, using the NarrationBot+InfoBot system as a primary example. Additionally, it identifies sound-based XAI technologies to make XAI accessible for persons with sight loss, discussing methods such as Cough-Local Interpretable Model-Agnostic Explanations (Cough-LIME) and audioLIME. The paper’s findings have the potential to inspire research directions for enhancing XAI accessibility and motivate developments that could positively impact a wide range of disabilities.

Publication
Journal on Technology and Persons with Disabilities, 13, 350-366, California State University, Northridge