De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk

Terug naar zoekresultatenDeel deze publicatie

Identifying XAI User Needs

Gaps between Literature and Use Cases in the Financial Sector

Identifying XAI User Needs

Gaps between Literature and Use Cases in the Financial Sector

Samenvatting

One aspect of a responsible application of Artificial Intelligence (AI) is ensuring that the operation and outputs of an AI system are understandable for non-technical users, who need to consider its recommendations in their decision making. The importance of explainable AI (XAI) is widely acknowledged; however, its practical implementation is not straightforward. In particular, it is still unclear what the requirements are of non-technical users from explanations, i.e. what makes an explanation meaningful. In this paper, we synthesize insights on meaningful explanations from a literature study and two use cases in the financial sector. We identified 30 components of meaningfulness in XAI literature. In addition, we report three themes associated with explanation needs that were central to the users in our use cases, but are not prominently described in literature: actionability, coherent narratives and context. Our results highlight the importance of narrowing the gap between theoretical and applied responsible AI.

Toon meer
OrganisatieHogeschool Utrecht
AfdelingKenniscentrum Digital Business & Media
LectoraatArtificial Intelligence
Gepubliceerd inProceedings of the Workshops at the Third International Conference on Hybrid Human-Artificial Intelligence RWTA Aachen University, Malmö, Sweden, Vol. 3, Uitgave: 3825, Pagina's: 221-227
Datum2024-06-10
TypeConferentiebijdrage
TaalEngels

Op de HBO Kennisbank vind je publicaties van 26 hogescholen

De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk