Prof. Dr. Christian Meske together with Niklas Kühl from the Karlsruhe Institute of Technology (KIT) and Jodie Lobana from McMaster University (Hamilton, Canada) won the Paper-a-Thon at the International Conference on Information Systems 2019 (VHB-Rank: A) this week. The final topic was "Do you comply with AI? - The impact of personalized explanations of learning algorithms on compliance behavior".
The paper focuses on machine learning algorithms as "black boxes" and addresses the situation that, based on different backgrounds of the users (experiences, education, etc.), different mental models exist about reality and thus the algorithms in the work context. This requires a model-specific personalization of explanations ("Explainable AI"), which in turn have an influence on trust, compliance behavior and ultimately task performance. Initial results from a qualitative study support these assumptions, which will be quantitatively tested in a follow-up project.