Explaining Decision-Making Algorithms through UI - Strategies to Help Non-Expert Stakeholders
Association for Computing Machinery (ACM) via YouTube
Overview
Explore strategies for explaining decision-making algorithms through user interfaces to non-expert stakeholders in this 20-minute conference talk from the ACM CHI Conference on Human Factors in Computing Systems. Delve into research findings on design principles for explanation interfaces that effectively communicate algorithmic decision-making processes. Discover how interactive and "white-box" explanations can improve users' comprehension of algorithms, and examine the trade-offs between effectiveness and time investment. Gain insights into the surprising relationship between algorithm comprehension and user trust. Learn about the study's methodology, design process, and user demographics, and consider the implications for future research and development in the field of algorithmic transparency and explainability.
Syllabus
Introduction
Machine Learning Algorithms
Challenges
Explanation Matters
Our Goal
Innovation Strategy 1
Innovation Strategy 2
Summary
Methods
Design Process
User Profile
User Demographic
Results
Research Question 1
Trust
Why
Future
Thank you
Questions
Confidence
Taught by
ACM SIGCHI