Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Interactive Explainable AI - Enhancing Trust and Decision-Making

Open Data Science via YouTube

Overview

Explore the world of Interactive Explainable AI in this 37-minute talk by Dr. Meg Kurdziolek. Dive into the importance of understanding AI decision-making processes, discover cutting-edge Explainable AI (XAI) techniques, and learn how human factors research influences trust in AI systems. Gain insights into practical interaction design strategies for enhancing XAI services, perfect for AI enthusiasts, data scientists, and machine learning professionals. The presentation covers topics such as the need for XAI, human factors in AI and XAI, historical context of explaining complex concepts, the user experience of XAI, and real-world examples of interactive XAI implementations. Conclude with valuable parting thoughts on the future of explainable and trustworthy AI technologies.

Syllabus

- Intro
- What do we need XAI for?
- Human factors of AI and XAI
- We’ve actually been explaining complex things for a long time
- The UX of XAI
- Examples of interactive XAI
- Parting Thoughts

Taught by

Open Data Science

Reviews

Start your review of Interactive Explainable AI - Enhancing Trust and Decision-Making

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.