Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the critical role of machine learning observability in making models work effectively in real-world scenarios. Learn about common challenges in productionalizing ML and discover essential techniques for monitoring and improving model performance. Dive into drift analysis, performance evaluation, data quality checks, and explainability methods. Gain insights on how to quickly visualize, detect, and diagnose problems with models in production, enabling faster issue resolution. Understand the four pillars of ML observability and their implementation, including performance tracing, data quality monitoring, and model explainability. Discover practical tools and metrics such as KL Divergence and Earth Mover's Distance for measuring drift. By the end of this webinar, acquire the knowledge to ramp up an ML observability practice and enhance your modern ML infrastructure stack.
Syllabus
Introduction
Pain Points
ML Monitoring
Four Pillars
Performance
Fast Actuals
Models without fast actuals
What is drift
Why do we monitor for drift
Metrics to measure drift
KL Divergence
Earth mover distance
Monitors
Data Quality
Explainability
How to implement model explainability
Shaft values
Example
Questions
Arize Platform
Performance Tracing
Integrations
Model Drift
Performance Trace
Drift Tab
Dashboards
Monitoring
Taught by
Open Data Science