Quantifying the Uncertainty in Model Predictions Using Conformal Prediction
Toronto Machine Learning Series (TMLS) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of conformal prediction in this 33-minute conference talk from the Toronto Machine Learning Series. Learn how to quantify uncertainty in neural network predictions and generate alternative outputs when models are unsure. Discover the versatility, statistical rigor, and simplicity of conformal prediction as a method applicable to both classification and regression tasks. Gain insights into its three-step implementation process and understand how it can be applied to real-world use cases. Presented by Jesse Cresswell, Senior Machine Learning Scientist at Layer 6 AI, this talk provides valuable knowledge for addressing the challenge of overconfident wrong predictions in neural networks.
Syllabus
Quantifying the Uncertainty in Model Predictions
Taught by
Toronto Machine Learning Series (TMLS)