Apply LIME to Explain, Trust, and Validate Your Predictions for Any ML Model

Apply LIME to Explain, Trust, and Validate Your Predictions for Any ML Model

Prodramp via YouTube Direct link

- 1st Original LIME explanation

11 of 29

11 of 29

- 1st Original LIME explanation

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Apply LIME to Explain, Trust, and Validate Your Predictions for Any ML Model

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Tutorial Introduction
  2. 2 - Why LIME is needed?
  3. 3 - Need for a surrogate model
  4. 4 - LIME Properties
  5. 5 - LIME is not Feature Importance
  6. 6 - Explaining image classification
  7. 7 - Another LIME based explanation
  8. 8 - Tabular data classification explanation
  9. 9 - Two types of explanations
  10. 10 - What is in notebook exercises?
  11. 11 - 1st Original LIME explanation
  12. 12 - Loading Inception V3 model
  13. 13 - LIME library Installation
  14. 14 - Lime Explainer Module
  15. 15 - LIME Explanation Model Creation
  16. 16 - Creating superpixel Image
  17. 17 - Showing Pros and Cons in image
  18. 18 - Showing Pros and Cons with weight higher 0.1 in image
  19. 19 - Analyzing 2nd Prediction
  20. 20 - LIME Custom Implementation
  21. 21 - Loading EffecientNet Model
  22. 22 - Loading LIME class from custom Implementation
  23. 23 - LIME Explanation Results
  24. 24 - Loading ResNet50 Model
  25. 25 - LIME Explanations
  26. 26 - Step by Step Custom Explanations
  27. 27 - Explanations Comparisons
  28. 28 - Saving Notebooks to GitHub
  29. 29 - Recap

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.