Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Poisoned Pickles - Security Risks and Protections for Serialized ML Models

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the security risks and protective measures associated with pickle serialization in machine learning during this 27-minute conference talk. Delve into the widespread use of the pickle module for serializing and distributing ML models, and understand the vulnerabilities that make it easy for attackers to inject arbitrary code into ML pipelines. Learn about the challenges in detecting poisoned pickles and discover emerging tools and techniques inspired by DevOps practices to generate safer, higher-quality pickles. Gain practical insights on how to protect your models from attacks and implement trust-or-discard processes to enhance the security of your ML workflows.

Syllabus

Poisoned Pickles Make You Ill - Adrian Gonzalez-Martin, Seldon

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Poisoned Pickles - Security Risks and Protections for Serialized ML Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.