Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore efficient storage, versioning, and distribution of AI/ML models using OCI Artifacts in this 35-minute DevConf.US 2024 conference talk. Learn how to overcome storage challenges associated with frequently updated and versioned AI/ML models, especially when distributing them to edge devices for inference. Discover techniques for breaking down AI models into atomic units, storing them in OCI registries, and reassembling them on target devices. Gain insights into leveraging OCI standards for data versioning, deduplication, and easy transfer, drawing from over a decade of development experience. Understand how to optimize model updates by distributing only the differences, reducing network traffic and storage requirements.
Syllabus
Store AI/ML models efficiently with OCI Artifacts - DevConf.US 2024
Taught by
DevConf