Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 44-minute conference talk from SNIA SDC 2024 examining how flash storage technology is revolutionizing data ingestion for AI workloads. Dive into the challenges faced by traditional object stores like S3, Google Cloud, and Azure Blob as AI deployments scale to production levels, using Meta's Tectonic-Shift platform as a case study. Learn about the increasing demands of Deep Learning Recommendation Model (DLRM) training and how flash storage addresses bandwidth and power requirements. Examine key findings from MLPerf DLRM preprocessing and training storage trace analysis, while understanding the critical need for standardized benchmarks in data ingestion performance and power efficiency measurement. Gain insights from Micron Technology experts on the evolving landscape of AI deployment, object store requirements, and the strategic role of flash storage in meeting these emerging challenges.
Syllabus
SNIA SDC 2024 - The Role of Flash in Data Ingestion within the AI Pipeline
Taught by
SNIAVideo