Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Advancements in AI Inference Technology - Cerebras Systems' Wafer-Scale Chips

Weights & Biases via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into a 53-minute podcast episode featuring Andrew Feldman, CEO of Cerebras Systems, as he discusses the latest advancements in AI inference technology with host Lukas Biewald. Explore groundbreaking innovations in wafer-scale chips, setting new benchmarks in speed, accuracy, and cost efficiency for AI workloads. Gain insights into architectural innovations, real-world applications, and the balance between speed and accuracy in AI inference. Examine challenges in overcoming latency issues, the future of AI in production environments, and competition with industry giants. Delve into discussions on open source vs. closed source in AI development and the impact of AI on chip manufacturing. Access this comprehensive look at cutting-edge AI hardware and its implications for the future of machine learning through various podcast platforms.

Syllabus

- Introduction
- Cerebras Systems' Latest Product Announcement
- The Challenges of AI Inference
- Architectural Innovations in Wafer-Scale Chips
- Real-World Applications of AI Inference
- Speed vs. Accuracy: Striking the Balance
- Overcoming Latency Issues
- The Future of AI in Production Environments
- Competing with Industry Giants
- Open Source vs. Closed Source in AI Development
- The Impact of AI on Chip Manufacturing
- Final Thoughts and Takeaways

Taught by

Weights & Biases

Reviews

Start your review of Advancements in AI Inference Technology - Cerebras Systems' Wafer-Scale Chips

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.