Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Addressing the Hidden Infrastructure and Sustainability Costs of AI in Enterprise

SNIAVideo via YouTube

Overview

Join a detailed webinar featuring experts from NVIDIA, Intel, and Dell who explore the often-overlooked technical and infrastructure costs of implementing generative AI technologies. Delve into crucial enterprise considerations including scalability challenges, computational demands of Large Language Model inferencing, fabric requirements, and sustainability impacts from increased power consumption and cooling needs. Learn practical strategies for cost optimization by comparing on-premises versus cloud deployments, and discover how to leverage pre-trained models for specific market domains. Through comprehensive discussions on AI infrastructure trends, silicon diversity, training methodologies, and both endpoint and edge inference, gain valuable insights into managing and reducing the environmental and financial impact of AI implementations.

Syllabus

Introduction
AIs Rapid Evolution
AI Infrastructure
Trends
Power Usage
Silicon Diversity
Training
Fine Tuning
Rag
David McIntyre
Rob For Fabric
AI optimized Ethernet example
Wire it differently
Cost per bit
Summary
QA
Endpoint Inference
Edge Inference
Question of the Day
Conclusion

Taught by

SNIAVideo

Reviews

Start your review of Addressing the Hidden Infrastructure and Sustainability Costs of AI in Enterprise

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.