Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 20-minute technical presentation that delves into VMware's internal AI services and their practical implementation within the organization. Learn how VMware leverages large language models (LLMs) for coding assistance, document search using Retrieval-Augmented Generation (RAG), and internal API development. Discover the company's Cloud Smart approach, which utilizes open-source LLMs trained on public cloud infrastructure to minimize environmental impact. Gain insights into the VMware Automated Question Answering System (Wacqua), an advanced information retrieval system that streamlines document searching and question answering. Understand how VMware has scaled its GPU capacity to support increased developer demands and learn about their comprehensive AI platform that provides GPU pool resources, development environments, and LLM APIs. Examine critical aspects of data management, platform standardization, and the importance of collaboration between AI and infrastructure teams. Gain valuable insights on starting small with open-source models, identifying key performance indicators, and implementing AI solutions strategically to solve business problems effectively. Recorded live in Santa Clara, CA during AI Field Day 4, this presentation offers practical knowledge for organizations looking to implement private AI solutions.
Syllabus
Introduction
Use Cases
Cloud Smart
Improved Documentation Search
Automated Question Answering
Data Scientists and Software Developers
Conclusion
Taught by
Tech Field Day