Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the transformative potential of on-device Large Language Models (LLMs) in this conference talk from Conf42 Prompt Engineering 2024. Begin with a foundational understanding of LLMs and their current limitations before diving into the advantages of on-device implementations. Examine real-world demonstrations and comparisons of state-of-the-art models, with particular attention to practical applications and Apple's integration of on-device LLM features. Learn about the development path for creating minimum viable products with on-device LLMs and gain insights into future applications that could reshape how AI is deployed locally on devices. Through comprehensive examples and technical demonstrations, discover how on-device LLMs are addressing privacy concerns, reducing latency, and enabling offline functionality while maintaining powerful language processing capabilities.
Syllabus
Introduction and Speaker Background
Why On-Device LLMs Matter
Understanding Large Language Models
Problems with Large Language Models
Benefits of On-Device LLMs
State of the Art LLMs: Demos and Comparisons
Real World Use Cases of On-Device LLMs
Apple Intelligence: On-Device LLM Features
Path to MVP: Future of On-Device LLMs
Conclusion and Future Applications
Taught by
Conf42