Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Run Local LLMs on Hardware from $50 to $50,000 - Testing and Comparison

Dave's Garage via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the performance of local large language models across a diverse range of hardware in this comprehensive video comparison. Witness firsthand testing of Llama 3.1 and Llama 3.2 using Ollama on devices ranging from a budget-friendly Raspberry Pi 4 to a high-end Dell AI workstation. Learn how to install Ollama on Windows and observe the capabilities of various systems, including a Herk Orion Mini PC, a gaming rig with a Threadripper 3970X, an M2 Mac Pro, and a powerful workstation featuring a Threadripper 7995WX with NVIDIA 6000 Ada GPUs. Discover which hardware configurations excel at handling different model sizes, including the massive 405-billion parameter model. Gain insights into the practical aspects of running LLMs locally and understand the trade-offs between cost and performance across this wide spectrum of computing devices.

Syllabus

Run Local LLMs on Hardware from $50 to $50,000 - We Test and Compare!

Taught by

Dave's Garage

Reviews

Start your review of Run Local LLMs on Hardware from $50 to $50,000 - Testing and Comparison

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.