Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Using Ollama to Run Local LLMs on the Steam Deck - Performance Comparison with Raspberry Pi 5

Ian Wootten via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the process of installing and running Ollama, a tool for local large language models, on the Steam Deck gaming device. Learn how to set up Ollama, execute various models, and compare performance to the Raspberry Pi 5. Gain insights into the Steam Deck's capabilities as a fully-fledged PC beyond gaming. Follow along with the installation steps, observe model run demonstrations, and discover the potential of running AI models on portable gaming hardware.

Syllabus

Intro
Installation
Model Runs
Conclusion

Taught by

Ian Wootten

Reviews

Start your review of Using Ollama to Run Local LLMs on the Steam Deck - Performance Comparison with Raspberry Pi 5

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.