Using Ollama to Run Local LLMs on the Steam Deck - Performance Comparison with Raspberry Pi 5

Using Ollama to Run Local LLMs on the Steam Deck - Performance Comparison with Raspberry Pi 5

Ian Wootten via YouTube Direct link

Model Runs

3 of 4

3 of 4

Model Runs

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Using Ollama to Run Local LLMs on the Steam Deck - Performance Comparison with Raspberry Pi 5

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Installation
  3. 3 Model Runs
  4. 4 Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.