Ollama on Linux - Installing and Running LLMs on Your Server

Ollama on Linux - Installing and Running LLMs on Your Server

Ian Wootten via YouTube Direct link

Running Llama2 on a Server

2 of 4

2 of 4

Running Llama2 on a Server

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Ollama on Linux - Installing and Running LLMs on Your Server

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Installation on DigitalOcean
  2. 2 Running Llama2 on a Server
  3. 3 Calling a Model Remotely
  4. 4 Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.