Ollama on Linux - Installing and Running LLMs on Your Server

Ollama on Linux - Installing and Running LLMs on Your Server

Ian Wootten via YouTube Direct link

Calling a Model Remotely

3 of 4

3 of 4

Calling a Model Remotely

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Ollama on Linux - Installing and Running LLMs on Your Server

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Installation on DigitalOcean
  2. 2 Running Llama2 on a Server
  3. 3 Calling a Model Remotely
  4. 4 Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.