Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Working with Gemma and Open LLMs on Google Kubernetes Engine

Linux Foundation via YouTube

Overview

Explore the Gemma family of open models and learn how to fine-tune them on custom datasets for various tasks like text generation, translation, and summarization in this hands-on workshop. Discover how to combine Gemma with Kubernetes to leverage open source AI innovations with scalability, reliability, and ease of management. Through guided exercises, gain practical experience in working with Gemma and fine-tuning it on a Kubernetes cluster. Investigate options for serving Gemma on Kubernetes using accelerators and open source tools, enhancing your skills in deploying and managing large language models in a scalable environment.

Syllabus

Workshop: Working with Gemma and Open LLMs on Google Kubernetes... - Abdel Sghiouar & Victor Dantas

Taught by

Linux Foundation

Reviews

Start your review of Working with Gemma and Open LLMs on Google Kubernetes Engine

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.