Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Unlocking Local LLMs with Quantization

Linux Foundation via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about quantization's evolution and its impact on local Large Language Models in this 40-minute conference talk from Hugging Face's Marc Sun. Explore the journey of quantization through influential papers like QLoRA and GPTQ, and discover its practical applications across different stages of model development. Gain insights into pre-training a 1.58-bit model, implementing fine-tuning techniques with PEFT + QLoRA, and optimizing inference performance using torch.compile or custom kernels. Understand how the open-source community is making quantized models more accessible through transformers and GGUF models from llama.cpp, enabling broader adoption of local LLM implementations.

Syllabus

Unlocking Local LLMs with Quantization - Marc Sun, Hugging Face

Taught by

Linux Foundation

Reviews

Start your review of Unlocking Local LLMs with Quantization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.