Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance

GAIA via YouTube

Overview

Explore parameter-efficient tuning techniques for boosting Large Language Model (LLM) performance in this 25-minute conference talk from the 2023 GAIA Conference. Delve into the adaptation of p-tuning, a prompt-learning method, for low-resource language settings, with a focus on Swedish. Learn about an improved version of p-tuning implemented in NVIDIA NeMo that enables continuous multitask learning of virtual prompts. Gain insights from Zenodia Charpy, a senior deep learning data scientist at NVIDIA, as she shares her expertise in training and deploying very large language models for non-English and low-resource languages. Discover how these techniques can help solve real-world natural language tasks and improve performance on various downstream NLP tasks while grounding the factual correctness of LLM responses.

Syllabus

P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance by Zenodia Charpy

Taught by

GAIA

Reviews

Start your review of P-Tuning: A Parameter Efficient Tuning to Boost LLM Performance

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.