Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

From Zero to SOTA: Fine-Tuning BERT Using Hugging Face for World-Class Training Performance

Data Science Festival via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive conference talk on fine-tuning BERT using Hugging Face for state-of-the-art training performance. Learn how Graphcore accelerates AI development through their Intelligence Processing Unit hardware and easy-to-integrate examples. Discover the implementation and optimization of BERT-Large for IPU systems, showcasing industry-leading performance results. Follow a step-by-step demonstration on accessing IPUs via Spell's Cloud MLOps platform, navigate through a BERT Fine-tuning notebook tutorial using the SQuADv1 dataset, and execute an inference question-answering task with the HuggingFace inference API. Gain insights into training transformer models faster using the Hugging Face Optimum toolkit, and understand the significance of BERT in industries undergoing AI transformation such as legal, banking and finance, and healthcare.

Syllabus

From Zero to SOTA: How to fine tune BERT using Hugging Face for world class training performance

Taught by

Data Science Festival

Reviews

Start your review of From Zero to SOTA: Fine-Tuning BERT Using Hugging Face for World-Class Training Performance

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.