Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

From Paper to Product - How We Implemented BERT

MLCon | Machine Learning Conference via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the journey of implementing BERT, a cutting-edge natural language processing model, in a real-world product development scenario. Dive into the challenges, successes, and lessons learned as a team transforms theoretical concepts into a functional natural language generation application. Learn about the decision-making process behind choosing BERT, alternative approaches considered, and the intricacies of training a custom version of the network. Gain valuable insights into common pitfalls to avoid and unexpected discoveries made during the implementation process. This conference talk provides a comprehensive look at bridging the gap between academic research and practical application in the field of NLP, offering both technical details and strategic considerations for professionals working with advanced language models.

Syllabus

Introduction
About the talk
How did it start
What happens when the customer is waited
Good old AI
Visionary
English
German
Product
Customer Feedback
Part of Speech Tagging
Architecture
Stanford NLP
Reinventing the wheel
What is BERT
What is supervised training
Unsupervised learning
Unsupervised training
Semisupervised training
Input structure
Transformer block
What is good
What is bad
How did we do it
BERT parameter tuning
Preprocessing
Word pieces
Morphemes vs morphs
Dictionary size
How does it look
What did we save
Training
Training Results
Future Plans
Backend
ZukaText
Whats next

Taught by

MLCon | Machine Learning Conference

Reviews

Start your review of From Paper to Product - How We Implemented BERT

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.