Pre-Training BERT from Scratch - Building, Fine-Tuning, and Running Inference with KERAS NLP
Discover AI via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn to build, pre-train, fine-tune, and deploy a BERT transformer model from scratch using TF2 and Keras NLP in this 30-minute tutorial video. Master the complete workflow of implementing BERT for domain-specific datasets, starting with transformer architecture construction, followed by pre-training procedures, model fine-tuning techniques, and finally running inference tasks on plain text. Explore practical code implementations at each stage, including building the transformer architecture (0:00), pre-training the BERT model (16:30), fine-tuning for specific tasks (20:41), and performing inference on text data (26:19). Gain hands-on experience with Keras NLP toolbox while developing a custom BERT model tailored for company or domain-specific applications.
Syllabus
Build a Transformer BERT from scratch
Code to pre-train a Transformer BERT model
Code to fine-tuning a unique BERT model
Code BERT model inference on plain text
Taught by
Discover AI