Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

LinkedIn Learning

Transformers: Text Classification for NLP Using BERT

via LinkedIn Learning

Overview

Learn about transformers, the go-to architecture for NLP and computer vision tasks.

Syllabus

Introduction
  • Natural language processing with transformers
  • How to use the exercise files
1. NLP and Transformers
  • How transformers are used in NLP
  • Transformers in production
  • Transformers history
  • Challenge: BERT model sizes
  • Solution: BERT model sizes
2. BERT and Transfer Learning
  • Bias in BERT
  • How was BERT trained?
  • Transfer learning
3. Transformer Architecture and BERT
  • Transformer: Architecture overview
  • BERT model and tokenization
  • Positional encodings and segment embeddings
  • Tokenizers
  • Self-attention
  • Multi-head attention and feedforward network
4. Text Classification
  • BERT and text classification
  • The Datasets library
  • Overview of IMDb dataset
  • Using a tokenizer
  • Tiny IMDb
  • A training run
Conclusion
  • Additional training runs

Taught by

Jonathan Fernandes

Reviews

4.7 rating at LinkedIn Learning based on 456 ratings

Start your review of Transformers: Text Classification for NLP Using BERT

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.