Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Pluralsight

Transformer Models and BERT Model

via Pluralsight

Overview

This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model.

This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.

This course is estimated to take approximately 45 minutes to complete.

Syllabus

  • Introduction 23mins
  • Introduction 23mins

Taught by

Pluralsight

Reviews

Start your review of Transformer Models and BERT Model

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.