Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the fundamentals of Hugging Face Transformers in this 30-minute video tutorial, the first episode of a practical coding guide series. Explore the basics of the Hugging Face Transformers Library, including its purpose, functionality, and applications. Navigate through high-level concepts, learn to use the Transformers documentation, and implement out-of-the-box functionality. Install the library, utilize pre-defined pipelines, and implement a model through PyTorch. Gain insights into tokenizers, token IDs, and attention masks, and understand model outputs. Perfect for those interested in Natural Language Processing (NLP) models like BERT and RoBERTa, this casual guide focuses on implementation rather than theory, providing a solid foundation for future episodes on retraining models for multi-label classification tasks.
Syllabus
Intro:
What is Hugging Face's Transformer Library:
Hugging Face models:
Navigating the Transformers documentation:
Coding with Transformers - installation:
Using pre-defined pipelines:
Implementing a model through PyTorch:
Tokenisers, Token IDs and Attention Masks:
Output from the model:
Outro:
Taught by
rupert ai