In this 3-day developer class on working with Large Language Models, we will show you how to use Transformers in Natural Language Processing and leverage the capabilities available on Huggingface.You'll learn how transformer models work and their limitations, as well as how to fine-tune a pre-trained model using the Trainer API or Keras. We'll cover sharing models and tokenizers on the Hugging Face Hub and how to create your own dataset and perform a semantic search with FAISS using the Datasets library.Join us for an interactive 3-day journey into the world of Large Language Models with Huggingface, and take your Natural Language Processing projects to the next level.Prerequisites:Â To take this 3-day course, you should have taken our AI Workbench class or have basic knowledge of programming concepts and syntax in a language such as Python or JavaScript. General familiarity with APIs is also recommended.COURSE OUTLINETRANSFORMER MODELS:Natural Language ProcessingTransformers, what can they do?How do Transformers work?Encoder modelsDecoder modelsSequence-to-sequence modelsBias and limitationsUSING TRANSFORMERS:Behind the pipelineModelsTokenizersHandling multiple sequencesPutting it all togetherFINE-TUNING A PRE-TRAINED MODEL:Processing the dataFine-tuning a model with the Trainer API or KerasA full trainingFine-tuning, Check!SHARING MODELS AND TOKENIZERS:The Hugging Face HubUsing pre-trained modelsSharing pre-trained modelsBuilding a model cardPart 1 completed!End-of-chapter quizTHE DATASETS LIBRARY:What if my dataset isn't on the Hub?Time to slice and diceBig data? Datasets to the rescue!Creating your own datasetSemantic search with FAISSDatasets
Overview
Taught by
ONLC Training Centers