Explore the final video in the Neural Machine Translation (NMT) series, focusing on model training and testing for recreating Google Translate. Learn about parameters, training processes, and testing methodologies. Dive into the practical application of concepts covered in previous videos, including NLP models for sequential data, attention mechanisms, self-attention, the mT5 model, and the Hugging Face library for transformers. Witness the recreation of a demo and observe real-time translation tests. Access the GitHub repository, Colab code, and additional resources to deepen your understanding of transformer models and their applications in multilingual machine translation.
Overview
Syllabus
Intro
Parameters
Training
Testing
Results
Our App
Taught by
Edan Meyer