Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to implement Next Sentence Prediction (NSP), a crucial component of BERT model training, in this 37-minute tutorial video. Explore the process of using unstructured text to pre-train BERT models for improved language understanding in specific use cases. Discover how to apply NSP alongside Masked Language Modeling (MLM) to enhance BERT's performance. Follow along with provided Jupyter Notebook and dataset to gain hands-on experience in implementing NSP for BERT pre-training. Access additional resources, including a detailed Medium article and discounted NLP course, to further expand your knowledge of BERT and transformer models in natural language processing.
Syllabus
Training BERT #4 - Train With Next Sentence Prediction (NSP)
Taught by
James Briggs