Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

DeepLearning.AI

Natural Language Processing with Attention Models

DeepLearning.AI via Coursera

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.

Syllabus

  • Neural Machine Translation
    • Discover some of the shortcomings of a traditional seq2seq model and how to solve for them by adding an attention mechanism, then build a Neural Machine Translation model with Attention that translates English sentences into German.
  • Text Summarization
    • Compare RNNs and other sequential models to the more modern Transformer architecture, then create a tool that generates text summaries.
  • Question Answering
    • Explore transfer learning with state-of-the-art models like T5 and BERT, then build a model that can answer questions.

Taught by

Younes Bensouda Mourri, Łukasz Kaiser and Eddy Shyu

Reviews

4.4 rating at Coursera based on 1024 ratings

Start your review of Natural Language Processing with Attention Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.