Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Natural Language Processing (NLP) is a subfield of linguistics, computer science, and artificial intelligence that uses algorithms to interpret and manipulate human language.
This technology is one of the most broadly applied areas of machine learning and is critical in effectively analyzing massive quantities of unstructured, text-heavy data. As AI continues to expand, so will the demand for professionals skilled at building models that analyze speech and language, uncover contextual patterns, and produce insights from text and audio.
By the end of this Specialization, you will be ready to design NLP applications that perform question-answering and sentiment analysis, create tools to translate languages and summarize text. These and other NLP applications are going to be at the forefront of the coming transformation to an AI-powered future.
This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
Syllabus
Course 1: Natural Language Processing with Classification and Vector Spaces
- Offered by DeepLearning.AI. In Course 1 of the Natural Language Processing Specialization, you will: a) Perform sentiment analysis of ... Enroll for free.
Course 2: Natural Language Processing with Probabilistic Models
- Offered by DeepLearning.AI. In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct ... Enroll for free.
Course 3: Natural Language Processing with Sequence Models
- Offered by DeepLearning.AI. In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with word ... Enroll for free.
Course 4: Natural Language Processing with Attention Models
- Offered by DeepLearning.AI. In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English ... Enroll for free.
- Offered by DeepLearning.AI. In Course 1 of the Natural Language Processing Specialization, you will: a) Perform sentiment analysis of ... Enroll for free.
Course 2: Natural Language Processing with Probabilistic Models
- Offered by DeepLearning.AI. In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct ... Enroll for free.
Course 3: Natural Language Processing with Sequence Models
- Offered by DeepLearning.AI. In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with word ... Enroll for free.
Course 4: Natural Language Processing with Attention Models
- Offered by DeepLearning.AI. In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English ... Enroll for free.
Courses
-
In Course 1 of the Natural Language Processing Specialization, you will: a) Perform sentiment analysis of tweets using logistic regression and then naïve Bayes, b) Use vector space models to discover relationships between words and use PCA to reduce the dimensionality of the vector space and visualize those relationships, and c) Write a simple English to French translation algorithm using pre-computed word embeddings and locality-sensitive hashing to relate words via approximate k-nearest neighbor search. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
-
In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
-
In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
-
In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper.
Taught by
Eddy Shyu, Younes Bensouda Mourri and Łukasz Kaiser