Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Deep Learning in NLP and Beyond - 2015

Center for Language & Speech Processing(CLSP), JHU via YouTube

Overview

Explore the frontiers of deep learning in natural language processing and beyond in this 57-minute lecture by Tomas Mikolov, a research scientist at Facebook AI Research. Gain insights into the success stories of advanced machine learning techniques in NLP, focusing on recurrent neural networks. Discover the motivations driving researchers towards deep learning approaches and learn about novel ideas for future research aimed at developing machines capable of understanding natural language and communicating with humans. Delve into topics such as neural network fundamentals, including neurons, activation functions, and hidden layers. Examine the applications of recurrent neural networks in language modeling and their extensions like Long Short-Term Memory networks. Compare performance on the Penn Treebank dataset and contemplate the future directions of deep learning research in NLP. This talk, presented at the Center for Language & Speech Processing (CLSP) at Johns Hopkins University in 2015, offers valuable perspectives on the evolving landscape of artificial intelligence and language understanding.

Syllabus

Intro
Deep Learning in NLP and Beyond: Overview
Neural networks: motivation
Neuron (perceptron)
Activation function
Non-linearity: example
Hidden layer
Deep Learning in NLP: RNN language model
RNN Extensions: Longer Short Term Memory
Comparison on Penn Treebank
Future of Deep Learning Research for NLP
Conclusion

Taught by

Center for Language & Speech Processing(CLSP), JHU

Reviews

Start your review of Deep Learning in NLP and Beyond - 2015

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.