Courses from 1000+ universities
Two years after its first major layoff round, Coursera announces another, impacting 10% of its workforce.
600 Free Google Certifications
Artificial Intelligence
Cybersecurity
Computer Networking
How Things Work: An Introduction to Physics
End of Life Care: Challenges and Innovation
Understanding Medical Research: Your Facebook Friend is Wrong
Organize and share your learning with Class Central Lists.
View our Lists Showcase
Explore transition-based parsing for NLP, covering shift-reduce methods, Stack LSTM, phrase structure models, and linearized trees. Gain insights into advanced parsing techniques.
Explore latent random variables in neural networks for NLP, covering variational autoencoders, discrete latent variables, and applications in language processing and text generation.
Explore model interpretation in neural networks for NLP, covering themes, analysis techniques, and evaluation methods for understanding and explaining neural models.
Explore structured prediction in NLP using local independence assumptions and conditional random fields. Learn about sequence labeling, globally normalized models, and advanced training techniques.
Explore reinforcement learning in NLP, covering policy gradient, REINFORCE, value-based methods, and applications in dialogue and information retrieval.
Explore structured prediction in NLP, covering perceptron algorithms, max-margin objectives, and exposure bias remedies. Learn techniques for modeling output interactions and training globally normalized models.
Practical strategies for identifying and resolving issues in neural networks for NLP, covering training and test-time problems, model architecture, and data analysis.
Explore sentence representations and contextual word embeddings for NLP tasks. Learn about multi-task learning, transfer learning, and pre-training techniques to improve model performance.
Explore attention mechanisms in neural networks for NLP, covering basic concepts, improvements, specialized varieties, and a case study on the Transformer model.
Explore recurrent networks, LSTMs, and their applications in sentence modeling and NLP. Learn about vanishing gradients, pre-training, and the strengths and weaknesses of recurrent models.
Explore convolutional neural networks for NLP, covering bag-of-words, context windows, sentence modeling, stacked and dilated convolutions, and structured convolution for sentence pairs.
Explore word embeddings, their training methods, evaluation techniques, and advanced concepts in natural language processing. Learn to represent words numerically for machine learning applications.
Explore neural network techniques for knowledge graph construction, relation extraction, and embedding refinement in natural language processing.
Explore document-level NLP models, covering language modeling, topic modeling, coreference resolution, and discourse parsing. Learn advanced techniques for analyzing and processing longer texts.
Explore models of dialogue in neural networks, covering generation-based approaches, discourse-level coherence, diversity promotion, evaluation methods, and personality infusion in conversational AI.
Get personalized course recommendations, track subjects and courses with reminders, and more.