Applied Natural Language Processing

Applied Natural Language Processing

NPTEL-NOC IITM via YouTube Direct link

mod10lec81- Neural machine translation by jointly learning to align and translate

79 of 92

79 of 92

mod10lec81- Neural machine translation by jointly learning to align and translate

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Applied Natural Language Processing

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Operations on a Corpus
  2. 2 Probability and NLP
  3. 3 Machine Translation
  4. 4 Statistical Properties of Words - Part 01
  5. 5 Statistical Properties of Words - Part 02
  6. 6 Statistical Properties of Words - Part 03
  7. 7 Vector Space Models for NLP
  8. 8 Document Similarity - Demo, Inverted index, Exercise
  9. 9 Contextual understanding of text
  10. 10 Collocations, Dense word Vectors
  11. 11 Query Processing
  12. 12 Topic Modeling
  13. 13 Introduction
  14. 14 Sequence Learning
  15. 15 Vector Representation of words
  16. 16 Co-occurence matrix, n-grams
  17. 17 SVD, Dimensionality reduction, Demo
  18. 18 Vector Space models
  19. 19 Preprocessing
  20. 20 Introduction to Probability in the context of NLP
  21. 21 Joint and conditional probabilities, independence with examples
  22. 22 The definition of probabilistic language model
  23. 23 Chain rule and Markov assumption
  24. 24 Out of vocabulary words and curse of dimensionality
  25. 25 Exercise
  26. 26 Examples for word prediction
  27. 27 Generative Models
  28. 28 Bigram and Trigram Language models -peeking indide the model building
  29. 29 Naive-Bayes, classification
  30. 30 Machine learning, perceptron, linearly separable
  31. 31 Linear Models for Claassification
  32. 32 Biological Neural Network
  33. 33 Perceptron
  34. 34 Perceptron Learning
  35. 35 Logical XOR
  36. 36 Activation Functions
  37. 37 Gradient Descent
  38. 38 Feedforward and Backpropagation Neural Network
  39. 39 Why Word2Vec?
  40. 40 What are CBOW and Skip-Gram Models?
  41. 41 One word learning architecture
  42. 42 Forward pass for Word2Vec
  43. 43 Matrix Operations Explained
  44. 44 CBOW and Skip Gram Models
  45. 45 Binay tree, Hierarchical softmax
  46. 46 Updating the weights using hierarchical softmax
  47. 47 Sequence Learning and its applications
  48. 48 ANN as a LM and its limitations
  49. 49 Discussion on the results obtained from word2vec
  50. 50 Recap and Introduction
  51. 51 Mapping the output layer to Softmax
  52. 52 Reduction of complexity - sub-sampling, negative sampling
  53. 53 Building Skip-gram model using Python
  54. 54 GRU
  55. 55 Truncated BPTT
  56. 56 LSTM
  57. 57 BPTT - Exploding and vanishing gradient
  58. 58 BPTT - Derivatives for W,V and U
  59. 59 BPTT - Forward Pass
  60. 60 RNN - Based Language Model
  61. 61 Unrolled RNN
  62. 62 Introuduction to Recurrent Neural Network
  63. 63 IBM Model 2
  64. 64 IBM Model 1
  65. 65 Alignments again!
  66. 66 Translation Model, Alignment Variables
  67. 67 Noisy Channel Model, Bayes Rule, Language Model
  68. 68 What is SMT?
  69. 69 Introduction and Historical Approaches to Machine Translation
  70. 70 BLEU Demo using NLTK and other metrics
  71. 71 BLEU - "A short Discussion of the seminal paper"
  72. 72 Introduction to evaluation of Machine Translation
  73. 73 Extraction of Phrases
  74. 74 Introduction to Phrase-based translation
  75. 75 Symmetrization of alignments
  76. 76 Learning/estimating the phrase probabilities using another Symmetrization example
  77. 77 mod10lec79-Recap and Connecting Bloom Taxonomy with Machine Learning
  78. 78 mod10lec80-Introduction to Attention based Translation
  79. 79 mod10lec81- Neural machine translation by jointly learning to align and translate
  80. 80 mod10lec82-Typical NMT architecture architecture and models for multi-language translation
  81. 81 mod10lec77-Encoder-Decoder model for Neural Machine Translation
  82. 82 mod10lec78-RNN Based Machine Translation
  83. 83 mod10lec83-Beam Search
  84. 84 mod10lec84-Variants of Gradient Descend
  85. 85 mod11lec85-Introduction to Conversation Modeling
  86. 86 mod11lec86-A few examples in Conversation Modeling
  87. 87 mod11lec87-Some ideas to Implement IR-based Conversation Modeling
  88. 88 mod11lec88-Discussion of some ideas in Question Answering
  89. 89 mod12lec89-Hyperspace Analogue to Language - HAL
  90. 90 mod12lec90-Correlated Occurence Analogue to Lexical Semantic - COALS
  91. 91 mod12lec91-Global Vectors - Glove
  92. 92 mod12lec92-Evaluation of Word vectors

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.