Overview
Explore structured prediction with energy-based models in this comprehensive lecture by Yann LeCun. Delve into energy-based factor graphs, efficient inference techniques, and sequence labeling. Examine simple energy-based factor graphs with "shallow" factors and the Graph Transformer Net. Compare various loss functions and learn about the application of Viterbi and forward algorithms to graphical transformer networks. Investigate the Lagrangian formulation of backpropagation, neural ODEs, and variational inference for energy-based models. Gain insights into language models as graphs and discover how these concepts apply to real-world machine learning problems.
Syllabus
– Week 14 – Lecture
– Structured Prediction, Energy based factor graphs, Sequence Labeling
– Efficient Inference for Energy-Based Factor Graph and Some Simple Energy-Based Factor Graphs
– Graph Transformer Net
– Comparing Losses and the start of language models as graphs
– Forward algorithm in Graph Transformer Networks
– Lagrangian formulation of back prop and neural ODE
– Variational Inference in terms of Energy
Taught by
Alfredo Canziani