Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of loss functions in Convolutional Neural Networks through this 42-minute video lecture. Delve into mean square error, classification problems, and cross-entropy. Understand the concepts of softmax, negative log likelihood, and log softmax. Examine various loss functions and their applications in training against identities. Learn about practical implementations using Torch, with references to specific criterion documentation. Gain valuable insights into CNN optimization techniques from instructor Alfredo Canziani's comprehensive tutorial, part of a larger torch-Video-Tutorials project available on GitHub.
Syllabus
Introduction
Mean square error
Classification problems
Example
Crossentropy
Softmax
Negative log likelihood
Log softmax
Loss functions
Training against identities
Implementation
Taught by
Alfredo Canziani