Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gaussian Pre-Activations in Neural Networks: Myth or Reality?

Finnish Center for Artificial Intelligence FCAI via YouTube

Overview

Explore the intricacies of Gaussian pre-activations in neural networks through this 45-minute conference talk by Pierre Wolinski at the Finnish Center for Artificial Intelligence. Delve into the construction of activation functions and initialization distributions that ensure Gaussian pre-activations throughout network depth, even in narrow neural networks. Examine the critical review of Edge of Chaos claims and discover a unified view on pre-activations propagation. Gain insights into information propagation in deep and narrow neural networks, comparing ReLU and tanh activation functions with Kaiming and Xavier initializations. Learn about the speaker's background in neural network pruning, Bayesian neural networks, and current research on information propagation during initialization and training.

Syllabus

Introduction
Scaling
Framework
Naive heuristic
Outline
Edge of Cows
Recurrence Equation
Gaussian PreActivations
The Edge of Chaos
Experiments
Gaussian regulations
Assumption of edge of chaos
Preservation of variance
Solution
Summary
Constraints
Density
activation functions
numerical approximations
training experiments
training losses
conclusion
future work
questions
data patterns
impossibility results
Cons
Training Loss

Taught by

Finnish Center for Artificial Intelligence FCAI

Reviews

Start your review of Gaussian Pre-Activations in Neural Networks: Myth or Reality?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.