Overview
Syllabus
Intro
Agenda
AI, Machine Learning, and Deep Learning
What is Deep Learning?
Implementing Deep Learning using Neural Networks Outputs
Inputs and Outputs in a Neural Network
Hidden Layer(s)
Weights and Biases
Calculating the Result of a Node (Forward Propagation)
Feeding the Result of a Node to an Activation Function
Categories of Activation Functions
Binary Step Function
Analogy
Use of Sigmoid Activation
Non-Linear Activation
Evaluating Performance
Cross Entropy
In Summary Activation Function and Loss Function
Using an Optimizer
Back Propagation
A walkthrough
Initializing the Weights
Significance of the Partial Differentials
Updating the Weights using Stochastic Gradient Descent
In Summary Activation Function, Optimizer, and Loss Function
TensorFlow and Keras
Taught by
NDC Conferences