Completed
⌨️ Question 39: What are ways to solve Exploding Gradients?
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Deep Learning Interview Prep Course
Automatically move to the next video in the Classroom when playback concludes
- 1 ⌨️ Introduction
- 2 ⌨️ Question 1: What is Deep Learning?
- 3 ⌨️ Question 2: How does Deep Learning differ from traditional Machine Learning?
- 4 ⌨️ Question 3: What is a Neural Network?
- 5 ⌨️ Question 4: Explain the concept of a neuron in Deep Learning
- 6 ⌨️ Question 5: Explain architecture of Neural Networks in simple way
- 7 ⌨️ Question 6: What is an activation function in a Neural Network?
- 8 ⌨️ Question 7: Name few popular activation functions and describe them
- 9 ⌨️ Question 8: What happens if you do not use any activation functions in a neural network?
- 10 ⌨️ Question 9: Describe how training of basic Neural Networks works
- 11 ⌨️ Question 10: What is Gradient Descent?
- 12 ⌨️ Question 11: What is the function of an optimizer in Deep Learning?
- 13 ⌨️ Question 12: What is backpropagation, and why is it important in Deep Learning?
- 14 ⌨️ Question 13: How is backpropagation different from gradient descent?
- 15 ⌨️ Question 14: Describe what Vanishing Gradient Problem is and it’s impact on NN
- 16 ⌨️ Question 15: Describe what Exploding Gradients Problem is and it’s impact on NN
- 17 ⌨️ Question 16: There is a neuron in the hidden layer that always results in an error. What could be the reason?
- 18 ⌨️ Question 17: What do you understand by a computational graph?
- 19 ⌨️ Question 18: What is Loss Function and what are various Loss functions used in Deep Learning?
- 20 ⌨️ Question 19: What is Cross Entropy loss function and how is it called in industry?
- 21 ⌨️ Question 20: Why is Cross-entropy preferred as the cost function for multi-class classification problems?
- 22 ⌨️ Question 21: What is SGD and why it’s used in training Neural Networks?
- 23 ⌨️ Question 22: Why does stochastic gradient descent oscillate towards local minima?
- 24 ⌨️ Question 23: How is GD different from SGD?
- 25 ⌨️ Question 24: How can optimization methods like gradient descent be improved? What is the role of the momentum term?
- 26 ⌨️ Question 25: Compare batch gradient descent, minibatch gradient descent, and stochastic gradient descent.
- 27 ⌨️ Question 26: How to decide batch size in deep learning considering both too small and too large sizes?
- 28 ⌨️ Question 27: Batch Size vs Model Performance: How does the batch size impact the performance of a deep learning model?
- 29 ⌨️ Question 28: What is Hessian, and how can it be used for faster training? What are its disadvantages?
- 30 ⌨️ Question 29: What is RMSProp and how does it work?
- 31 ⌨️ Question 30: Discuss the concept of an adaptive learning rate. Describe adaptive learning methods
- 32 ⌨️ Question 31: What is Adam and why is it used most of the time in NNs?
- 33 ⌨️ Question 32: What is AdamW and why it’s preferred over Adam?
- 34 ⌨️ Question 33: What is Batch Normalization and why it’s used in NN?
- 35 ⌨️ Question 34: What is Layer Normalization, and why it’s used in NN?
- 36 ⌨️ Question 35: What are Residual Connections and their function in NN?
- 37 ⌨️ Question 36: What is Gradient clipping and their impact on NN?
- 38 ⌨️ Question 37: What is Xavier Initialization and why it’s used in NN?
- 39 ⌨️ Question 38: What are different ways to solve Vanishing gradients?
- 40 ⌨️ Question 39: What are ways to solve Exploding Gradients?
- 41 ⌨️ Question 40: What happens if the Neural Network is suffering from Overfitting relate to large weights?
- 42 ⌨️ Question 41: What is Dropout and how does it work?
- 43 ⌨️ Question 42: How does Dropout prevent overfitting in NN?
- 44 ⌨️ Question 43: Is Dropout like Random Forest?
- 45 ⌨️ Question 44: What is the impact of Drop Out on the training vs testing?
- 46 ⌨️ Question 45: What are L2/L1 Regularizations and how do they prevent overfitting in NN?
- 47 ⌨️ Question 46: What is the difference between L1 and L2 regularisations in NN?
- 48 ⌨️ Question 47: How do L1 vs L2 Regularization impact the Weights in a NN?
- 49 ⌨️ Question 48: What is the curse of dimensionality in ML or AI?
- 50 ⌨️ Question 49: How deep learning models tackle the curse of dimensionality?
- 51 ⌨️ Question 50: What are Generative Models, give examples?