Embark on a journey through the intricate world of deep learning and neural networks. This course starts with a foundation in the history and basic concepts of neural networks, including perceptrons and multi-layer structures. As you progress, you'll explore the mechanics of training neural networks, covering activation functions and the backpropagation algorithm.
The course then advances to artificial neural networks and their real-world applications, drawing inspiration from the human brain's architecture. You'll gain practical insights into input and output layers, the Sigmoid function, and key datasets like MNIST. Specialized topics such as feed-forward networks, backpropagation, and regularization techniques, including dropout strategies and batch normalization, are thoroughly covered.
You'll also be introduced to powerful frameworks like TensorFlow and Keras. The course concludes with an in-depth study of convolutional neural networks (CNNs), focusing on their applications and principles for image and video analysis.
This course is ideal for tech professionals and students with a basic understanding of programming and mathematics, particularly linear algebra, calculus, and basic probability.
Overview
Syllabus
- Course Introduction
- In this module, we will introduce the basic concepts of deep learning and neural networks. We will explore the history, fundamental structures like perceptrons, and the process of training neural networks. Additionally, we'll cover important concepts such as activation functions and representations.
- Artificial Neural Networks-Introduction
- In this module, we will delve into the intricacies of artificial neural networks. We'll explore how the human brain inspires these networks, the detailed workings of perceptrons, and the layers that constitute neural networks. Additionally, we'll cover the sigmoid function and understanding MNIST data.
- ANN - Feed Forward Network
- In this module, we will focus on feed-forward networks, their operation modes, and the dimensions involved. We'll break down the pseudocode required for batch processing and introduce vectorized methods to optimize neural network training.
- Backpropagation
- In this module, we will dive deep into backpropagation, a crucial method for training neural networks. We'll introduce the loss function, break down the backpropagation process into multiple parts, and cover associated concepts such as the sigmoid function and stochastic gradient descent (SGD).
- Regularization
- In this module, we will cover regularization techniques to enhance neural network performance. We'll explore dropout methods, batch normalization in multiple parts, and introduce tools like TensorFlow and Keras that facilitate these processes.
- Convolution Neural Networks
- In this module, we will explore Convolutional Neural Networks (CNNs) and their applications. We'll discuss the ideas behind CNNs, analyze how they process image and video data, and implement essential operations like convolution, stride, padding, and pooling. We'll also cover combining networks for complex tasks.
Taught by
Packt - Course Instructors