Mathematics for Machine Learning
Indian Institute of Technology, Kharagpur and NPTEL via Swayam
-
21
-
- Write review
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
ABOUT THE COURSE: This course will discuss the rich mathematical theory needed for developing efficient, accurate and robust machine learning algorithms. This course will focus on selected advanced topics from linear algebra, calculus, optimization, probability theory and statistics with strong linkage with machine learning. Applications of these topics will be introduced in ML with help of some real-life examples.INTENDED AUDIENCE: Under graduatePREREQUISITES: Basic MathematicsINDUSTRY SUPPORT: Any industry who practices AI
Syllabus
Week 1: Introduction to Theory of Learning: meaning of learning, overfitting etc.
Week 2:Convex functions and sets, Convex Optimization, Optimization problem Formulations
Week 3:Gradient and Sub-gradient descent for non- smooth functions
Week 4:Regularization, Lasso and Ridge, Applications with medical data
Week 5:Accelerating Gradient Descent, Stochastic Gradient Descent and its applications (NN)
Week 6:Support Vector Regression, Logistic Regression for dichotomous variable
Week 7:Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family
Week 8:Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family. (Contd.)
Week 9:Dimensionality reduction techniques
Week 10:Dynamical systems and control, Fourier transform and its applications
Week 11:Expectation Maximization (EM) based learning in Mixture models, Hidden Markov Model, Dirichlet processes (Clustering).
Week 12:Bayesian Machine Learning, estimating decisions using posterior distributions, Model selection: Variational Inference.
Week 2:Convex functions and sets, Convex Optimization, Optimization problem Formulations
Week 3:Gradient and Sub-gradient descent for non- smooth functions
Week 4:Regularization, Lasso and Ridge, Applications with medical data
Week 5:Accelerating Gradient Descent, Stochastic Gradient Descent and its applications (NN)
Week 6:Support Vector Regression, Logistic Regression for dichotomous variable
Week 7:Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family
Week 8:Maximum likelihood estimation (MLE) in Binomial, Multinomial, Gaussian, models in exponential family. (Contd.)
Week 9:Dimensionality reduction techniques
Week 10:Dynamical systems and control, Fourier transform and its applications
Week 11:Expectation Maximization (EM) based learning in Mixture models, Hidden Markov Model, Dirichlet processes (Clustering).
Week 12:Bayesian Machine Learning, estimating decisions using posterior distributions, Model selection: Variational Inference.
Taught by
Prof. Debjani Chakraborty, Prof. Debashree Guha Adhya