Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Indian Institute of Technology, Kharagpur

Applied Linear Algebra in AI and ML

Indian Institute of Technology, Kharagpur and NPTEL via Swayam

Overview

ABOUT THE COURSE: Linear algebra, optimization techniques and statistical methods together form essential tools for most of the algorithms in artificial intelligence and machine learning. In this course, we propose to build some background in these mathematical foundations and prepare students to take on advanced study or research in the field of AI and ML. The objective of this course is to familiarize students with the important concepts and computational techniques in linear algebra useful for AI and ML applications. The unique objective of this course and the distinguishing point from the existing courses on the similar topics is the illustration of application of these concepts to many problems in AI and ML. Some of the key topics to be covered in this course are listed below: least squares solution, parameter estimation problems, concept of cost function and relation to parameter estimation, constrained least squares, multi-objective least squares, applications to portfolio optimization, sparse solutions to underdetermined systems of linear equations, applications to dictionary learning, eigenvalue eigenvector decomposition of square matrices, spectral theorem for symmetric matrices, SVD, multicollinearity problem and applications to principal component analysis (PCA) and dimensionality reduction, power method, application to Google page ranking algorithm, inverse eigenvalue problem, construction of Markov chains from the given stationary distribution, low rank approximation and structured low rank approximation problem (SLRA), Autoregressive model order selection using Hankel SLRA, approximate GCD computation and application to image de- blurring, tensors and CP tensor decomposition, tensor decomposition based sparse learning in deep networks, matrix completion problems, application to collaborative filteringINTENDED AUDIENCE: Senior undergraduate and post graduate students from CSE, EE, ECE, AI, MathsPREREQUISITES: First course in Engineering Mathematics with some exposure to linear algebra

Syllabus

Untitled Document

Week 1:Vectors, operations on vectors, vector spaces and subspaces,inner product and vector norm, linear dependence and independence, Matrices, linear transformations, orthogonal matrices
Week 2:System of linear equations, existence and uniqueness, left and right inverses, pseudo inverse, triangular systems
Week 3:LU decomposition and computational complexity, rotators and reflectors, QR decomposition, Gram Schmidt Orthogonalization
Week 4:Condition number of a square matrix, geometric interpretation, norm of matrix, sensitivity analysis results for the system of linear equations
Week 5:Linear least squares, existence and uniqueness, geometrical interpretation, data fitting with least squares, feature engineering, application to Vector auto-regressive models, fitting with continuous and discontinuous piecewise linear functions
Week 6:Application of least squares to classification, two-class and multi-class least squares classifiers, Polynomial classifiers, application to MNIST data set
Week 7:Multi-objective least squares, applications to estimation and regularized inversion, regularized data fitting and application to image de-blurring, constrained least squares, application to portfolio optimization
Week 8:Eigenvalue eigenvector decomposition of square matrices,spectral theorem for symmetric matrices
Week 9:SVD, relation to condition number, sensitivity analysis of least squares problems, variation in parameter estimates in regression
Week 10:Multicollinearity problem and applications to principal component analysis (PCA) and diinensionality reduction, power method, application to Google page ranking algorithm
Week 11:Underdetermined systems of linear equations, least norm solutions, sparse solutions, applications in dictionary learning and sparse code recovery, inverse eigenvalue problem, application in construction of Markov chains from the given stationary distribution
Week 12:Low rank approximation (LRA) and structured low rank approximation problem (SLRA), application to model order selection in time series, alternating projections for computing LRA and SLRA

Taught by

Prof.Swanand Khare

Tags

Reviews

Start your review of Applied Linear Algebra in AI and ML

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.