Linear Algebra: Matrix Algebra, Determinants, & Eigenvectors
Johns Hopkins University via Coursera
-
181
-
- Write review
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
This course is the second course in the Linear Algebra Specialization. In this course, we continue to develop the techniques and theory to study matrices as special linear transformations (functions) on vectors. In particular, we develop techniques to manipulate matrices algebraically. This will allow us to better analyze and solve systems of linear equations. Furthermore, the definitions and theorems presented in the course allow use to identify the properties of an invertible matrix, identify relevant subspaces in R^n,
We then focus on the geometry of the matrix transformation by studying the eigenvalues and eigenvectors of matrices. These numbers are useful for both pure and applied concepts in mathematics, data science, machine learning, artificial intelligence, and dynamical systems. We will see an application of Markov Chains and the Google PageRank Algorithm at the end of the course.
Syllabus
- Matrix Algebra
- In this module, we now look at what arithmetic operation we can perform on nxm matrices and how these operations correspond to operations on functions. In particular, we will view at matrix multiplication AB as a composition of function A(B(x)). In this way, algebraic properties like non-commutativity will become more apparent. We will also look for those matrices that are invertible. Since we no longer have the Horizontal Line Test, new tests for invertibility will be needed. This will lead to the study of the very important matrix invariant, the determinant.
- Subspaces
- In this module we investigate the structure of R^n by formally defining the notion of a subspace. These special sets are those that look like smaller versions of R^n that pass through the origin. These subsets have invariants called a dimension which captures a notion of size. The linear algebra definition of dimension, which uses the notion of linearly independent vectors, matches our intuition in low dimensions where lines have dimension one and planes have dimension two. These sets, and their sizes, turn out to be another tool to student matrices as functions as both the zeros and image of a matrix are subspaces of R^n.
- Determinants
- The determinant is a real number calculated from a square matrix that determines the invertibility of a square matrix. Its value characterizes the invertibility of the matrix. The determinant also has a geometric meaning: the absolute value of the determinant scales the volumes of sets under the function. In this module, we will show how to calculate the determinant of nxn matrices and study its properties.
- Eigenvectors and Eigenvalues
- In this module we study special vectors, called eigenvectors, of a linear transformation defined by a square matrix A. These are vectors whose image is easily visualized as they are scaled by a real number called the eigenvalue. While eigenvalues can be complex numbers, we do not consider that case in this course. Eigenvalues and eigenvectors are central to the theory of discrete dynamical systems, differential equations, and Markov chains and the eigentheory presented here also appear in settings in more advanced pure math courses.
- Diagonalization and Linear Transformations
- In this module we continue our study of eigenvalues and eigenvectors, in particular how they relate to diagonalizable matrices. Eigenvectors are so important: they make understanding linear transformations easy. They are the "axes" (directions) along which a linear transformation acts simply by "stretching/compressing" and/or "flipping"; eigenvalues give you the factors by which this compression occurs. The more directions you have along which you understand the behavior of a linear transformation, the easier it is to understand the linear transformation; so you want to have as many linearly independent eigenvectors as possible associated to a single linear transformation.
- Final Assessment
- Congratulations on reaching the final assessment! Review all vocabulary and theorems before attempting the final quiz below. Think about what each theorem is saying both algebraically as well as geometrically. Provide examples (with pictures in R^2 and R^3) along with counterexamples of each theorem and vocabulary term. Lastly, be sure to work through some examples for computation, looking for any of the shortcuts in the calculations when possible. In addition, there is an optional project that applies the theory of this course. You will see how eigenvalues and eigenvectors are applied to Markov Chains and the Google Page Rank algorithm. I strongly recommend you attempt this project. Good luck!
Taught by
Joseph W. Cutrone, PhD