Linear Algebra Archive

Singular Value Decomposition Explained

In this post, we build an understanding of the singular value decomposition (SVD) to decompose a matrix into constituent parts. What is the Singular Value Decomposition? The singular value decomposition (SVD) is a way to decompose a matrix into constituent parts. It is a more general form of the eigendecomposition. While the eigendecomposition is

Linear Algebra for Machine Learning and Data Science

This series of blog posts aims to introduce and explain the most important mathematical concepts from linear algebra for machine learning. If you understand the contents of this series, you have all the linear algebra you’ll need to understand deep neural networks and statistical machine learning algorithms on a technical level. Most of my

Eigenvalue Decomposition Explained

In this post, we learn how to decompose a matrix into its eigenvalues and eigenvectors. We also discuss the uses of the Eigendecomposition. The eigenvalue decomposition or eigendecomposition is the process of decomposing a matrix into its eigenvectors and eigenvalues. We can also transform a matrix into an Eigenbasis (the basis matrix where every

Understanding Eigenvectors in 10 Minutes

In this post, we explain the concept of eigenvectors and eigenvalues by going through an example. What are Eigenvectors and Eigenvalues An eigenvector of a matrix A is a vector v that may change its length but not its direction when a matrix transformation is applied. In other words, applying a matrix transformation to

Gram Schmidt Process: A Brief Explanation

The Gram Schmidt process is used to transform a set of linearly independent vectors into a set of orthonormal vectors forming an orthonormal basis. It allows us to check whether vectors in a set are linearly independent. In this post, we understand how the Gram Schmidt process works and learn how to use it

Understanding the Change of Basis Matrix

In this post, we learn how to construct a transformation matrix and apply it to transform vectors into another vector space. This process is also referred to as performing a change of basis. As discussed in the previous article on vector projections, a vector can be represented on a different basis than the basic

Orthogonal Matrix: Definition and Example

In this post, we introduce orthonormal bases, orthogonal matrices and discuss their properties. An orthogonal matrix is a square matrix whose rows and columns are vectors that are orthogonal to each other and of unit length. We can also say that they form an orthonormal basis. Orthonormal Basis A set of vectors V =

What is a Transpose Matrix

In this short post, we learn how to obtain the transpose of a matrix and how to perform operations with a matrix transpose. The transpose of a matrix is a matrix that is obtained by flipping the original matrix over its diagonal. In other words, the rows of a matrix become the columns of

Identity Matrix and Inverse Matrix

We introduce the inverse matrix and the identity matrix. In addition, we learn how to solve systems of linear equations using the inverse matrix. The identity matrix is a matrix in which the diagonal entries are 1, and all other entries are zero. It is a more restrictive form of the diagonal matrix. It

How to Perform Matrix Multiplication

In this post we learn how to perform matrix multiplication and why we need to be mindful of matrix dimensions. Furthermore, we look at the properties of matrix multiplication. Matrix multiplication is an operation that consists of the element-wise multiplication of all entries in a row of the first matrix with all entries in