Latest Posts

See what's new

Linear Algebra for Machine Learning and Data Science

This series of blog posts aims to introduce and explain the most important mathematical concepts from linear algebra for machine learning. If you understand the contents of this series, you have all the linear algebra you’ll need to understand deep neural networks and statistical machine learning algorithms on a technical level. Most of my

Eigenvalue Decomposition Explained

In this post, we learn how to decompose a matrix into its eigenvalues and eigenvectors. We also discuss the uses of the Eigendecomposition. The eigenvalue decomposition or eigendecomposition is the process of decomposing a matrix into its eigenvectors and eigenvalues. We can also transform a matrix into an Eigenbasis (the basis matrix where every

Understanding Eigenvectors in 10 Minutes

In this post, we explain the concept of eigenvectors and eigenvalues by going through an example. What are Eigenvectors and Eigenvalues An eigenvector of a matrix A is a vector v that may change its length but not its direction when a matrix transformation is applied. In other words, applying a matrix transformation to

Gram Schmidt Process: A Brief Explanation

The Gram Schmidt process is used to transform a set of linearly independent vectors into a set of orthonormal vectors forming an orthonormal basis. It allows us to check whether vectors in a set are linearly independent. In this post, we understand how the Gram Schmidt process works and learn how to use it

Understanding the Change of Basis Matrix

In this post, we learn how to construct a transformation matrix and apply it to transform vectors into another vector space. This process is also referred to as performing a change of basis. As discussed in the previous article on vector projections, a vector can be represented on a different basis than the basic

Orthogonal Matrix: Definition and Example

In this post, we introduce orthonormal bases, orthogonal matrices and discuss their properties. An orthogonal matrix is a square matrix whose rows and columns are vectors that are orthogonal to each other and of unit length. We can also say that they form an orthonormal basis. Orthonormal Basis A set of vectors V =

What is a Transpose Matrix

In this short post, we learn how to obtain the transpose of a matrix and how to perform operations with a matrix transpose. The transpose of a matrix is a matrix that is obtained by flipping the original matrix over its diagonal. In other words, the rows of a matrix become the columns of

Identity Matrix and Inverse Matrix

We introduce the inverse matrix and the identity matrix. In addition, we learn how to solve systems of linear equations using the inverse matrix. The identity matrix is a matrix in which the diagonal entries are 1, and all other entries are zero. It is a more restrictive form of the diagonal matrix. It

What is a Diagonal Matrix

We introduce and discuss the applications and properties of the diagonal matrix, the upper triangular matrix, and the lower triangular matrix. A diagonal matrix is a square matrix in which all entries are zero, except for those on the leading diagonal. It is also called the scaling matrix because multiplication with the diagonal matrix

Gaussian Elimination and Gauss Jordan Elimination: An Introduction

We introduce Gaussian elimination and Gauss Jordan Elimination, more commonly known as the elimination method, and learn to use these methods to solve linear equations with several unknown variables. We also introduce the row echelon form of a matrix and discuss the difference between Gaussian elimination and Gauss Jordan elimination. Gaussian Elimination Method Gaussian