Author Archive

From the Computer Bus to Point-to-Point Interconnect: An Overview of Bus Interconnection Structures

A bus in computer architecture is a medium for transmitting information between devices. In a typical computer, you have a central processing unit, system memory, and some I/O devices that need to communicate with each other. Busses connect all of these components together as a shared transmission medium. This means a signal sent by

What is a Random Variable in Statistics: An Introduction to Probability

In this post, we look at the basic rules of probability and the notation used to describe probabilities and probability functions. We also introduce random variables. Probability is the branch of mathematics that deals with quantifying uncertainty. To model a real-world process that does not have a deterministic outcome, we can construct a probability

Calculus For Machine Learning and Data Science

This series of blog posts introduces multivariate calculus for machine learning. While the first few posts should be accessible to anyone with a high-school math background, the articles covering vector calculus require a basic understanding of linear algebra. If you need a refresher, I’d suggest you first check out my series on linear algebra.

The Fundamental Theorem of Calculus and Integration

In this post, we introduce and develop an intuitive understanding of integral calculus. We learn how the fundamental theorem of calculus relates integral calculus to differential calculus. Integration is the reverse operation of differentiation. Together they form a powerful toolset to describe non-linear functions. While differentiation enables us to describe the gradient or rate

Lagrange Multipliers: An Introduction to Constrained Optimization

In this post we explain constrained optimization using LaGrange multipliers and illustrate it with a simple example. Lagrange multipliers enable us to maximize or minimize a multivariable function given equality constraints. This is useful if we want to find the maximum along a line described by another function. The Lagrange Multiplier Method Let’s say

Understanding The Gradient Descent Algorithm

In this post, we introduce the intuition as well as the math behind gradient descent, one of the foundational algorithms in modern artificial intelligence. Motivation for Gradient Descent In many engineering applications, you want to find the optimum of a complex system. For example, in a production system, you want to find the ideal

Linearization of Differential Equations for Approximation

In this post we learn how to build linear approximations to non-linear functions and how to measure the error between our approximation and the desired function. Given a well-behaved higher-order function, we can find an approximation using Taylor series. But how do we know when our approximation is good enough so that we can

Power Series: Understand the Taylor and MacLaurin Series

In this post, we introduce power series as a method to approximate unknown functions. We derive the Maclaurin series and the Taylor series in simple and intuitive terms. Differential calculus is an amazing tool to describe changes in complex systems with multiple inputs. But to unleash the power of Calculus, we need to describe

The Multivariable Chain Rule

In this post we learn how to apply the chain rule to vector-valued functions with multiple variables. We’ve seen how to apply the chain rule to real number functions. Now we can extend this concept into higher dimensions. How does the Multivariable Chain Rule Work? Remember that the chain rule helps us differentiate nested

The Hessian Matrix: Finding Minima and Maxima

In this post, we learn how to construct the Hessian matrix of a function and find out how the Hessian helps us determine minima and maxima. What is a Hessian Matrix? The Jacobian matrix helps us find the local gradient of a non-linear function. In many applications, we are interested in optimizing a function.