Probability and Statistics for Machine Learning and Data Science

This series of blog posts introduces probability and mathematical statistics. While I wrote

Calculus For Machine Learning and Data Science

This series of blog posts introduces multivariate calculus for machine learning. While the

Linear Algebra for Machine Learning and Data Science

This series of blog posts aims to introduce and explain the most important

Weight Decay in Neural Networks

What is Weight Decay Weight decay is a regularization technique in deep learning.

Feature Scaling and Data Normalization for Deep Learning

Before training a neural network, there are several things we should do to

An Introduction to Neural Network Loss Functions

This post introduces the most common loss functions used in deep learning. The

Understanding Basic Neural Network Layers and Architecture

This post will introduce the basic architecture of a neural network and explain

Understanding Backpropagation With Gradient Descent

In this post, we develop a thorough understanding of the backpropagation algorithm and

How do Neural Networks Learn

In this post, we develop an understanding of how neural networks learn new

Understanding Hinge Loss and the SVM Cost Function

In this post, we develop an understanding of the hinge loss and how

What is a Support Vector?

In this post, we will develop an understanding of support vectors, discuss why we need them, how to construct them, and how they fit into the optimization objective of support

What is a Kernel in Machine Learning?

In this post, we are going to develop an understanding of Kernels in machine learning. We frame the problem that kernels attempt to solve, followed by a detailed explanation of

Principal Components Analysis Explained for Dummies

In this post, we will have an in-depth look at principal components analysis. We start with a simple explanation to build an intuitive understanding of PCA. In the second part,