Latest Posts
See what's new
An Introduction to Autoencoders and Variational Autoencoders
On February 13, 2022 In Computer Vision, Deep Learning, Machine Learning
What is an Autoencoder? An autoencoder is a neural network trained to compress its input and recreate the original input from the compressed data. This procedure is useful in applications such as dimensionality reduction or file compression where we want to store a version of our data that is more memory efficient or reconstruct
Principal Components Analysis Explained for Dummies
On January 17, 2022 In Classical Machine Learning, Machine Learning
In this post, we will have an in-depth look at principal components analysis or PCA. We start with a simple explanation to build an intuitive understanding of PCA. In the second part, we will look at a more mathematical definition of Principal components analysis. Lastly, we learn how to perform PCA in Python. What
An Introduction to Neural Network Loss Functions
On September 28, 2021 In Deep Learning, Machine Learning
This post introduces the most common loss functions used in deep learning. The loss function in a neural network quantifies the difference between the expected outcome and the outcome produced by the machine learning model. From the loss function, we can derive the gradients which are used to update the weights. The average over
Understanding Backpropagation With Gradient Descent
On September 13, 2021 In Deep Learning, Machine Learning
In this post, we develop a thorough understanding of the backpropagation algorithm and how it helps a neural network learn new information. After a conceptual overview of what backpropagation aims to achieve, we go through a brief recap of the relevant concepts from calculus. Next, we perform a step-by-step walkthrough of backpropagation using an
Protobuf Vs JSON: Which Data Serialization Format Is Best for Your Project?
On June 6, 2023 In Data & ML Engineering
When it comes to data serialization, two formats have been dominant in the software development scene – Protobuf and JSON. Protobuf, short for Protocol Buffers, is a binary format designed by Google in 2008. JSON, or JavaScript Object Notation, on the other hand, is a text-based format that has gained popularity due to its
Avro Vs Protobuf: Which Data Serialization Format Is Best For Your Use Case?
On June 4, 2023 In Data & ML Engineering
The world of data management is constantly evolving, and one of the key challenges for developers is choosing the right data serialization format. Data serialization refers to the process of converting complex data structures into a format that can be easily transmitted over a network or saved to disk. The ideal data serialization format
What is Serialization in Programming
On June 3, 2023 In Data & ML Engineering
Have you ever wondered how data is stored on your computer or transferred between applications? The answer lies in serialization, a process that converts complex data structures into a format that can be easily stored and transported. In programming, serialization plays a crucial role in handling data storage and transfer. Serialization refers to the
Protobuf Vs Thrift: What’s the Difference?
On May 27, 2023 In Data & ML Engineering
In this blog post we’ll explore protobuf vs Apache thrift in depth by comparing their structures, performance, and use cases. Get ready to dive deep into the fascinating world of data serialization formats. Overview of Protocol Buffers and Apache Thrift Protocol Buffers (protobuf) and Apache Thrift are two popular data serialization formats used for
Protobuf Vs. Messagepack: What’s The Difference?
On May 21, 2023 In Data & ML Engineering, None
When considering data serialization formats, two popular choices are Protocol Buffers (protobuf) and MessagePack. In this article, we’ll analyze protobuf vs messagepack in terms of structure, performance, usage scenarios, and security considerations to help you decide which format is most suitable for your project requirements. So let’s dive into the details of these two
What is JSON Serialization
On May 20, 2023 In Data & ML Engineering
JSON serialization is a crucial part of modern web development. It is the process of converting data objects from one format to another, where the resulting format is JSON (JavaScript Object Notation). This enables easy sharing and transfer of data between different systems, programming languages, and platforms. In this article, we will discuss what
Head to Head: Protobuf vs. Flatbuffers – Which Serialization Technology Reigns Supreme?
On May 16, 2023 In Data & ML Engineering
Two popular serialization technologies used in the industry are Protobuf and Flatbuffers. They are both efficient, lightweight and widely used in various applications. In this post we will compare protobuf vs flatbuffers in terms of structure, performance and use cases. We’ll take a deep dive into both formats to explore their unique features as
XML Serialization: What it is and How it Works
On May 15, 2023 In Data & ML Engineering
XML Serialization is the process of converting an object into a format that can be stored or transmitted in XML. It is an essential step in modern programming, enabling applications to exchange data with other systems, store configuration settings, and access web services using standard formats. Serialization is a means of converting objects into
What is Data Serialization: An Introduction
On May 14, 2023 In Data & ML Engineering
In simple terms, data serialization is the process of converting complex data structures into a format that can be easily transferred and reconstructed on another platform. It is a fundamental concept in computer programming that allows for efficient data storage and transmission between different systems. This article aims to explain data serialization, its importance
Understanding Binary Serialization for Programmers
On May 13, 2023 In Data & ML Engineering
Binary serialization is the process of converting an object’s state into a binary format, which can be stored in files or transmitted over networks. This process includes writing the object’s state to memory as a sequence of bytes so that it may later be retrieved and deserialized back into its original form. The serialized