Note that not every matrix has an identity matrix.
Solving Linear Systems with the Inverse Matrix
We’ve established that a linear system of equations can be represented as a matrix of coefficients A multiplied by a vector b of your unknown variables, to give you the vector of results c.
Ab = c
Using Gaussian elimination we can determine the values of b. But this only gives you a solution for a specific value of c. If c changes, you’d have to do the whole process of Gaussian elimination all over again.
Imagine you have a car production plant for trucks and cars and your production target is 10 trucks and 8 cars. You need 1 unit of materialA and 3 units of materialB to produce a certain amount of trucks and 1 unit of materialA and 2 units of materialB to produce cars.
Assume also that the production lines are interdependent. For example changes in the production of cars could affect units of materials required for truck production because trucks make us of preprocessed materials from the car production line, etc.
You could model this with our Ab = c format and figure out the solution using Gaussian elimination:
With Gaussian elimination the solution is materialsA = 4 and materialsB = 2.
Your production targets are likely to change. For this small example it is not a big deal. But in the real world you rarely deal with such small examples. In data science and machine learning you often have millions of dimensions in your vectors and matrices.