I wrote an article titled “Matrix Inverse from Scratch Using SVD Decomposition with C#” in the February 2024 edition of Microsoft Visual Studio Magazine. See https://visualstudiomagazine.com/Articles/2024/02/01/matrix-inverse-ml-tutorial.aspx.
Computing the inverse of a matrix is one of the most important operations in machine learning. If some matrix A has shape n-by-n, then its inverse matrix Ai is n-by-n and the matrix product of Ai * A is the n-by-n Identity matrix (1s on the upper-left to lower-right main diagonal, 0s off the diagonal).
An analogy in ordinary arithmetic is that for a number n, its inverse ni is a number such that n * ni = 1. For example, the inverse of 4 is 0.25 because 4 * 0.25 = 1.
There are several algorithms to compute a matrix inverse, and each algorithm has several variations. The article presents a from-scratch C# language implementation of matrix inverse using the Jacobi version of the SVD algorithm.
The article MatInverseSVD() function is based on SVD decomposition. If you have an n-by-n matrix A and apply SVD decomposition, the result has three components: an n-by-n matrix U, an n-dim vector s and an n-by-n matrix Vh, such that U * S * Vh = A, where S is an n-by-n matrix that has the values of the s vector on the diagonal. Then inv(a) = inv(A) = inv(U * S * Vh) = inv(Vh) * inv(S) * inv(Vh) = trans(Vh) * inv(S) * trans(U).
It’s not feasible to list all the machine learning algorithms that use matrix inverses — there are just too many. That said however, examples of ML algorithms that use a matrix inverse are linear regression, logistic regression, Gaussian mixture model clustering, Gaussian process regression and kernel ridge regression.

Color gel photography is different from, but sort of similar to, inverse color photography. Here are three nice examples (to my eye anyway) of color gel photography. The image in the center is AI-generated, and the other two are real photography. A matrix inverse is fairly easy to evaluate for correctness but art is basically impossible for me to evaluate.


.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
thank you for these great articles. Would you be able to explain the difference between each variation of the svd algorithms for matrix decomposition? Also what do you consider to be the most efficient and the most accurate decomposition methods?
One of the reasons I’ve been looking at SVD is that the available information is very confusing. So I don’t yet fully understand the difference between different SVD algorithms. Some variations work with any kind of matrix, and some work only for square matrices, and so on. I’ll post an explanation when I think I understand the differences, but it won’t be anytime soon because I can only look at these things before work in my free time.
. . . and also some apply to definite matrices only, or positive definite matrices only, or positive semi-definite matrices only , etc., etc. Sheesh! Very complicated!