I wrote an article titled “Quadratic Regression with Pseudo-Inverse Training Using C#” in the May 2026 edition of Microsoft Visual Studio Magazine. See https://visualstudiomagazine.com/articles/2026/05/01/quadratic-regression-with-pseudo-inverse-training-using-csharp.aspx.
The goal of a machine learning regression problem is to predict a single numeric value. For example, you might want to predict a person’s credit rating based on age, annual salary, bank account balance, and so on. There are approximately a dozen common regression techniques. The most basic technique is called linear regression, or sometimes multiple linear regression, where the “multiple” indicates two or more predictor variables.
The form of a basic linear regression prediction model is y’ = (w0 * x0) + (w1 * x1) + . . + (wn * xn) + b, where y’ is the predicted value, the xi are n predictor values, the wi are model weights, and b is the bias.
Quadratic regression extends linear regression. The form of a quadratic regression model is y’ = (w0 * x0) + . . + (wn * xn) + (wj * x0 * x0) + . . + (wk * x0 * x1) + . . . + b. There are derived predictors that are the square of each original predictor, and interaction terms that are the multiplication product of all possible pairs of original predictors.
Compared to basic linear regression, quadratic regression can handle more complex data. Compared to the most powerful regression techniques such as gradient boosting regression, quadratic regression often has slightly worse prediction accuracy, but has much better model interpretability.
The output of the demo program presented in the article is:
Begin C# quadratic regression with pseudo-inverse (QR-Householder) training Loading synthetic train (200) and test (40) data Done First three train X: -0.1660 0.4406 -0.9998 -0.3953 -0.7065 0.0776 -0.1616 0.3704 -0.5911 0.7562 -0.9452 0.3409 -0.1654 0.1174 -0.7192 First three train y: 0.4840 0.1568 0.8054 Creating quadratic regression model Starting pseudo-inverse training Done Model base weights: -0.2630 0.0354 -0.0421 0.0341 -0.1124 Model quadratic weights: 0.0655 0.0194 0.0051 0.0047 0.0243 Model interaction weights: 0.0043 0.0249 0.0071 0.1081 -0.0012 -0.0093 0.0362 0.0085 -0.0568 0.0016 Model bias/intercept: 0.3220 Evaluating model Accuracy train (within 0.10) = 0.8850 Accuracy test (within 0.10) = 0.9250 MSE train = 0.0003 MSE test = 0.0005 Predicting for x = -0.1660 0.4406 -0.9998 -0.3953 -0.7065 Predicted y = 0.4843 End demo
The model accuracy is quite good compared to many other regression techniques applied to the synthetic dataset. The model achieves 88.50% accuracy on the training data (177 out of 200 correct) and 92.50% accuracy on the test data (37 out of 40 correct). A prediction is scored as correct if it’s within 10% of the true target value.
Behind the scenes, the demo program trains the quadratic regression model using a technique called relaxed Moore-Penrose pseudo-inverse with QR inverse via the Householder algorithm. This is just one of many possible training algorithms. The different algorithms have different pros and cons and no algorithm works the best all the time.
Quadratic regression has a nice balance of prediction power and interpretability. The model weights/coefficients are easy to interpret. If the predictor values have been normalized to the same scale, larger magnitudes mean larger effect, and the sign of the weights indicate the direction of the effect.
Quadratic regression is not always effective — if it were, it would be used far more often than it is. Compared to basic linear regression, quadratic regression can sometimes provide a big improvement in model prediction accuracy for a relatively small investment in effort, and so it’s usually worth exploring.

Quadratic regression was developed in the 1950s and 1960s, about the same time as early space exploration. Galaxy Science Fiction was published from 1950 to 1980. It was the leading science fiction magazine of its time. I really enjoy looking at older covers to get a sense of the optimism and excitement in those days. Here are three nice 1952 covers by artist Jack Coggins (1911-2006).


.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.