I almost always write some code every morning before work — I enjoy writing code and coding is a skill that’s easy to lose if you don’t practice — one of the common ways that technical managers become somewhat useless.
One morning I decided to implement linear regression using C#, when some of the predictor variables are categorical instead of numeric. My synthetic raw source data looks like:
F, 24, michigan, 29500.00, liberal M, 39, oklahoma, 51200.00, moderate F, 63, nebraska, 75800.00, conservative M, 36, michigan, 44500.00, moderate . . .
The fields are sex, age, State (just three), salary, and politic leaning (just three). The goal is to predict salary from sex, age, State, politics. I encoded and normalized the data to:
1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1 0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0 1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0 0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0 . . .
Sex is encoded as Male = 0, Female = 1. Ages are divided by 100. State is one-hot encoded. Salaries are divided by 100,000. Politics is one-hot encoded. There are 200 training items and 40 test items.
The output of my demo is:
Begin C# linear regression Predict salary from sex, age, State, politics Loading people train (200) and test (40) data Done First three train X: 1.0000 0.2400 1.0000 0.0000 0.0000 0.0000 0.0000 1.0000 0.0000 0.3900 0.0000 0.0000 1.0000 0.0000 1.0000 0.0000 1.0000 0.6300 0.0000 1.0000 0.0000 1.0000 0.0000 0.0000 First three train y: 0.29500 0.51200 0.75800 Setting lrnRate = 0.0010 Setting maxEpohcs = 3000 Creating and training Linear Regression model epoch = 0 MSE = 0.125474 epoch = 600 MSE = 0.000890 epoch = 1200 MSE = 0.000674 epoch = 1800 MSE = 0.000671 epoch = 2400 MSE = 0.000671 Done Coefficients/weights: -0.0505 0.9991 0.0306 0.0251 0.0076 0.0529 0.0245 -0.0066 Bias/constant: 0.0722 Evaluating model Accuracy train (within 0.10) = 0.9150 Accuracy test (within 0.10) = 0.9500 MSE train = 0.000671 MSE test = 0.000687 R2 train = 0.9677 R2 test = 0.9706 Predicting for x = 1.0000 0.2400 1.0000 0.0000 0.0000 0.0000 0.0000 1.0000 Predicted y = 0.28535 Saving model weights to file Loading saved weights to new model Predicted y from new model= 0.28535 End demo
Because there are 7 predictors, there are 7 model weights and 1 bias. The prediction for an input x is just the sum of the weights times x values, plus the bias. For example, for x = (1 0.24 1 0 0 0 0 1) = (Female, age 24, Michigan, liberal), the predicted salary is:
y' = (1 * -0.0505) +
(0.24 * 0.9991) +
(1 * 0.0306) + (0 * 0.0251) + (0 * 0.0076) +
(0 * 0.0529) + (0 * 0.0245) + (1 * -0.0066)
= 0.28535
which is $28,535.
Training is the process of finding the values of the weights and the bias so that predicted y values are close to the target y values in the training data. I used basic stochastic gradient descent (SGD) without weight decay aka L2 regularization.
The model accuracy is surprisingly good — 91.5% accuracy on the training data (183 out of 200 correct) and 95.0% accuracy on the test data (38 out of 40 correct), where a prediction is scored correct if it’s within 15% of the correct salary. Mean squared error and R2 are quite good too.
I validated my C# implementation using a closed-form solution with Python numpy functions. The key statements are:
# get train_X and train_y from file using np.loadtxt()
new_col = np.ones(200)
design_X = np.column_stack((new_col, train_X))
design_X_pinv = np.linalg.pinv(design_X)
bias_wts = np.dot(design_X_pinv, train_y)
print("\nbias and weights using np.pinv: ")
print(bias_wts)
# 0.0701
# -0.0506 0.9991 0.0328 0.0274 0.0099 0.0526 0.0243 -0.0068
The closed form solution relies on the Moore-Penrose pseudo-inverse pinv() function, which is extremely complicated. The results (weights, bias, MSE, accuracy, R2) were nearly identical to my from-scratch SGD C# version.
Good fun!

I have implemented linear regression dozens of times, but code remakes are always interesting because I always learn something new.
I’m a big fan of science fiction movies, especially old ones. Most remakes of science fiction movies are almost always worse than the originals in my opinion. For example, remakes that are worse than the originals are remakes of the original “Godzilla” (1954/56), the original “Total Recall” (1990), and the original “Invaders from Mars” (1953).
But here are three good remakes.
Left: “Dune” (2021) is clearly better than the 1984 version. I give the remake an A- grade and the original a solid B grade.
Center: “Alien” (1979) is a remake of “It! The Terror from Beyond Space” (1958). I give “Alien” an A grade and “Terror” a B grade.
Right: “12 Monkeys” (1995) is a sort-of remake of “La Jetee” (1962). The two are difficult to compare because “12 Monkeys” is a big-budget film with well-known stars, but “La Jetee” is an experimental black and white film with only still photos and narration. I give “12 Monkeys” a B+ grade and “La Jetee” a B grade.
I was eating dinner with my good friend Ken L a few nights ago. We both grew up reading “A Princess of Mars” by E.R. Burroughs and waited for decades for the movie based on it — and it was the disaster “John Carter” (2012). This is a movie we’d both love to see remade.
Demo program. Replace “lt” (less than), “gt”, “lte”, “gte” with Boolean operator symbols (my blog editor chokes on symbols).
using System;
using System.IO;
using System.Collections.Generic;
namespace LinearRegressionCategorical
{
internal class Program
{
static void Main(string[] args)
{
Console.WriteLine("\nBegin C# linear regression ");
Console.WriteLine("Predict salary from sex, age," +
" State, politics ");
// 1. load data
Console.WriteLine("\nLoading people train" +
" (200) and test (40) data");
string trainFile =
"C:\\VSM\\LinearRegressionCategorical\\" +
"Data\\people_train.txt";
int[] colsX = new int[] { 0, 1, 2, 3, 4, 6, 7, 8 };
double[][] trainX =
MatLoad(trainFile, colsX, ',', "#");
double[] trainY =
MatToVec(MatLoad(trainFile,
new int[] { 5 }, ',', "#"));
string testFile =
"C:\\VSM\\LinearRegressionCategorical\\" +
"Data\\people_test.txt";
double[][] testX =
MatLoad(testFile, colsX, ',', "#");
double[] testY =
MatToVec(MatLoad(testFile,
new int[] { 5 }, ',', "#"));
Console.WriteLine("Done ");
Console.WriteLine("\nFirst three train X: ");
for (int i = 0; i "lt" 3; ++i)
VecShow(trainX[i], 4, 8);
Console.WriteLine("\nFirst three train y: ");
for (int i = 0; i "lt" 3; ++i)
Console.WriteLine(trainY[i].ToString("F5").
PadLeft(8));
// 2. create and train model
double lrnRate = 0.001;
int maxEpochs = 100;
int seed = 0;
Console.WriteLine("\nSetting lrnRate = " +
lrnRate.ToString("F4"));
Console.WriteLine("Setting maxEpohcs = " +
maxEpochs);
Console.WriteLine("\nCreating and training" +
" Linear Regression model ");
LinearRegressor model =
new LinearRegressor(seed);
model.Train(trainX, trainY, lrnRate, maxEpochs);
Console.WriteLine("Done ");
// 2b. show model parameters
Console.WriteLine("\nCoefficients/weights: ");
for (int i = 0; i "lt" model.weights.Length; ++i)
Console.Write(model.weights[i].ToString("F4") + " ");
Console.WriteLine("\nBias/constant: " +
model.bias.ToString("F4"));
// 3. evaluate model
Console.WriteLine("\nEvaluating model ");
double accTrain = model.Accuracy(trainX, trainY, 0.15);
Console.WriteLine("\nAccuracy train (within 0.15) = " +
accTrain.ToString("F4"));
double accTest = model.Accuracy(testX, testY, 0.15);
Console.WriteLine("Accuracy test (within 0.15) = " +
accTest.ToString("F4"));
double mseTrain = model.MSE(trainX, trainY);
Console.WriteLine("\nMSE train = " +
mseTrain.ToString("F4"));
double mseTest = model.MSE(testX, testY);
Console.WriteLine("MSE test = " +
mseTest.ToString("F4"));
double r2Train = model.RSquared(trainX, trainY);
Console.WriteLine("\nR2 train = " +
r2Train.ToString("F4"));
double r2Test = model.RSquared(testX, testY);
Console.WriteLine("R2 test = " +
r2Test.ToString("F4"));
// 4. use model to predict first training item
double[] x = trainX[0];
Console.WriteLine("\nPredicting for x = ");
VecShow(x, 4, 9);
double predY = model.Predict(x);
Console.WriteLine("\nPredicted y = " +
predY.ToString("F5"));
// 5. save model weights and verify
Console.WriteLine("\nSaving model weights to file ");
string fn = "C:\\VSM\\LinearRegressionCategorical\\" +
"Models\\weights.txt";
model.SaveWeights(fn);
Console.WriteLine("Loading saved weights to new model ");
LinearRegressor savedModel = new LinearRegressor();
savedModel.LoadWeights(fn);
double y = savedModel.Predict(x);
Console.WriteLine("Predicted y from new model= " +
y.ToString("F5"));
Console.WriteLine("\nEnd demo ");
Console.ReadLine();
} // Main()
// ------------------------------------------------------
// helpers for Main()
// ------------------------------------------------------
static double[][] MatLoad(string fn, int[] usecols,
char sep, string comment)
{
List"lt"double[]"gt" result =
new List"lt"double[]"gt"();
string line = "";
FileStream ifs = new FileStream(fn, FileMode.Open);
StreamReader sr = new StreamReader(ifs);
while ((line = sr.ReadLine()) != null)
{
if (line.StartsWith(comment) == true)
continue;
string[] tokens = line.Split(sep);
List"lt"double"gt" lst = new List"lt"double"gt"();
for (int j = 0; j "lt" usecols.Length; ++j)
lst.Add(double.Parse(tokens[usecols[j]]));
double[] row = lst.ToArray();
result.Add(row);
}
sr.Close(); ifs.Close();
return result.ToArray();
}
static double[] MatToVec(double[][] mat)
{
int nRows = mat.Length;
int nCols = mat[0].Length;
double[] result = new double[nRows * nCols];
int k = 0;
for (int i = 0; i "lt" nRows; ++i)
for (int j = 0; j "lt" nCols; ++j)
result[k++] = mat[i][j];
return result;
}
static void VecShow(double[] vec, int dec, int wid)
{
for (int i = 0; i "lt" vec.Length; ++i)
Console.Write(vec[i].ToString("F" + dec).
PadLeft(wid));
Console.WriteLine("");
}
} // class Program
public class LinearRegressor
{
public double[] weights;
public double bias;
private Random rnd;
public LinearRegressor(int seed = 0)
{
this.weights = new double[0];
this.bias = 0;
this.rnd = new Random(seed);
}
public void Train(double[][] trainX, double[] trainY,
double lrnRate, int maxEpochs)
{
int n = trainX.Length;
int dim = trainX[0].Length;
this.weights = new double[dim];
double low = -0.01; double hi = 0.01;
for (int i = 0; i "lt" dim; ++i)
this.weights[i] = (hi - low) *
this.rnd.NextDouble() + low;
this.bias = (hi - low) *
this.rnd.NextDouble() + low;
int[] indices = new int[n];
for (int i = 0; i "lt" n; ++i)
indices[i] = i;
for (int epoch = 0; epoch "lt" maxEpochs; ++epoch)
{
Shuffle(indices, this.rnd);
for (int i = 0; i "lt" n; ++i)
{
int idx = indices[i];
double[] x = trainX[idx];
double predY = this.Predict(x);
double actualY = trainY[idx];
for (int j = 0; j "lt" dim; ++j)
this.weights[j] -= lrnRate *
(predY - actualY) * x[j];
this.bias -= lrnRate * (predY - actualY);
}
if (epoch % (int)(maxEpochs / 5) == 0)
{
double mse = this.MSE(trainX, trainY);
string s1 = "epoch = " +
epoch.ToString().PadLeft(5);
string s2 = " MSE = " +
mse.ToString("F4").PadLeft(8);
Console.WriteLine(s1 + s2);
}
}
}
public double Predict(double[] x)
{
double result = 0.0;
for (int j = 0; j "lt" x.Length; ++j)
result += x[j] * this.weights[j];
result += this.bias;
return result;
}
public double Accuracy(double[][] dataX, double[] dataY,
double pctClose)
{
int numCorrect = 0; int numWrong = 0;
for (int i = 0; i "lt" dataX.Length; ++i)
{
double actualY = dataY[i];
double predY = this.Predict(dataX[i]);
if (Math.Abs(predY - actualY) "lt"
Math.Abs(pctClose * actualY))
++numCorrect;
else
++numWrong;
}
return (numCorrect * 1.0) / (numWrong + numCorrect);
}
public double MSE(double[][] dataX, double[] dataY)
{
int n = dataX.Length;
double sum = 0.0;
for (int i = 0; i "lt" n; ++i)
{
double actualY = dataY[i];
double predY = this.Predict(dataX[i]);
sum += (actualY - predY) * (actualY - predY);
}
return sum / n;
}
public double RSquared(double[][] dataX, double[] dataY)
{
// coefficient of determination
// sum of (y - y')^2 / sum (y - y_mean)^2
// 1. compute mean actual y
int n = dataX.Length;
double sum = 0.0;
for (int i = 0; i "lt" n; ++i) // compute mean actual y
sum += dataY[i];
double meanY = sum / n;
double sumTop = 0.0;
double sumBot = 0.0;
for (int i = 0; i "lt" n; ++i)
{
double predY = this.Predict(dataX[i]);
sumTop +=
(dataY[i] - predY) * (dataY[i] - predY);
sumBot +=
(dataY[i] - meanY) * (dataY[i] - meanY);
}
return 1.0 - (sumTop / sumBot);
}
public void SaveWeights(string filename)
{
FileStream ofs =
new FileStream(filename, FileMode.Create);
StreamWriter sw = new StreamWriter(ofs);
for (int i = 0; i "lt" this.weights.Length; ++i)
sw.WriteLine(this.weights[i].ToString("F6"));
sw.WriteLine(this.bias.ToString("F6"));
sw.Close(); ofs.Close();
}
public void LoadWeights(string filename)
{
List"lt"double"gt" wtsAndBiasList =
new List"lt"double"gt"();
FileStream ifs =
new FileStream(filename, FileMode.Open);
StreamReader sr = new StreamReader(ifs);
while (!sr.EndOfStream)
{
wtsAndBiasList.Add(double.Parse(sr.ReadLine()));
}
sr.Close(); ifs.Close();
this.weights = new double[wtsAndBiasList.Count - 1];
for (int i = 0; i "lt" this.weights.Length; ++i)
this.weights[i] = wtsAndBiasList[i];
this.bias = wtsAndBiasList[wtsAndBiasList.Count - 1];
}
private static void Shuffle(int[] indices, Random rnd)
{
// Fisher-Yates
int n = indices.Length;
for (int i = 0; i "lt" n; ++i)
{
int ri = rnd.Next(i, n);
int tmp = indices[i];
indices[i] = indices[ri];
indices[ri] = tmp;
}
}
} // class LinearRegressor
} // ns
Training data:
# people_train.txt # sex (0 = male, 1 = female) # age, state (michigan, nebraska, oklahoma), income, # politics type (conservative, moderate, liberal) # 1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1 0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0 1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0 0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0 1, 0.27, 0, 1, 0, 0.2860, 0, 0, 1 1, 0.50, 0, 1, 0, 0.5650, 0, 1, 0 1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0 0, 0.19, 0, 0, 1, 0.3270, 1, 0, 0 1, 0.22, 0, 1, 0, 0.2770, 0, 1, 0 0, 0.39, 0, 0, 1, 0.4710, 0, 0, 1 1, 0.34, 1, 0, 0, 0.3940, 0, 1, 0 0, 0.22, 1, 0, 0, 0.3350, 1, 0, 0 1, 0.35, 0, 0, 1, 0.3520, 0, 0, 1 0, 0.33, 0, 1, 0, 0.4640, 0, 1, 0 1, 0.45, 0, 1, 0, 0.5410, 0, 1, 0 1, 0.42, 0, 1, 0, 0.5070, 0, 1, 0 0, 0.33, 0, 1, 0, 0.4680, 0, 1, 0 1, 0.25, 0, 0, 1, 0.3000, 0, 1, 0 0, 0.31, 0, 1, 0, 0.4640, 1, 0, 0 1, 0.27, 1, 0, 0, 0.3250, 0, 0, 1 1, 0.48, 1, 0, 0, 0.5400, 0, 1, 0 0, 0.64, 0, 1, 0, 0.7130, 0, 0, 1 1, 0.61, 0, 1, 0, 0.7240, 1, 0, 0 1, 0.54, 0, 0, 1, 0.6100, 1, 0, 0 1, 0.29, 1, 0, 0, 0.3630, 1, 0, 0 1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0 1, 0.55, 0, 0, 1, 0.6250, 1, 0, 0 1, 0.40, 1, 0, 0, 0.5240, 1, 0, 0 1, 0.22, 1, 0, 0, 0.2360, 0, 0, 1 1, 0.68, 0, 1, 0, 0.7840, 1, 0, 0 0, 0.60, 1, 0, 0, 0.7170, 0, 0, 1 0, 0.34, 0, 0, 1, 0.4650, 0, 1, 0 0, 0.25, 0, 0, 1, 0.3710, 1, 0, 0 0, 0.31, 0, 1, 0, 0.4890, 0, 1, 0 1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0 1, 0.58, 0, 1, 0, 0.6540, 0, 0, 1 0, 0.55, 0, 1, 0, 0.6070, 0, 0, 1 0, 0.43, 0, 1, 0, 0.5110, 0, 1, 0 0, 0.43, 0, 0, 1, 0.5320, 0, 1, 0 0, 0.21, 1, 0, 0, 0.3720, 1, 0, 0 1, 0.55, 0, 0, 1, 0.6460, 1, 0, 0 1, 0.64, 0, 1, 0, 0.7480, 1, 0, 0 0, 0.41, 1, 0, 0, 0.5880, 0, 1, 0 1, 0.64, 0, 0, 1, 0.7270, 1, 0, 0 0, 0.56, 0, 0, 1, 0.6660, 0, 0, 1 1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0 0, 0.65, 0, 0, 1, 0.7010, 0, 0, 1 1, 0.55, 0, 0, 1, 0.6430, 1, 0, 0 0, 0.25, 1, 0, 0, 0.4030, 1, 0, 0 1, 0.46, 0, 0, 1, 0.5100, 0, 1, 0 0, 0.36, 1, 0, 0, 0.5350, 1, 0, 0 1, 0.52, 0, 1, 0, 0.5810, 0, 1, 0 1, 0.61, 0, 0, 1, 0.6790, 1, 0, 0 1, 0.57, 0, 0, 1, 0.6570, 1, 0, 0 0, 0.46, 0, 1, 0, 0.5260, 0, 1, 0 0, 0.62, 1, 0, 0, 0.6680, 0, 0, 1 1, 0.55, 0, 0, 1, 0.6270, 1, 0, 0 0, 0.22, 0, 0, 1, 0.2770, 0, 1, 0 0, 0.50, 1, 0, 0, 0.6290, 1, 0, 0 0, 0.32, 0, 1, 0, 0.4180, 0, 1, 0 0, 0.21, 0, 0, 1, 0.3560, 1, 0, 0 1, 0.44, 0, 1, 0, 0.5200, 0, 1, 0 1, 0.46, 0, 1, 0, 0.5170, 0, 1, 0 1, 0.62, 0, 1, 0, 0.6970, 1, 0, 0 1, 0.57, 0, 1, 0, 0.6640, 1, 0, 0 0, 0.67, 0, 0, 1, 0.7580, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3430, 0, 0, 1 1, 0.53, 1, 0, 0, 0.6010, 1, 0, 0 0, 0.44, 1, 0, 0, 0.5480, 0, 1, 0 1, 0.46, 0, 1, 0, 0.5230, 0, 1, 0 0, 0.20, 0, 1, 0, 0.3010, 0, 1, 0 0, 0.38, 1, 0, 0, 0.5350, 0, 1, 0 1, 0.50, 0, 1, 0, 0.5860, 0, 1, 0 1, 0.33, 0, 1, 0, 0.4250, 0, 1, 0 0, 0.33, 0, 1, 0, 0.3930, 0, 1, 0 1, 0.26, 0, 1, 0, 0.4040, 1, 0, 0 1, 0.58, 1, 0, 0, 0.7070, 1, 0, 0 1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0 0, 0.46, 1, 0, 0, 0.6440, 1, 0, 0 1, 0.60, 1, 0, 0, 0.7170, 1, 0, 0 0, 0.42, 1, 0, 0, 0.4890, 0, 1, 0 0, 0.56, 0, 0, 1, 0.5640, 0, 0, 1 0, 0.62, 0, 1, 0, 0.6630, 0, 0, 1 0, 0.50, 1, 0, 0, 0.6480, 0, 1, 0 1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0 0, 0.67, 0, 1, 0, 0.8040, 0, 0, 1 0, 0.40, 0, 0, 1, 0.5040, 0, 1, 0 1, 0.42, 0, 1, 0, 0.4840, 0, 1, 0 1, 0.64, 1, 0, 0, 0.7200, 1, 0, 0 0, 0.47, 1, 0, 0, 0.5870, 0, 0, 1 1, 0.45, 0, 1, 0, 0.5280, 0, 1, 0 0, 0.25, 0, 0, 1, 0.4090, 1, 0, 0 1, 0.38, 1, 0, 0, 0.4840, 1, 0, 0 1, 0.55, 0, 0, 1, 0.6000, 0, 1, 0 0, 0.44, 1, 0, 0, 0.6060, 0, 1, 0 1, 0.33, 1, 0, 0, 0.4100, 0, 1, 0 1, 0.34, 0, 0, 1, 0.3900, 0, 1, 0 1, 0.27, 0, 1, 0, 0.3370, 0, 0, 1 1, 0.32, 0, 1, 0, 0.4070, 0, 1, 0 1, 0.42, 0, 0, 1, 0.4700, 0, 1, 0 0, 0.24, 0, 0, 1, 0.4030, 1, 0, 0 1, 0.42, 0, 1, 0, 0.5030, 0, 1, 0 1, 0.25, 0, 0, 1, 0.2800, 0, 0, 1 1, 0.51, 0, 1, 0, 0.5800, 0, 1, 0 0, 0.55, 0, 1, 0, 0.6350, 0, 0, 1 1, 0.44, 1, 0, 0, 0.4780, 0, 0, 1 0, 0.18, 1, 0, 0, 0.3980, 1, 0, 0 0, 0.67, 0, 1, 0, 0.7160, 0, 0, 1 1, 0.45, 0, 0, 1, 0.5000, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5580, 0, 1, 0 0, 0.25, 0, 1, 0, 0.3900, 0, 1, 0 0, 0.67, 1, 0, 0, 0.7830, 0, 1, 0 1, 0.37, 0, 0, 1, 0.4200, 0, 1, 0 0, 0.32, 1, 0, 0, 0.4270, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5700, 0, 1, 0 0, 0.66, 0, 0, 1, 0.7500, 0, 0, 1 1, 0.61, 1, 0, 0, 0.7000, 1, 0, 0 0, 0.58, 0, 0, 1, 0.6890, 0, 1, 0 1, 0.19, 1, 0, 0, 0.2400, 0, 0, 1 1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0 0, 0.27, 1, 0, 0, 0.3640, 0, 1, 0 1, 0.42, 1, 0, 0, 0.4800, 0, 1, 0 1, 0.60, 1, 0, 0, 0.7130, 1, 0, 0 0, 0.27, 0, 0, 1, 0.3480, 1, 0, 0 1, 0.29, 0, 1, 0, 0.3710, 1, 0, 0 0, 0.43, 1, 0, 0, 0.5670, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5670, 0, 1, 0 1, 0.27, 0, 0, 1, 0.2940, 0, 0, 1 0, 0.44, 1, 0, 0, 0.5520, 1, 0, 0 1, 0.23, 0, 1, 0, 0.2630, 0, 0, 1 0, 0.36, 0, 1, 0, 0.5300, 0, 0, 1 1, 0.64, 0, 0, 1, 0.7250, 1, 0, 0 1, 0.29, 0, 0, 1, 0.3000, 0, 0, 1 0, 0.33, 1, 0, 0, 0.4930, 0, 1, 0 0, 0.66, 0, 1, 0, 0.7500, 0, 0, 1 0, 0.21, 0, 0, 1, 0.3430, 1, 0, 0 1, 0.27, 1, 0, 0, 0.3270, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3180, 0, 0, 1 0, 0.31, 1, 0, 0, 0.4860, 0, 1, 0 1, 0.36, 0, 0, 1, 0.4100, 0, 1, 0 1, 0.49, 0, 1, 0, 0.5570, 0, 1, 0 0, 0.28, 1, 0, 0, 0.3840, 1, 0, 0 0, 0.43, 0, 0, 1, 0.5660, 0, 1, 0 0, 0.46, 0, 1, 0, 0.5880, 0, 1, 0 1, 0.57, 1, 0, 0, 0.6980, 1, 0, 0 0, 0.52, 0, 0, 1, 0.5940, 0, 1, 0 0, 0.31, 0, 0, 1, 0.4350, 0, 1, 0 0, 0.55, 1, 0, 0, 0.6200, 0, 0, 1 1, 0.50, 1, 0, 0, 0.5640, 0, 1, 0 1, 0.48, 0, 1, 0, 0.5590, 0, 1, 0 0, 0.22, 0, 0, 1, 0.3450, 1, 0, 0 1, 0.59, 0, 0, 1, 0.6670, 1, 0, 0 1, 0.34, 1, 0, 0, 0.4280, 0, 0, 1 0, 0.64, 1, 0, 0, 0.7720, 0, 0, 1 1, 0.29, 0, 0, 1, 0.3350, 0, 0, 1 0, 0.34, 0, 1, 0, 0.4320, 0, 1, 0 0, 0.61, 1, 0, 0, 0.7500, 0, 0, 1 1, 0.64, 0, 0, 1, 0.7110, 1, 0, 0 0, 0.29, 1, 0, 0, 0.4130, 1, 0, 0 1, 0.63, 0, 1, 0, 0.7060, 1, 0, 0 0, 0.29, 0, 1, 0, 0.4000, 1, 0, 0 0, 0.51, 1, 0, 0, 0.6270, 0, 1, 0 0, 0.24, 0, 0, 1, 0.3770, 1, 0, 0 1, 0.48, 0, 1, 0, 0.5750, 0, 1, 0 1, 0.18, 1, 0, 0, 0.2740, 1, 0, 0 1, 0.18, 1, 0, 0, 0.2030, 0, 0, 1 1, 0.33, 0, 1, 0, 0.3820, 0, 0, 1 0, 0.20, 0, 0, 1, 0.3480, 1, 0, 0 1, 0.29, 0, 0, 1, 0.3300, 0, 0, 1 0, 0.44, 0, 0, 1, 0.6300, 1, 0, 0 0, 0.65, 0, 0, 1, 0.8180, 1, 0, 0 0, 0.56, 1, 0, 0, 0.6370, 0, 0, 1 0, 0.52, 0, 0, 1, 0.5840, 0, 1, 0 0, 0.29, 0, 1, 0, 0.4860, 1, 0, 0 0, 0.47, 0, 1, 0, 0.5890, 0, 1, 0 1, 0.68, 1, 0, 0, 0.7260, 0, 0, 1 1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0 1, 0.61, 0, 1, 0, 0.6250, 0, 0, 1 1, 0.19, 0, 1, 0, 0.2150, 0, 0, 1 1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0 0, 0.26, 1, 0, 0, 0.4230, 1, 0, 0 1, 0.61, 0, 1, 0, 0.6740, 1, 0, 0 1, 0.40, 1, 0, 0, 0.4650, 0, 1, 0 0, 0.49, 1, 0, 0, 0.6520, 0, 1, 0 1, 0.56, 1, 0, 0, 0.6750, 1, 0, 0 0, 0.48, 0, 1, 0, 0.6600, 0, 1, 0 1, 0.52, 1, 0, 0, 0.5630, 0, 0, 1 0, 0.18, 1, 0, 0, 0.2980, 1, 0, 0 0, 0.56, 0, 0, 1, 0.5930, 0, 0, 1 0, 0.52, 0, 1, 0, 0.6440, 0, 1, 0 0, 0.18, 0, 1, 0, 0.2860, 0, 1, 0 0, 0.58, 1, 0, 0, 0.6620, 0, 0, 1 0, 0.39, 0, 1, 0, 0.5510, 0, 1, 0 0, 0.46, 1, 0, 0, 0.6290, 0, 1, 0 0, 0.40, 0, 1, 0, 0.4620, 0, 1, 0 0, 0.60, 1, 0, 0, 0.7270, 0, 0, 1 1, 0.36, 0, 1, 0, 0.4070, 0, 0, 1 1, 0.44, 1, 0, 0, 0.5230, 0, 1, 0 1, 0.28, 1, 0, 0, 0.3130, 0, 0, 1 1, 0.54, 0, 0, 1, 0.6260, 1, 0, 0
Test data:
# people_test.txt # 0, 0.51, 1, 0, 0, 0.6120, 0, 1, 0 0, 0.32, 0, 1, 0, 0.4610, 0, 1, 0 1, 0.55, 1, 0, 0, 0.6270, 1, 0, 0 1, 0.25, 0, 0, 1, 0.2620, 0, 0, 1 1, 0.33, 0, 0, 1, 0.3730, 0, 0, 1 0, 0.29, 0, 1, 0, 0.4620, 1, 0, 0 1, 0.65, 1, 0, 0, 0.7270, 1, 0, 0 0, 0.43, 0, 1, 0, 0.5140, 0, 1, 0 0, 0.54, 0, 1, 0, 0.6480, 0, 0, 1 1, 0.61, 0, 1, 0, 0.7270, 1, 0, 0 1, 0.52, 0, 1, 0, 0.6360, 1, 0, 0 1, 0.30, 0, 1, 0, 0.3350, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3140, 0, 0, 1 0, 0.47, 0, 0, 1, 0.5940, 0, 1, 0 1, 0.39, 0, 1, 0, 0.4780, 0, 1, 0 1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0 0, 0.49, 1, 0, 0, 0.5860, 0, 1, 0 0, 0.63, 0, 0, 1, 0.6740, 0, 0, 1 0, 0.30, 1, 0, 0, 0.3920, 1, 0, 0 0, 0.61, 0, 0, 1, 0.6960, 0, 0, 1 0, 0.47, 0, 0, 1, 0.5870, 0, 1, 0 1, 0.30, 0, 0, 1, 0.3450, 0, 0, 1 0, 0.51, 0, 0, 1, 0.5800, 0, 1, 0 0, 0.24, 1, 0, 0, 0.3880, 0, 1, 0 0, 0.49, 1, 0, 0, 0.6450, 0, 1, 0 1, 0.66, 0, 0, 1, 0.7450, 1, 0, 0 0, 0.65, 1, 0, 0, 0.7690, 1, 0, 0 0, 0.46, 0, 1, 0, 0.5800, 1, 0, 0 0, 0.45, 0, 0, 1, 0.5180, 0, 1, 0 0, 0.47, 1, 0, 0, 0.6360, 1, 0, 0 0, 0.29, 1, 0, 0, 0.4480, 1, 0, 0 0, 0.57, 0, 0, 1, 0.6930, 0, 0, 1 0, 0.20, 1, 0, 0, 0.2870, 0, 0, 1 0, 0.35, 1, 0, 0, 0.4340, 0, 1, 0 0, 0.61, 0, 0, 1, 0.6700, 0, 0, 1 0, 0.31, 0, 0, 1, 0.3730, 0, 1, 0 1, 0.18, 1, 0, 0, 0.2080, 0, 0, 1 1, 0.26, 0, 0, 1, 0.2920, 0, 0, 1 0, 0.28, 1, 0, 0, 0.3640, 0, 0, 1 0, 0.59, 0, 0, 1, 0.6940, 0, 0, 1

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.