One morning before work, I was waiting for Rika, the dog groomer lady, to come to my house to clean up my two mutts (Riley and Kevin). I figured I’d make use of the wait time by implementing a logistic regression demo using JavaScript. So I did.
For the demo, I used one of my standard synthetic datasets. The goal is to predict the sex of a person (male = 0, female = 1) from age, state of residence (Michigan, Nebraska, Oklahoma), income, and political leaning (conservative, moderate, liberal).
The raw data looks like:
F 24 michigan 29500.00 liberal M 39 oklahoma 51200.00 moderate F 63 nebraska 75800.00 conservative M 36 michigan 44500.00 moderate F 27 nebraska 28600.00 liberal . . .
The encoded and normalized comma-delimited data looks like:
1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1 0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0 1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0 0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0 1, 0.27, 0, 1, 0, 0.2860, 0, 0, 1 . . .
Each data item is a person. The fields are sex (male = 0, female = 1), age (divided by 100), state (Michigan = 100, Nebraska = 010, Oklahoma – 001), income (divided by $100,000) and political leaning (conservative = 100, moderate = 010, liberal = 001). There are 200 training items and 40 test items.
There are many ways to train a logistic regression model: gradient descent, dual coordinate ascent, Newton-Raphson, L-BFGS, Nelder–Mead, evolutionary optimization, and others. Each technique has pros and cons and each has many variations. I used batch gradient descent with weight decay. With a batch size of 10, the training algorithm looks like:
loop max_epochs = 1000 times
shuffle order of training data items
loop 200 / 10 = 20 batches times
loop batch_size = 10 items times
x = training input item
y = actual target (0 or 1)
p = computed pseudo-prob using x and curr weights
accumulate gradient = x * (p - y)
end-batch
loop each weight, j
wt[j] += -1 * lrn_rate * gradient[j]
end-loop
zero-out all gradients
end-loop 20 batches
decay all wts slightly
end-loop max_eposhs
The output of the demo program is:
Begin JavaScript logistic regression Predict sex from age, State, income, politics Loading data into memory First four train x: 0.2400 1.00 0.00 0.00 0.2950 0.00 0.00 1.00 0.3900 0.00 0.00 1.00 0.5120 0.00 1.00 0.00 0.6300 0.00 1.00 0.00 0.7580 1.00 0.00 0.00 0.3600 1.00 0.00 0.00 0.4450 0.00 1.00 0.00 First four train y: 1 0 1 0 Creating logistic regression model Done Setting: lrnRate = 0.0100 maxEpochs = 1000 batSize = 10 decay = 0.000100 Starting training epoch = 0 | loss = 0.2488 epoch = 200 | loss = 0.2329 epoch = 400 | loss = 0.2208 epoch = 600 | loss = 0.2101 epoch = 800 | loss = 0.2009 Done Evaluating trained model Accuracy on train data: 0.8150 Accuracy on test data: 0.7500 Confusion matrix test data: ----------------------- actual 0: 16 10 actual 1: 0 14 ----------------------- predicted: 0 1 Model wts: 10.22 0.12 0.38 0.12 -10.22 0.61 0.12 -0.10 Model bias: 0.66 Predicting sex for [33, Nebraska, $50,000, conservative]: p-val = 0.4773 class 0 (male) End demo
Compared to non-batch SGD training, i.e., “online” SGD training, batch SGD training often produces a slightly better prediction model.
Weight decay is a surprisingly tricky subject. The main idea is to limit the magnitudes of the weights. Large weights can produce a model that is overfitted on the training data, and therefore predicts poorly on new previously unseen data. Weight decay is one of three common so-called regularization techniques. The other two are L1 regularization (penalize the sum of the absolute values of the weights) and L2 regularization (penalize the sum of the squared weight values). In a very surprising math result, when using SGD, L2 regularization and weight decay are mathematically equivalent. But L2 and decay are not equivalent for other training algorithms.
A nice way to spend some time waiting for Rika the dog grooming lady.

Machine learning techniques, including the humble logistic regression, are the engines of data science prediction. I grew up in the 1950s and 1960s during the early years of space travel. It was exciting as the current rapid development of AI. Five nice illustrations by artist Douglas Castleman. From left to right. The Redstone rocket put the first U.S. astronaut, Alan Shepard, into space, but not into orbit, in a Mercury capsule in 1961. The Atlas rocket put the first U.S. astronaut into orbit, John Glenn, in a Mercury capsule in 1962. The Titan rocket was used for the two-astronaut Gemini missions in 1965 and 1966. The Saturn 1B was used for early three-man Apollo missions in 1967-1969. The mammoth Saturn V rocket put the first men on the moon in Apollo 11 on July 20, 1969 — one of the greatest achievements of human civilization.
Demo program. Replace “lt” (less than), “gt”, “lte”, “gte”, “and” with Boolean operator symbols (my blog editor often chokes on symbols).
// logistic_regression.js
// predict sex (0=M, 1=F) from age, State, income, politics
// gradient descent, batch training, weight decay
// node.js v16.14.1
let FS = require('fs'); // file load
class Logistic
{
constructor(seed)
{
this.seed = seed + 0.5; // virtual RNG
this.wts = null; // see train()
this.bias = 0.0;
}
predict(x)
{
// x is a vector
let z = 0.0;
for (let i = 0; i "lt" this.wts.length; ++i)
z += this.wts[i] * x[i];
z += this.bias;
let p = 1.0 / (1.0 + Math.exp(-z));
return p;
}
// --------------------------------------------------------
train(trainX, trainY, lrnRate, maxEpochs, batSize, decay)
{
let dim = trainX[0].length;
this.wts = this.vecMake(dim, 0.0);
let grads = this.vecMake(dim, 0.0); // one per wt
let biasGrad = 0.0;
// init wts and bias
let lo = -0.01; let hi = 0.01;
for (let i = 0; i "lt" dim; ++i)
this.wts[i] = (hi - lo) *
this.nextDouble() + lo;
// rsearch not clear whether to init the bias or not
// this.bias = (hi - lo) * this.nextDouble() + lo;
let n = trainX.length; // like 200
let batchesPerEpoch = Math.trunc(n / batSize); // like 20
let freq = Math.trunc(maxEpochs / 5); // to show progress
let indices = [];
for (let i = 0; i "lt" n; ++i)
indices[i] = i;
for (let epoch = 0; epoch "lt" maxEpochs; ++epoch) {
this.shuffle(indices); // new order each epoch
let ptr = 0; // points into indices
for (let bix = 0; bix "lt" batchesPerEpoch; ++bix) {
for (let i = 0; i "lt" batSize; ++i) // 0 . . 9
{
let ii = indices[ptr++]; // compute output
let x = trainX[ii];
let y = trainY[ii];
let p = this.predict(x);
// accumulate gradients
for (let j = 0; j "lt" dim; ++j)
grads[j] += x[j] * (p - y);
biasGrad += 1.0 * (p - y);
} // end of curr batch
// time to update weights
for (let j = 0; j "lt" dim; ++j)
this.wts[j] += -1 * lrnRate * grads[j];
this.bias += -1 * lrnRate * biasGrad;
// zero-out grads for next batch
for (let j = 0; j "lt" dim; ++j)
grads[j] = 0.0;
biasGrad = 0.0;
} // all batches
// apply weight decay once per epoch
for (let j = 0; j "lt" dim; ++j)
this.wts[j] *= (1.0 - decay);
if (epoch % freq == 0) { // show progress
let loss = this.mseLoss(trainX, trainY);
let s1 = "epoch = " +
epoch.toString().padStart(6) + " | ";
let s2 = " loss = " + loss.toFixed(4).toString();
console.log(s1 + s2);
}
} // epoch
} // train()
// --------------------------------------------------------
shuffle(arr)
{
// Fisher-Yates algorithm
let n = arr.length;
for (let i = 0; i "lt" n; ++i) {
let ri = this.nextInt(i, n); // random index
let tmp = arr[ri];
arr[ri] = arr[i];
arr[i] = tmp;
}
}
// --------------------------------------------------------
mseLoss(dataX, dataY)
{
let sum = 0.0;
for (let i = 0; i "lt" dataX.length; ++i) {
let x = dataX[i];
let y = dataY[i];
let p = this.predict(x);
sum += (y - p) * (y - p);
}
let mse = sum / dataX.length;
return mse;
}
// --------------------------------------------------------
accuracy(dataX, dataY)
{
let nCorrect = 0; let nWrong = 0;
for (let i = 0; i "lt" dataX.length; ++i) {
let x = dataX[i];
let y = dataY[i];
let p = this.predict(x);
if ((y == 0 "and" p "lt" 0.5) ||
(y == 1 "and" p "gte" 0.5)) {
++nCorrect;
}
else {
++nWrong;
}
}
let acc = (nCorrect * 1.0) / (nCorrect + nWrong);
return acc;
}
// --------------------------------------------------------
confusionMatrix(dataX, dataY)
{
let result = []; // 2x2
for (let i = 0; i "lt" 2; ++i) {
result[i] = [];
for (let j = 0; j "lt" 2; ++j) {
result[i][j] = 0;
}
}
let n = dataX.length;
for (let i = 0; i "lt" n; ++i) {
let predY = this.predict(dataX[i]); // prob
let actualY = dataY[i]; // 0 or 1
if (actualY == 0 "and" predY "lt" 0.5)
++result[0][0];
else if (actualY == 0 "and" predY "gte" 0.5)
++result[0][1];
else if (actualY == 1 "and" predY "lt" 0.5)
++result[1][0];
else if (actualY == 1 "and" predY "gte" 0.5)
++result[1][1];
}
return result;
}
// --------------------------------------------------------
showConfusion(cm)
{
console.log("-----------------------");
process.stdout.write("actual 0: ");
process.stdout.write(cm[0][0].toString().padStart(5));
process.stdout.write(cm[0][1].toString().padStart(5));
console.log("");
process.stdout.write("actual 1: ");
process.stdout.write(cm[1][0].toString().padStart(5));
process.stdout.write(cm[1][1].toString().padStart(5));
console.log("");
console.log("-----------------------");
console.log("predicted: 0 1");
}
// --------------------------------------------------------
// --------------------------------------------------------
// helpers for class
vecMake(n, val)
{
let result = [];
for (let i = 0; i "lt" n; ++i) {
result[i] = val;
}
return result;
}
// --------------------------------------------------------
nextDouble() // next double
{
// semi-sort-of random
let x = Math.sin(this.seed) * 1000;
let result = x - Math.floor(x); // [0.0,1.0)
this.seed = result; // for next call
return result;
}
// --------------------------------------------------------
nextInt(lo, hi) // [lo, hi)
{
let x = this.nextDouble();
return Math.trunc((hi - lo) * x + lo);
}
} // class Logistic
// ==========================================================
function main()
{
console.log("\nBegin JavaScript logistic regression ");
console.log("\nPredict sex from age, State," +
" income, politics ");
// 1. load data, looks like:
// 0 0.29 1 0 0 0.65400 0 0 1
// 1 0.36 0 0 1 0.58300 1 0 0
console.log("\nLoading data into memory ");
let trainX = loadTxt(".\\Data\\people_train.txt", ",",
[1,2,3,4,5,6,7,8], "#");
let trainY = loadTxt(".\\Data\\people_train.txt", ",",
[0], "#");
let testX = loadTxt(".\\Data\\people_test.txt", ",",
[1,2,3,4,5,6,7,8], "#");
let testY = loadTxt(".\\Data\\people_test.txt", ",",
[0], "#");
console.log("\nFirst four train x: ");
for (let i = 0; i "lt" 4; ++i)
vecShow(trainX[i], 4, 8);
console.log("\nFirst four train y: ");
for (let i = 0; i "lt" 4; ++i)
process.stdout.write(trainY[i].toString() + " ");
// 2. create model
console.log("\nCreating logistic regression model ");
let seed = 0;
let model = new Logistic(seed);
console.log("Done ");
// 3. train model
let lrnRate = 0.01;
let maxEpochs = 1000;
let batSize = 10;
let decay = 0.0001;
console.log("\nSetting: ");
console.log("lrnRate = " +
lrnRate.toFixed(4).toString());
console.log("maxEpochs = " + maxEpochs.toString());
console.log("batSize = " + batSize.toString());
console.log("decay = " + decay.toFixed(6).toString());
console.log("\nStarting training ");
model.train(trainX, trainY, lrnRate, maxEpochs,
batSize, decay);
console.log("Done");
// 4. evaluate model
console.log("\nEvaluating trained model ");
let accTrain = model.accuracy(trainX, trainY);
console.log("Accuracy on train data: " +
accTrain.toFixed(4).toString());
let accTest = model.accuracy(testX, testY);
console.log("Accuracy on test data: " +
accTest.toFixed(4).toString());
console.log("\nConfusion matrix test data: ");
let cm = model.confusionMatrix(testX, testY);
model.showConfusion(cm);
// 5. examine model
console.log("\nModel wts: ");
vecShow(model.wts, 2, 8); // 2 decimals, 8 per row
console.log("Model bias: " +
model.bias.toFixed(2).toString());
// 6. use model
console.log("\nPredicting sex for" +
" [33, Nebraska, $50,000, conservative]: ");
let x = [ 0.33, 0,1,0, 0.50000, 1,0,0 ];
let pVal = model.predict(x);
console.log("p-val = " + pVal.toFixed(4).toString());
if (pVal "lt" 0.5)
console.log("class 0 (male) ");
else
console.log("class 1 (female) ");
// TODO: save model weights and bias values to text file
// for later use
console.log("\nEnd demo");
}
main();
// ==========================================================
// helpers for main()
function loadTxt(fn, delimit, usecols, comment) {
// efficient but mildly complicated
let all = FS.readFileSync(fn, "utf8"); // giant string
all = all.trim(); // strip final crlf in file
let lines = all.split("\n"); // array of lines
// count number non-comment lines
let nRows = 0;
for (let i = 0; i "lt" lines.length; ++i) {
if (!lines[i].startsWith(comment))
++nRows;
}
nCols = usecols.length;
//let result = matMake(nRows, nCols, 0.0);
let result = [];
for (let i = 0; i "lt" nRows; ++i) {
result[i] = [];
for (let j = 0; j "lt" nCols; ++j) {
result[i][j] = 0.0;
}
}
let r = 0; // into lines
let i = 0; // into result[][]
while (r "lt" lines.length) {
if (lines[r].startsWith(comment)) {
++r; // next row
}
else {
let tokens = lines[r].split(delimit);
for (let j = 0; j "lt" nCols; ++j) {
result[i][j] = parseFloat(tokens[usecols[j]]);
}
++r;
++i;
}
}
if (usecols.length "gt" 1)
return result;
// if length of usecols is just 1, convert matrix to vector
if (usecols.length == 1) {
let vec = [];
let k = 0;
for (let i = 0; i "lt" nRows; ++i)
for (let j = 0; j "lt" nCols; ++j)
vec[k++] = result[i][j];
return vec;
}
//return result;
}
function vecShow(v, dec, len)
{
for (let i = 0; i "lt" v.length; ++i) {
if (i != 0 "and" i % len == 0) {
process.stdout.write("\n");
}
if (v[i] "gte" 0.0) {
process.stdout.write(" "); // + or - space
}
process.stdout.write(v[i].toFixed(dec));
process.stdout.write(" ");
}
process.stdout.write("\n");
}
function matShow(m, dec)
{
let rows = m.length;
let cols = m[0].length;
for (let i = 0; i "lt" rows; ++i) {
for (let j = 0; j "lt" cols; ++j) {
if (m[i][j] "gte" 0.0) {
process.stdout.write(" "); // + or - space
}
process.stdout.write(m[i][j].toFixed(dec));
process.stdout.write(" ");
}
process.stdout.write("\n");
}
}
Training data:
# people_train.txt # sex (0 = male, 1 = female) - dependent variable # age, state (michigan, nebraska, oklahoma), income, # politics type (conservative, moderate, liberal) # 1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1 0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0 1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0 0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0 1, 0.27, 0, 1, 0, 0.2860, 0, 0, 1 1, 0.50, 0, 1, 0, 0.5650, 0, 1, 0 1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0 0, 0.19, 0, 0, 1, 0.3270, 1, 0, 0 1, 0.22, 0, 1, 0, 0.2770, 0, 1, 0 0, 0.39, 0, 0, 1, 0.4710, 0, 0, 1 1, 0.34, 1, 0, 0, 0.3940, 0, 1, 0 0, 0.22, 1, 0, 0, 0.3350, 1, 0, 0 1, 0.35, 0, 0, 1, 0.3520, 0, 0, 1 0, 0.33, 0, 1, 0, 0.4640, 0, 1, 0 1, 0.45, 0, 1, 0, 0.5410, 0, 1, 0 1, 0.42, 0, 1, 0, 0.5070, 0, 1, 0 0, 0.33, 0, 1, 0, 0.4680, 0, 1, 0 1, 0.25, 0, 0, 1, 0.3000, 0, 1, 0 0, 0.31, 0, 1, 0, 0.4640, 1, 0, 0 1, 0.27, 1, 0, 0, 0.3250, 0, 0, 1 1, 0.48, 1, 0, 0, 0.5400, 0, 1, 0 0, 0.64, 0, 1, 0, 0.7130, 0, 0, 1 1, 0.61, 0, 1, 0, 0.7240, 1, 0, 0 1, 0.54, 0, 0, 1, 0.6100, 1, 0, 0 1, 0.29, 1, 0, 0, 0.3630, 1, 0, 0 1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0 1, 0.55, 0, 0, 1, 0.6250, 1, 0, 0 1, 0.40, 1, 0, 0, 0.5240, 1, 0, 0 1, 0.22, 1, 0, 0, 0.2360, 0, 0, 1 1, 0.68, 0, 1, 0, 0.7840, 1, 0, 0 0, 0.60, 1, 0, 0, 0.7170, 0, 0, 1 0, 0.34, 0, 0, 1, 0.4650, 0, 1, 0 0, 0.25, 0, 0, 1, 0.3710, 1, 0, 0 0, 0.31, 0, 1, 0, 0.4890, 0, 1, 0 1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0 1, 0.58, 0, 1, 0, 0.6540, 0, 0, 1 0, 0.55, 0, 1, 0, 0.6070, 0, 0, 1 0, 0.43, 0, 1, 0, 0.5110, 0, 1, 0 0, 0.43, 0, 0, 1, 0.5320, 0, 1, 0 0, 0.21, 1, 0, 0, 0.3720, 1, 0, 0 1, 0.55, 0, 0, 1, 0.6460, 1, 0, 0 1, 0.64, 0, 1, 0, 0.7480, 1, 0, 0 0, 0.41, 1, 0, 0, 0.5880, 0, 1, 0 1, 0.64, 0, 0, 1, 0.7270, 1, 0, 0 0, 0.56, 0, 0, 1, 0.6660, 0, 0, 1 1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0 0, 0.65, 0, 0, 1, 0.7010, 0, 0, 1 1, 0.55, 0, 0, 1, 0.6430, 1, 0, 0 0, 0.25, 1, 0, 0, 0.4030, 1, 0, 0 1, 0.46, 0, 0, 1, 0.5100, 0, 1, 0 0, 0.36, 1, 0, 0, 0.5350, 1, 0, 0 1, 0.52, 0, 1, 0, 0.5810, 0, 1, 0 1, 0.61, 0, 0, 1, 0.6790, 1, 0, 0 1, 0.57, 0, 0, 1, 0.6570, 1, 0, 0 0, 0.46, 0, 1, 0, 0.5260, 0, 1, 0 0, 0.62, 1, 0, 0, 0.6680, 0, 0, 1 1, 0.55, 0, 0, 1, 0.6270, 1, 0, 0 0, 0.22, 0, 0, 1, 0.2770, 0, 1, 0 0, 0.50, 1, 0, 0, 0.6290, 1, 0, 0 0, 0.32, 0, 1, 0, 0.4180, 0, 1, 0 0, 0.21, 0, 0, 1, 0.3560, 1, 0, 0 1, 0.44, 0, 1, 0, 0.5200, 0, 1, 0 1, 0.46, 0, 1, 0, 0.5170, 0, 1, 0 1, 0.62, 0, 1, 0, 0.6970, 1, 0, 0 1, 0.57, 0, 1, 0, 0.6640, 1, 0, 0 0, 0.67, 0, 0, 1, 0.7580, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3430, 0, 0, 1 1, 0.53, 1, 0, 0, 0.6010, 1, 0, 0 0, 0.44, 1, 0, 0, 0.5480, 0, 1, 0 1, 0.46, 0, 1, 0, 0.5230, 0, 1, 0 0, 0.20, 0, 1, 0, 0.3010, 0, 1, 0 0, 0.38, 1, 0, 0, 0.5350, 0, 1, 0 1, 0.50, 0, 1, 0, 0.5860, 0, 1, 0 1, 0.33, 0, 1, 0, 0.4250, 0, 1, 0 0, 0.33, 0, 1, 0, 0.3930, 0, 1, 0 1, 0.26, 0, 1, 0, 0.4040, 1, 0, 0 1, 0.58, 1, 0, 0, 0.7070, 1, 0, 0 1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0 0, 0.46, 1, 0, 0, 0.6440, 1, 0, 0 1, 0.60, 1, 0, 0, 0.7170, 1, 0, 0 0, 0.42, 1, 0, 0, 0.4890, 0, 1, 0 0, 0.56, 0, 0, 1, 0.5640, 0, 0, 1 0, 0.62, 0, 1, 0, 0.6630, 0, 0, 1 0, 0.50, 1, 0, 0, 0.6480, 0, 1, 0 1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0 0, 0.67, 0, 1, 0, 0.8040, 0, 0, 1 0, 0.40, 0, 0, 1, 0.5040, 0, 1, 0 1, 0.42, 0, 1, 0, 0.4840, 0, 1, 0 1, 0.64, 1, 0, 0, 0.7200, 1, 0, 0 0, 0.47, 1, 0, 0, 0.5870, 0, 0, 1 1, 0.45, 0, 1, 0, 0.5280, 0, 1, 0 0, 0.25, 0, 0, 1, 0.4090, 1, 0, 0 1, 0.38, 1, 0, 0, 0.4840, 1, 0, 0 1, 0.55, 0, 0, 1, 0.6000, 0, 1, 0 0, 0.44, 1, 0, 0, 0.6060, 0, 1, 0 1, 0.33, 1, 0, 0, 0.4100, 0, 1, 0 1, 0.34, 0, 0, 1, 0.3900, 0, 1, 0 1, 0.27, 0, 1, 0, 0.3370, 0, 0, 1 1, 0.32, 0, 1, 0, 0.4070, 0, 1, 0 1, 0.42, 0, 0, 1, 0.4700, 0, 1, 0 0, 0.24, 0, 0, 1, 0.4030, 1, 0, 0 1, 0.42, 0, 1, 0, 0.5030, 0, 1, 0 1, 0.25, 0, 0, 1, 0.2800, 0, 0, 1 1, 0.51, 0, 1, 0, 0.5800, 0, 1, 0 0, 0.55, 0, 1, 0, 0.6350, 0, 0, 1 1, 0.44, 1, 0, 0, 0.4780, 0, 0, 1 0, 0.18, 1, 0, 0, 0.3980, 1, 0, 0 0, 0.67, 0, 1, 0, 0.7160, 0, 0, 1 1, 0.45, 0, 0, 1, 0.5000, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5580, 0, 1, 0 0, 0.25, 0, 1, 0, 0.3900, 0, 1, 0 0, 0.67, 1, 0, 0, 0.7830, 0, 1, 0 1, 0.37, 0, 0, 1, 0.4200, 0, 1, 0 0, 0.32, 1, 0, 0, 0.4270, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5700, 0, 1, 0 0, 0.66, 0, 0, 1, 0.7500, 0, 0, 1 1, 0.61, 1, 0, 0, 0.7000, 1, 0, 0 0, 0.58, 0, 0, 1, 0.6890, 0, 1, 0 1, 0.19, 1, 0, 0, 0.2400, 0, 0, 1 1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0 0, 0.27, 1, 0, 0, 0.3640, 0, 1, 0 1, 0.42, 1, 0, 0, 0.4800, 0, 1, 0 1, 0.60, 1, 0, 0, 0.7130, 1, 0, 0 0, 0.27, 0, 0, 1, 0.3480, 1, 0, 0 1, 0.29, 0, 1, 0, 0.3710, 1, 0, 0 0, 0.43, 1, 0, 0, 0.5670, 0, 1, 0 1, 0.48, 1, 0, 0, 0.5670, 0, 1, 0 1, 0.27, 0, 0, 1, 0.2940, 0, 0, 1 0, 0.44, 1, 0, 0, 0.5520, 1, 0, 0 1, 0.23, 0, 1, 0, 0.2630, 0, 0, 1 0, 0.36, 0, 1, 0, 0.5300, 0, 0, 1 1, 0.64, 0, 0, 1, 0.7250, 1, 0, 0 1, 0.29, 0, 0, 1, 0.3000, 0, 0, 1 0, 0.33, 1, 0, 0, 0.4930, 0, 1, 0 0, 0.66, 0, 1, 0, 0.7500, 0, 0, 1 0, 0.21, 0, 0, 1, 0.3430, 1, 0, 0 1, 0.27, 1, 0, 0, 0.3270, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3180, 0, 0, 1 0, 0.31, 1, 0, 0, 0.4860, 0, 1, 0 1, 0.36, 0, 0, 1, 0.4100, 0, 1, 0 1, 0.49, 0, 1, 0, 0.5570, 0, 1, 0 0, 0.28, 1, 0, 0, 0.3840, 1, 0, 0 0, 0.43, 0, 0, 1, 0.5660, 0, 1, 0 0, 0.46, 0, 1, 0, 0.5880, 0, 1, 0 1, 0.57, 1, 0, 0, 0.6980, 1, 0, 0 0, 0.52, 0, 0, 1, 0.5940, 0, 1, 0 0, 0.31, 0, 0, 1, 0.4350, 0, 1, 0 0, 0.55, 1, 0, 0, 0.6200, 0, 0, 1 1, 0.50, 1, 0, 0, 0.5640, 0, 1, 0 1, 0.48, 0, 1, 0, 0.5590, 0, 1, 0 0, 0.22, 0, 0, 1, 0.3450, 1, 0, 0 1, 0.59, 0, 0, 1, 0.6670, 1, 0, 0 1, 0.34, 1, 0, 0, 0.4280, 0, 0, 1 0, 0.64, 1, 0, 0, 0.7720, 0, 0, 1 1, 0.29, 0, 0, 1, 0.3350, 0, 0, 1 0, 0.34, 0, 1, 0, 0.4320, 0, 1, 0 0, 0.61, 1, 0, 0, 0.7500, 0, 0, 1 1, 0.64, 0, 0, 1, 0.7110, 1, 0, 0 0, 0.29, 1, 0, 0, 0.4130, 1, 0, 0 1, 0.63, 0, 1, 0, 0.7060, 1, 0, 0 0, 0.29, 0, 1, 0, 0.4000, 1, 0, 0 0, 0.51, 1, 0, 0, 0.6270, 0, 1, 0 0, 0.24, 0, 0, 1, 0.3770, 1, 0, 0 1, 0.48, 0, 1, 0, 0.5750, 0, 1, 0 1, 0.18, 1, 0, 0, 0.2740, 1, 0, 0 1, 0.18, 1, 0, 0, 0.2030, 0, 0, 1 1, 0.33, 0, 1, 0, 0.3820, 0, 0, 1 0, 0.20, 0, 0, 1, 0.3480, 1, 0, 0 1, 0.29, 0, 0, 1, 0.3300, 0, 0, 1 0, 0.44, 0, 0, 1, 0.6300, 1, 0, 0 0, 0.65, 0, 0, 1, 0.8180, 1, 0, 0 0, 0.56, 1, 0, 0, 0.6370, 0, 0, 1 0, 0.52, 0, 0, 1, 0.5840, 0, 1, 0 0, 0.29, 0, 1, 0, 0.4860, 1, 0, 0 0, 0.47, 0, 1, 0, 0.5890, 0, 1, 0 1, 0.68, 1, 0, 0, 0.7260, 0, 0, 1 1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0 1, 0.61, 0, 1, 0, 0.6250, 0, 0, 1 1, 0.19, 0, 1, 0, 0.2150, 0, 0, 1 1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0 0, 0.26, 1, 0, 0, 0.4230, 1, 0, 0 1, 0.61, 0, 1, 0, 0.6740, 1, 0, 0 1, 0.40, 1, 0, 0, 0.4650, 0, 1, 0 0, 0.49, 1, 0, 0, 0.6520, 0, 1, 0 1, 0.56, 1, 0, 0, 0.6750, 1, 0, 0 0, 0.48, 0, 1, 0, 0.6600, 0, 1, 0 1, 0.52, 1, 0, 0, 0.5630, 0, 0, 1 0, 0.18, 1, 0, 0, 0.2980, 1, 0, 0 0, 0.56, 0, 0, 1, 0.5930, 0, 0, 1 0, 0.52, 0, 1, 0, 0.6440, 0, 1, 0 0, 0.18, 0, 1, 0, 0.2860, 0, 1, 0 0, 0.58, 1, 0, 0, 0.6620, 0, 0, 1 0, 0.39, 0, 1, 0, 0.5510, 0, 1, 0 0, 0.46, 1, 0, 0, 0.6290, 0, 1, 0 0, 0.40, 0, 1, 0, 0.4620, 0, 1, 0 0, 0.60, 1, 0, 0, 0.7270, 0, 0, 1 1, 0.36, 0, 1, 0, 0.4070, 0, 0, 1 1, 0.44, 1, 0, 0, 0.5230, 0, 1, 0 1, 0.28, 1, 0, 0, 0.3130, 0, 0, 1 1, 0.54, 0, 0, 1, 0.6260, 1, 0, 0
Test data:
# people_test.txt # 0, 0.51, 1, 0, 0, 0.6120, 0, 1, 0 0, 0.32, 0, 1, 0, 0.4610, 0, 1, 0 1, 0.55, 1, 0, 0, 0.6270, 1, 0, 0 1, 0.25, 0, 0, 1, 0.2620, 0, 0, 1 1, 0.33, 0, 0, 1, 0.3730, 0, 0, 1 0, 0.29, 0, 1, 0, 0.4620, 1, 0, 0 1, 0.65, 1, 0, 0, 0.7270, 1, 0, 0 0, 0.43, 0, 1, 0, 0.5140, 0, 1, 0 0, 0.54, 0, 1, 0, 0.6480, 0, 0, 1 1, 0.61, 0, 1, 0, 0.7270, 1, 0, 0 1, 0.52, 0, 1, 0, 0.6360, 1, 0, 0 1, 0.30, 0, 1, 0, 0.3350, 0, 0, 1 1, 0.29, 1, 0, 0, 0.3140, 0, 0, 1 0, 0.47, 0, 0, 1, 0.5940, 0, 1, 0 1, 0.39, 0, 1, 0, 0.4780, 0, 1, 0 1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0 0, 0.49, 1, 0, 0, 0.5860, 0, 1, 0 0, 0.63, 0, 0, 1, 0.6740, 0, 0, 1 0, 0.30, 1, 0, 0, 0.3920, 1, 0, 0 0, 0.61, 0, 0, 1, 0.6960, 0, 0, 1 0, 0.47, 0, 0, 1, 0.5870, 0, 1, 0 1, 0.30, 0, 0, 1, 0.3450, 0, 0, 1 0, 0.51, 0, 0, 1, 0.5800, 0, 1, 0 0, 0.24, 1, 0, 0, 0.3880, 0, 1, 0 0, 0.49, 1, 0, 0, 0.6450, 0, 1, 0 1, 0.66, 0, 0, 1, 0.7450, 1, 0, 0 0, 0.65, 1, 0, 0, 0.7690, 1, 0, 0 0, 0.46, 0, 1, 0, 0.5800, 1, 0, 0 0, 0.45, 0, 0, 1, 0.5180, 0, 1, 0 0, 0.47, 1, 0, 0, 0.6360, 1, 0, 0 0, 0.29, 1, 0, 0, 0.4480, 1, 0, 0 0, 0.57, 0, 0, 1, 0.6930, 0, 0, 1 0, 0.20, 1, 0, 0, 0.2870, 0, 0, 1 0, 0.35, 1, 0, 0, 0.4340, 0, 1, 0 0, 0.61, 0, 0, 1, 0.6700, 0, 0, 1 0, 0.31, 0, 0, 1, 0.3730, 0, 1, 0 1, 0.18, 1, 0, 0, 0.2080, 0, 0, 1 1, 0.26, 0, 0, 1, 0.2920, 0, 0, 1 0, 0.28, 1, 0, 0, 0.3640, 0, 0, 1 0, 0.59, 0, 0, 1, 0.6940, 0, 0, 1

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.