I write code almost every day. Like many activities, writing code is a skill that must be practiced, plus I just enjoy writing code. And in the emerging world of AI-generated software, it’s still important to write code by hand so that good prompts can be generated and the resulting code can be evaluated.
One morning before work, I figured I’d run the well-known Diabetes Dataset through a random forest regression model. Based on previous experiments with other models including linear regression, quadratic regression, neural network regression, and a scikit DecisionTreeRegressor model, I was pretty sure the C# random forest model would give poor prediction accuracy. And that’s exactly what happened, as expected.
The Diabetes Dataset looks like:
59, 2, 32.1, 101.00, 157, 93.2, 38, 4.00, 4.8598, 87, 151 48, 1, 21.6, 87.00, 183, 103.2, 70, 3.00, 3.8918, 69, 75 72, 2, 30.5, 93.00, 156, 93.6, 41, 4.00, 4.6728, 85, 141 . . .
Each line represents a patient. The first 10 values on each line are predictors. The last value on each line is the target value (a diabetes metric) to predict. The predictors are: age, sex, body mass index, blood pressure, serum cholesterol, low-density lipoproteins, high-density lipoproteins, total cholesterol, triglycerides, blood sugar. There are 442 data items.
The sex encoding isn’t explained anywhere but I suspect male = 1, female = 2 because there are 235 1 values and 206 2 values).
Note that this is the diabetes dataset that is included in the scikit-learn demo datasets, not the similar Pima Diabetes dataset.
I converted the sex values encoding from 1,2 into 0,1. Then I applied divide-by-constant normalization by dividing the 10 predictor columns by (100, 1, 100, 1000, 1000, 1000, 100, 10, 10, 1000) and the target y values by 1000. The resulting encoded and normalized data looks like:
0.5900, 1.0000, 0.3210, . . . 0.1510 0.4800, 0.0000, 0.2160, . . . 0.0750 0.7200, 1.0000, 0.3050, . . . 0.1410 . . .
Normalization isn’t necessary for decision tree models, but it doesn’t hurt. I split the 442-items into a 342-item training set and a 100-item test set.
I implemented a random forest tree regression model, from scratch, using the C# language. The idea of a random forest model is to create a collection of basic decision trees, each of which is trained on a different subset of the training data. This reduces model overfitting which is the main problem with decision trees.
I used 100 trees for the random forest. I set the depth of each tree to 5. I specified that a random 300 out of the 342 rows of training data would be used to train each tree. I specified that a random 9 out of the 10 columns would be used during each tree split operation during training — this is a secondary regularization technique.
The output of the demo is:
Begin C# Random Forest regression demo Loading diabetes train (342) and test (100) data Done First three train X: 0.5900 1.0000 0.3210 . . . 0.0870 0.4800 0.0000 0.2160 . . . 0.0690 0.7200 1.0000 0.3050 . . . 0.0850 First three train y: 0.1510 0.0750 0.1410 Setting nTrees = 100 Setting maxDepth = 5 Setting minSamples = 2 Setting minLeaf = 1 Setting nCols = 9 Setting nRows = 300 Creating and training random forest regression model Done Accuracy train (within 0.10) = 0.2456 Accuracy test (within 0.10) = 0.2400 MSE train = 0.0016 MSE test = 0.0032 Predicting for trainX[0] = 0.5900 1.0000 0.3210 . . . 0.0870 Predicted y = 0.1923 End demo
These results were essentially the same as the results I got using other regression techniques such as quadratic regression and neural network regression.
I have done a lot of experiments with the Diabetes Dataset and I’ve concluded the the default target value in the last column (a patient diabetes score) simply cannot be predicted well. But the variables in columns [4], [5], [6], [7], and [8] can be meaningfully predicted from the other columns.
Good fun on a rainy morning in the Pacific Northwest.

I see beauty in machine learning code. Physical beauty in people is very subjective. Actresses of one era might be considered beautiful, but only average at best in another era.
Here are three actresses from the 1920s who meet modern standards of beauty, at least according to my non-expert opinion.
Left: Myrna Loy (1905-1993) had a very long career, acting from 1925 until 1981. I like several of the films in which she appeared, especially “The Mask of Fu Manchu” (1932), “The Thin Man” (1934), and “Cheaper by the Dozen” (1950).
Center: Bessie Love (born Juanita Horton, 1898–1986) is not well-remembered today. She had a very long career, acting from 1916 (silent film) to 1980. The three movies of hers that I know well and like a lot are “The Lost World” (1925), “Battle Beneath the Earth” (1967), and “On Her Majesty’s Secret Service” (1969).
Right: Louise Brooks (1906-1985) acted in 24 films from 1925-1938. She was well-known for popularizing a bob hair style in the 1920s. The one movie that she appeared in that I am most familiar with is “Overland Stage Raiders” (1938), with actor John Wayne. Like many actresses of the 1920s through 1950s, she ended her days as a serial mistress passed around from one movie executive to another.
Demo program. Replace “lt” (less than), “gt”, “lte”, gte” with Boolean operator symbols. (My blog editor chokes on symbols).
using System;
using System.IO;
using System.Collections.Generic;
namespace RandomForestRegression
{
internal class RandomForestRegressionProgram
{
static void Main(string[] args)
{
Console.WriteLine("\nBegin C# Random Forest" +
" regression demo ");
// 1. load data
Console.WriteLine("\nLoading diabetes train (342)" +
" and test (100) data");
string trainFile =
"..\\..\\..\\Data\\diabetes_norm_train_342.txt";
int[] colsX = new int[] { 0, 1, 2, 3, 4,
5, 6, 7, 8, 9};
int colY = 10;
double[][] trainX =
MatLoad(trainFile, colsX, ',', "#");
double[] trainY =
MatToVec(MatLoad(trainFile,
new int[] { colY }, ',', "#"));
string testFile =
"..\\..\\..\\Data\\diabetes_norm_test_100.txt";
double[][] testX =
MatLoad(testFile, colsX, ',', "#");
double[] testY =
MatToVec(MatLoad(testFile,
new int[] { colY }, ',', "#"));
Console.WriteLine("Done ");
Console.WriteLine("\nFirst three train X: ");
for (int i = 0; i "lt" 3; ++i)
VecShow(trainX[i], 4, 8);
Console.WriteLine("\nFirst three train y: ");
for (int i = 0; i "lt" 3; ++i)
Console.WriteLine(trainY[i].ToString("F4").
PadLeft(8));
int nTrees = 100;
int maxDepth = 5;
int minSamples = 2;
int minLeaf = 1;
int nCols = 9; // if use all 10 == bagging tree
int nRows = 300; // of 342 train data for each tree
Console.WriteLine("\nSetting nTrees = " + nTrees);
Console.WriteLine("Setting maxDepth = " + maxDepth);
Console.WriteLine("Setting minSamples = " + minSamples);
Console.WriteLine("Setting minLeaf = " + minLeaf);
Console.WriteLine("Setting nCols = " + nCols);
Console.WriteLine("Setting nRows = " + nRows);
Console.WriteLine("\nCreating and training" +
" random forest regression model ");
RandomForestRegressor rfr =
new RandomForestRegressor(nTrees, maxDepth,
minSamples, minLeaf, nCols, nRows, seed: 0);
rfr.Train(trainX, trainY);
Console.WriteLine("Done ");
// 3. evaluate model
double accTrain = rfr.Accuracy(trainX, trainY, 0.10);
Console.WriteLine("\nAccuracy train (within 0.10) = " +
accTrain.ToString("F4"));
double accTest = rfr.Accuracy(testX, testY, 0.10);
Console.WriteLine("Accuracy test (within 0.10) = " +
accTest.ToString("F4"));
double mseTrain = rfr.MSE(trainX, trainY);
Console.WriteLine("\nMSE train = " +
mseTrain.ToString("F4"));
double mseTest = rfr.MSE(testX, testY);
Console.WriteLine("MSE test = " +
mseTest.ToString("F4"));
// 4. use model
double[] x = trainX[0];
Console.WriteLine("\nPredicting for x = ");
VecShow(x, 4, 9);
double predY = rfr.Predict(x);
Console.WriteLine("y = " + predY.ToString("F4"));
Console.WriteLine("\nEnd demo ");
Console.ReadLine();
} // Main()
// ------------------------------------------------------
// helpers for Main()
// ------------------------------------------------------
static double[][] MatLoad(string fn, int[] usecols,
char sep, string comment)
{
List"lt"double[]"gt" result =
new List"lt"double[]"gt"();
string line = "";
FileStream ifs = new FileStream(fn, FileMode.Open);
StreamReader sr = new StreamReader(ifs);
while ((line = sr.ReadLine()) != null)
{
if (line.StartsWith(comment) == true)
continue;
string[] tokens = line.Split(sep);
List"lt"double"gt" lst = new List"lt"double"gt"();
for (int j = 0; j "lt" usecols.Length; ++j)
lst.Add(double.Parse(tokens[usecols[j]]));
double[] row = lst.ToArray();
result.Add(row);
}
sr.Close(); ifs.Close();
return result.ToArray();
}
static double[] MatToVec(double[][] M)
{
int nRows = M.Length;
int nCols = M[0].Length;
double[] result = new double[nRows * nCols];
int k = 0;
for (int i = 0; i "lt" nRows; ++i)
for (int j = 0; j "lt" nCols; ++j)
result[k++] = M[i][j];
return result;
}
static void VecShow(double[] vec, int dec, int wid)
{
for (int i = 0; i "lt" vec.Length; ++i)
Console.Write(vec[i].ToString("F" + dec).
PadLeft(wid));
Console.WriteLine("");
}
} // class Program
// ========================================================
public class RandomForestRegressor
{
public int nTrees;
public int maxDepth;
public int minSamples;
public int minLeaf;
public int nSplitCols; // number cols to use each split
public int nRows;
public List"lt"DecisionTreeRegressor"gt" trees;
public Random rnd;
public RandomForestRegressor(int nTrees, int maxDepth,
int minSamples, int minLeaf, int nCols, int nRows,
int seed = 0)
{
this.nTrees = nTrees;
this.maxDepth = maxDepth;
this.minSamples = minSamples;
this.minLeaf = minLeaf;
this.nRows = nRows;
this.nSplitCols = nCols; // pass to DecisionTree
this.trees = new List"lt"DecisionTreeRegressor"gt"();
this.rnd = new Random(seed);
}
// ------------------------------------------------------
public void Train(double[][] trainX, double[] trainY)
{
int nr = trainX.Length;
// int nc = trainX[0].Length;
for (int i = 0; i "lt" this.nTrees; ++i) // each tree
{
// construct data rows-subset
int[] rndRows =
GetRandomRowIdxs(nr, this.nRows, this.rnd);
double[][] subsetX =
MakeRowsSubsetX(trainX, rndRows);
double[] subsetY =
MakeSubsetY(trainY, rndRows);
// make and train curr tree
DecisionTreeRegressor dtr =
new DecisionTreeRegressor(this.maxDepth,
this.minSamples, this.minLeaf, this.nSplitCols);
// random cols will be used during splits
dtr.Train(subsetX, subsetY);
this.trees.Add(dtr);
} // i
}
// ------------------------------------------------------
public double Predict(double[] x)
{
double sum = 0.0;
for (int t = 0; t "lt" this.nTrees; ++t)
sum += this.trees[t].Predict(x);
return sum / this.nTrees;
}
// ------------------------------------------------------
public double Accuracy(double[][] dataX, double[] dataY,
double pctClose)
{
int numCorrect = 0; int numWrong = 0;
for (int i = 0; i "lt" dataX.Length; ++i)
{
double actualY = dataY[i];
double predY = this.Predict(dataX[i]);
if (Math.Abs(predY - actualY) "lt"
Math.Abs(pctClose * actualY))
++numCorrect;
else
++numWrong;
}
return (numCorrect * 1.0) / (numWrong + numCorrect);
}
// ------------------------------------------------------
public double MSE(double[][] dataX, double[] dataY)
{
int n = dataX.Length;
double sum = 0.0;
for (int i = 0; i "lt" n; ++i)
{
double actualY = dataY[i];
double predY = this.Predict(dataX[i]);
sum += (actualY - predY) * (actualY - predY);
}
return sum / n;
}
// ------------------------------------------------------
private static int[] GetRandomRowIdxs(int N, int n,
Random rnd)
{
// pick n rows from N with replacement
int[] result = new int[n];
for (int i = 0; i "lt" n; ++i)
result[i] = rnd.Next(0, N);
Array.Sort(result);
return result;
}
private static double[][] MakeRowsSubsetX(double[][] trainX,
int[] rndRows)
{
int nCols = trainX[0].Length; // use all cols
double[][] result = new double[rndRows.Length][];
for (int i = 0; i "lt" rndRows.Length; ++i)
result[i] = new double[nCols];
for (int i = 0; i "lt" rndRows.Length; ++i)
{
for (int j = 0; j "lt" nCols; ++j)
{
int r = rndRows[i]; // random row in trainX
result[i][j] = trainX[r][j];
}
}
return result;
}
private static double[] MakeSubsetY(double[] trainY,
int[] rndRows)
{
double[] result = new double[rndRows.Length];
for (int i = 0; i "lt" rndRows.Length; ++i)
result[i] = trainY[rndRows[i]];
return result;
}
} // class RandomForestRegressor
// ========================================================
public class DecisionTreeRegressor
{
public int maxDepth;
public int minSamples; // aka min_samples_split
public int minLeaf; // min number of values in a leaf
public int numSplitCols; // mostly for random forest
public List"lt"Node"gt" tree = new List"lt"Node"gt"();
public Random rnd; // order in which cols are searched
public double[][] trainX; // store data by ref
public double[] trainY;
// ------------------------------------------------------
public class Node
{
public int id;
public int colIdx; // aka featureIdx
public double thresh;
public int left; // index into List
public int right;
public double value;
public bool isLeaf;
public List"lt"int"gt" rows; // assoc rows in train data
public Node()
{
this.id = -1;
this.colIdx = -1;
this.thresh = 0.0; // aka split value
this.left = -1;
this.right = -1;
this.value = 0.0; // aka pred y
this.isLeaf = false;
this.rows = null;
}
} // class Node
// --------------------------------------------
public DecisionTreeRegressor(int maxDepth = 2,
int minSamples = 2, int minLeaf = 1,
int numSplitCols = -1, int seed = 0)
{
// if maxDepth = 0, tree has just a root node
// if maxDepth = 1, at most 3 nodes (root, l, r)
// if maxDepth = n, at most 2^(n+1) - 1 nodes
this.maxDepth = maxDepth;
this.minSamples = minSamples;
this.minLeaf = minLeaf;
this.numSplitCols = numSplitCols; // for ran. forest
// create full tree List with null nodes
int numNodes = (int)Math.Pow(2, (maxDepth + 1)) - 1;
for (int i = 0; i "lt" numNodes; ++i)
{
this.tree.Add(null); // empty nodes
}
this.rnd = new Random(seed);
}
// ------------------------------------------------------
// public: Train(), Predict()
// helpers: MakeTree(), BestSplit(), TreeTargetMean(),
// TreeTargetVariance().
// ------------------------------------------------------
public void Train(double[][] trainX, double[] trainY)
{
this.trainX = trainX; //
this.trainY = trainY;
this.MakeTree();
// delete rows in each Node to save space
// when tree is part of an ensemble
for (int i = 0; i "lt" this.tree.Count; ++i)
if (this.tree[i] != null)
this.tree[i].rows = null;
}
// ------------------------------------------------------
public double Predict(double[] x)
{
int p = 0;
Node currNode = this.tree[p];
while (currNode != null &&
currNode.isLeaf == false &&
p "lt" this.tree.Count)
{
if (x[currNode.colIdx] "lte" currNode.thresh)
p = currNode.left;
else
p = currNode.right;
currNode = this.tree[p];
}
return this.tree[p].value;
}
// ------------------------------------------------------
// helpers: MakeTree(), BestSplit(),
// TreeTargetMean(), TreeTargetVariance()
private void MakeTree()
{
// no recursion, no pointers, List storage, no stack
if (this.numSplitCols == -1) // use all cols
this.numSplitCols = this.trainX[0].Length;
// prepare root node
List"lt"int"gt" allRows = new List"lt"int"gt"();
for (int i = 0; i "lt" this.trainX.Length; ++i)
allRows.Add(i);
double grandMean = this.TreeTargetMean(allRows);
// wait to supply colIdx and thresh in loop
Node root = new Node();
root.id = 0;
root.left = 1;
root.right = 2;
root.value = grandMean;
root.isLeaf = false; // already set
root.rows = allRows;
this.tree[0] = root;
for (int i = 0; i "lt" this.tree.Count; ++i)
{
Node currNode = this.tree[i];
// curr node has values for everything
// except colIdx and thresh
// curr node too deep to have children OR
// curr node not enough rows to split then
// leave both children as null
if (currNode == null ||
currNode.rows.Count == 0) { continue; }
// if parent cannot be split, make parent a leaf
if (currNode.id "gte" (int)Math.Pow(2,
(this.maxDepth)) - 1 ||
currNode.rows.Count "lt" this.minSamples)
{
currNode.isLeaf = true;
continue;
}
// parent has enough rows to try to split
double[] splitInfo = this.BestSplit(currNode.rows);
int colIdx = (int)splitInfo[0];
double splitVal = splitInfo[1]; //split value
if (colIdx == -1) // parent is a leaf
{
currNode.isLeaf = true;
continue;
}
// complete the fields for curr node
currNode.colIdx = colIdx;
currNode.thresh = splitVal;
// construct the children, except colIdx and thresh
Node leftNode = new Node();
Node rightNode = new Node();
// construct the children rows using split info
// all info except colIdx and thresh
List"lt"int"gt" leftIdxs = new List"lt"int"gt"();
List"lt"int"gt" rightIdxs = new List"lt"int"gt"();
for (int k = 0; k "lt" currNode.rows.Count; ++k)
{
int r = currNode.rows[k];
if (this.trainX[r][colIdx] "lte" splitVal)
leftIdxs.Add(r);
else
rightIdxs.Add(r);
}
// assign -1 to children if out of range
leftNode.id = currNode.id * 2 + 1;
if (leftNode.id "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) leftNode.id = -1;
leftNode.left = leftNode.id * 2 + 1;
if (leftNode.left "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) leftNode.left = -1;
leftNode.right = leftNode.id * 2 + 2;
if (leftNode.right "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) leftNode.right = -1;
leftNode.rows = leftIdxs;
leftNode.value =
this.TreeTargetMean(leftNode.rows);
this.tree[leftNode.id] = leftNode;
rightNode.id = currNode.id * 2 + 2;
if (rightNode.id "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) rightNode.id = -1;
rightNode.left = rightNode.id * 2 + 1;
if (rightNode.left "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) rightNode.left = -1;
rightNode.right = rightNode.id * 2 + 2;
if (rightNode.right "gt" (int)Math.Pow(2,
(maxDepth + 1)) - 2) rightNode.right = -1;
rightNode.rows = rightIdxs;
rightNode.value =
this.TreeTargetMean(rightNode.rows);
this.tree[rightNode.id] = rightNode;
} // i
return;
}
// ------------------------------------------------------
private double[] BestSplit(List"lt"int"gt" rows)
{
// implicit params numSplitCols, minLeaf, numsplitCols
// result[0] = best col idx (as double)
// result[1] = best split value
rows.Sort();
int bestColIdx = -1; // indicates bad split
double bestThresh = 0.0;
double bestVar = double.MaxValue; // smaller better
int nRows = rows.Count; // or dataY.Length
int nCols = this.trainX[0].Length;
if (nRows == 0)
{
throw new Exception("empty data in BestSplit()");
}
// process cols in scrambled order
int[] colIndices = new int[nCols];
for (int k = 0; k "lt" nCols; ++k)
colIndices[k] = k;
// shuffle, inline Fisher-Yates
int n = colIndices.Length;
for (int i = 0; i "lt" n; ++i)
{
int ri = rnd.Next(i, n); // be careful
int tmp = colIndices[i];
colIndices[i] = colIndices[ri];
colIndices[ri] = tmp;
}
// numSplitCols is usually all columns (-1)
for (int j = 0; j "lt" this.numSplitCols; ++j)
{
int colIdx = colIndices[j];
HashSet"lt"double"gt" examineds =
new HashSet"lt"double"gt"();
for (int i = 0; i "lt" nRows; ++i) // each row
{
// if curr thresh been seen, skip it
double thresh = this.trainX[rows[i]][colIdx];
if (examineds.Contains(thresh)) continue;
examineds.Add(thresh);
// get row idxs where x is lte, gt thresh
List"lt"int"gt" leftIdxs = new List"lt"int"gt"();
List"lt"int"gt" rightIdxs = new List"lt"int"gt"();
for (int k = 0; k "lt" nRows; ++k)
{
if (this.trainX[rows[k]][colIdx] "lte" thresh)
leftIdxs.Add(rows[k]);
else
rightIdxs.Add(rows[k]);
}
// Check if proposed split has too few values
if (leftIdxs.Count "lt" this.minLeaf ||
rightIdxs.Count "lt" this.minLeaf)
continue; // to next row
double leftVar =
this.TreeTargetVariance(leftIdxs);
double rightVar =
this.TreeTargetVariance(rightIdxs);
double weightedVar = (leftIdxs.Count * leftVar +
rightIdxs.Count * rightVar) / nRows;
if (weightedVar "lt" bestVar)
{
// if this never happens, bestColIdx remains -1
// which means a bad split. used in MakeTree()
bestColIdx = colIdx;
bestThresh = thresh;
bestVar = weightedVar;
}
} // each row
} // j each col
double[] result = new double[2]; // out params ugly
result[0] = 1.0 * bestColIdx;
result[1] = bestThresh;
return result;
} // BestSplit()
// ------------------------------------------------------
private double TreeTargetMean(List"lt"int"gt" rows)
{
// mean of rows items in trainY: for node prediction
double sum = 0.0;
for (int i = 0; i "lt" rows.Count; ++i)
{
int r = rows[i];
sum += this.trainY[r];
}
return sum / rows.Count;
}
// ------------------------------------------------------
private double TreeTargetVariance(List"lt"int"gt" rows)
{
double mean = this.TreeTargetMean(rows);
double sum = 0.0;
for (int i = 0; i "lt" rows.Count; ++i)
{
int r = rows[i];
sum += (this.trainY[r] - mean) *
(this.trainY[r] - mean);
}
return sum / rows.Count;
}
// ------------------------------------------------------
} // class DecisionTreeRegressor
// ========================================================
} // ns
Training data:
# diabetes_norm_train_342.txt
# cols [0] to [9] predictors. col [10] target
# norm division constants:
# 100, -1, 100, 1000, 1000, 1000, 100, 10, 10, 1000, 1000
#
0.5900, 1.0000, 0.3210, 0.1010, 0.1570, 0.0932, 0.3800, 0.4000, 0.4860, 0.0870, 0.1510
0.4800, 0.0000, 0.2160, 0.0870, 0.1830, 0.1032, 0.7000, 0.3000, 0.3892, 0.0690, 0.0750
0.7200, 1.0000, 0.3050, 0.0930, 0.1560, 0.0936, 0.4100, 0.4000, 0.4673, 0.0850, 0.1410
0.2400, 0.0000, 0.2530, 0.0840, 0.1980, 0.1314, 0.4000, 0.5000, 0.4890, 0.0890, 0.2060
0.5000, 0.0000, 0.2300, 0.1010, 0.1920, 0.1254, 0.5200, 0.4000, 0.4291, 0.0800, 0.1350
0.2300, 0.0000, 0.2260, 0.0890, 0.1390, 0.0648, 0.6100, 0.2000, 0.4190, 0.0680, 0.0970
0.3600, 1.0000, 0.2200, 0.0900, 0.1600, 0.0996, 0.5000, 0.3000, 0.3951, 0.0820, 0.1380
0.6600, 1.0000, 0.2620, 0.1140, 0.2550, 0.1850, 0.5600, 0.4550, 0.4249, 0.0920, 0.0630
0.6000, 1.0000, 0.3210, 0.0830, 0.1790, 0.1194, 0.4200, 0.4000, 0.4477, 0.0940, 0.1100
0.2900, 0.0000, 0.3000, 0.0850, 0.1800, 0.0934, 0.4300, 0.4000, 0.5385, 0.0880, 0.3100
0.2200, 0.0000, 0.1860, 0.0970, 0.1140, 0.0576, 0.4600, 0.2000, 0.3951, 0.0830, 0.1010
0.5600, 1.0000, 0.2800, 0.0850, 0.1840, 0.1448, 0.3200, 0.6000, 0.3584, 0.0770, 0.0690
0.5300, 0.0000, 0.2370, 0.0920, 0.1860, 0.1092, 0.6200, 0.3000, 0.4304, 0.0810, 0.1790
0.5000, 1.0000, 0.2620, 0.0970, 0.1860, 0.1054, 0.4900, 0.4000, 0.5063, 0.0880, 0.1850
0.6100, 0.0000, 0.2400, 0.0910, 0.2020, 0.1154, 0.7200, 0.3000, 0.4291, 0.0730, 0.1180
0.3400, 1.0000, 0.2470, 0.1180, 0.2540, 0.1842, 0.3900, 0.7000, 0.5037, 0.0810, 0.1710
0.4700, 0.0000, 0.3030, 0.1090, 0.2070, 0.1002, 0.7000, 0.3000, 0.5215, 0.0980, 0.1660
0.6800, 1.0000, 0.2750, 0.1110, 0.2140, 0.1470, 0.3900, 0.5000, 0.4942, 0.0910, 0.1440
0.3800, 0.0000, 0.2540, 0.0840, 0.1620, 0.1030, 0.4200, 0.4000, 0.4443, 0.0870, 0.0970
0.4100, 0.0000, 0.2470, 0.0830, 0.1870, 0.1082, 0.6000, 0.3000, 0.4543, 0.0780, 0.1680
0.3500, 0.0000, 0.2110, 0.0820, 0.1560, 0.0878, 0.5000, 0.3000, 0.4511, 0.0950, 0.0680
0.2500, 1.0000, 0.2430, 0.0950, 0.1620, 0.0986, 0.5400, 0.3000, 0.3850, 0.0870, 0.0490
0.2500, 0.0000, 0.2600, 0.0920, 0.1870, 0.1204, 0.5600, 0.3000, 0.3970, 0.0880, 0.0680
0.6100, 1.0000, 0.3200, 0.1037, 0.2100, 0.0852, 0.3500, 0.6000, 0.6107, 0.1240, 0.2450
0.3100, 0.0000, 0.2970, 0.0880, 0.1670, 0.1034, 0.4800, 0.4000, 0.4357, 0.0780, 0.1840
0.3000, 1.0000, 0.2520, 0.0830, 0.1780, 0.1184, 0.3400, 0.5000, 0.4852, 0.0830, 0.2020
0.1900, 0.0000, 0.1920, 0.0870, 0.1240, 0.0540, 0.5700, 0.2000, 0.4174, 0.0900, 0.1370
0.4200, 0.0000, 0.3190, 0.0830, 0.1580, 0.0876, 0.5300, 0.3000, 0.4466, 0.1010, 0.0850
0.6300, 0.0000, 0.2440, 0.0730, 0.1600, 0.0914, 0.4800, 0.3000, 0.4635, 0.0780, 0.1310
0.6700, 1.0000, 0.2580, 0.1130, 0.1580, 0.0542, 0.6400, 0.2000, 0.5293, 0.1040, 0.2830
0.3200, 0.0000, 0.3050, 0.0890, 0.1820, 0.1106, 0.5600, 0.3000, 0.4344, 0.0890, 0.1290
0.4200, 0.0000, 0.2030, 0.0710, 0.1610, 0.0812, 0.6600, 0.2000, 0.4234, 0.0810, 0.0590
0.5800, 1.0000, 0.3800, 0.1030, 0.1500, 0.1072, 0.2200, 0.7000, 0.4644, 0.0980, 0.3410
0.5700, 0.0000, 0.2170, 0.0940, 0.1570, 0.0580, 0.8200, 0.2000, 0.4443, 0.0920, 0.0870
0.5300, 0.0000, 0.2050, 0.0780, 0.1470, 0.0842, 0.5200, 0.3000, 0.3989, 0.0750, 0.0650
0.6200, 1.0000, 0.2350, 0.0803, 0.2250, 0.1128, 0.8600, 0.2620, 0.4875, 0.0960, 0.1020
0.5200, 0.0000, 0.2850, 0.1100, 0.1950, 0.0972, 0.6000, 0.3000, 0.5242, 0.0850, 0.2650
0.4600, 0.0000, 0.2740, 0.0780, 0.1710, 0.0880, 0.5800, 0.3000, 0.4828, 0.0900, 0.2760
0.4800, 1.0000, 0.3300, 0.1230, 0.2530, 0.1636, 0.4400, 0.6000, 0.5425, 0.0970, 0.2520
0.4800, 1.0000, 0.2770, 0.0730, 0.1910, 0.1194, 0.4600, 0.4000, 0.4852, 0.0920, 0.0900
0.5000, 1.0000, 0.2560, 0.1010, 0.2290, 0.1622, 0.4300, 0.5000, 0.4779, 0.1140, 0.1000
0.2100, 0.0000, 0.2010, 0.0630, 0.1350, 0.0690, 0.5400, 0.3000, 0.4094, 0.0890, 0.0550
0.3200, 1.0000, 0.2540, 0.0903, 0.1530, 0.1004, 0.3400, 0.4500, 0.4533, 0.0830, 0.0610
0.5400, 0.0000, 0.2420, 0.0740, 0.2040, 0.1090, 0.8200, 0.2000, 0.4174, 0.1090, 0.0920
0.6100, 1.0000, 0.3270, 0.0970, 0.1770, 0.1184, 0.2900, 0.6000, 0.4997, 0.0870, 0.2590
0.5600, 1.0000, 0.2310, 0.1040, 0.1810, 0.1164, 0.4700, 0.4000, 0.4477, 0.0790, 0.0530
0.3300, 0.0000, 0.2530, 0.0850, 0.1550, 0.0850, 0.5100, 0.3000, 0.4554, 0.0700, 0.1900
0.2700, 0.0000, 0.1960, 0.0780, 0.1280, 0.0680, 0.4300, 0.3000, 0.4443, 0.0710, 0.1420
0.6700, 1.0000, 0.2250, 0.0980, 0.1910, 0.1192, 0.6100, 0.3000, 0.3989, 0.0860, 0.0750
0.3700, 1.0000, 0.2770, 0.0930, 0.1800, 0.1194, 0.3000, 0.6000, 0.5030, 0.0880, 0.1420
0.5800, 0.0000, 0.2570, 0.0990, 0.1570, 0.0916, 0.4900, 0.3000, 0.4407, 0.0930, 0.1550
0.6500, 1.0000, 0.2790, 0.1030, 0.1590, 0.0968, 0.4200, 0.4000, 0.4615, 0.0860, 0.2250
0.3400, 0.0000, 0.2550, 0.0930, 0.2180, 0.1440, 0.5700, 0.4000, 0.4443, 0.0880, 0.0590
0.4600, 0.0000, 0.2490, 0.1150, 0.1980, 0.1296, 0.5400, 0.4000, 0.4277, 0.1030, 0.1040
0.3500, 0.0000, 0.2870, 0.0970, 0.2040, 0.1268, 0.6400, 0.3000, 0.4190, 0.0930, 0.1820
0.3700, 0.0000, 0.2180, 0.0840, 0.1840, 0.1010, 0.7300, 0.3000, 0.3912, 0.0930, 0.1280
0.3700, 0.0000, 0.3020, 0.0870, 0.1660, 0.0960, 0.4000, 0.4150, 0.5011, 0.0870, 0.0520
0.4100, 0.0000, 0.2050, 0.0800, 0.1240, 0.0488, 0.6400, 0.2000, 0.4025, 0.0750, 0.0370
0.6000, 0.0000, 0.2040, 0.1050, 0.1980, 0.0784, 0.9900, 0.2000, 0.4635, 0.0790, 0.1700
0.6600, 1.0000, 0.2400, 0.0980, 0.2360, 0.1464, 0.5800, 0.4000, 0.5063, 0.0960, 0.1700
0.2900, 0.0000, 0.2600, 0.0830, 0.1410, 0.0652, 0.6400, 0.2000, 0.4078, 0.0830, 0.0610
0.3700, 1.0000, 0.2680, 0.0790, 0.1570, 0.0980, 0.2800, 0.6000, 0.5043, 0.0960, 0.1440
0.4100, 1.0000, 0.2570, 0.0830, 0.1810, 0.1066, 0.6600, 0.3000, 0.3738, 0.0850, 0.0520
0.3900, 0.0000, 0.2290, 0.0770, 0.2040, 0.1432, 0.4600, 0.4000, 0.4304, 0.0740, 0.1280
0.6700, 1.0000, 0.2400, 0.0830, 0.1430, 0.0772, 0.4900, 0.3000, 0.4431, 0.0940, 0.0710
0.3600, 1.0000, 0.2410, 0.1120, 0.1930, 0.1250, 0.3500, 0.6000, 0.5106, 0.0950, 0.1630
0.4600, 1.0000, 0.2470, 0.0850, 0.1740, 0.1232, 0.3000, 0.6000, 0.4644, 0.0960, 0.1500
0.6000, 1.0000, 0.2500, 0.0897, 0.1850, 0.1208, 0.4600, 0.4020, 0.4511, 0.0920, 0.0970
0.5900, 1.0000, 0.2360, 0.0830, 0.1650, 0.1000, 0.4700, 0.4000, 0.4500, 0.0920, 0.1600
0.5300, 0.0000, 0.2210, 0.0930, 0.1340, 0.0762, 0.4600, 0.3000, 0.4078, 0.0960, 0.1780
0.4800, 0.0000, 0.1990, 0.0910, 0.1890, 0.1096, 0.6900, 0.3000, 0.3951, 0.1010, 0.0480
0.4800, 0.0000, 0.2950, 0.1310, 0.2070, 0.1322, 0.4700, 0.4000, 0.4935, 0.1060, 0.2700
0.6600, 1.0000, 0.2600, 0.0910, 0.2640, 0.1466, 0.6500, 0.4000, 0.5568, 0.0870, 0.2020
0.5200, 1.0000, 0.2450, 0.0940, 0.2170, 0.1494, 0.4800, 0.5000, 0.4585, 0.0890, 0.1110
0.5200, 1.0000, 0.2660, 0.1110, 0.2090, 0.1264, 0.6100, 0.3000, 0.4682, 0.1090, 0.0850
0.4600, 1.0000, 0.2350, 0.0870, 0.1810, 0.1148, 0.4400, 0.4000, 0.4710, 0.0980, 0.0420
0.4000, 1.0000, 0.2900, 0.1150, 0.0970, 0.0472, 0.3500, 0.2770, 0.4304, 0.0950, 0.1700
0.2200, 0.0000, 0.2300, 0.0730, 0.1610, 0.0978, 0.5400, 0.3000, 0.3829, 0.0910, 0.2000
0.5000, 0.0000, 0.2100, 0.0880, 0.1400, 0.0718, 0.3500, 0.4000, 0.5112, 0.0710, 0.2520
0.2000, 0.0000, 0.2290, 0.0870, 0.1910, 0.1282, 0.5300, 0.4000, 0.3892, 0.0850, 0.1130
0.6800, 0.0000, 0.2750, 0.1070, 0.2410, 0.1496, 0.6400, 0.4000, 0.4920, 0.0900, 0.1430
0.5200, 1.0000, 0.2430, 0.0860, 0.1970, 0.1336, 0.4400, 0.5000, 0.4575, 0.0910, 0.0510
0.4400, 0.0000, 0.2310, 0.0870, 0.2130, 0.1264, 0.7700, 0.3000, 0.3871, 0.0720, 0.0520
0.3800, 0.0000, 0.2730, 0.0810, 0.1460, 0.0816, 0.4700, 0.3000, 0.4466, 0.0810, 0.2100
0.4900, 0.0000, 0.2270, 0.0653, 0.1680, 0.0962, 0.6200, 0.2710, 0.3892, 0.0600, 0.0650
0.6100, 0.0000, 0.3300, 0.0950, 0.1820, 0.1148, 0.5400, 0.3000, 0.4190, 0.0740, 0.1410
0.2900, 1.0000, 0.1940, 0.0830, 0.1520, 0.1058, 0.3900, 0.4000, 0.3584, 0.0830, 0.0550
0.6100, 0.0000, 0.2580, 0.0980, 0.2350, 0.1258, 0.7600, 0.3000, 0.5112, 0.0820, 0.1340
0.3400, 1.0000, 0.2260, 0.0750, 0.1660, 0.0918, 0.6000, 0.3000, 0.4263, 0.1080, 0.0420
0.3600, 0.0000, 0.2190, 0.0890, 0.1890, 0.1052, 0.6800, 0.3000, 0.4369, 0.0960, 0.1110
0.5200, 0.0000, 0.2400, 0.0830, 0.1670, 0.0866, 0.7100, 0.2000, 0.3850, 0.0940, 0.0980
0.6100, 0.0000, 0.3120, 0.0790, 0.2350, 0.1568, 0.4700, 0.5000, 0.5050, 0.0960, 0.1640
0.4300, 0.0000, 0.2680, 0.1230, 0.1930, 0.1022, 0.6700, 0.3000, 0.4779, 0.0940, 0.0480
0.3500, 0.0000, 0.2040, 0.0650, 0.1870, 0.1056, 0.6700, 0.2790, 0.4277, 0.0780, 0.0960
0.2700, 0.0000, 0.2480, 0.0910, 0.1890, 0.1068, 0.6900, 0.3000, 0.4190, 0.0690, 0.0900
0.2900, 0.0000, 0.2100, 0.0710, 0.1560, 0.0970, 0.3800, 0.4000, 0.4654, 0.0900, 0.1620
0.6400, 1.0000, 0.2730, 0.1090, 0.1860, 0.1076, 0.3800, 0.5000, 0.5308, 0.0990, 0.1500
0.4100, 0.0000, 0.3460, 0.0873, 0.2050, 0.1426, 0.4100, 0.5000, 0.4673, 0.1100, 0.2790
0.4900, 1.0000, 0.2590, 0.0910, 0.1780, 0.1066, 0.5200, 0.3000, 0.4575, 0.0750, 0.0920
0.4800, 0.0000, 0.2040, 0.0980, 0.2090, 0.1394, 0.4600, 0.5000, 0.4771, 0.0780, 0.0830
0.5300, 0.0000, 0.2800, 0.0880, 0.2330, 0.1438, 0.5800, 0.4000, 0.5050, 0.0910, 0.1280
0.5300, 1.0000, 0.2220, 0.1130, 0.1970, 0.1152, 0.6700, 0.3000, 0.4304, 0.1000, 0.1020
0.2300, 0.0000, 0.2900, 0.0900, 0.2160, 0.1314, 0.6500, 0.3000, 0.4585, 0.0910, 0.3020
0.6500, 1.0000, 0.3020, 0.0980, 0.2190, 0.1606, 0.4000, 0.5000, 0.4522, 0.0840, 0.1980
0.4100, 0.0000, 0.3240, 0.0940, 0.1710, 0.1044, 0.5600, 0.3000, 0.3970, 0.0760, 0.0950
0.5500, 1.0000, 0.2340, 0.0830, 0.1660, 0.1016, 0.4600, 0.4000, 0.4522, 0.0960, 0.0530
0.2200, 0.0000, 0.1930, 0.0820, 0.1560, 0.0932, 0.5200, 0.3000, 0.3989, 0.0710, 0.1340
0.5600, 0.0000, 0.3100, 0.0787, 0.1870, 0.1414, 0.3400, 0.5500, 0.4060, 0.0900, 0.1440
0.5400, 1.0000, 0.3060, 0.1033, 0.1440, 0.0798, 0.3000, 0.4800, 0.5142, 0.1010, 0.2320
0.5900, 1.0000, 0.2550, 0.0953, 0.1900, 0.1394, 0.3500, 0.5430, 0.4357, 0.1170, 0.0810
0.6000, 1.0000, 0.2340, 0.0880, 0.1530, 0.0898, 0.5800, 0.3000, 0.3258, 0.0950, 0.1040
0.5400, 0.0000, 0.2680, 0.0870, 0.2060, 0.1220, 0.6800, 0.3000, 0.4382, 0.0800, 0.0590
0.2500, 0.0000, 0.2830, 0.0870, 0.1930, 0.1280, 0.4900, 0.4000, 0.4382, 0.0920, 0.2460
0.5400, 1.0000, 0.2770, 0.1130, 0.2000, 0.1284, 0.3700, 0.5000, 0.5153, 0.1130, 0.2970
0.5500, 0.0000, 0.3660, 0.1130, 0.1990, 0.0944, 0.4300, 0.4630, 0.5730, 0.0970, 0.2580
0.4000, 1.0000, 0.2650, 0.0930, 0.2360, 0.1470, 0.3700, 0.7000, 0.5561, 0.0920, 0.2290
0.6200, 1.0000, 0.3180, 0.1150, 0.1990, 0.1286, 0.4400, 0.5000, 0.4883, 0.0980, 0.2750
0.6500, 0.0000, 0.2440, 0.1200, 0.2220, 0.1356, 0.3700, 0.6000, 0.5509, 0.1240, 0.2810
0.3300, 1.0000, 0.2540, 0.1020, 0.2060, 0.1410, 0.3900, 0.5000, 0.4868, 0.1050, 0.1790
0.5300, 0.0000, 0.2200, 0.0940, 0.1750, 0.0880, 0.5900, 0.3000, 0.4942, 0.0980, 0.2000
0.3500, 0.0000, 0.2680, 0.0980, 0.1620, 0.1036, 0.4500, 0.4000, 0.4205, 0.0860, 0.2000
0.6600, 0.0000, 0.2800, 0.1010, 0.1950, 0.1292, 0.4000, 0.5000, 0.4860, 0.0940, 0.1730
0.6200, 1.0000, 0.3390, 0.1010, 0.2210, 0.1564, 0.3500, 0.6000, 0.4997, 0.1030, 0.1800
0.5000, 1.0000, 0.2960, 0.0943, 0.3000, 0.2424, 0.3300, 0.9090, 0.4812, 0.1090, 0.0840
0.4700, 0.0000, 0.2860, 0.0970, 0.1640, 0.0906, 0.5600, 0.3000, 0.4466, 0.0880, 0.1210
0.4700, 1.0000, 0.2560, 0.0940, 0.1650, 0.0748, 0.4000, 0.4000, 0.5526, 0.0930, 0.1610
0.2400, 0.0000, 0.2070, 0.0870, 0.1490, 0.0806, 0.6100, 0.2000, 0.3611, 0.0780, 0.0990
0.5800, 1.0000, 0.2620, 0.0910, 0.2170, 0.1242, 0.7100, 0.3000, 0.4691, 0.0680, 0.1090
0.3400, 0.0000, 0.2060, 0.0870, 0.1850, 0.1122, 0.5800, 0.3000, 0.4304, 0.0740, 0.1150
0.5100, 0.0000, 0.2790, 0.0960, 0.1960, 0.1222, 0.4200, 0.5000, 0.5069, 0.1200, 0.2680
0.3100, 1.0000, 0.3530, 0.1250, 0.1870, 0.1124, 0.4800, 0.4000, 0.4890, 0.1090, 0.2740
0.2200, 0.0000, 0.1990, 0.0750, 0.1750, 0.1086, 0.5400, 0.3000, 0.4127, 0.0720, 0.1580
0.5300, 1.0000, 0.2440, 0.0920, 0.2140, 0.1460, 0.5000, 0.4000, 0.4500, 0.0970, 0.1070
0.3700, 1.0000, 0.2140, 0.0830, 0.1280, 0.0696, 0.4900, 0.3000, 0.3850, 0.0840, 0.0830
0.2800, 0.0000, 0.3040, 0.0850, 0.1980, 0.1156, 0.6700, 0.3000, 0.4344, 0.0800, 0.1030
0.4700, 0.0000, 0.3160, 0.0840, 0.1540, 0.0880, 0.3000, 0.5100, 0.5199, 0.1050, 0.2720
0.2300, 0.0000, 0.1880, 0.0780, 0.1450, 0.0720, 0.6300, 0.2000, 0.3912, 0.0860, 0.0850
0.5000, 0.0000, 0.3100, 0.1230, 0.1780, 0.1050, 0.4800, 0.4000, 0.4828, 0.0880, 0.2800
0.5800, 1.0000, 0.3670, 0.1170, 0.1660, 0.0938, 0.4400, 0.4000, 0.4949, 0.1090, 0.3360
0.5500, 0.0000, 0.3210, 0.1100, 0.1640, 0.0842, 0.4200, 0.4000, 0.5242, 0.0900, 0.2810
0.6000, 1.0000, 0.2770, 0.1070, 0.1670, 0.1146, 0.3800, 0.4000, 0.4277, 0.0950, 0.1180
0.4100, 0.0000, 0.3080, 0.0810, 0.2140, 0.1520, 0.2800, 0.7600, 0.5136, 0.1230, 0.3170
0.6000, 1.0000, 0.2750, 0.1060, 0.2290, 0.1438, 0.5100, 0.4000, 0.5142, 0.0910, 0.2350
0.4000, 0.0000, 0.2690, 0.0920, 0.2030, 0.1198, 0.7000, 0.3000, 0.4190, 0.0810, 0.0600
0.5700, 1.0000, 0.3070, 0.0900, 0.2040, 0.1478, 0.3400, 0.6000, 0.4710, 0.0930, 0.1740
0.3700, 0.0000, 0.3830, 0.1130, 0.1650, 0.0946, 0.5300, 0.3000, 0.4466, 0.0790, 0.2590
0.4000, 1.0000, 0.3190, 0.0950, 0.1980, 0.1356, 0.3800, 0.5000, 0.4804, 0.0930, 0.1780
0.3300, 0.0000, 0.3500, 0.0890, 0.2000, 0.1304, 0.4200, 0.4760, 0.4927, 0.1010, 0.1280
0.3200, 1.0000, 0.2780, 0.0890, 0.2160, 0.1462, 0.5500, 0.4000, 0.4304, 0.0910, 0.0960
0.3500, 1.0000, 0.2590, 0.0810, 0.1740, 0.1024, 0.3100, 0.6000, 0.5313, 0.0820, 0.1260
0.5500, 0.0000, 0.3290, 0.1020, 0.1640, 0.1062, 0.4100, 0.4000, 0.4431, 0.0890, 0.2880
0.4900, 0.0000, 0.2600, 0.0930, 0.1830, 0.1002, 0.6400, 0.3000, 0.4543, 0.0880, 0.0880
0.3900, 1.0000, 0.2630, 0.1150, 0.2180, 0.1582, 0.3200, 0.7000, 0.4935, 0.1090, 0.2920
0.6000, 1.0000, 0.2230, 0.1130, 0.1860, 0.1258, 0.4600, 0.4000, 0.4263, 0.0940, 0.0710
0.6700, 1.0000, 0.2830, 0.0930, 0.2040, 0.1322, 0.4900, 0.4000, 0.4736, 0.0920, 0.1970
0.4100, 1.0000, 0.3200, 0.1090, 0.2510, 0.1706, 0.4900, 0.5000, 0.5056, 0.1030, 0.1860
0.4400, 0.0000, 0.2540, 0.0950, 0.1620, 0.0926, 0.5300, 0.3000, 0.4407, 0.0830, 0.0250
0.4800, 1.0000, 0.2330, 0.0893, 0.2120, 0.1428, 0.4600, 0.4610, 0.4754, 0.0980, 0.0840
0.4500, 0.0000, 0.2030, 0.0743, 0.1900, 0.1262, 0.4900, 0.3880, 0.4304, 0.0790, 0.0960
0.4700, 0.0000, 0.3040, 0.1200, 0.1990, 0.1200, 0.4600, 0.4000, 0.5106, 0.0870, 0.1950
0.4600, 0.0000, 0.2060, 0.0730, 0.1720, 0.1070, 0.5100, 0.3000, 0.4249, 0.0800, 0.0530
0.3600, 1.0000, 0.3230, 0.1150, 0.2860, 0.1994, 0.3900, 0.7000, 0.5472, 0.1120, 0.2170
0.3400, 0.0000, 0.2920, 0.0730, 0.1720, 0.1082, 0.4900, 0.4000, 0.4304, 0.0910, 0.1720
0.5300, 1.0000, 0.3310, 0.1170, 0.1830, 0.1190, 0.4800, 0.4000, 0.4382, 0.1060, 0.1310
0.6100, 0.0000, 0.2460, 0.1010, 0.2090, 0.1068, 0.7700, 0.3000, 0.4836, 0.0880, 0.2140
0.3700, 0.0000, 0.2020, 0.0810, 0.1620, 0.0878, 0.6300, 0.3000, 0.4025, 0.0880, 0.0590
0.3300, 1.0000, 0.2080, 0.0840, 0.1250, 0.0702, 0.4600, 0.3000, 0.3784, 0.0660, 0.0700
0.6800, 0.0000, 0.3280, 0.1057, 0.2050, 0.1164, 0.4000, 0.5130, 0.5493, 0.1170, 0.2200
0.4900, 1.0000, 0.3190, 0.0940, 0.2340, 0.1558, 0.3400, 0.7000, 0.5398, 0.1220, 0.2680
0.4800, 0.0000, 0.2390, 0.1090, 0.2320, 0.1052, 0.3700, 0.6000, 0.6107, 0.0960, 0.1520
0.5500, 1.0000, 0.2450, 0.0840, 0.1790, 0.1058, 0.6600, 0.3000, 0.3584, 0.0870, 0.0470
0.4300, 0.0000, 0.2210, 0.0660, 0.1340, 0.0772, 0.4500, 0.3000, 0.4078, 0.0800, 0.0740
0.6000, 1.0000, 0.3300, 0.0970, 0.2170, 0.1256, 0.4500, 0.5000, 0.5447, 0.1120, 0.2950
0.3100, 1.0000, 0.1900, 0.0930, 0.1370, 0.0730, 0.4700, 0.3000, 0.4443, 0.0780, 0.1010
0.5300, 1.0000, 0.2730, 0.0820, 0.1190, 0.0550, 0.3900, 0.3000, 0.4828, 0.0930, 0.1510
0.6700, 0.0000, 0.2280, 0.0870, 0.1660, 0.0986, 0.5200, 0.3000, 0.4344, 0.0920, 0.1270
0.6100, 1.0000, 0.2820, 0.1060, 0.2040, 0.1320, 0.5200, 0.4000, 0.4605, 0.0960, 0.2370
0.6200, 0.0000, 0.2890, 0.0873, 0.2060, 0.1272, 0.3300, 0.6240, 0.5434, 0.0990, 0.2250
0.6000, 0.0000, 0.2560, 0.0870, 0.2070, 0.1258, 0.6900, 0.3000, 0.4111, 0.0840, 0.0810
0.4200, 0.0000, 0.2490, 0.0910, 0.2040, 0.1418, 0.3800, 0.5000, 0.4796, 0.0890, 0.1510
0.3800, 1.0000, 0.2680, 0.1050, 0.1810, 0.1192, 0.3700, 0.5000, 0.4820, 0.0910, 0.1070
0.6200, 0.0000, 0.2240, 0.0790, 0.2220, 0.1474, 0.5900, 0.4000, 0.4357, 0.0760, 0.0640
0.6100, 1.0000, 0.2690, 0.1110, 0.2360, 0.1724, 0.3900, 0.6000, 0.4812, 0.0890, 0.1380
0.6100, 1.0000, 0.2310, 0.1130, 0.1860, 0.1144, 0.4700, 0.4000, 0.4812, 0.1050, 0.1850
0.5300, 0.0000, 0.2860, 0.0880, 0.1710, 0.0988, 0.4100, 0.4000, 0.5050, 0.0990, 0.2650
0.2800, 1.0000, 0.2470, 0.0970, 0.1750, 0.0996, 0.3200, 0.5000, 0.5380, 0.0870, 0.1010
0.2600, 1.0000, 0.3030, 0.0890, 0.2180, 0.1522, 0.3100, 0.7000, 0.5159, 0.0820, 0.1370
0.3000, 0.0000, 0.2130, 0.0870, 0.1340, 0.0630, 0.6300, 0.2000, 0.3689, 0.0660, 0.1430
0.5000, 0.0000, 0.2610, 0.1090, 0.2430, 0.1606, 0.6200, 0.4000, 0.4625, 0.0890, 0.1410
0.4800, 0.0000, 0.2020, 0.0950, 0.1870, 0.1174, 0.5300, 0.4000, 0.4419, 0.0850, 0.0790
0.5100, 0.0000, 0.2520, 0.1030, 0.1760, 0.1122, 0.3700, 0.5000, 0.4898, 0.0900, 0.2920
0.4700, 1.0000, 0.2250, 0.0820, 0.1310, 0.0668, 0.4100, 0.3000, 0.4754, 0.0890, 0.1780
0.6400, 1.0000, 0.2350, 0.0970, 0.2030, 0.1290, 0.5900, 0.3000, 0.4318, 0.0770, 0.0910
0.5100, 1.0000, 0.2590, 0.0760, 0.2400, 0.1690, 0.3900, 0.6000, 0.5075, 0.0960, 0.1160
0.3000, 0.0000, 0.2090, 0.1040, 0.1520, 0.0838, 0.4700, 0.3000, 0.4663, 0.0970, 0.0860
0.5600, 1.0000, 0.2870, 0.0990, 0.2080, 0.1464, 0.3900, 0.5000, 0.4727, 0.0970, 0.1220
0.4200, 0.0000, 0.2210, 0.0850, 0.2130, 0.1386, 0.6000, 0.4000, 0.4277, 0.0940, 0.0720
0.6200, 1.0000, 0.2670, 0.1150, 0.1830, 0.1240, 0.3500, 0.5000, 0.4788, 0.1000, 0.1290
0.3400, 0.0000, 0.3140, 0.0870, 0.1490, 0.0938, 0.4600, 0.3000, 0.3829, 0.0770, 0.1420
0.6000, 0.0000, 0.2220, 0.1047, 0.2210, 0.1054, 0.6000, 0.3680, 0.5628, 0.0930, 0.0900
0.6400, 0.0000, 0.2100, 0.0923, 0.2270, 0.1468, 0.6500, 0.3490, 0.4331, 0.1020, 0.1580
0.3900, 1.0000, 0.2120, 0.0900, 0.1820, 0.1104, 0.6000, 0.3000, 0.4060, 0.0980, 0.0390
0.7100, 1.0000, 0.2650, 0.1050, 0.2810, 0.1736, 0.5500, 0.5000, 0.5568, 0.0840, 0.1960
0.4800, 1.0000, 0.2920, 0.1100, 0.2180, 0.1516, 0.3900, 0.6000, 0.4920, 0.0980, 0.2220
0.7900, 1.0000, 0.2700, 0.1030, 0.1690, 0.1108, 0.3700, 0.5000, 0.4663, 0.1100, 0.2770
0.4000, 0.0000, 0.3070, 0.0990, 0.1770, 0.0854, 0.5000, 0.4000, 0.5338, 0.0850, 0.0990
0.4900, 1.0000, 0.2880, 0.0920, 0.2070, 0.1400, 0.4400, 0.5000, 0.4745, 0.0920, 0.1960
0.5100, 0.0000, 0.3060, 0.1030, 0.1980, 0.1066, 0.5700, 0.3000, 0.5148, 0.1000, 0.2020
0.5700, 0.0000, 0.3010, 0.1170, 0.2020, 0.1396, 0.4200, 0.5000, 0.4625, 0.1200, 0.1550
0.5900, 1.0000, 0.2470, 0.1140, 0.1520, 0.1048, 0.2900, 0.5000, 0.4511, 0.0880, 0.0770
0.5100, 0.0000, 0.2770, 0.0990, 0.2290, 0.1456, 0.6900, 0.3000, 0.4277, 0.0770, 0.1910
0.7400, 0.0000, 0.2980, 0.1010, 0.1710, 0.1048, 0.5000, 0.3000, 0.4394, 0.0860, 0.0700
0.6700, 0.0000, 0.2670, 0.1050, 0.2250, 0.1354, 0.6900, 0.3000, 0.4635, 0.0960, 0.0730
0.4900, 0.0000, 0.1980, 0.0880, 0.1880, 0.1148, 0.5700, 0.3000, 0.4394, 0.0930, 0.0490
0.5700, 0.0000, 0.2330, 0.0880, 0.1550, 0.0636, 0.7800, 0.2000, 0.4205, 0.0780, 0.0650
0.5600, 1.0000, 0.3510, 0.1230, 0.1640, 0.0950, 0.3800, 0.4000, 0.5043, 0.1170, 0.2630
0.5200, 1.0000, 0.2970, 0.1090, 0.2280, 0.1628, 0.3100, 0.8000, 0.5142, 0.1030, 0.2480
0.6900, 0.0000, 0.2930, 0.1240, 0.2230, 0.1390, 0.5400, 0.4000, 0.5011, 0.1020, 0.2960
0.3700, 0.0000, 0.2030, 0.0830, 0.1850, 0.1246, 0.3800, 0.5000, 0.4719, 0.0880, 0.2140
0.2400, 0.0000, 0.2250, 0.0890, 0.1410, 0.0680, 0.5200, 0.3000, 0.4654, 0.0840, 0.1850
0.5500, 1.0000, 0.2270, 0.0930, 0.1540, 0.0942, 0.5300, 0.3000, 0.3526, 0.0750, 0.0780
0.3600, 0.0000, 0.2280, 0.0870, 0.1780, 0.1160, 0.4100, 0.4000, 0.4654, 0.0820, 0.0930
0.4200, 1.0000, 0.2400, 0.1070, 0.1500, 0.0850, 0.4400, 0.3000, 0.4654, 0.0960, 0.2520
0.2100, 0.0000, 0.2420, 0.0760, 0.1470, 0.0770, 0.5300, 0.3000, 0.4443, 0.0790, 0.1500
0.4100, 0.0000, 0.2020, 0.0620, 0.1530, 0.0890, 0.5000, 0.3000, 0.4249, 0.0890, 0.0770
0.5700, 1.0000, 0.2940, 0.1090, 0.1600, 0.0876, 0.3100, 0.5000, 0.5333, 0.0920, 0.2080
0.2000, 1.0000, 0.2210, 0.0870, 0.1710, 0.0996, 0.5800, 0.3000, 0.4205, 0.0780, 0.0770
0.6700, 1.0000, 0.2360, 0.1113, 0.1890, 0.1054, 0.7000, 0.2700, 0.4220, 0.0930, 0.1080
0.3400, 0.0000, 0.2520, 0.0770, 0.1890, 0.1206, 0.5300, 0.4000, 0.4344, 0.0790, 0.1600
0.4100, 1.0000, 0.2490, 0.0860, 0.1920, 0.1150, 0.6100, 0.3000, 0.4382, 0.0940, 0.0530
0.3800, 1.0000, 0.3300, 0.0780, 0.3010, 0.2150, 0.5000, 0.6020, 0.5193, 0.1080, 0.2200
0.5100, 0.0000, 0.2350, 0.1010, 0.1950, 0.1210, 0.5100, 0.4000, 0.4745, 0.0940, 0.1540
0.5200, 1.0000, 0.2640, 0.0913, 0.2180, 0.1520, 0.3900, 0.5590, 0.4905, 0.0990, 0.2590
0.6700, 0.0000, 0.2980, 0.0800, 0.1720, 0.0934, 0.6300, 0.3000, 0.4357, 0.0820, 0.0900
0.6100, 0.0000, 0.3000, 0.1080, 0.1940, 0.1000, 0.5200, 0.3730, 0.5347, 0.1050, 0.2460
0.6700, 1.0000, 0.2500, 0.1117, 0.1460, 0.0934, 0.3300, 0.4420, 0.4585, 0.1030, 0.1240
0.5600, 0.0000, 0.2700, 0.1050, 0.2470, 0.1606, 0.5400, 0.5000, 0.5088, 0.0940, 0.0670
0.6400, 0.0000, 0.2000, 0.0747, 0.1890, 0.1148, 0.6200, 0.3050, 0.4111, 0.0910, 0.0720
0.5800, 1.0000, 0.2550, 0.1120, 0.1630, 0.1106, 0.2900, 0.6000, 0.4762, 0.0860, 0.2570
0.5500, 0.0000, 0.2820, 0.0910, 0.2500, 0.1402, 0.6700, 0.4000, 0.5366, 0.1030, 0.2620
0.6200, 1.0000, 0.3330, 0.1140, 0.1820, 0.1140, 0.3800, 0.5000, 0.5011, 0.0960, 0.2750
0.5700, 1.0000, 0.2560, 0.0960, 0.2000, 0.1330, 0.5200, 0.3850, 0.4318, 0.1050, 0.1770
0.2000, 1.0000, 0.2420, 0.0880, 0.1260, 0.0722, 0.4500, 0.3000, 0.3784, 0.0740, 0.0710
0.5300, 1.0000, 0.2210, 0.0980, 0.1650, 0.1052, 0.4700, 0.4000, 0.4159, 0.0810, 0.0470
0.3200, 1.0000, 0.3140, 0.0890, 0.1530, 0.0842, 0.5600, 0.3000, 0.4159, 0.0900, 0.1870
0.4100, 0.0000, 0.2310, 0.0860, 0.1480, 0.0780, 0.5800, 0.3000, 0.4094, 0.0600, 0.1250
0.6000, 0.0000, 0.2340, 0.0767, 0.2470, 0.1480, 0.6500, 0.3800, 0.5136, 0.0770, 0.0780
0.2600, 0.0000, 0.1880, 0.0830, 0.1910, 0.1036, 0.6900, 0.3000, 0.4522, 0.0690, 0.0510
0.3700, 0.0000, 0.3080, 0.1120, 0.2820, 0.1972, 0.4300, 0.7000, 0.5342, 0.1010, 0.2580
0.4500, 0.0000, 0.3200, 0.1100, 0.2240, 0.1342, 0.4500, 0.5000, 0.5412, 0.0930, 0.2150
0.6700, 0.0000, 0.3160, 0.1160, 0.1790, 0.0904, 0.4100, 0.4000, 0.5472, 0.1000, 0.3030
0.3400, 1.0000, 0.3550, 0.1200, 0.2330, 0.1466, 0.3400, 0.7000, 0.5568, 0.1010, 0.2430
0.5000, 0.0000, 0.3190, 0.0783, 0.2070, 0.1492, 0.3800, 0.5450, 0.4595, 0.0840, 0.0910
0.7100, 0.0000, 0.2950, 0.0970, 0.2270, 0.1516, 0.4500, 0.5000, 0.5024, 0.1080, 0.1500
0.5700, 1.0000, 0.3160, 0.1170, 0.2250, 0.1076, 0.4000, 0.6000, 0.5958, 0.1130, 0.3100
0.4900, 0.0000, 0.2030, 0.0930, 0.1840, 0.1030, 0.6100, 0.3000, 0.4605, 0.0930, 0.1530
0.3500, 0.0000, 0.4130, 0.0810, 0.1680, 0.1028, 0.3700, 0.5000, 0.4949, 0.0940, 0.3460
0.4100, 1.0000, 0.2120, 0.1020, 0.1840, 0.1004, 0.6400, 0.3000, 0.4585, 0.0790, 0.0630
0.7000, 1.0000, 0.2410, 0.0823, 0.1940, 0.1492, 0.3100, 0.6260, 0.4234, 0.1050, 0.0890
0.5200, 0.0000, 0.2300, 0.1070, 0.1790, 0.1237, 0.4250, 0.4210, 0.4159, 0.0930, 0.0500
0.6000, 0.0000, 0.2560, 0.0780, 0.1950, 0.0954, 0.9100, 0.2000, 0.3761, 0.0870, 0.0390
0.6200, 0.0000, 0.2250, 0.1250, 0.2150, 0.0990, 0.9800, 0.2000, 0.4500, 0.0950, 0.1030
0.4400, 1.0000, 0.3820, 0.1230, 0.2010, 0.1266, 0.4400, 0.5000, 0.5024, 0.0920, 0.3080
0.2800, 1.0000, 0.1920, 0.0810, 0.1550, 0.0946, 0.5100, 0.3000, 0.3850, 0.0870, 0.1160
0.5800, 1.0000, 0.2900, 0.0850, 0.1560, 0.1092, 0.3600, 0.4000, 0.3989, 0.0860, 0.1450
0.3900, 1.0000, 0.2400, 0.0897, 0.1900, 0.1136, 0.5200, 0.3650, 0.4804, 0.1010, 0.0740
0.3400, 1.0000, 0.2060, 0.0980, 0.1830, 0.0920, 0.8300, 0.2000, 0.3689, 0.0920, 0.0450
0.6500, 0.0000, 0.2630, 0.0700, 0.2440, 0.1662, 0.5100, 0.5000, 0.4898, 0.0980, 0.1150
0.6600, 1.0000, 0.3460, 0.1150, 0.2040, 0.1394, 0.3600, 0.6000, 0.4963, 0.1090, 0.2640
0.5100, 0.0000, 0.2340, 0.0870, 0.2200, 0.1088, 0.9300, 0.2000, 0.4511, 0.0820, 0.0870
0.5000, 1.0000, 0.2920, 0.1190, 0.1620, 0.0852, 0.5400, 0.3000, 0.4736, 0.0950, 0.2020
0.5900, 1.0000, 0.2720, 0.1070, 0.1580, 0.1020, 0.3900, 0.4000, 0.4443, 0.0930, 0.1270
0.5200, 0.0000, 0.2700, 0.0783, 0.1340, 0.0730, 0.4400, 0.3050, 0.4443, 0.0690, 0.1820
0.6900, 1.0000, 0.2450, 0.1080, 0.2430, 0.1364, 0.4000, 0.6000, 0.5808, 0.1000, 0.2410
0.5300, 0.0000, 0.2410, 0.1050, 0.1840, 0.1134, 0.4600, 0.4000, 0.4812, 0.0950, 0.0660
0.4700, 1.0000, 0.2530, 0.0980, 0.1730, 0.1056, 0.4400, 0.4000, 0.4762, 0.1080, 0.0940
0.5200, 0.0000, 0.2880, 0.1130, 0.2800, 0.1740, 0.6700, 0.4000, 0.5273, 0.0860, 0.2830
0.3900, 0.0000, 0.2090, 0.0950, 0.1500, 0.0656, 0.6800, 0.2000, 0.4407, 0.0950, 0.0640
0.6700, 1.0000, 0.2300, 0.0700, 0.1840, 0.1280, 0.3500, 0.5000, 0.4654, 0.0990, 0.1020
0.5900, 1.0000, 0.2410, 0.0960, 0.1700, 0.0986, 0.5400, 0.3000, 0.4466, 0.0850, 0.2000
0.5100, 1.0000, 0.2810, 0.1060, 0.2020, 0.1222, 0.5500, 0.4000, 0.4820, 0.0870, 0.2650
0.2300, 1.0000, 0.1800, 0.0780, 0.1710, 0.0960, 0.4800, 0.4000, 0.4905, 0.0920, 0.0940
0.6800, 0.0000, 0.2590, 0.0930, 0.2530, 0.1812, 0.5300, 0.5000, 0.4543, 0.0980, 0.2300
0.4400, 0.0000, 0.2150, 0.0850, 0.1570, 0.0922, 0.5500, 0.3000, 0.3892, 0.0840, 0.1810
0.6000, 1.0000, 0.2430, 0.1030, 0.1410, 0.0866, 0.3300, 0.4000, 0.4673, 0.0780, 0.1560
0.5200, 0.0000, 0.2450, 0.0900, 0.1980, 0.1290, 0.2900, 0.7000, 0.5298, 0.0860, 0.2330
0.3800, 0.0000, 0.2130, 0.0720, 0.1650, 0.0602, 0.8800, 0.2000, 0.4431, 0.0900, 0.0600
0.6100, 0.0000, 0.2580, 0.0900, 0.2800, 0.1954, 0.5500, 0.5000, 0.4997, 0.0900, 0.2190
0.6800, 1.0000, 0.2480, 0.1010, 0.2210, 0.1514, 0.6000, 0.4000, 0.3871, 0.0870, 0.0800
0.2800, 1.0000, 0.3150, 0.0830, 0.2280, 0.1494, 0.3800, 0.6000, 0.5313, 0.0830, 0.0680
0.6500, 1.0000, 0.3350, 0.1020, 0.1900, 0.1262, 0.3500, 0.5000, 0.4970, 0.1020, 0.3320
0.6900, 0.0000, 0.2810, 0.1130, 0.2340, 0.1428, 0.5200, 0.4000, 0.5278, 0.0770, 0.2480
0.5100, 0.0000, 0.2430, 0.0853, 0.1530, 0.0716, 0.7100, 0.2150, 0.3951, 0.0820, 0.0840
0.2900, 0.0000, 0.3500, 0.0983, 0.2040, 0.1426, 0.5000, 0.4080, 0.4043, 0.0910, 0.2000
0.5500, 1.0000, 0.2350, 0.0930, 0.1770, 0.1268, 0.4100, 0.4000, 0.3829, 0.0830, 0.0550
0.3400, 1.0000, 0.3000, 0.0830, 0.1850, 0.1072, 0.5300, 0.3000, 0.4820, 0.0920, 0.0850
0.6700, 0.0000, 0.2070, 0.0830, 0.1700, 0.0998, 0.5900, 0.3000, 0.4025, 0.0770, 0.0890
0.4900, 0.0000, 0.2560, 0.0760, 0.1610, 0.0998, 0.5100, 0.3000, 0.3932, 0.0780, 0.0310
0.5500, 1.0000, 0.2290, 0.0810, 0.1230, 0.0672, 0.4100, 0.3000, 0.4304, 0.0880, 0.1290
0.5900, 1.0000, 0.2510, 0.0900, 0.1630, 0.1014, 0.4600, 0.4000, 0.4357, 0.0910, 0.0830
0.5300, 0.0000, 0.3320, 0.0827, 0.1860, 0.1068, 0.4600, 0.4040, 0.5112, 0.1020, 0.2750
0.4800, 1.0000, 0.2410, 0.1100, 0.2090, 0.1346, 0.5800, 0.4000, 0.4407, 0.1000, 0.0650
0.5200, 0.0000, 0.2950, 0.1043, 0.2110, 0.1328, 0.4900, 0.4310, 0.4984, 0.0980, 0.1980
0.6900, 0.0000, 0.2960, 0.1220, 0.2310, 0.1284, 0.5600, 0.4000, 0.5451, 0.0860, 0.2360
0.6000, 1.0000, 0.2280, 0.1100, 0.2450, 0.1898, 0.3900, 0.6000, 0.4394, 0.0880, 0.2530
0.4600, 1.0000, 0.2270, 0.0830, 0.1830, 0.1258, 0.3200, 0.6000, 0.4836, 0.0750, 0.1240
0.5100, 1.0000, 0.2620, 0.1010, 0.1610, 0.0996, 0.4800, 0.3000, 0.4205, 0.0880, 0.0440
0.6700, 1.0000, 0.2350, 0.0960, 0.2070, 0.1382, 0.4200, 0.5000, 0.4898, 0.1110, 0.1720
0.4900, 0.0000, 0.2210, 0.0850, 0.1360, 0.0634, 0.6200, 0.2190, 0.3970, 0.0720, 0.1140
0.4600, 1.0000, 0.2650, 0.0940, 0.2470, 0.1602, 0.5900, 0.4000, 0.4935, 0.1110, 0.1420
0.4700, 0.0000, 0.3240, 0.1050, 0.1880, 0.1250, 0.4600, 0.4090, 0.4443, 0.0990, 0.1090
0.7500, 0.0000, 0.3010, 0.0780, 0.2220, 0.1542, 0.4400, 0.5050, 0.4779, 0.0970, 0.1800
0.2800, 0.0000, 0.2420, 0.0930, 0.1740, 0.1064, 0.5400, 0.3000, 0.4220, 0.0840, 0.1440
0.6500, 1.0000, 0.3130, 0.1100, 0.2130, 0.1280, 0.4700, 0.5000, 0.5247, 0.0910, 0.1630
0.4200, 0.0000, 0.3010, 0.0910, 0.1820, 0.1148, 0.4900, 0.4000, 0.4511, 0.0820, 0.1470
0.5100, 0.0000, 0.2450, 0.0790, 0.2120, 0.1286, 0.6500, 0.3000, 0.4522, 0.0910, 0.0970
0.5300, 1.0000, 0.2770, 0.0950, 0.1900, 0.1018, 0.4100, 0.5000, 0.5464, 0.1010, 0.2200
0.5400, 0.0000, 0.2320, 0.1107, 0.2380, 0.1628, 0.4800, 0.4960, 0.4913, 0.1080, 0.1900
0.7300, 0.0000, 0.2700, 0.1020, 0.2110, 0.1210, 0.6700, 0.3000, 0.4745, 0.0990, 0.1090
0.5400, 0.0000, 0.2680, 0.1080, 0.1760, 0.0806, 0.6700, 0.3000, 0.4956, 0.1060, 0.1910
0.4200, 0.0000, 0.2920, 0.0930, 0.2490, 0.1742, 0.4500, 0.6000, 0.5004, 0.0920, 0.1220
0.7500, 0.0000, 0.3120, 0.1177, 0.2290, 0.1388, 0.2900, 0.7900, 0.5724, 0.1060, 0.2300
0.5500, 1.0000, 0.3210, 0.1127, 0.2070, 0.0924, 0.2500, 0.8280, 0.6105, 0.1110, 0.2420
0.6800, 1.0000, 0.2570, 0.1090, 0.2330, 0.1126, 0.3500, 0.7000, 0.6057, 0.1050, 0.2480
0.5700, 0.0000, 0.2690, 0.0980, 0.2460, 0.1652, 0.3800, 0.7000, 0.5366, 0.0960, 0.2490
0.4800, 0.0000, 0.3140, 0.0753, 0.2420, 0.1516, 0.3800, 0.6370, 0.5568, 0.1030, 0.1920
0.6100, 1.0000, 0.2560, 0.0850, 0.1840, 0.1162, 0.3900, 0.5000, 0.4970, 0.0980, 0.1310
0.6900, 0.0000, 0.3700, 0.1030, 0.2070, 0.1314, 0.5500, 0.4000, 0.4635, 0.0900, 0.2370
0.3800, 0.0000, 0.3260, 0.0770, 0.1680, 0.1006, 0.4700, 0.4000, 0.4625, 0.0960, 0.0780
0.4500, 1.0000, 0.2120, 0.0940, 0.1690, 0.0968, 0.5500, 0.3000, 0.4454, 0.1020, 0.1350
0.5100, 1.0000, 0.2920, 0.1070, 0.1870, 0.1390, 0.3200, 0.6000, 0.4382, 0.0950, 0.2440
0.7100, 1.0000, 0.2400, 0.0840, 0.1380, 0.0858, 0.3900, 0.4000, 0.4190, 0.0900, 0.1990
0.5700, 0.0000, 0.3610, 0.1170, 0.1810, 0.1082, 0.3400, 0.5000, 0.5268, 0.1000, 0.2700
0.5600, 1.0000, 0.2580, 0.1030, 0.1770, 0.1144, 0.3400, 0.5000, 0.4963, 0.0990, 0.1640
0.3200, 1.0000, 0.2200, 0.0880, 0.1370, 0.0786, 0.4800, 0.3000, 0.3951, 0.0780, 0.0720
0.5000, 0.0000, 0.2190, 0.0910, 0.1900, 0.1112, 0.6700, 0.3000, 0.4078, 0.0770, 0.0960
0.4300, 0.0000, 0.3430, 0.0840, 0.2560, 0.1726, 0.3300, 0.8000, 0.5529, 0.1040, 0.3060
0.5400, 1.0000, 0.2520, 0.1150, 0.1810, 0.1200, 0.3900, 0.5000, 0.4701, 0.0920, 0.0910
0.3100, 0.0000, 0.2330, 0.0850, 0.1900, 0.1308, 0.4300, 0.4000, 0.4394, 0.0770, 0.2140
0.5600, 0.0000, 0.2570, 0.0800, 0.2440, 0.1516, 0.5900, 0.4000, 0.5118, 0.0950, 0.0950
0.4400, 0.0000, 0.2510, 0.1330, 0.1820, 0.1130, 0.5500, 0.3000, 0.4249, 0.0840, 0.2160
0.5700, 1.0000, 0.3190, 0.1110, 0.1730, 0.1162, 0.4100, 0.4000, 0.4369, 0.0870, 0.2630
Test data:
# diabetes_norm_test_100.txt
#
0.6400, 1.0000, 0.2840, 0.1110, 0.1840, 0.1270, 0.4100, 0.4000, 0.4382, 0.0970, 0.1780
0.4300, 0.0000, 0.2810, 0.1210, 0.1920, 0.1210, 0.6000, 0.3000, 0.4007, 0.0930, 0.1130
0.1900, 0.0000, 0.2530, 0.0830, 0.2250, 0.1566, 0.4600, 0.5000, 0.4719, 0.0840, 0.2000
0.7100, 1.0000, 0.2610, 0.0850, 0.2200, 0.1524, 0.4700, 0.5000, 0.4635, 0.0910, 0.1390
0.5000, 1.0000, 0.2800, 0.1040, 0.2820, 0.1968, 0.4400, 0.6000, 0.5328, 0.0950, 0.1390
0.5900, 1.0000, 0.2360, 0.0730, 0.1800, 0.1074, 0.5100, 0.4000, 0.4682, 0.0840, 0.0880
0.5700, 0.0000, 0.2450, 0.0930, 0.1860, 0.0966, 0.7100, 0.3000, 0.4522, 0.0910, 0.1480
0.4900, 1.0000, 0.2100, 0.0820, 0.1190, 0.0854, 0.2300, 0.5000, 0.3970, 0.0740, 0.0880
0.4100, 1.0000, 0.3200, 0.1260, 0.1980, 0.1042, 0.4900, 0.4000, 0.5412, 0.1240, 0.2430
0.2500, 1.0000, 0.2260, 0.0850, 0.1300, 0.0710, 0.4800, 0.3000, 0.4007, 0.0810, 0.0710
0.5200, 1.0000, 0.1970, 0.0810, 0.1520, 0.0534, 0.8200, 0.2000, 0.4419, 0.0820, 0.0770
0.3400, 0.0000, 0.2120, 0.0840, 0.2540, 0.1134, 0.5200, 0.5000, 0.6094, 0.0920, 0.1090
0.4200, 1.0000, 0.3060, 0.1010, 0.2690, 0.1722, 0.5000, 0.5000, 0.5455, 0.1060, 0.2720
0.2800, 1.0000, 0.2550, 0.0990, 0.1620, 0.1016, 0.4600, 0.4000, 0.4277, 0.0940, 0.0600
0.4700, 1.0000, 0.2330, 0.0900, 0.1950, 0.1258, 0.5400, 0.4000, 0.4331, 0.0730, 0.0540
0.3200, 1.0000, 0.3100, 0.1000, 0.1770, 0.0962, 0.4500, 0.4000, 0.5187, 0.0770, 0.2210
0.4300, 0.0000, 0.1850, 0.0870, 0.1630, 0.0936, 0.6100, 0.2670, 0.3738, 0.0800, 0.0900
0.5900, 1.0000, 0.2690, 0.1040, 0.1940, 0.1266, 0.4300, 0.5000, 0.4804, 0.1060, 0.3110
0.5300, 0.0000, 0.2830, 0.1010, 0.1790, 0.1070, 0.4800, 0.4000, 0.4788, 0.1010, 0.2810
0.6000, 0.0000, 0.2570, 0.1030, 0.1580, 0.0846, 0.6400, 0.2000, 0.3850, 0.0970, 0.1820
0.5400, 1.0000, 0.3610, 0.1150, 0.1630, 0.0984, 0.4300, 0.4000, 0.4682, 0.1010, 0.3210
0.3500, 1.0000, 0.2410, 0.0947, 0.1550, 0.0974, 0.3200, 0.4840, 0.4852, 0.0940, 0.0580
0.4900, 1.0000, 0.2580, 0.0890, 0.1820, 0.1186, 0.3900, 0.5000, 0.4804, 0.1150, 0.2620
0.5800, 0.0000, 0.2280, 0.0910, 0.1960, 0.1188, 0.4800, 0.4000, 0.4984, 0.1150, 0.2060
0.3600, 1.0000, 0.3910, 0.0900, 0.2190, 0.1358, 0.3800, 0.6000, 0.5421, 0.1030, 0.2330
0.4600, 1.0000, 0.4220, 0.0990, 0.2110, 0.1370, 0.4400, 0.5000, 0.5011, 0.0990, 0.2420
0.4400, 1.0000, 0.2660, 0.0990, 0.2050, 0.1090, 0.4300, 0.5000, 0.5580, 0.1110, 0.1230
0.4600, 0.0000, 0.2990, 0.0830, 0.1710, 0.1130, 0.3800, 0.4500, 0.4585, 0.0980, 0.1670
0.5400, 0.0000, 0.2100, 0.0780, 0.1880, 0.1074, 0.7000, 0.3000, 0.3970, 0.0730, 0.0630
0.6300, 1.0000, 0.2550, 0.1090, 0.2260, 0.1032, 0.4600, 0.5000, 0.5951, 0.0870, 0.1970
0.4100, 1.0000, 0.2420, 0.0900, 0.1990, 0.1236, 0.5700, 0.4000, 0.4522, 0.0860, 0.0710
0.2800, 0.0000, 0.2540, 0.0930, 0.1410, 0.0790, 0.4900, 0.3000, 0.4174, 0.0910, 0.1680
0.1900, 0.0000, 0.2320, 0.0750, 0.1430, 0.0704, 0.5200, 0.3000, 0.4635, 0.0720, 0.1400
0.6100, 1.0000, 0.2610, 0.1260, 0.2150, 0.1298, 0.5700, 0.4000, 0.4949, 0.0960, 0.2170
0.4800, 0.0000, 0.3270, 0.0930, 0.2760, 0.1986, 0.4300, 0.6420, 0.5148, 0.0910, 0.1210
0.5400, 1.0000, 0.2730, 0.1000, 0.2000, 0.1440, 0.3300, 0.6000, 0.4745, 0.0760, 0.2350
0.5300, 1.0000, 0.2660, 0.0930, 0.1850, 0.1224, 0.3600, 0.5000, 0.4890, 0.0820, 0.2450
0.4800, 0.0000, 0.2280, 0.1010, 0.1100, 0.0416, 0.5600, 0.2000, 0.4127, 0.0970, 0.0400
0.5300, 0.0000, 0.2880, 0.1117, 0.1450, 0.0872, 0.4600, 0.3150, 0.4078, 0.0850, 0.0520
0.2900, 1.0000, 0.1810, 0.0730, 0.1580, 0.0990, 0.4100, 0.4000, 0.4500, 0.0780, 0.1040
0.6200, 0.0000, 0.3200, 0.0880, 0.1720, 0.0690, 0.3800, 0.4000, 0.5784, 0.1000, 0.1320
0.5000, 1.0000, 0.2370, 0.0920, 0.1660, 0.0970, 0.5200, 0.3000, 0.4443, 0.0930, 0.0880
0.5800, 1.0000, 0.2360, 0.0960, 0.2570, 0.1710, 0.5900, 0.4000, 0.4905, 0.0820, 0.0690
0.5500, 1.0000, 0.2460, 0.1090, 0.1430, 0.0764, 0.5100, 0.3000, 0.4357, 0.0880, 0.2190
0.5400, 0.0000, 0.2260, 0.0900, 0.1830, 0.1042, 0.6400, 0.3000, 0.4304, 0.0920, 0.0720
0.3600, 0.0000, 0.2780, 0.0730, 0.1530, 0.1044, 0.4200, 0.4000, 0.3497, 0.0730, 0.2010
0.6300, 1.0000, 0.2410, 0.1110, 0.1840, 0.1122, 0.4400, 0.4000, 0.4935, 0.0820, 0.1100
0.4700, 1.0000, 0.2650, 0.0700, 0.1810, 0.1048, 0.6300, 0.3000, 0.4190, 0.0700, 0.0510
0.5100, 1.0000, 0.3280, 0.1120, 0.2020, 0.1006, 0.3700, 0.5000, 0.5775, 0.1090, 0.2770
0.4200, 0.0000, 0.1990, 0.0760, 0.1460, 0.0832, 0.5500, 0.3000, 0.3664, 0.0790, 0.0630
0.3700, 1.0000, 0.2360, 0.0940, 0.2050, 0.1388, 0.5300, 0.4000, 0.4190, 0.1070, 0.1180
0.2800, 0.0000, 0.2210, 0.0820, 0.1680, 0.1006, 0.5400, 0.3000, 0.4205, 0.0860, 0.0690
0.5800, 0.0000, 0.2810, 0.1110, 0.1980, 0.0806, 0.3100, 0.6000, 0.6068, 0.0930, 0.2730
0.3200, 0.0000, 0.2650, 0.0860, 0.1840, 0.1016, 0.5300, 0.4000, 0.4990, 0.0780, 0.2580
0.2500, 1.0000, 0.2350, 0.0880, 0.1430, 0.0808, 0.5500, 0.3000, 0.3584, 0.0830, 0.0430
0.6300, 0.0000, 0.2600, 0.0857, 0.1550, 0.0782, 0.4600, 0.3370, 0.5037, 0.0970, 0.1980
0.5200, 0.0000, 0.2780, 0.0850, 0.2190, 0.1360, 0.4900, 0.4000, 0.5136, 0.0750, 0.2420
0.6500, 1.0000, 0.2850, 0.1090, 0.2010, 0.1230, 0.4600, 0.4000, 0.5075, 0.0960, 0.2320
0.4200, 0.0000, 0.3060, 0.1210, 0.1760, 0.0928, 0.6900, 0.3000, 0.4263, 0.0890, 0.1750
0.5300, 0.0000, 0.2220, 0.0780, 0.1640, 0.0810, 0.7000, 0.2000, 0.4174, 0.1010, 0.0930
0.7900, 1.0000, 0.2330, 0.0880, 0.1860, 0.1284, 0.3300, 0.6000, 0.4812, 0.1020, 0.1680
0.4300, 0.0000, 0.3540, 0.0930, 0.1850, 0.1002, 0.4400, 0.4000, 0.5318, 0.1010, 0.2750
0.4400, 0.0000, 0.3140, 0.1150, 0.1650, 0.0976, 0.5200, 0.3000, 0.4344, 0.0890, 0.2930
0.6200, 1.0000, 0.3780, 0.1190, 0.1130, 0.0510, 0.3100, 0.4000, 0.5043, 0.0840, 0.2810
0.3300, 0.0000, 0.1890, 0.0700, 0.1620, 0.0918, 0.5900, 0.3000, 0.4025, 0.0580, 0.0720
0.5600, 0.0000, 0.3500, 0.0793, 0.1950, 0.1408, 0.4200, 0.4640, 0.4111, 0.0960, 0.1400
0.6600, 0.0000, 0.2170, 0.1260, 0.2120, 0.1278, 0.4500, 0.4710, 0.5278, 0.1010, 0.1890
0.3400, 1.0000, 0.2530, 0.1110, 0.2300, 0.1620, 0.3900, 0.6000, 0.4977, 0.0900, 0.1810
0.4600, 1.0000, 0.2380, 0.0970, 0.2240, 0.1392, 0.4200, 0.5000, 0.5366, 0.0810, 0.2090
0.5000, 0.0000, 0.3180, 0.0820, 0.1360, 0.0692, 0.5500, 0.2000, 0.4078, 0.0850, 0.1360
0.6900, 0.0000, 0.3430, 0.1130, 0.2000, 0.1238, 0.5400, 0.4000, 0.4710, 0.1120, 0.2610
0.3400, 0.0000, 0.2630, 0.0870, 0.1970, 0.1200, 0.6300, 0.3000, 0.4249, 0.0960, 0.1130
0.7100, 1.0000, 0.2700, 0.0933, 0.2690, 0.1902, 0.4100, 0.6560, 0.5242, 0.0930, 0.1310
0.4700, 0.0000, 0.2720, 0.0800, 0.2080, 0.1456, 0.3800, 0.6000, 0.4804, 0.0920, 0.1740
0.4100, 0.0000, 0.3380, 0.1233, 0.1870, 0.1270, 0.4500, 0.4160, 0.4318, 0.1000, 0.2570
0.3400, 0.0000, 0.3300, 0.0730, 0.1780, 0.1146, 0.5100, 0.3490, 0.4127, 0.0920, 0.0550
0.5100, 0.0000, 0.2410, 0.0870, 0.2610, 0.1756, 0.6900, 0.4000, 0.4407, 0.0930, 0.0840
0.4300, 0.0000, 0.2130, 0.0790, 0.1410, 0.0788, 0.5300, 0.3000, 0.3829, 0.0900, 0.0420
0.5500, 0.0000, 0.2300, 0.0947, 0.1900, 0.1376, 0.3800, 0.5000, 0.4277, 0.1060, 0.1460
0.5900, 1.0000, 0.2790, 0.1010, 0.2180, 0.1442, 0.3800, 0.6000, 0.5187, 0.0950, 0.2120
0.2700, 1.0000, 0.3360, 0.1100, 0.2460, 0.1566, 0.5700, 0.4000, 0.5088, 0.0890, 0.2330
0.5100, 1.0000, 0.2270, 0.1030, 0.2170, 0.1624, 0.3000, 0.7000, 0.4812, 0.0800, 0.0910
0.4900, 1.0000, 0.2740, 0.0890, 0.1770, 0.1130, 0.3700, 0.5000, 0.4905, 0.0970, 0.1110
0.2700, 0.0000, 0.2260, 0.0710, 0.1160, 0.0434, 0.5600, 0.2000, 0.4419, 0.0790, 0.1520
0.5700, 1.0000, 0.2320, 0.1073, 0.2310, 0.1594, 0.4100, 0.5630, 0.5030, 0.1120, 0.1200
0.3900, 1.0000, 0.2690, 0.0930, 0.1360, 0.0754, 0.4800, 0.3000, 0.4143, 0.0990, 0.0670
0.6200, 1.0000, 0.3460, 0.1200, 0.2150, 0.1292, 0.4300, 0.5000, 0.5366, 0.1230, 0.3100
0.3700, 0.0000, 0.2330, 0.0880, 0.2230, 0.1420, 0.6500, 0.3400, 0.4357, 0.0820, 0.0940
0.4600, 0.0000, 0.2110, 0.0800, 0.2050, 0.1444, 0.4200, 0.5000, 0.4533, 0.0870, 0.1830
0.6800, 1.0000, 0.2350, 0.1010, 0.1620, 0.0854, 0.5900, 0.3000, 0.4477, 0.0910, 0.0660
0.5100, 0.0000, 0.3150, 0.0930, 0.2310, 0.1440, 0.4900, 0.4700, 0.5252, 0.1170, 0.1730
0.4100, 0.0000, 0.2080, 0.0860, 0.2230, 0.1282, 0.8300, 0.3000, 0.4078, 0.0890, 0.0720
0.5300, 0.0000, 0.2650, 0.0970, 0.1930, 0.1224, 0.5800, 0.3000, 0.4143, 0.0990, 0.0490
0.4500, 0.0000, 0.2420, 0.0830, 0.1770, 0.1184, 0.4500, 0.4000, 0.4220, 0.0820, 0.0640
0.3300, 0.0000, 0.1950, 0.0800, 0.1710, 0.0854, 0.7500, 0.2000, 0.3970, 0.0800, 0.0480
0.6000, 1.0000, 0.2820, 0.1120, 0.1850, 0.1138, 0.4200, 0.4000, 0.4984, 0.0930, 0.1780
0.4700, 1.0000, 0.2490, 0.0750, 0.2250, 0.1660, 0.4200, 0.5000, 0.4443, 0.1020, 0.1040
0.6000, 1.0000, 0.2490, 0.0997, 0.1620, 0.1066, 0.4300, 0.3770, 0.4127, 0.0950, 0.1320
0.3600, 0.0000, 0.3000, 0.0950, 0.2010, 0.1252, 0.4200, 0.4790, 0.5130, 0.0850, 0.2200
0.3600, 0.0000, 0.1960, 0.0710, 0.2500, 0.1332, 0.9700, 0.3000, 0.4595, 0.0920, 0.0570

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.