Bottom line: I did an experiment where I used one-over-n-hot encoding instead of the standard one-hot encoding for categorical predictor variables for a neural network. The results were promising but not conclusive.
Suppose you want to predict a person’s income from their sex (M or F), age, State (Colorado, Michigan, Nebraska, Oklahoma), and political leaning (conservative, moderate, liberal). The raw data might look like:
F 24 michigan lib 29500 M 39 oklahoma mod 51200 F 63 nebraska con 75800 M 36 colorado mod 44500 F 27 nebraska lib 28600 . . .
The standard way to encode the categorial binary sex variable is M = 0, F = 1 (or vice versa). The standard way to encode the State and political leaning variables is one-hot encoding. If you normalize predictor age by dividing by 100, and target income by dividing by 100,000, the normalized and encoded data looks like:
1 0.25 0 1 0 0 0 0 1 0.29500 0 0.39 0 0 0 1 0 1 0 0.51200 1 0.63 0 0 1 0 1 0 0 0.75800 0 0.36 1 0 0 0 0 1 0 0.44500 1 0.27 0 0 1 0 0 0 1 0.28600 . . .
So, Colorado = 1 0 0 0, Michigan = 0 1 0 0, Nebraska = 0 0 1 0, Oklahoma = 0 0 0 1. And conservative = 1 0 0, moderate = 0 1 0, liberal = 0 0 1.
I wondered if using one-over-n-hot encoding would be better than standard one-hot encoding: Colorado = 0.25 0 0 0, Michigan = 0 0.25 0 0, Nebraska = 0 0 0.25 0, Oklahoma = 0 0 0 0.25. And conservative = 0.3333 0 0, moderate = 0 0.3333 0, liberal = 0 0 0.3333. The binary sex variable is one-over-n-hot encoded as M = 0.5 0, F = 0 0.5.
The idea is that standard one-hot encoding doesn’t explicitly take into account how many possible values a categorical variable has, but one-over-n-hot does. Put another way, one-over-n-hot encoding adds information to the predictor variables.
So I spun up a couple of demo programs. I used PyTorch. The demo data has just 200 training items and 40 test items. The output of the baseline, standard one-hot encoding demo is:
Begin People predict income std one-hot Creating People Dataset objects Creating 9-(15-15)-1 neural network bat_size = 10 loss = MSELoss() optimizer = Adam lrn_rate = 0.01 Starting training epoch = 0 | loss = 0.6131 epoch = 200 | loss = 0.0185 epoch = 400 | loss = 0.0228 epoch = 600 | loss = 0.0166 epoch = 800 | loss = 0.0157 Done Computing model accuracy (within 0.10 of true) Accuracy on train data = 0.9400 Accuracy on test data = 0.9000 Predicting income for M 34 Oklahoma moderate: $44968.94 End People income one-hot demo
And the output of the non-standard one-over-n-hot encoding version is:
Begin People predict income one-over-n-hot Creating People Dataset objects Creating 10-(15-15)-1 neural network bat_size = 10 loss = MSELoss() optimizer = Adam lrn_rate = 0.01 Starting training epoch = 0 | loss = 0.8670 epoch = 200 | loss = 0.0171 epoch = 400 | loss = 0.0158 epoch = 600 | loss = 0.0153 epoch = 800 | loss = 0.0161 Done Computing model accuracy (within 0.10 of true) Accuracy on train data = 0.9050 Accuracy on test data = 0.9500 Predicting income for M 34 Oklahoma moderate: $43229.11 End People income one-over-n-hot demo
The one-over-n-hot version seems to generalize better than the standard one-hot version (95% accuracy vs. 90% accuracy), but the results are too similar to draw any solid conclusions.
Now here’s a problem with this idea. Even if I were to spend several weeks investigating this idea rigorously, and demonstrate that one-over-n hot encoding gives slightly better prediction results than the usual one-hot encoding, nobody would use the superior one-over-n hot encoding technique because the standard one-hot encoding technique has been used for decades in countless systems.
An interesting exploration.

There’s a close connection between chess and machine learning regression.
When I was in high school, I was a pretty good chess player. My Servite High School (Anaheim, CA) chess team won the Orange County Chess Championship in my junior and senior years. Other strong players on those teams included Bob Smith, Tom Law, Tom Quackenbush (all expert strength), Mike Ventriglia, Dan Musser, Ed Hernandez, Dennis Michel, and Pat Doyle.
In those days, Isaac Kashdan was the editor of the weekly chess column for the Los Angeles Times. I met Kashdan several times and he was a wonderful man.
I didn’t discover until years later that Kashdan was one of the strongest chess players in the world. If it hadn’t been for the Great Depression of the 1930s, he might have become world champion.
The graph above shows that Kashdan was the strongest rated player in the world for a short time in 1930 — ahead of Alekhine, Capablanca, Botvinnik, and Euwe — all former or future world champions.
Demo program. Replace “lt” (less than), “gt”, “lte”, “gte” with Boolean operator symbols (my blog editor often chokes on these symbols).
# people_income_special.py
# predict income from sex, age, State, politics
# one-over-n-hot encoding for sex, State, politics
# PyTorch 2.3.1-CPU Anaconda3-2023.09-0 Python 3.11.5
# Windows 11
import numpy as np
import torch as T
device = T.device('cpu') # apply to Tensor or Module
# -----------------------------------------------------------
class PeopleDataset(T.utils.data.Dataset):
def __init__(self, src_file):
# sex age state politics income
# 0.5 0.0 0.45 0.25 0 0 0 0.3333 0 0 0.76100
tmp_x = np.loadtxt(src_file,
usecols=[0,1, 2, 3,4,5,6, 7,8,9],
delimiter=",", comments="#", dtype=np.float32)
tmp_y = np.loadtxt(src_file, usecols=10, delimiter=",",
comments="#", dtype=np.float32)
tmp_y = tmp_y.reshape(-1,1) # 2D required
self.x_data = T.tensor(tmp_x, dtype=T.float32).to(device)
self.y_data = T.tensor(tmp_y, dtype=T.float32).to(device)
def __len__(self):
return len(self.x_data)
def __getitem__(self, idx):
preds = self.x_data[idx]
incom = self.y_data[idx]
return (preds, incom) # as a tuple
# -----------------------------------------------------------
class Net(T.nn.Module):
def __init__(self):
super(Net, self).__init__()
self.hid1 = T.nn.Linear(10, 15) # 10-(15-15)-1
self.hid2 = T.nn.Linear(15, 15)
self.oupt = T.nn.Linear(15, 1)
T.nn.init.xavier_uniform_(self.hid1.weight)
T.nn.init.zeros_(self.hid1.bias)
T.nn.init.xavier_uniform_(self.hid2.weight)
T.nn.init.zeros_(self.hid2.bias)
T.nn.init.xavier_uniform_(self.oupt.weight)
T.nn.init.zeros_(self.oupt.bias)
def forward(self, x):
z = T.tanh(self.hid1(x))
z = T.tanh(self.hid2(z))
z = self.oupt(z) # regression: no activation
return z
# -----------------------------------------------------------
def accuracy(model, ds, pct_close):
# assumes model.eval()
# correct within pct of true income
n_correct = 0; n_wrong = 0
for i in range(len(ds)):
X = ds[i][0] # 2-d
Y = ds[i][1] # 2-d
with T.no_grad():
oupt = model(X) # computed income
if T.abs(oupt - Y) "lt" T.abs(pct_close * Y):
n_correct += 1
else:
n_wrong += 1
acc = (n_correct * 1.0) / (n_correct + n_wrong)
return acc
# -----------------------------------------------------------
def accuracy_x(model, ds, pct_close):
# all-at-once (quick)
# assumes model.eval()
X = ds.x_data # all inputs
Y = ds.y_data # all targets
n_items = len(X)
with T.no_grad():
pred = model(X) # all predicted incomes
n_correct = T.sum((T.abs(pred - Y) "lt" T.abs(pct_close * Y)))
result = (n_correct.item() / n_items) # scalar
return result
# -----------------------------------------------------------
def train(model, ds, bs, lr, me, le):
# dataset, bat_size, lrn_rate, max_epochs, log interval
train_ldr = T.utils.data.DataLoader(ds, batch_size=bs,
shuffle=True)
loss_func = T.nn.MSELoss()
optimizer = T.optim.Adam(model.parameters(), lr=lr)
for epoch in range(0, me):
epoch_loss = 0.0 # for one full epoch
for (b_idx, batch) in enumerate(train_ldr):
X = batch[0] # predictors
y = batch[1] # target income
optimizer.zero_grad()
oupt = model(X)
loss_val = loss_func(oupt, y) # a tensor
epoch_loss += loss_val.item() # accumulate
loss_val.backward() # compute gradients
optimizer.step() # update weights
if epoch % le == 0:
print("epoch = %4d | loss = %0.4f" % \
(epoch, epoch_loss))
# -----------------------------------------------------------
def main():
# 0. get started
print("\nBegin People predict income one-over-n-hot ")
T.manual_seed(1)
np.random.seed(1)
# 1. create Dataset objects
print("\nCreating People Dataset objects ")
train_file = ".\\Data\\people_train.txt"
train_ds = PeopleDataset(train_file) # 200 rows
test_file = ".\\Data\\people_test.txt"
test_ds = PeopleDataset(test_file) # 40 rows
# 2. create network
print("\nCreating 10-(15-15)-1 neural network ")
net = Net().to(device)
# -----------------------------------------------------------
# 3. train model
print("\nbat_size = 10 ")
print("loss = MSELoss() ")
print("optimizer = Adam ")
print("lrn_rate = 0.01 ")
print("\nStarting training")
net.train()
train(net, train_ds, bs=10, lr=0.01, me=1000, le=200)
print("Done ")
# -----------------------------------------------------------
# 4. evaluate model accuracy
print("\nComputing model accuracy (within 0.10 of true) ")
net.eval()
acc_train = accuracy(net, train_ds, 0.10) # item-by-item
print("Accuracy on train data = %0.4f" % acc_train)
acc_test = accuracy_x(net, test_ds, 0.10) # all-at-once
print("Accuracy on test data = %0.4f" % acc_test)
# -----------------------------------------------------------
# 5. make a prediction
print("\nPredicting income for M 34 Oklahoma moderate: ")
x = np.array([[0.5,0, 0.34, 0,0,0,0.25, 0,0.3333,0]],
dtype=np.float32)
x = T.tensor(x, dtype=T.float32).to(device)
with T.no_grad():
pred_inc = net(x)
pred_inc = pred_inc.item() # scalar
print("$%0.2f" % (pred_inc * 100_000)) # un-normalized
# -----------------------------------------------------------
# 6. save model (state_dict approach)
# print("\nSaving trained model state")
# fn = ".\\Models\\people_income_model.pt"
# T.save(net.state_dict(), fn)
# model = Net()
# model.load_state_dict(T.load(fn))
# use model to make prediction(s)
print("\nEnd People income one-over-n-hot demo ")
if __name__ == "__main__":
main()
Training data:
# people_train.txt
# one-over-n-hot encoding
#
# male = 0.5 0.0, female = 0.0 0.5
# age / 100
# colorado = 0.25 0 0 0, michigan = 0 0.25 0 0,
# nebraska = 0 0 0.25 0, oklahoma = 0 0 0 0.25
# conservative = 0.3333 0 0, moderate = 0 0.3333 0, liberal = 0 0 0.3333
# income / 100,000
#
0.0, 0.5, 0.24, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.29500
0.5, 0.0, 0.39, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.51200
0.0, 0.5, 0.63, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.75800
0.5, 0.0, 0.36, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.44500
0.0, 0.5, 0.27, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.28600
0.0, 0.5, 0.50, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.56500
0.0, 0.5, 0.50, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.55000
0.5, 0.0, 0.19, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.32700
0.0, 0.5, 0.22, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.27700
0.5, 0.0, 0.39, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.47100
0.0, 0.5, 0.34, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.39400
0.5, 0.0, 0.22, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.33500
0.0, 0.5, 0.35, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.35200
0.5, 0.0, 0.33, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.46400
0.0, 0.5, 0.45, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.54100
0.0, 0.5, 0.42, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.50700
0.5, 0.0, 0.33, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.46800
0.0, 0.5, 0.25, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.30000
0.5, 0.0, 0.31, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.46400
0.0, 0.5, 0.27, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.32500
0.0, 0.5, 0.48, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.54000
0.5, 0.0, 0.64, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.71300
0.0, 0.5, 0.61, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.72400
0.0, 0.5, 0.54, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.61000
0.0, 0.5, 0.29, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.36300
0.0, 0.5, 0.50, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.55000
0.0, 0.5, 0.55, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.62500
0.0, 0.5, 0.40, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.52400
0.0, 0.5, 0.22, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.23600
0.0, 0.5, 0.68, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.78400
0.5, 0.0, 0.60, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.71700
0.5, 0.0, 0.34, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.46500
0.5, 0.0, 0.25, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.37100
0.5, 0.0, 0.31, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.48900
0.0, 0.5, 0.43, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.48000
0.0, 0.5, 0.58, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.65400
0.5, 0.0, 0.55, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.60700
0.5, 0.0, 0.43, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.51100
0.5, 0.0, 0.43, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.53200
0.5, 0.0, 0.21, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.37200
0.0, 0.5, 0.55, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.64600
0.0, 0.5, 0.64, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.74800
0.5, 0.0, 0.41, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.58800
0.0, 0.5, 0.64, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.72700
0.5, 0.0, 0.56, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.66600
0.0, 0.5, 0.31, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.36000
0.5, 0.0, 0.65, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.70100
0.0, 0.5, 0.55, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.64300
0.5, 0.0, 0.25, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.40300
0.0, 0.5, 0.46, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.51000
0.5, 0.0, 0.36, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.53500
0.0, 0.5, 0.52, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.58100
0.0, 0.5, 0.61, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.67900
0.0, 0.5, 0.57, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.65700
0.5, 0.0, 0.46, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.52600
0.5, 0.0, 0.62, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.66800
0.0, 0.5, 0.55, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.62700
0.5, 0.0, 0.22, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.27700
0.5, 0.0, 0.50, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.62900
0.5, 0.0, 0.32, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.41800
0.5, 0.0, 0.21, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.35600
0.0, 0.5, 0.44, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.52000
0.0, 0.5, 0.46, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.51700
0.0, 0.5, 0.62, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.69700
0.0, 0.5, 0.57, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.66400
0.5, 0.0, 0.67, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.75800
0.0, 0.5, 0.29, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.34300
0.0, 0.5, 0.53, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.60100
0.5, 0.0, 0.44, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.54800
0.0, 0.5, 0.46, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.52300
0.5, 0.0, 0.20, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.30100
0.5, 0.0, 0.38, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.53500
0.0, 0.5, 0.50, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.58600
0.0, 0.5, 0.33, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.42500
0.5, 0.0, 0.33, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.39300
0.0, 0.5, 0.26, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.40400
0.0, 0.5, 0.58, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.70700
0.0, 0.5, 0.43, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.48000
0.5, 0.0, 0.46, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.64400
0.0, 0.5, 0.60, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.71700
0.5, 0.0, 0.42, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.48900
0.5, 0.0, 0.56, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.56400
0.5, 0.0, 0.62, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.66300
0.5, 0.0, 0.50, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.64800
0.0, 0.5, 0.47, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.52000
0.5, 0.0, 0.67, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.80400
0.5, 0.0, 0.40, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.50400
0.0, 0.5, 0.42, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.48400
0.0, 0.5, 0.64, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.72000
0.5, 0.0, 0.47, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.58700
0.0, 0.5, 0.45, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.52800
0.5, 0.0, 0.25, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.40900
0.0, 0.5, 0.38, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.48400
0.0, 0.5, 0.55, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.60000
0.5, 0.0, 0.44, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.60600
0.0, 0.5, 0.33, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.41000
0.0, 0.5, 0.34, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.39000
0.0, 0.5, 0.27, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.33700
0.0, 0.5, 0.32, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.40700
0.0, 0.5, 0.42, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.47000
0.5, 0.0, 0.24, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.40300
0.0, 0.5, 0.42, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.50300
0.0, 0.5, 0.25, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.28000
0.0, 0.5, 0.51, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.58000
0.5, 0.0, 0.55, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.63500
0.0, 0.5, 0.44, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.47800
0.5, 0.0, 0.18, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.39800
0.5, 0.0, 0.67, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.71600
0.0, 0.5, 0.45, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.50000
0.0, 0.5, 0.48, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.55800
0.5, 0.0, 0.25, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.39000
0.5, 0.0, 0.67, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.78300
0.0, 0.5, 0.37, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.42000
0.5, 0.0, 0.32, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.42700
0.0, 0.5, 0.48, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.57000
0.5, 0.0, 0.66, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.75000
0.0, 0.5, 0.61, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.70000
0.5, 0.0, 0.58, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.68900
0.0, 0.5, 0.19, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.24000
0.0, 0.5, 0.38, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.43000
0.5, 0.0, 0.27, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.36400
0.0, 0.5, 0.42, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.48000
0.0, 0.5, 0.60, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.71300
0.5, 0.0, 0.27, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.34800
0.0, 0.5, 0.29, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.37100
0.5, 0.0, 0.43, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.56700
0.0, 0.5, 0.48, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.56700
0.0, 0.5, 0.27, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.29400
0.5, 0.0, 0.44, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.55200
0.0, 0.5, 0.23, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.26300
0.5, 0.0, 0.36, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.53000
0.0, 0.5, 0.64, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.72500
0.0, 0.5, 0.29, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.30000
0.5, 0.0, 0.33, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.49300
0.5, 0.0, 0.66, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.75000
0.5, 0.0, 0.21, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.34300
0.0, 0.5, 0.27, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.32700
0.0, 0.5, 0.29, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.31800
0.5, 0.0, 0.31, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.48600
0.0, 0.5, 0.36, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.41000
0.0, 0.5, 0.49, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.55700
0.5, 0.0, 0.28, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.38400
0.5, 0.0, 0.43, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.56600
0.5, 0.0, 0.46, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.58800
0.0, 0.5, 0.57, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.69800
0.5, 0.0, 0.52, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.59400
0.5, 0.0, 0.31, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.43500
0.5, 0.0, 0.55, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.62000
0.0, 0.5, 0.50, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.56400
0.0, 0.5, 0.48, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.55900
0.5, 0.0, 0.22, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.34500
0.0, 0.5, 0.59, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.66700
0.0, 0.5, 0.34, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.42800
0.5, 0.0, 0.64, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.77200
0.0, 0.5, 0.29, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.33500
0.5, 0.0, 0.34, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.43200
0.5, 0.0, 0.61, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.75000
0.0, 0.5, 0.64, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.71100
0.5, 0.0, 0.29, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.41300
0.0, 0.5, 0.63, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.70600
0.5, 0.0, 0.29, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.40000
0.5, 0.0, 0.51, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.62700
0.5, 0.0, 0.24, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.37700
0.0, 0.5, 0.48, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.57500
0.0, 0.5, 0.18, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.27400
0.0, 0.5, 0.18, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.20300
0.0, 0.5, 0.33, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.38200
0.5, 0.0, 0.20, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.34800
0.0, 0.5, 0.29, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.33000
0.5, 0.0, 0.44, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.63000
0.5, 0.0, 0.65, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.81800
0.5, 0.0, 0.56, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.63700
0.5, 0.0, 0.52, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.58400
0.5, 0.0, 0.29, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.48600
0.5, 0.0, 0.47, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.58900
0.0, 0.5, 0.68, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.72600
0.0, 0.5, 0.31, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.36000
0.0, 0.5, 0.61, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.62500
0.0, 0.5, 0.19, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.21500
0.0, 0.5, 0.38, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.43000
0.5, 0.0, 0.26, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.42300
0.0, 0.5, 0.61, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.67400
0.0, 0.5, 0.40, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.46500
0.5, 0.0, 0.49, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.65200
0.0, 0.5, 0.56, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.67500
0.5, 0.0, 0.48, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.66000
0.0, 0.5, 0.52, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.56300
0.5, 0.0, 0.18, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.29800
0.5, 0.0, 0.56, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.59300
0.5, 0.0, 0.52, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.64400
0.5, 0.0, 0.18, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.28600
0.5, 0.0, 0.58, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.66200
0.5, 0.0, 0.39, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.55100
0.5, 0.0, 0.46, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.62900
0.5, 0.0, 0.40, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.46200
0.5, 0.0, 0.60, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.72700
0.0, 0.5, 0.36, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.40700
0.0, 0.5, 0.44, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.52300
0.0, 0.5, 0.28, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.31300
0.0, 0.5, 0.54, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.62600
Test data:
# people_test.txt
# one-over-n-hot encoding
#
0.5, 0.0, 0.51, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.61200
0.5, 0.0, 0.32, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.46100
0.0, 0.5, 0.55, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.62700
0.0, 0.5, 0.25, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.26200
0.0, 0.5, 0.33, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.37300
0.5, 0.0, 0.29, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.46200
0.0, 0.5, 0.65, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.72700
0.5, 0.0, 0.43, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.3333, 0.0000, 0.51400
0.5, 0.0, 0.54, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.64800
0.0, 0.5, 0.61, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.72700
0.0, 0.5, 0.52, 0.00, 0.00, 0.25, 0.00, 0.3333, 0.0000, 0.0000, 0.63600
0.0, 0.5, 0.30, 0.00, 0.00, 0.25, 0.00, 0.0000, 0.0000, 0.3333, 0.33500
0.0, 0.5, 0.29, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.31400
0.5, 0.0, 0.47, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.59400
0.0, 0.5, 0.39, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.47800
0.0, 0.5, 0.47, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.52000
0.5, 0.0, 0.49, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.58600
0.5, 0.0, 0.63, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.67400
0.5, 0.0, 0.30, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.39200
0.5, 0.0, 0.61, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.69600
0.5, 0.0, 0.47, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.58700
0.0, 0.5, 0.30, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.34500
0.5, 0.0, 0.51, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.58000
0.5, 0.0, 0.24, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.38800
0.5, 0.0, 0.49, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.64500
0.0, 0.5, 0.66, 0.00, 0.00, 0.00, 0.25, 0.3333, 0.0000, 0.0000, 0.74500
0.5, 0.0, 0.65, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.76900
0.5, 0.0, 0.46, 0.25, 0.00, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.58000
0.5, 0.0, 0.45, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.3333, 0.0000, 0.51800
0.5, 0.0, 0.47, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.63600
0.5, 0.0, 0.29, 0.00, 0.25, 0.00, 0.00, 0.3333, 0.0000, 0.0000, 0.44800
0.5, 0.0, 0.57, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.69300
0.5, 0.0, 0.20, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.28700
0.5, 0.0, 0.35, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.43400
0.5, 0.0, 0.61, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.67000
0.5, 0.0, 0.31, 0.25, 0.00, 0.00, 0.00, 0.0000, 0.3333, 0.0000, 0.37300
0.0, 0.5, 0.18, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.20800
0.0, 0.5, 0.26, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.29200
0.5, 0.0, 0.28, 0.00, 0.25, 0.00, 0.00, 0.0000, 0.0000, 0.3333, 0.36400
0.5, 0.0, 0.59, 0.00, 0.00, 0.00, 0.25, 0.0000, 0.0000, 0.3333, 0.69400

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.