Updating My JavaScript Binary Classification Neural Network

Once or twice a year, I revisit my JavaScript implementations of a neural network. The system has enough complexity that there are dozens of ideas that can be explored.

My latest binary classification version makes many small changes from previous versions. The primary change was that I refactored the train() method from a single, very large method, to one that calls three helper functions — zeroOutGrads(), accumGrads(y), updateWeights(lrnRate). This change required me to store the hidden node and output node gradients as class matrices and vectors rather than as objects local to the train() method.

For my demo program, I used one of my standard synthetic datasets. The goal is to predict a person’s sex from age, State, income, and political leaning. The 240-item tab-delimited raw data looks like:

F   24   michigan   29500.00   liberal
M   39   oklahoma   51200.00   moderate
F   63   nebraska   75800.00   conservative
M   36   michigan   44500.00   moderate
F   27   nebraska   28600.00   liberal
. . .

I encoded the sex target as M = 0, F = 1, and predictor State as Michigan = 100, Nebraska = 010, Oklahoma = 001, and predictor political leaning as conservative = 100, moderate = 010, liberal = 001. I normalized the numeric data. I divided age values by 100, and divided the target income values by 100,000. The resulting encoded and normalized comma-delimited data looks like:

 1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1
 0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0
 1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0
 0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0
 1, 0.27, 0, 1, 0, 0.2860, 0, 0, 1
. . .

I split the data into a 200-item set of training data and a 40-item set of test data.

My neural architecture was 8-25-1 with tanh() hidden node activation and logistic-sigmoid output node activation. For training I used a batch size of 10, a learning rate of 0.005, and 10,000 epochs.

The resulting model scored 0.9400 accuracy on the training data (188 out of 200 correct) and 0.7500 accuracy on the test data (30 out of 40 correct):

Accuracy on training data = 0.9400
Accuracy on test data     = 0.7500

Constructing confusion matrix
actual = 0:   16   10
actual = 1:    0   14

These results are similar to those achieved by a PyTorch neural network and a LightGBM tree-based system.

Good fun!



Movies filmed in black and white really aren’t an example of a binary system because the images are shades of gray. Objectively, many of the old science fiction films that I like which were filmed in black and white, would likely be better if filmed in color. That said, here are three movies that I think are probably better in their original black and white than if they had been filmed in color.

Left: The original “Godzilla” (1954) and the Americanized version (1956) were terrifying to me as a young man. The black and white enhanced the menace and is much better than the colorized versions of later Godzilla movies that look cartoonish.

Center: “Them!” (1954) was the first giant insect movie. Atomic testing produces giant ants that threaten to dominate the world. The scenes in the desert at night and during sandstorms were very effective. The black and white aided in the suspecnsion of disbelief and made the story more plausible.

Right: “Quatermass II” (1957) aka “Enemy From Space” told the story of invading aliens who use small parasisitc blobs to take control of a small English village and an adjoinig industrial complex. The aliens use the complex to manufacture food for the planned full-scale invasion. The black and white cinematography lends a nightmarish quality to many of the scenes.


Demo code. Very long! Replace “lt” (less than), “gt”, “lte”, “gte”, “and” with Boolean operator symbols.

// people_gender.js
// node.js  ES6

// NN binary classification
// tanh, log-sigmoid activation, BCEE loss

let U = require("..\\Utils\\utilities_lib.js")
let FS = require("fs")

// ----------------------------------------------------------

class NeuralNet
{
  constructor(numInput, numHidden, numOutput, seed)
  {
    this.rnd = new U.Erratic(seed);  // pseudo-random

    this.ni = numInput; 
    this.nh = numHidden;
    this.no = numOutput;

    this.iNodes = U.vecMake(this.ni, 0.0);
    this.hNodes = U.vecMake(this.nh, 0.0);
    this.oNodes = U.vecMake(this.no, 0.0);

    this.ihWeights = U.matMake(this.ni, this.nh, 0.0);
    this.hoWeights = U.matMake(this.nh, this.no, 0.0);

    this.hBiases = U.vecMake(this.nh, 0.0);
    this.oBiases = U.vecMake(this.no, 0.0); // [1] 

    this.ihGrads = U.matMake(this.ni, this.nh, 0.0);
    this.hbGrads = U.vecMake(this.nh, 0.0);
    this.hoGrads = U.matMake(this.nh, this.no, 0.0);
    this.obGrads = U.vecMake(this.no, 0.0);

    this.initWeights();
  }

  initWeights()
  {
    let lo = -0.10;
    let hi = 0.10;
    for (let i = 0; i "lt" this.ni; ++i) {
      for (let j = 0; j "lt" this.nh; ++j) {
        this.ihWeights[i][j] = (hi - lo) * 
          this.rnd.next() + lo;
      }
    }

    for (let j = 0; j "lt" this.nh; ++j) {
      for (let k = 0; k "lt" this.no; ++k) {
        this.hoWeights[j][k] = (hi - lo) * 
          this.rnd.next() + lo;
      }
    }
  } 

  computeOutput(X)
  {
    let hSums = U.vecMake(this.nh, 0.0);
    let oSums = U.vecMake(this.no, 0.0);
    
    this.iNodes = X;

    for (let j = 0; j "lt" this.nh; ++j) {
      for (let i = 0; i "lt" this.ni; ++i) {
        hSums[j] += this.iNodes[i] * this.ihWeights[i][j];
      }
      hSums[j] += this.hBiases[j];
      this.hNodes[j] = U.hyperTan(hSums[j]);
    }

    for (let k = 0; k "lt" this.no; ++k) {
      for (let j = 0; j "lt" this.nh; ++j) {
        oSums[k] += this.hNodes[j] * this.hoWeights[j][k];
      }
      oSums[k] += this.oBiases[k];
    }

    // apply output activation
    // this.oNodes = U.softmax(oSums);  // multi-class
    // this.oNodes = U.identity(oSums); // regression
    for (let k = 0; k "lt" this.no; ++k) {
      this.oNodes[k] = U.logSig(oSums[k]);  // binary
    }

    return this.oNodes[0];  // a single scalar value
  } // computeOutput()

  setWeights(wts)
  {
    // order: ihWts, hBiases, hoWts, oBiases
    let p = 0;

    for (let i = 0; i "lt" this.ni; ++i) {
      for (let j = 0; j "lt" this.nh; ++j) {
        this.ihWeights[i][j] = wts[p++];
      }
    }

    for (let j = 0; j "lt" this.nh; ++j) {
      this.hBiases[j] = wts[p++];
    }

    for (let j = 0; j "lt" this.nh; ++j) {
      for (let k = 0; k "lt" this.no; ++k) {
        this.hoWeights[j][k] = wts[p++];
      }
    }

    for (let k = 0; k "lt" this.no; ++k) {
      this.oBiases[k] = wts[p++];
    }
  } // setWeights()

  getWeights()
  {
    // order: ihWts, hBiases, hoWts, oBiases
    let numWts = (this.ni * this.nh) + this.nh +
      (this.nh * this.no) + this.no;
    let result = U.vecMake(numWts, 0.0);
    let p = 0;
    for (let i = 0; i "lt" this.ni; ++i) {
      for (let j = 0; j "lt" this.nh; ++j) {
        result[p++] = this.ihWeights[i][j];
      }
    }

    for (let j = 0; j "lt" this.nh; ++j) {
      result[p++] = this.hBiases[j];
    }

    for (let j = 0; j "lt" this.nh; ++j) {
      for (let k = 0; k "lt" this.no; ++k) {
        result[p++] = this.hoWeights[j][k];
      }
    }

    for (let k = 0; k "lt" this.no; ++k) {
      result[p++] = this.oBiases[k];
    }
    return result;
  } // getWeights()

  shuffle(v)
  {
    // Fisher-Yates
    let n = v.length;
    for (let i = 0; i "lt" n; ++i) {
      let r = this.rnd.nextInt(i, n);
      let tmp = v[r];
      v[r] = v[i];
      v[i] = tmp;
    }
  }

  // --------------------------------------------------------
  // helpers for train(): zeroOutGrads(), accumGrads(y),
  //   updateWeights(lrnRate)
  // --------------------------------------------------------

  zeroOutGrads()
  {
    for (let i = 0; i "lt" this.ni; ++i)
      for (let j = 0; j "lt" this.nh; ++j)
        this.ihGrads[i][j] = 0.0;

    for (let j = 0; j "lt" this.nh; ++j)
      this.hbGrads[j] = 0.0;

    for (let j = 0; j "lt" this.nh; ++j)
      for (let k = 0; k "lt" this.no; ++k)
        this.hoGrads[j][k] = 0.0;

    for (let k = 0; k "lt" this.no; ++k)
      this.obGrads[k] = 0.0;
  }

  accumGrads(y)
  {
    // y is target scalar
    let oSignals = U.vecMake(this.no, 0.0);
    let hSignals = U.vecMake(this.nh, 0.0);

    // 1. compute output node scratch signals 
    for (let k = 0; k "lt" this.no; ++k) {
      let derivative = 1.0;  // CEE
      // let derivative =
      //  this.oNodes[k] * (1 - this.oNodes[k]); // MSE
      oSignals[k] = derivative *
        (this.oNodes[k] - y);  // CEE
    }

    // 2. accum hidden-to-output gradients 
    for (let j = 0; j "lt" this.nh; ++j)
      for (let k = 0; k "lt" this.no; ++k)
        this.hoGrads[j][k] += oSignals[k] * this.hNodes[j];

    // 3. accum output node bias gradients
    for (let k = 0; k "lt" this.no; ++k)
      this.obGrads[k] += oSignals[k] * 1.0;  // 1.0 dummy 

    // 4. compute hidden node signals
    for (let j = 0; j "lt" this.nh; ++j) {
      let sum = 0.0;
      for (let k = 0; k "lt" this.no; ++k)
        sum += oSignals[k] * this.hoWeights[j][k];

      let derivative =
        (1 - this.hNodes[j]) *
        (1 + this.hNodes[j]);  // assumes tanh
      hSignals[j] = derivative * sum;
    }

    // 5. accum input-to-hidden gradients
    for (let i = 0; i "lt" this.ni; ++i)
      for (let j = 0; j "lt" this.nh; ++j)
        this.ihGrads[i][j] += hSignals[j] * this.iNodes[i];

    // 6. accum hidden node bias gradients
    for (let j = 0; j "lt" this.nh; ++j)
      this.hbGrads[j] += hSignals[j] * 1.0;  // 1.0 dummy
  } // accumGrads
  
  updateWeights(lrnRate)
  {
    // assumes all gradients computed
    // 1. update input-to-hidden weights
    for (let i = 0; i "lt" this.ni; ++i) {
      for (let j = 0; j "lt" this.nh; ++j) {
        let delta = -1.0 * lrnRate * this.ihGrads[i][j];
        this.ihWeights[i][j] += delta;
      }
    }

    // 2. update hidden node biases
    for (let j = 0; j "lt" this.nh; ++j) {
      let delta = -1.0 * lrnRate * this.hbGrads[j];
      this.hBiases[j] += delta;
    }

    // 3. update hidden-to-output weights
    for (let j = 0; j "lt" this.nh; ++j) {
      for (let k = 0; k "lt" this.no; ++k) {
        let delta = -1.0 * lrnRate * this.hoGrads[j][k];
        this.hoWeights[j][k] += delta;
      }
    }

    // 4. update output node biases
    for (let k = 0; k "lt" this.no; ++k) {
      let delta = -1.0 * lrnRate * this.obGrads[k];
      this.oBiases[k] += delta;
    }
  } // updateWeights()

  // --------------------------------------------------------

  train(trainX, trainY, lrnRate, batSize, maxEpochs)
  {
    let n = trainX.length;  // 200
    let batchesPerEpoch = Math.trunc(n / batSize);  // 20
    let freq = Math.trunc(maxEpochs / 10);  // progress
    let indices = U.arange(n);

    // ----------------------------------------------------
    //
    // n = 200; bs = 10
    // batches per epoch = 200 / 10 = 20

    // for epoch = 0; epoch "lt" maxEpochs; ++epoch
    //   for batch = 0; batch "lt" bpe; ++batch
    //     for item = 0; item "lt" bs; ++item
    //       compute output
    //       accum grads
    //     end-item
    //     update weights
    //     zero-out grads
    //   end-batches
    //   shuffle indices
    // end-epochs
    //
    // ----------------------------------------------------

    for (let epoch = 0; epoch "lt" maxEpochs; ++epoch) {
      this.shuffle(indices);
      let ptr = 0;  // points into indices
      for (let batIdx = 0; batIdx "lt" batchesPerEpoch;
        ++batIdx) // 0, 1, . . 19
      {
        for (let i = 0; i "lt" batSize; ++i) { // 0 . . 9
          let ii = indices[ptr++];  // compute output
          let x = trainX[ii];
          let y = trainY[ii];
          this.computeOutput(x);  // into this.oNodes
          this.accumGrads(y);
        }
        this.updateWeights(lrnRate);
        this.zeroOutGrads(); // prep for next batch
      } // batches

      if (epoch % freq == 0) {
        // let mse = 
        // this.meanSqErr(trainX, trainY).toFixed(4);
        let mbce = 
          this.meanBCE(trainX, trainY).toFixed(4);
        let acc = this.accuracy(trainX, trainY,
          0.10).toFixed(4);

        let s1 = "epoch: " +
          epoch.toString().padStart(6, ' ');
        let s2 = "   MBCE = " + 
          mcee.toString().padStart(8, ' ');
        let s3 = "   acc = " + acc.toString();

        console.log(s1 + s2 + s3);
      }
    } // epoch
  } // train

  // -------------------------------------------------------- 

  meanBCE(trainX, trainY)
  {
    // mean binary cross entropy error
    // for arbitary target yi and predicted pi:
    // bcee = -1 * [ (yi * log(pi)) + ((1-yi) * log(1-pi)) ]
    // 
    // if yi target == 0 or 1 only: 
    // when yi == 1, bcee = -1 * [log(pi)]
    // when yi == 0, bcee = -1 * [log(1-pi)]
    let n = trainX.length;
    let err = 0.0;
    for (let i = 0; i "lt" n; ++i) {
      let predY = this.computeOutput(trainX[i]);
      let actualY = trainY[i];   // 0.0 or 1.0
      if (Math.trunc(actualY) == 1) // target == 1
        err += -Math.log(predY);
      else                          // target == 0
        err += -Math.log(1.0 - predY);
    }
    return err / n;
  } 

  meanSqErr(dataX, dataY)
  {
    // for regression
    let sumSE = 0.0;
    for (let i = 0; i "lt" dataX.length; ++i) {
      let X = dataX[i];
      let Y = dataY[i];  // target income
      let oupt = this.computeOutput(X); 
      sumSE += (Y - oupt) * (Y - oupt); 
    }
    return sumSE / dataX.length;  // consider Root MSE
  } 

  accuracy(dataX, dataY)
  {
    let nc = 0; let nw = 0;
    for (let i = 0; i "lt" dataX.length; ++i) { 
      let X = dataX[i];
      let y = dataY[i];  // target income
      let pred = this.computeOutput(X);
      if (Math.trunc(y) == 0 "and" pred "lt" 0.5) {
        ++nc;
      }
      else if (Math.trunc(y) == 1 "and" pred "gte" 0.5) {
        ++nc;
      }
      else {
        ++nw;
      }
    }
    return nc / (nc + nw);
  }

  confusionMatrix(dataX, dataY)
  {
    let result = U.matMake(2, 2, 0);  // 2x2
    let n = dataX.length;
    for (let i = 0; i "lt" n; ++i) {
      let predY = this.computeOutput(dataX[i]); 
      let actualY = dataY[i];  // 0.0 or 1.0

      if (Math.trunc(actualY) == 0 "and" 
        predY "lt" 0.5) {
        ++result[0][0];
      }
      else if (Math.trunc(actualY) == 0 "and" 
        predY "gte" 0.5) {
        ++result[0][1];
      }
      else if (Math.trunc(actualY) == 1 "and" 
        predY "lt" 0.5) {
        ++result[1][0];
      }
      else if (Math.trunc(actualY) == 1 "and" 
        predY "gte" 0.5) {
        ++result[1][1];
      }
    }
    return result;
  }

  showConfusion(cm)
  {
    process.stdout.write("actual = 0: ");
    process.stdout.write(cm[0][0].toString().
      padStart(4, " ") + " ");
    console.log(cm[0][1].toString().
      padStart(4, " "));
    process.stdout.write("actual = 1: ");
    process.stdout.write(cm[1][0].toString().
      padStart(4, " ") + " ");
    console.log(cm[1][1].toString().
      padStart(4, " "));
  }

  saveWeights(fn)
  {
    let wts = this.getWeights();
    let n = wts.length;
    let s = "";
    for (let i = 0; i "lt" n-1; ++i) {
      s += wts[i].toString() + ",";
    }
    s += wts[n-1];

    FS.writeFileSync(fn, s);
  }

  loadWeights(fn)
  {
    let n = (this.ni * this.nh) + this.nh +
      (this.nh * this.no) + this.no;
    let wts = U.vecMake(n, 0.0);
    let all = FS.readFileSync(fn, "utf8");
    let strVals = all.split(",");
    let nn = strVals.length;
    if (n != nn) {
      throw("Size error in NeuralNet.loadWeights()");
    }
    for (let i = 0; i "lt" n; ++i) {
      wts[i] = parseFloat(strVals[i]);
    }
    this.setWeights(wts);
  }

} // NeuralNet

// ----------------------------------------------------------

function main()
{
  // process.stdout.write("\033[0m");  // reset
  // process.stdout.write("\x1b[1m" + "\x1b[37m"); // white
  console.log("\nBegin JavaScript binary classification ");
  console.log("Predict sex from age, State," +
    " income, politics ");
  
  // 1. load data
  //  0  0.29  1 0 0  0.65400  0 0 1
  //  1  0.36  0 0 1  0.58300  1 0 0
  console.log("\nLoading data into memory ");
  let trainX = U.loadTxt(".\\Data\\people_train.txt", ",",
    [1,2,3,4,5,6,7,8], "#");
  let trainY = U.loadTxt(".\\Data\\people_train.txt", ",",
    [0], "#");
  trainY = U.matToVec(trainY);
  let testX = U.loadTxt(".\\Data\\people_test.txt", ",",
    [1,2,3,4,5,6,7,8], "#");
  let testY = U.loadTxt(".\\Data\\people_test.txt", ",",
    [0], "#");
  testY = U.matToVec(testY);

  // 2. create network
  console.log("\nCreating 8-25-1 tanh, sigmoid BCCE NN ");
  let seed = 0;
  let nn = new NeuralNet(8, 25, 1, seed);

  // 3. train network
  let lrnRate = 0.005;
  let maxEpochs = 10000;
  console.log("\nSetting learn rate = 0.005 ");
  console.log("Setting bat size = 10 ");
  nn.train(trainX, trainY, lrnRate, 10, maxEpochs);
  console.log("Training complete ");

  // 4. evaluate model
  let trainAcc = nn.accuracy(trainX, trainY);
  let testAcc = nn.accuracy(testX, testY);
  console.log("\nAccuracy on training data = " +
    trainAcc.toFixed(4).toString()); 
  console.log("Accuracy on test data     = " +
    testAcc.toFixed(4).toString());

  // 4b. confusion
  console.log("\nConstructing confusion matrix ");
  cm = nn.confusionMatrix(testX, testY);
  nn.showConfusion(cm);

  // 5. save trained model
  fn = ".\\Models\\people_gender_wts.txt";
  console.log("\nSaving model weights and biases to: ");
  console.log(fn);
  nn.saveWeights(fn);

  // 6. use trained model
  console.log("\nPredict for 30 Oklahoma $40,000 moderate ");
  let x = [0.30, 0,0,1, 0.40000, 0,1,0];
  let predicted = nn.computeOutput(x);
  console.log("\nPredicted sex (0 = M, 1 = F): ");
  console.log(predicted.toFixed(4).toString());

  //process.stdout.write("\033[0m");  // reset
  console.log("\nEnd demo");
}

main()

Code for utility functions:

// utilities_lib.js
// ES6

let FS = require('fs');

// ----------------------------------------------------------

function loadTxt(fn, delimit, usecols, comment) {
  // efficient but mildly complicated
  let all = FS.readFileSync(fn, "utf8");  // giant string
  all = all.trim();  // strip final crlf in file
  let lines = all.split("\n");  // array of lines

  // count number non-comment lines
  let nRows = 0;
  for (let i = 0; i "lt" lines.length; ++i) {
    if (!lines[i].startsWith(comment))
      ++nRows;
  }
  let nCols = usecols.length;
  let result = matMake(nRows, nCols, 0.0); 
 
  let r = 0;  // into lines
  let i = 0;  // into result[][]
  while (r "lt" lines.length) {
    if (lines[r].startsWith(comment)) {
      ++r;  // next row
    }
    else {
      let tokens = lines[r].split(delimit);
      for (let j = 0; j "lt" nCols; ++j) {
        result[i][j] = parseFloat(tokens[usecols[j]]);
      }
      ++r;
      ++i;
    }
  }

  return result;
}

// ----------------------------------------------------------

function arange(n)
{
  let result = [];
  for (let i = 0; i "lt" n; ++i) {
    result[i] = Math.trunc(i);
  }
  return result;
}

// ----------------------------------------------------------

class Erratic
{
  constructor(seed)
  {
    this.seed = seed + 0.5;  // avoid 0
  }

  next()
  {
    let x = Math.sin(this.seed) * 1000;
    let result = x - Math.floor(x);  // [0.0,1.0)
    this.seed = result;  // for next call
    return result;
  }

  nextInt(lo, hi)
  {
    let x = this.next();
    return Math.trunc((hi - lo) * x + lo);
  }
}

// ----------------------------------------------------------

function vecMake(n, val)
{
  let result = [];
  for (let i = 0; i "lt" n; ++i) {
    result[i] = val;
  }
  return result;
}

function matMake(rows, cols, val)
{
  let result = [];
  for (let i = 0; i "lt" rows; ++i) {
    result[i] = [];
    for (let j = 0; j "lt" cols; ++j) {
      result[i][j] = val;
    }
  }
  return result;
}

function matToOneHot(m, n)
{
  // convert ordinal (0,1,2 . .) to one-hot
  let rows = m.length;
  let cols = m[0].length;
  let result = matMake(rows, n);
  for (let i = 0; i "lt" rows; ++i) {
    let k = Math.trunc(m[i][0]);  // 0,1,2 . .
    result[i] = vecMake(n, 0.0);  // [0.0  0.0  0.0]
    result[i][k] = 1.0;  // [ 0.0  1.0  0.0]
  }

  return result;
}

function matToVec(m)
{
  let r = m.length;
  let c = m[0].length;
  let result = 	vecMake(r*c, 0.0);
  let k = 0;
  for (let i = 0; i "lt" r; ++i) {
    for (let j = 0; j "lt" c; ++j) {
      result[k++] = m[i][j];
    }
  }
  return result;
}

function vecShow(v, dec, len)
{
  for (let i = 0; i "lt" v.length; ++i) {
    if (i != 0 "and" i % len == 0) {
      process.stdout.write("\n");
    }
    if (v[i] "gte" 0.0) {
      process.stdout.write(" ");  // + or - space
    }
    process.stdout.write(v[i].toFixed(dec));
    process.stdout.write("  ");
  }
  process.stdout.write("\n");
}

function vecShow(vec, dec, wid, nl)
{
  for (let i = 0; i "lt" vec.length; ++i) {
    let x = vec[i];
    if (Math.abs(x) "lt" 0.000001) x = 0.0  // avoid -0.00
    let xx = x.toFixed(dec);
    let s = xx.toString().padStart(wid, ' ');
    process.stdout.write(s);
    process.stdout.write(" ");
  }

  if (nl == true)
    process.stdout.write("\n");
}


function matShow(m, dec, wid)
{
  let rows = m.length;
  let cols = m[0].length;
  for (let i = 0; i "lt" rows; ++i) {
    for (let j = 0; j "lt" cols; ++j) {
      if (m[i][j] "gte" 0.0) {
        process.stdout.write(" ");  // + or - space
      }
      process.stdout.write(m[i][j].toFixed(dec));
      process.stdout.write("  ");
    }
    process.stdout.write("\n");
  }
}

function argmax(v)
{
  let result = 0;
  let m = v[0];
  for (let i = 0; i "lt" v.length; ++i) {
    if (v[i] "gt" m) {
      m = v[i];
      result = i;
    }
  }
  return result;
}

function hyperTan(x)
{
  if (x "lt" -10.0) {
    return -1.0;
  }
  else if (x "gt" 10.0) {
    return 1.0;
  }
  else {
    return Math.tanh(x);
  }
}

function logSig(x)
{
  if (x "lt" -10.0) {
    return 0.0;
  }
  else if (x "gt" 10.0) {
    return 1.0;
  }
  else {
    return 1.0 / (1.0 + Math.exp(-x));
  }
}

function vecMax(vec)
{
  let mx = vec[0];
  for (let i = 0; i "lt" vec.length; ++i) {
    if (vec[i] "gt" mx) {
      mx = vec[i];
    }
  }
  return mx;
}

function softmax(vec)
{
  //let m = Math.max(...vec);  // or 'spread' operator
  let m = vecMax(vec);
  let result = [];
  let sum = 0.0;
  for (let i = 0; i "lt" vec.length; ++i) {
    result[i] = Math.exp(vec[i] - m);
    sum += result[i];
  }
  for (let i = 0; i "lt" result.length; ++i) {
    result[i] = result[i] / sum;
  }
  return result;
}

module.exports = {
  vecMake,
  matMake,
  matToOneHot,
  matToVec,
  vecShow,
  matShow,
  argmax,
  loadTxt,
  arange,
  Erratic,
  hyperTan,
  logSig,
  vecMax,
  softmax
};

Training data:

# people_train.txt
# sex (0 = male, 1 = female) - dependent variable
# age, state (michigan, nebraska, oklahoma), income,
# politics type (conservative, moderate, liberal)
#
1, 0.24, 1, 0, 0, 0.2950, 0, 0, 1
0, 0.39, 0, 0, 1, 0.5120, 0, 1, 0
1, 0.63, 0, 1, 0, 0.7580, 1, 0, 0
0, 0.36, 1, 0, 0, 0.4450, 0, 1, 0
1, 0.27, 0, 1, 0, 0.2860, 0, 0, 1
1, 0.50, 0, 1, 0, 0.5650, 0, 1, 0
1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0
0, 0.19, 0, 0, 1, 0.3270, 1, 0, 0
1, 0.22, 0, 1, 0, 0.2770, 0, 1, 0
0, 0.39, 0, 0, 1, 0.4710, 0, 0, 1
1, 0.34, 1, 0, 0, 0.3940, 0, 1, 0
0, 0.22, 1, 0, 0, 0.3350, 1, 0, 0
1, 0.35, 0, 0, 1, 0.3520, 0, 0, 1
0, 0.33, 0, 1, 0, 0.4640, 0, 1, 0
1, 0.45, 0, 1, 0, 0.5410, 0, 1, 0
1, 0.42, 0, 1, 0, 0.5070, 0, 1, 0
0, 0.33, 0, 1, 0, 0.4680, 0, 1, 0
1, 0.25, 0, 0, 1, 0.3000, 0, 1, 0
0, 0.31, 0, 1, 0, 0.4640, 1, 0, 0
1, 0.27, 1, 0, 0, 0.3250, 0, 0, 1
1, 0.48, 1, 0, 0, 0.5400, 0, 1, 0
0, 0.64, 0, 1, 0, 0.7130, 0, 0, 1
1, 0.61, 0, 1, 0, 0.7240, 1, 0, 0
1, 0.54, 0, 0, 1, 0.6100, 1, 0, 0
1, 0.29, 1, 0, 0, 0.3630, 1, 0, 0
1, 0.50, 0, 0, 1, 0.5500, 0, 1, 0
1, 0.55, 0, 0, 1, 0.6250, 1, 0, 0
1, 0.40, 1, 0, 0, 0.5240, 1, 0, 0
1, 0.22, 1, 0, 0, 0.2360, 0, 0, 1
1, 0.68, 0, 1, 0, 0.7840, 1, 0, 0
0, 0.60, 1, 0, 0, 0.7170, 0, 0, 1
0, 0.34, 0, 0, 1, 0.4650, 0, 1, 0
0, 0.25, 0, 0, 1, 0.3710, 1, 0, 0
0, 0.31, 0, 1, 0, 0.4890, 0, 1, 0
1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0
1, 0.58, 0, 1, 0, 0.6540, 0, 0, 1
0, 0.55, 0, 1, 0, 0.6070, 0, 0, 1
0, 0.43, 0, 1, 0, 0.5110, 0, 1, 0
0, 0.43, 0, 0, 1, 0.5320, 0, 1, 0
0, 0.21, 1, 0, 0, 0.3720, 1, 0, 0
1, 0.55, 0, 0, 1, 0.6460, 1, 0, 0
1, 0.64, 0, 1, 0, 0.7480, 1, 0, 0
0, 0.41, 1, 0, 0, 0.5880, 0, 1, 0
1, 0.64, 0, 0, 1, 0.7270, 1, 0, 0
0, 0.56, 0, 0, 1, 0.6660, 0, 0, 1
1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0
0, 0.65, 0, 0, 1, 0.7010, 0, 0, 1
1, 0.55, 0, 0, 1, 0.6430, 1, 0, 0
0, 0.25, 1, 0, 0, 0.4030, 1, 0, 0
1, 0.46, 0, 0, 1, 0.5100, 0, 1, 0
0, 0.36, 1, 0, 0, 0.5350, 1, 0, 0
1, 0.52, 0, 1, 0, 0.5810, 0, 1, 0
1, 0.61, 0, 0, 1, 0.6790, 1, 0, 0
1, 0.57, 0, 0, 1, 0.6570, 1, 0, 0
0, 0.46, 0, 1, 0, 0.5260, 0, 1, 0
0, 0.62, 1, 0, 0, 0.6680, 0, 0, 1
1, 0.55, 0, 0, 1, 0.6270, 1, 0, 0
0, 0.22, 0, 0, 1, 0.2770, 0, 1, 0
0, 0.50, 1, 0, 0, 0.6290, 1, 0, 0
0, 0.32, 0, 1, 0, 0.4180, 0, 1, 0
0, 0.21, 0, 0, 1, 0.3560, 1, 0, 0
1, 0.44, 0, 1, 0, 0.5200, 0, 1, 0
1, 0.46, 0, 1, 0, 0.5170, 0, 1, 0
1, 0.62, 0, 1, 0, 0.6970, 1, 0, 0
1, 0.57, 0, 1, 0, 0.6640, 1, 0, 0
0, 0.67, 0, 0, 1, 0.7580, 0, 0, 1
1, 0.29, 1, 0, 0, 0.3430, 0, 0, 1
1, 0.53, 1, 0, 0, 0.6010, 1, 0, 0
0, 0.44, 1, 0, 0, 0.5480, 0, 1, 0
1, 0.46, 0, 1, 0, 0.5230, 0, 1, 0
0, 0.20, 0, 1, 0, 0.3010, 0, 1, 0
0, 0.38, 1, 0, 0, 0.5350, 0, 1, 0
1, 0.50, 0, 1, 0, 0.5860, 0, 1, 0
1, 0.33, 0, 1, 0, 0.4250, 0, 1, 0
0, 0.33, 0, 1, 0, 0.3930, 0, 1, 0
1, 0.26, 0, 1, 0, 0.4040, 1, 0, 0
1, 0.58, 1, 0, 0, 0.7070, 1, 0, 0
1, 0.43, 0, 0, 1, 0.4800, 0, 1, 0
0, 0.46, 1, 0, 0, 0.6440, 1, 0, 0
1, 0.60, 1, 0, 0, 0.7170, 1, 0, 0
0, 0.42, 1, 0, 0, 0.4890, 0, 1, 0
0, 0.56, 0, 0, 1, 0.5640, 0, 0, 1
0, 0.62, 0, 1, 0, 0.6630, 0, 0, 1
0, 0.50, 1, 0, 0, 0.6480, 0, 1, 0
1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0
0, 0.67, 0, 1, 0, 0.8040, 0, 0, 1
0, 0.40, 0, 0, 1, 0.5040, 0, 1, 0
1, 0.42, 0, 1, 0, 0.4840, 0, 1, 0
1, 0.64, 1, 0, 0, 0.7200, 1, 0, 0
0, 0.47, 1, 0, 0, 0.5870, 0, 0, 1
1, 0.45, 0, 1, 0, 0.5280, 0, 1, 0
0, 0.25, 0, 0, 1, 0.4090, 1, 0, 0
1, 0.38, 1, 0, 0, 0.4840, 1, 0, 0
1, 0.55, 0, 0, 1, 0.6000, 0, 1, 0
0, 0.44, 1, 0, 0, 0.6060, 0, 1, 0
1, 0.33, 1, 0, 0, 0.4100, 0, 1, 0
1, 0.34, 0, 0, 1, 0.3900, 0, 1, 0
1, 0.27, 0, 1, 0, 0.3370, 0, 0, 1
1, 0.32, 0, 1, 0, 0.4070, 0, 1, 0
1, 0.42, 0, 0, 1, 0.4700, 0, 1, 0
0, 0.24, 0, 0, 1, 0.4030, 1, 0, 0
1, 0.42, 0, 1, 0, 0.5030, 0, 1, 0
1, 0.25, 0, 0, 1, 0.2800, 0, 0, 1
1, 0.51, 0, 1, 0, 0.5800, 0, 1, 0
0, 0.55, 0, 1, 0, 0.6350, 0, 0, 1
1, 0.44, 1, 0, 0, 0.4780, 0, 0, 1
0, 0.18, 1, 0, 0, 0.3980, 1, 0, 0
0, 0.67, 0, 1, 0, 0.7160, 0, 0, 1
1, 0.45, 0, 0, 1, 0.5000, 0, 1, 0
1, 0.48, 1, 0, 0, 0.5580, 0, 1, 0
0, 0.25, 0, 1, 0, 0.3900, 0, 1, 0
0, 0.67, 1, 0, 0, 0.7830, 0, 1, 0
1, 0.37, 0, 0, 1, 0.4200, 0, 1, 0
0, 0.32, 1, 0, 0, 0.4270, 0, 1, 0
1, 0.48, 1, 0, 0, 0.5700, 0, 1, 0
0, 0.66, 0, 0, 1, 0.7500, 0, 0, 1
1, 0.61, 1, 0, 0, 0.7000, 1, 0, 0
0, 0.58, 0, 0, 1, 0.6890, 0, 1, 0
1, 0.19, 1, 0, 0, 0.2400, 0, 0, 1
1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0
0, 0.27, 1, 0, 0, 0.3640, 0, 1, 0
1, 0.42, 1, 0, 0, 0.4800, 0, 1, 0
1, 0.60, 1, 0, 0, 0.7130, 1, 0, 0
0, 0.27, 0, 0, 1, 0.3480, 1, 0, 0
1, 0.29, 0, 1, 0, 0.3710, 1, 0, 0
0, 0.43, 1, 0, 0, 0.5670, 0, 1, 0
1, 0.48, 1, 0, 0, 0.5670, 0, 1, 0
1, 0.27, 0, 0, 1, 0.2940, 0, 0, 1
0, 0.44, 1, 0, 0, 0.5520, 1, 0, 0
1, 0.23, 0, 1, 0, 0.2630, 0, 0, 1
0, 0.36, 0, 1, 0, 0.5300, 0, 0, 1
1, 0.64, 0, 0, 1, 0.7250, 1, 0, 0
1, 0.29, 0, 0, 1, 0.3000, 0, 0, 1
0, 0.33, 1, 0, 0, 0.4930, 0, 1, 0
0, 0.66, 0, 1, 0, 0.7500, 0, 0, 1
0, 0.21, 0, 0, 1, 0.3430, 1, 0, 0
1, 0.27, 1, 0, 0, 0.3270, 0, 0, 1
1, 0.29, 1, 0, 0, 0.3180, 0, 0, 1
0, 0.31, 1, 0, 0, 0.4860, 0, 1, 0
1, 0.36, 0, 0, 1, 0.4100, 0, 1, 0
1, 0.49, 0, 1, 0, 0.5570, 0, 1, 0
0, 0.28, 1, 0, 0, 0.3840, 1, 0, 0
0, 0.43, 0, 0, 1, 0.5660, 0, 1, 0
0, 0.46, 0, 1, 0, 0.5880, 0, 1, 0
1, 0.57, 1, 0, 0, 0.6980, 1, 0, 0
0, 0.52, 0, 0, 1, 0.5940, 0, 1, 0
0, 0.31, 0, 0, 1, 0.4350, 0, 1, 0
0, 0.55, 1, 0, 0, 0.6200, 0, 0, 1
1, 0.50, 1, 0, 0, 0.5640, 0, 1, 0
1, 0.48, 0, 1, 0, 0.5590, 0, 1, 0
0, 0.22, 0, 0, 1, 0.3450, 1, 0, 0
1, 0.59, 0, 0, 1, 0.6670, 1, 0, 0
1, 0.34, 1, 0, 0, 0.4280, 0, 0, 1
0, 0.64, 1, 0, 0, 0.7720, 0, 0, 1
1, 0.29, 0, 0, 1, 0.3350, 0, 0, 1
0, 0.34, 0, 1, 0, 0.4320, 0, 1, 0
0, 0.61, 1, 0, 0, 0.7500, 0, 0, 1
1, 0.64, 0, 0, 1, 0.7110, 1, 0, 0
0, 0.29, 1, 0, 0, 0.4130, 1, 0, 0
1, 0.63, 0, 1, 0, 0.7060, 1, 0, 0
0, 0.29, 0, 1, 0, 0.4000, 1, 0, 0
0, 0.51, 1, 0, 0, 0.6270, 0, 1, 0
0, 0.24, 0, 0, 1, 0.3770, 1, 0, 0
1, 0.48, 0, 1, 0, 0.5750, 0, 1, 0
1, 0.18, 1, 0, 0, 0.2740, 1, 0, 0
1, 0.18, 1, 0, 0, 0.2030, 0, 0, 1
1, 0.33, 0, 1, 0, 0.3820, 0, 0, 1
0, 0.20, 0, 0, 1, 0.3480, 1, 0, 0
1, 0.29, 0, 0, 1, 0.3300, 0, 0, 1
0, 0.44, 0, 0, 1, 0.6300, 1, 0, 0
0, 0.65, 0, 0, 1, 0.8180, 1, 0, 0
0, 0.56, 1, 0, 0, 0.6370, 0, 0, 1
0, 0.52, 0, 0, 1, 0.5840, 0, 1, 0
0, 0.29, 0, 1, 0, 0.4860, 1, 0, 0
0, 0.47, 0, 1, 0, 0.5890, 0, 1, 0
1, 0.68, 1, 0, 0, 0.7260, 0, 0, 1
1, 0.31, 0, 0, 1, 0.3600, 0, 1, 0
1, 0.61, 0, 1, 0, 0.6250, 0, 0, 1
1, 0.19, 0, 1, 0, 0.2150, 0, 0, 1
1, 0.38, 0, 0, 1, 0.4300, 0, 1, 0
0, 0.26, 1, 0, 0, 0.4230, 1, 0, 0
1, 0.61, 0, 1, 0, 0.6740, 1, 0, 0
1, 0.40, 1, 0, 0, 0.4650, 0, 1, 0
0, 0.49, 1, 0, 0, 0.6520, 0, 1, 0
1, 0.56, 1, 0, 0, 0.6750, 1, 0, 0
0, 0.48, 0, 1, 0, 0.6600, 0, 1, 0
1, 0.52, 1, 0, 0, 0.5630, 0, 0, 1
0, 0.18, 1, 0, 0, 0.2980, 1, 0, 0
0, 0.56, 0, 0, 1, 0.5930, 0, 0, 1
0, 0.52, 0, 1, 0, 0.6440, 0, 1, 0
0, 0.18, 0, 1, 0, 0.2860, 0, 1, 0
0, 0.58, 1, 0, 0, 0.6620, 0, 0, 1
0, 0.39, 0, 1, 0, 0.5510, 0, 1, 0
0, 0.46, 1, 0, 0, 0.6290, 0, 1, 0
0, 0.40, 0, 1, 0, 0.4620, 0, 1, 0
0, 0.60, 1, 0, 0, 0.7270, 0, 0, 1
1, 0.36, 0, 1, 0, 0.4070, 0, 0, 1
1, 0.44, 1, 0, 0, 0.5230, 0, 1, 0
1, 0.28, 1, 0, 0, 0.3130, 0, 0, 1
1, 0.54, 0, 0, 1, 0.6260, 1, 0, 0

Test data:

# people_test.txt
#
0, 0.51, 1, 0, 0, 0.6120, 0, 1, 0
0, 0.32, 0, 1, 0, 0.4610, 0, 1, 0
1, 0.55, 1, 0, 0, 0.6270, 1, 0, 0
1, 0.25, 0, 0, 1, 0.2620, 0, 0, 1
1, 0.33, 0, 0, 1, 0.3730, 0, 0, 1
0, 0.29, 0, 1, 0, 0.4620, 1, 0, 0
1, 0.65, 1, 0, 0, 0.7270, 1, 0, 0
0, 0.43, 0, 1, 0, 0.5140, 0, 1, 0
0, 0.54, 0, 1, 0, 0.6480, 0, 0, 1
1, 0.61, 0, 1, 0, 0.7270, 1, 0, 0
1, 0.52, 0, 1, 0, 0.6360, 1, 0, 0
1, 0.30, 0, 1, 0, 0.3350, 0, 0, 1
1, 0.29, 1, 0, 0, 0.3140, 0, 0, 1
0, 0.47, 0, 0, 1, 0.5940, 0, 1, 0
1, 0.39, 0, 1, 0, 0.4780, 0, 1, 0
1, 0.47, 0, 0, 1, 0.5200, 0, 1, 0
0, 0.49, 1, 0, 0, 0.5860, 0, 1, 0
0, 0.63, 0, 0, 1, 0.6740, 0, 0, 1
0, 0.30, 1, 0, 0, 0.3920, 1, 0, 0
0, 0.61, 0, 0, 1, 0.6960, 0, 0, 1
0, 0.47, 0, 0, 1, 0.5870, 0, 1, 0
1, 0.30, 0, 0, 1, 0.3450, 0, 0, 1
0, 0.51, 0, 0, 1, 0.5800, 0, 1, 0
0, 0.24, 1, 0, 0, 0.3880, 0, 1, 0
0, 0.49, 1, 0, 0, 0.6450, 0, 1, 0
1, 0.66, 0, 0, 1, 0.7450, 1, 0, 0
0, 0.65, 1, 0, 0, 0.7690, 1, 0, 0
0, 0.46, 0, 1, 0, 0.5800, 1, 0, 0
0, 0.45, 0, 0, 1, 0.5180, 0, 1, 0
0, 0.47, 1, 0, 0, 0.6360, 1, 0, 0
0, 0.29, 1, 0, 0, 0.4480, 1, 0, 0
0, 0.57, 0, 0, 1, 0.6930, 0, 0, 1
0, 0.20, 1, 0, 0, 0.2870, 0, 0, 1
0, 0.35, 1, 0, 0, 0.4340, 0, 1, 0
0, 0.61, 0, 0, 1, 0.6700, 0, 0, 1
0, 0.31, 0, 0, 1, 0.3730, 0, 1, 0
1, 0.18, 1, 0, 0, 0.2080, 0, 0, 1
1, 0.26, 0, 0, 1, 0.2920, 0, 0, 1
0, 0.28, 1, 0, 0, 0.3640, 0, 0, 1
0, 0.59, 0, 0, 1, 0.6940, 0, 0, 1
This entry was posted in JavaScript. Bookmark the permalink.

Leave a Reply