Somewhat unusually, I can get a lot of technical work done while I travel. When I’m sitting in a terminal or sitting on a plane, there are no distractions and I do what I call mental coding.
On a recent trip, for some reason, the topic of Naive Bayes classification popped into my head. I spent the next couple of hours mentally coding up a minimal implementation, with a little bit of help from an airline paper napkin and a pen.
When I got back home, I opened up my laptop and coded my mental implementation using Python. There were several glitches of course, but I had all the main ideas correct.
For my demo, I created a 40-item dummy data file that looks like:
‘A’, ‘R’, ‘T’, ‘1’
‘C’, ‘R’, ‘S’, ‘0’
‘Z’, ‘L’, ‘M’, ‘0’
‘Z’, ‘R’, ‘M’, ‘1’
. . .
There are three predictor values followed by a 0 or a 1. The goal is to classify data as 0 or 1. The first variable can be one of (A, C, E, Z). The second variable can be one of (L, R). The third variable can be one of (S, M, T). There are versions of Naive Bayes that work with numeric predictor data, but the simplest form works with categorical predictor values. My demo is binary classification, but Naive Bayes easily extends to multiclass classification.
An implementation of Naive Bayes is relatively short but quite deep. A full explanation would take several pages. But briefly, joint counts (such as the count of items with both ‘E’ and ‘0’) are computed, and counts of the dependent values (0 and 1) are computed, and combined according to Bayes Law to yield probabilities.
One important implementation factor is the tradeoff between a specific implementation for a given problem, versus a completely general implementation that can be applied to most problems. I decided, as I usually do, for a mostly specific, non-general implementation.
In my demo, I classified inputs (E, R, T). The result is a pair of values (0.3855, 0.6145) which loosely represent the probability that the input is class 0 and class 1. Because the second value is larger, the prediction is class 1.

I have always been fascinated by model train buildings. Some are truly works of art. Most of the people I’m comfortable hanging out with enjoy modeling reality in some way (machine learning, art, games, etc.)

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.