For the past dozen months or so, I’ve been working with neural network libraries including TensorFlow, Keras, CNTK, and PyTorch. As mental exercise, I decided to implement a neural network from scratch, using just raw Python and NumPy.
I’ve done this before. In fact, until about three years ago, implementing a neural network from scratch was just about the only option available because the neural libraries didn’t exist.
The exercise was a lot of fun. My implementation used online training — processing one line of training data at a time. Using mini-batch training is quite a bit more complicated so I’ll save that for another day. Additionally, I used mean squared error rather than the more common cross entropy error, mostly because back-propagation with MSE is a bit easier to understand than back-propagation with cross entropy.
As I was coding, I was mildly surprised to realize how much I’d learned over the past several months, while using Keras and CNTK. For example, when I initialized weights and biases, I used a Uniform random distribution, but I could have used Glorot Uniform or Glorot Normal, because I now understand those algorithms well.
The moral of the story is that the field of machine earning is moving very fast and it’s important to keep as up to date as possible. And that requires daily effort.

The F-104 jet fighter set a world speed record of 1404 miles per hour (2260 km/h) in 1958. That’s fast, even by the standards of modern jet fighters 60 years later.

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.