I wrote an article titled “Understanding Neural Network Batch Training: A Tutorial” in the August 2014 issue of Visual Studio Magazine. See http://visualstudiomagazine.com/articles/2014/08/01/batch-training.aspx.
By far the most common technique used to train a neural network is called back-propagation. But there are alternatives including using particle swarm optimization, or using simplex optimization. Each of the possible training techniques can be applied using one of two approaches, usually called “batch” and “online” training.
In batch training, in each iteration of the main training loop, all training data items are examined and their associated errors (differences between the computed output values and the actual output values) are then used to modify the network’s weights and bias values.
In online training, in each iteration of the main training loop, after each test item is fed to the network, weights and bias values are immediately updated.
Research suggest that online training is quite a bit superior to batch training when using back-propagation, but research results are not clear if online training is better when using particle swarm or simplex optimization.

.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
You must be logged in to post a comment.