I’m working on an article for Visual Studio Magazine on logistic regression. One of the most common mistakes related to logistic regression that I see on the Internet is related to the idea of linear separability. The topic is very tricky, but briefly, a correct statement is that logistic regression can score 100% accuracy only on data that is linearly separable. A common error in many references is something that suggests that logistic regression can classify data that has an “S” shape.
For simplicity, suppose you have two predictor variables, x0 and x1. Each data item can be class 0 or class 1. The following explanation applies to any number of predictor variables. The goal of logistic regression is to create a math function that classifies your data as either class 0 or class 1. As a concrete example, you can think of a problem to classify a person as male (0) or female (1) based on height (x0) and weight (x1).
To perform logistic regression, you compute
z = (w0 * x0) + (w1 * x1) + b
p = 1.0 / (1.0 + exp(-z))
where w0 and w1 are numeric constants called weights and b is a numeric constant called the bias. The exp(v) function is Euler’s number (approximately 2.718) raised to v. The resulting p value will always be between 0.0 and 1.0. If p is less than 0.5 the item is classified as class 0, otherwise the item is classified as class 1.
The function f(z) = 1.0 / (1.0 + exp(-z)) is called the logistic sigmoid function. The function accepts any value, from negative infinity to positive infinity, and returns a value between 0.0 and 1.0. If you graph logistic sigmoid, you get an “S” shaped curve. This is why some machine learning beginners incorrectly believe logistic regression can perfectly classify data that follows an “S” shape. However, the key is that the z function is a linear function, not that the logistic sigmoid function is a non-linear function. (Note: complicating matters is the fact that “linear” has several different meanings in mathematics).

Artificial data that follows the logistic sigmoid function.

Logistic sigmoid creates a linear separating line and so it can’t perfectly classify the data. The best it can do is 70% accuracy for the artificial data (which my be good enough in some scenarios).
I coded up a demo to illustrate the idea that logistic regression cannot perfectly classify data that’s not linearly separable. In the graph above, the black line is the pure mathematical logistic sigmoid function. I created 60 artificial data points. The 30 red data points are class 0 and are all slightly less (0.02) than their pure logistic sigmoid value. The 30 green data points are class 1 and are slightly greater than their pure log-sig value.
I ran logistic regression on the data and the best possible model scores only 70.00% accuracy. That’s because there’s no straight line that will perfectly separate the two classes. If logistic regression could create an “S” shaped model, it’d be easy to score 100% accuracy.
The moral of the story is that logistic regression is simple and often works fairly well. But logistic regression isn’t magic and it can only perfectly classify linearly separable data. (And another confusing detail is that logistic regression has difficulty with data that is perfectly linearly separable, but that’s another topic).

Chang was the stage name of magician Juan Jesorum (1890 – 1972). He was born in Panama but took on a Chinese persona, which was a very common thing for magicians in the early 1900s. He performed all over the world but was especially popular in South America and Spain because he could speak Spanish. I have several original posters of his touring show. The show also featured a Western magician who took on the Chinese persona Fak-Hong but I’ve never found any information about him.
.NET Test Automation Recipes
Software Testing
SciPy Programming Succinctly
Keras Succinctly
R Programming
2026 Visual Studio Live
2025 Summer MLADS Conference
2026 DevIntersection Conference
2025 Machine Learning Week
2025 Ai4 Conference
2026 G2E Conference
2026 iSC West Conference
Thanks Dr. McCaffrey!!
Aloha,
Kevin