Logistic RegressionUnderstanding Our S-Curve
Reading a Logistic Regression formula is slightly trickier than reading a standard straight-line formula. Because our line was bent into an S-curve, our numbers (coefficients) don't represent normal numbers anymore—they represent Log-Odds.
Before we can intuitively understand Log-Odds, we have to understand plain old Odds!
What are "Odds"?
"Odds" are just another way to talk about probability! It's the ratio of something happening compared to it not happening.
For example, if the probability of a sunny day is
Prompt: A minimal artistic drawing showing a shiny geometric 3-to-1 coin or dice, symbolizing betting odds.
If there is a 50% chance of rain, what are the exact odds of rain?
By taking the mathematical "logarithm" of the odds, we get log-odds. Our computer calculates everything under the hood using log-odds, but for us humans, we always convert the numbers back to normal odds or percentages because they are much easier to read!
Select a tab to see exactly how we interpret our model's coefficients!
Yes/No Features
Example: Predicting if it will be Sunny based on if it's Foggy in the morning.
If our feature is a simple Yes/No (1 or 0), our coefficient tells us how much the odds of our prediction change when the feature is "Yes".
If our math calculates a coefficient that translates to
Number Features
Example: Predicting if it will be Sunny based on the exact Temperature.
What if our feature is a continuous number, like 75 degrees?
In this case, our coefficient tells us how much the odds change for every single step forward. If the math gives us a coefficient that translates to
Multiple Features
Example: Predicting if it will be Sunny based on Temperature AND Fog.
Real-world models look at many features at the exact same time! We call this a multivariate model.
We interpret the numbers identically to the previous tabs, with one incredibly important catch: we assume all other factors are completely "frozen".
So, if we look at the temperature coefficient, we are reading the change in the odds of sunshine for every degree the temperature rises, assuming the fogginess stays exactly the same.
And just like that, you know how to read your very first Logistic Regression classification model!