Built with Mathigon

Glossary

Select one of the keywords on the left…

Logistic RegressionUnderstanding Our S-Curve

Reading time: ~10 min

Reading a Logistic Regression formula is slightly trickier than reading a standard straight-line formula. Because our line was bent into an S-curve, our numbers (coefficients) don't represent normal numbers anymore—they represent Log-Odds.

Before we can intuitively understand Log-Odds, we have to understand plain old Odds!

What are "Odds"?

"Odds" are just another way to talk about probability! It's the ratio of something happening compared to it not happening.

For example, if the probability of a sunny day is 0.75 (75%), the probability of a rainy day is 0.25 (25%). The odds of a sunny day are simply 0.750.25=3. In sports or statistics terms, the odds of sunshine are 3 to 1!

Prompt: A minimal artistic drawing showing a shiny geometric 3-to-1 coin or dice, symbolizing betting odds.

If there is a 50% chance of rain, what are the exact odds of rain?

1 to 1 (Even odds!)
50 to 1
0 to 1

By taking the mathematical "logarithm" of the odds, we get log-odds. Our computer calculates everything under the hood using log-odds, but for us humans, we always convert the numbers back to normal odds or percentages because they are much easier to read!

Select a tab to see exactly how we interpret our model's coefficients!

Yes/No Features

Example: Predicting if it will be Sunny based on if it's Foggy in the morning.

If our feature is a simple Yes/No (1 or 0), our coefficient tells us how much the odds of our prediction change when the feature is "Yes".

If our math calculates a coefficient that translates to 0.5 on the odds scale, it indicates that the odds of a sunny day are cut in half (multiplied by 0.5) whenever a morning is foggy! The computer has successfully learned that fog hurts the chances of sunshine.

Number Features

Example: Predicting if it will be Sunny based on the exact Temperature.

What if our feature is a continuous number, like 75 degrees?

In this case, our coefficient tells us how much the odds change for every single step forward. If the math gives us a coefficient that translates to 2, it means that for every 1-degree increase in temperature, the odds of a sunny day double (multiply by 2)! It acts exactly like the we learned about in Linear Regression.

Multiple Features

Example: Predicting if it will be Sunny based on Temperature AND Fog.

Real-world models look at many features at the exact same time! We call this a multivariate model.

We interpret the numbers identically to the previous tabs, with one incredibly important catch: we assume all other factors are completely "frozen".

So, if we look at the temperature coefficient, we are reading the change in the odds of sunshine for every degree the temperature rises, assuming the fogginess stays exactly the same.

And just like that, you know how to read your very first Logistic Regression classification model!

Sina