Built with Mathigon

Glossary

Select one of the keywords on the left…

Underfitting and OverfittingIntroduction

Reading time: ~5 min

Have you ever met someone who always jumps to conclusions without looking at the facts? Or someone who overthinks every tiny detail until they completely miss the big picture?

Machine learning models suffer from the exact same personality flaws! When an AI jumps to massive conclusions using too little information, it suffers from Bias. When it overthinks and memorizes every single useless detail, it suffers from Variance.

Any time an AI makes a prediction error, that mistake is almost always caught in a tug-of-war between these two extremes. If an AI is too simple, it will naturally the data. If it is far too complex, it will overfit!

The Core Concept

Balancing this tug-of-war is literally the foundation of all machine learning! In this chapter, we will interactively visualize exactly what underfitting (high bias) and overfitting (high variance) really look like, and learn how to find the perfect middle ground.

Sina