Built with Mathigon

Glossary

Select one of the keywords on the left…

Testing Your AISee For Yourself

Reading time: ~5 min

To make K-Fold Cross-Validation crystal clear, go ahead and test it yourself! In the interactive simulation below, we are using a simple linear regression model.

Try grabbing the slider and dragging the value of k to change the number of folds used in the Cross-Validation. Watch what happens as the original dataset gets sliced and rotated:

Because each "fold" is forced to study a slightly different combination of data points, every single fold produces a slightly different trendline. The magical final test grade is completely evaluated by taking the performance across all the folds in aggregate!

A Hidden Tradeoff!

While playing around with the simulation, you may have observed something interesting: the lines of best fit wildly jump around far more for purely low values of k than they do for higher values.

This strange behavior isn't an optical illusion. It is actually a direct result of our old archnemesis: The Bias-Variance Tradeoff. Let's break down exactly why this happens in the next section!

Sina