Overfitting (Intuition) Math Example 4
Follow the full solution, then compare it with the other examples linked below.
Example 4
hardExplain the bias-variance tradeoff: how does increasing model complexity affect bias and variance, and where is the optimal model?
Solution
- 1 Underfitting (high bias): simple model fails to capture true relationship; systematic errors in both training and test data
- 2 Overfitting (high variance): complex model fits training data noise; small changes in training data cause large changes in model
- 3 As complexity increases: bias decreases, variance increases
- 4 Optimal model: minimizes total error = biasΒ² + variance; typically a model of intermediate complexity, found via cross-validation
Answer
Increasing complexity reduces bias but increases variance. Optimal model balances both, minimizing total prediction error.
The bias-variance tradeoff is a fundamental concept in machine learning. Simpler models have high bias (underfitting); complex models have high variance (overfitting). The sweet spot is model complexity that minimizes the sum of squared bias and variance.
About Overfitting (Intuition)
Overfitting occurs when a model learns the noise in training data instead of just the underlying pattern, performing well on training data but poorly on new data.
Learn more about Overfitting (Intuition) βMore Overfitting (Intuition) Examples
Example 1 medium
A model fits 10 data points with a degree-9 polynomial (perfect fit, [formula]). A simpler linear mo
Example 2 hardA machine learning model is trained on 1000 observations with 50 predictors. Training error is near
Example 3 easyA student memorizes all 500 practice problems but performs poorly on the exam, which has new problem