Least Squares Regression Line Math Example 4
Follow the full solution, then compare it with the other examples linked below.
Example 4
hardThe LSRL has the property of minimizing . Explain why minimizing squared residuals (rather than absolute residuals) is preferred, and name two consequences of this choice.
Solution
- 1 Why squared: (1) mathematically convenient โ differentiable, unique closed-form solution; (2) penalizes larger errors more heavily than small ones, making the line more sensitive to influential points
- 2 Consequence 1: the LSRL is sensitive to outliers โ a single point far from the line (large residual) has a disproportionate influence on slope and intercept
- 3 Consequence 2: the LSRL passes through always โ a result that follows mathematically from minimizing squared residuals
Answer
Squared residuals: mathematical tractability + outlier sensitivity. LSRL always passes through .
The choice of squared vs. absolute residuals has important implications. Squared residuals give a unique, computationally convenient solution but create sensitivity to outliers. Absolute residuals give a more robust line (less outlier-sensitive) but lack a closed-form solution.
About Least Squares Regression Line
The unique straight line that minimizes the sum of squared vertical distances (residuals) between the observed data points and the line.
Learn more about Least Squares Regression Line โMore Least Squares Regression Line Examples
Example 1 medium
Find the least-squares regression line for: [formula]: [formula]. Use [formula] and [formula].
Example 2 hardThe LSRL for predicting weight ([formula], kg) from height ([formula], cm) is [formula]. Interpret t
Example 3 easyGiven [formula]: (a) predict [formula] when [formula], (b) interpret the slope, (c) does the line pa