Practice Least Squares Regression Line in Math
Use these practice problems to test your method after reviewing the concept explanation and worked examples.
Quick Recap
The unique straight line \hat{y} = a + bx that minimizes the sum of squared vertical distances (residuals) between the observed data points and the line.
You have a scatter plot with points scattered around a general trend. The LSRL is the line that gets as close as possible to all the points simultaneouslyβit's the 'best' straight line through the cloud. 'Best' means it minimizes the total squared prediction error.
Example 1
mediumFind the least-squares regression line for: (x,y): (1,2), (2,4), (3,5), (4,4), (5,5). Use b = r \frac{s_y}{s_x} and a = \bar{y} - b\bar{x}.
Example 2
hardThe LSRL for predicting weight (y, kg) from height (x, cm) is \hat{y} = -100 + 0.8x. Interpret the slope and intercept, predict weight for height=175 cm, and explain why extrapolating to height=50 cm is problematic.
Example 3
easyGiven \hat{y} = 5 + 3x: (a) predict y when x=4, (b) interpret the slope, (c) does the line pass through the origin?
Example 4
hardThe LSRL has the property of minimizing \sum e_i^2 = \sum (y_i - \hat{y}_i)^2. Explain why minimizing squared residuals (rather than absolute residuals) is preferred, and name two consequences of this choice.