Respuesta :
Answer:
1. The sum of the residuals is always close to zero.
2.The coefficient of determination measures how much of the variation in the y-values is explained by the regression line.
3. In the equation of the least-squares regression line, y hat is a predicted value when x is known
Step-by-step explanation:
The least square regression line is a line that makes distance from data points to the regression line to be as minimal as possible. This line is a best fit for the data points. Let's say we have a collection of numbers and a scatter plot, this line is a line that exists and best fits the data.
Therefore, the least square regression line minimizes sum of squared error to be close to zero. R² measures the extent to which variations in the value of y is explained by the regression line. In the equation of this line, y hat is a predicted value when x is known.
The least square regression line minimizes the sum of squared error such that the sum of the residual is zero or very close to 0. Hence, the true statements from the options given are :
- Sum of residuals is always close to 0
- The coefficient of determination measures how much of the variation in the y-values is explained by the regression line.
- [tex]y_{hat} [/tex] is the predicted value when x is known.
The least square regression Line is used to create a Linear model in the form y = bx + c
The Coefficient of determination R² gives the proportion of the variation explained by the regression line.
The predicted value of y is obtained using the model of the Least Square Regression Line for a given value of x.
Outlier values often affects the performance and accuracy of the Least Square Regression Line.
Learn more :https://brainly.com/question/18405415