BUAD 310g Chapter Notes - Chapter 13: Multicollinearity, Autocorrelation, Homoscedasticity
Document Summary
Multiple regression includes several independent variables (whereas simple regression only contains one) Statisticians make no differentiation between simple and multiple regression simple regression is a special case of multiple regression. Example: deciding how much to sell your home for: multiple relationships usually exist, biased estimates if relative predictors are omitted, lack of fit does not show that x is unrelated to y if the true model is multivariate. Would be extrapolating beyond the range of observed data. State our hypothesis about the sign of the coefficients. Regression equation can be estimated by a statistical package i. e. excel. Risky to use predictor values outside the predictor range. When two explanations are otherwise equivalent, we prefer the simpler, more parsimonious one. Sst (total variation) = ssr (explained by regression) + sse (unexplained error) When fcalc is close to 1, the values of msr and mse are close in magnitude (none of the predictors provides a good predictive model for y)