How do I solve this?
I. Decide whether the statement is true (T) or false (F). Each part is worth 2 points. 1. The sample covariance between the independent variables and the OLS residuals is zero. Consequently, the sample covariance between the OLS fitted values and the OLS errors is zero. 2. Omitting an important factor that is correlated with any of independent variables causes Assumption 4 (No Perfect Collinearity) to fail and thus OLS to be biased. 3. Because an estimator is a fixed number, obtained from a particular sample, an estimator cannot be unbiased 4. The OLS residual associated with observation i, 24, , is the difference between actual y, and its fitted value ( 1; = ); - ); ). If u; is negative, the line under-predicts ); if u; is positive, the line over- predicts Vi. 5. OLS no longer has the smallest variance among linear unbiased estimators in the presence of heteroskedasticity, which causes OLS to be biased. 6. A fitted value is defined as the estimated values of the dependent variable when the values of the independent variables for each observation are plugged into the OLS regression line. Generally, we exclude the intercept in obtaining the fitted values; with inclusion of the intercept, the answer could be very misleading. 7. The errors are computed from the data, whereas the residuals are never observable. 8. Multiple regression analysis allows us to explicitly control for many other factors which simultaneously affect the dependent variable. 9. A' never decreases when another explanatory variable is added to a regression. In other words, the sum of squared residuals (SSR) never increases when additional regressors are added to the model. 10. The F statistic is intended to detect whether a set of coefficients is different from zero, but it is never the best test for determining whether a single coefficient is different from zero