In a multiple linear regression analysis, (Vu X1i , *2i), i = 1,..., n, are statistically independent and satisfy the model (M1) given by: y = Bot Bix1+ Bzx2+ E, where the response variable y is continuous, the regressor vector X = (X1, X2)' has mean vector (/x1, /x2)' and positive definite variance-covariance matrix Ex , the random errors Er conditional on X, are statistically independent and normally distributed with mean zero and variance o which does not depend on X .After trying many multiple linear regression analyses based on the model M1 in Problem 1 above, a data analyst obtained the following statistics. The regression sum of squares 55030) by excluding the two regressors x1, x2 . The regression sums of squares 55(30),SS(30, 31), by excluding the regressor x2 . The regression sums of squares 55(30),SS(130, 32), by excluding the regressor x1 . The regression sums of squares 55(30),SS(30, 61, 62); that is, include all the regressors. The data analyst used statistical tests derived from sums of squares to determine whether x1 or x2 should be excluded to come up with the final regression model. That is, the final model may be a subset model of M1. 1) Discuss with mathematical proof whether the ordinary leastsquares estimator(s) for the remaining regressor(s) in the final model is biased. [10 points] 2) Discuss with mathematical proof whether the ordinary least-squares estimator(s) for the remaining regressor(s) in the final model has a smaller variance than the respective ordinary leastsquares estimator(s) in fitting the model M1. [10 points] 3) The analyst also explored addition of another regressor 3113; that is, Model M1 could be extended to include 9433. Discuss with mathematical proof whether the ordinary least- squares estimator for 51, say, based on the extended model is biased for 1 if the true model is M1. [10 points]