6.11 (Requires calculus) Consider the regression model Yi = b1X1i + b2X2i + ui for i =...
Question:
6.11 (Requires calculus) Consider the regression model Yi = b1X1i + b2X2i + ui for i = 1,
c, n. (Notice that there is no constant term in the regression.)
Following analysis like that used in Appendix (4.2):
a. Specify the least squares function that is minimized by OLS.
b. Compute the partial derivatives of the objective function with respect to b1 and b2.
c. Suppose that gni
= 1X1iX2i = 0. Show that b n
1 = gni
= 1X1iYi > gni
= 1X21 i.
d. Suppose that gni
= 1X1iX2i 0. Derive an expression for b n
1 as a function of the data (Yi, X1i, X2i), i = 1,
c, n.
e. Suppose that the model includes an intercept: Yi = b0 + b1X1i +
b2X2i + ui. Show that the least squares estimators satisfy b n
0 =
Y - b n
1X1 - b n
2X2.
f. As in (e), suppose that the model contains an intercept. Also suppose that gni
= 1(X1i - X1)(X2i - X2) = 0. Show that b n
1 =
gni
= 1(X1i - X1)(Yi - Y )>gni
= 1(X1i - X1)2. How does this compare to the OLS estimator of b1 from the regression that omits X2?
Step by Step Answer:
Introduction To Econometrics
ISBN: 9781292071367
3rd Global Edition
Authors: James Stock, Mark Watson