Suppose X is np with rank p

Question:

Suppose X is n×p with rank p

(X

X)−1X

Y and e = Y − Xβˆ.

(a) Show that X

X is p×p, while X

Y is p×1.

(b) Show that X

X is symmetric. Hint: look at exercise 9(a).

(c) Show that X

X is invertible. Hint: look at exercise 10.

(d) Show that (X
X)−1 is p×p, so βˆ = (X
X)−1X
Y is p×1.

(e) Show that (X
X)−1 is symmetric. Hint: look at exercise 9(b).

(f) Show that Xβˆ and e = Y − Xβˆ are n×1.
(g) Show that X
Xβˆ = X
Y , and hence X
e = 0p×1.
(h) Show that e ⊥ Xβˆ, so Y 2 = Xβˆ2 + e2.
(i) If γ is p×1, show that Y − Xγ 2 = Y − Xβˆ2 + X(βˆ − γ )2.
Hint: Y − Xγ = Y − Xβˆ + X(βˆ − γ ).
(j) Show that Y − Xγ 2 is minimized when γ = βˆ.
(k) If β˜ is p×1 with Y −Xβ˜ ⊥ X, show that β˜ = βˆ. Notation: v ⊥ X if v is orthogonal to each column of X. Hint: what is X
(Y −Xβ)˜ ?
(l) Is XX invertible? Hints. By assumption, pX = 01×p?
(m) Is (X
X)−1 = X−1(X
)−1?
Notes. The “OLS estimator” is βˆ, where OLS is shorthand for “ordinary least squares.” This exercise develops a lot of the theory for OLS estimators. The geometry in brief: X
e = 0p×1 means that e is orthogonal—perpendicular—to each column of X. Hence Yˆ = Xβˆ
is the projection of Y onto the columns of X, and the closest point in column space to Y . Part (j) is Gauss’ theorem for multiple regression.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: