Orthogonal explanatory variables. Suppose in the model Y i = 1 + 2 X 2i
Question:
Orthogonal explanatory variables. Suppose in the model
Yi = β1 + β2X2i + β3X3i + · · ·+βk Xki + ui
X2 to Xk are all uncorrelated. Such variables are called orthogonal variables. If this is the case:
a. What will be the structure of the (X'X) matrix?
b. How would you obtain β̂ = (X'X)−1X'y?
c. What will be the nature of the var–cov matrix of β̂?
d. Suppose you have run the regression and afterward you want to introduce another orthogonal variable, say, Xk+1 into the model. Do you have to recomputed all the previous coefficients β̂1 to β̂k ? Why or why not?
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: