*Orthogonal polynomial contrasts: The polynomial regressors X, X2; ... ; X m'1 generated to represent a quantitative,...

Question:

*Orthogonal polynomial contrasts: The polynomial regressors X, X2; ... ; X m'1 generated to represent a quantitative, discrete X with values 1, 2; ... ; m are substantially correlated. It is convenient (but by no means essential) to remove these correlations. Suppose that there are equal numbers of observations in the different levels of X, so that it suffices to make the columns of the row basis of the model matrix for X orthogonal. Working with the row basis, begin by subtracting the mean from X, calling the result X*. Centering X in this manner makes X* orthogonal to the constant regressor 1. (Why?) X2 can be made orthogonal to the constant and X* by projecting the X2 vector onto the subspace generated by 1 and X*; call the residual from this projection X*2. The remaining columns X*3; ... ; X*m'1 of the new row basis are formed in a similar manner, each orthogonal to the preceding ones.

(a) Show that the orthogonal polynomial contrasts 1, X*; ... ; X*m'1 span the same subspace as the original polynomial regressors 1, X; ... ; X m'1.

(b) Show that the incremental sum of squares for each orthogonal contrast X*, X*2; ... ;

X*m'1 is the same as the step-down sum of squares for the corresponding regressor among the original (correlated) polynomial terms, X m'1; ... ; X2, X. ( Hint: Remember that X*, X*2; ... ; X*m'1 are uncorrelated.)

(c) What, then, is the advantage of orthogonal polynomial contrasts?

(d) Can the same approach be applied to a continuous quantitative explanatory variable—

defining, for example, a quadratic component that is orthogonal to the linear component and a cubic component orthogonal to both the linear and quadratic components?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: