5.10 Consider the correlated regression model, defined in the text by (5.53), say, y = Z +...

Question:

5.10 Consider the correlated regression model, defined in the text by (5.53), say, y = Zβ + x, where x has mean zero and covariance matrix Γ. In this case, we know that the weighted least squares estimator is (5.54), namely,

βw = (Z

Γ−1Z)−1Z

Γ−1y.

Now, a problem of interest in spatial series can be formulated in terms of this basic model. Suppose yi = y(σi), i = 1, 2, . . . , n is a function of the spatial vector coordinates σi = (si1, si2, . . . , sir), the error is xi = x(σi), and the rows of Z are defined as z(σi), i = 1, 2, . . . , n. The Kriging estimator is defined as the best spatial predictor of y0 = z

0β + x0 using the estimator

y0 = a

y, subject to the unbiased condition Ey0 = Ey0, and such that the mean square prediction error MSE = E[(y0 − y0)2]

is minimized.

(a) Prove the estimator is unbiased when Za = z0.

(b) Show the MSE is minimized by solving the equations Γa + Zλ = γ0 and Z
a = z0, where γ0 = E[xx0] represents the vector of covariances between the error vector of the observed data and the error of the new point the vector λ is a q × 1 vector of LaGrangian multipliers.

(c) Show the predicted value can be expressed as

image text in transcribed

so the optimal prediction is a linear combination of the usual predictor and the least squares residuals.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: