5.15 There are various ways to seemingly generalize Theorems 5.5 and 5.9. However, if both the estimator

Question:

5.15 There are various ways to seemingly generalize Theorems 5.5 and 5.9. However, if both the estimator and loss function are allowed to depend on the covariance and loss matrix, then linear transformations can usually reduce the problem.

Let X ∼ Nr(θ , ), and let the loss function be L(θ , δ)=(θ −δ)



Q(θ −δ), and consider the following “generalizations” of Theorems 5.5 and 5.9.

(a) δ(x) = 

1 − c(x

−1x)

x

−1x



x, Q = −1

,

(b) δ(x) = 

1 − c(x

Qx)

x

Qx



x,  = I or  = Q,

(c) δ(x) = 

1 − c(x

−1/2Q−1/2x)

x

−1/2Q−1/2x



x.

In each case, use transformations to reduce the problem to that of Theorem 5.5 or 5.9, and deduce the condition for minimaxity of δ.

[Hint: For example, in

(a) the transformation Y = −1/2X will show that δ is minimax if 0 < c(·) < 2(r − 2).]

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Theory Of Point Estimation

ISBN: 9780387985022

2nd Edition

Authors: Erich L. Lehmann, George Casella

Question Posted: