An estimator is said to be consistent if for any > 0, P(0-0) 0 as n .

Question:

An estimator is said to be consistent if for any > 0, P(0-0) 0 as n . That is, is consistent if, as the sample size gets larger, it is less and less likely that will be further than from the true value of 0.

Show that X is a consistent estimator of when K minimizes the mean squared error of this estimator when the population distribution is normal? [Hint: It can be shown that E[(S)] = (n+1)/(n - 1) In general, it is difficult to find to minimize MSE(). which is why we look only at unbiased estimators and minimize V(0).] = 35.

Let X,,..., X, be a random sample from a pdf that is symmet- ric about . An estimator for that has been found to perform well for a variety of underlying distributions is the Hodges-Lehmann estimator. To define it, first compute for each i j and each j = 1, 2,...,n the pairwise average X (X + X))/2. Then the estimator is the median of the X's. Compute the value of this estimate using the data of Exercise 44 of Chapter 1.

[Hint: Construct a square table with the x,'s listed on the left margin and on top. Then compute averages on and above the diagonal.]

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: