Consider the continuous probability distribution (f(x)). Suppose that (theta) is an unknown location parameter and that the

Question:

Consider the continuous probability distribution \(f(x)\). Suppose that \(\theta\) is an unknown location parameter and that the density may be written as \(f(x-\theta)\) for \(-\infty<\theta<\infty\). Let \(x_{1}, x_{2}, \ldots, x_{n}\) be a random sample of size \(n\) from the density.

a. Show that the maximum-likelihood estimator of \(\theta\) is the solution to
\[
\sum_{i=1}^{n} \psi\left(x_{i}-\theta\right)=0
\]
that maximizes the logarithm of the likelihood function In \(L(\mu)=\sum_{i=1}^{n} \ln f\left(x_{i}-\theta\right)\), where \(\psi(x)=ho^{\prime}(x)\) and \(ho(x)=-\ln f(x)\).
b. If \(f(x)\) is a nonmal distribution, find \(ho(x), \psi(x)\) and the corresponding maximum-likelihood estimator of \(\theta\).
c. If \(f(x)=\left.(2 \sigma)^{-1} e^{-|x|}\right|^{/ \sigma}\) (the double-exponential distribution), find \(ho(x)\) and \(\psi(x)\). Show that the maximum-likelihood estimator of \(\theta\) is the sample median. Compare Ibis estimator with the estimator found in part \(b\). Does the sample median seem to be a reasonable estimator in this case?
d. If \(f(x)=\left[\pi\left(1+x^{2}\right)\right]^{-1}\) (the Cauchy distribution), find \(ho(x)\) and \(\psi(x)\). How would you solve \(\sum_{i=1}^{n} \psi\left(x_{i}-\theta\right)\) in this case?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Introduction To Linear Regression Analysis

ISBN: 9781119578727

6th Edition

Authors: Douglas C. Montgomery, Elizabeth A. Peck, G. Geoffrey Vining

Question Posted: