4.12 For the mixture distribution of Example 4.7, that is, Xi g(x) + (1 )h(x),...

Question:

4.12 For the mixture distribution of Example 4.7, that is, Xi ∼ θg(x) + (1 − θ)h(x), i = 1, . . . , n, independent where g(·) and h(·) are known, an EM algorithm can be used to find the ML estimator of θ. Let Z1, ··· , Zn, where Zi indicates from which distribution Xi has been drawn, so Xi|Zi = 1 ∼ g(x)

Xi|Zi = 0 ∼ h(x).

(a) Show that the complete-data likelihood can be written L(θ|x, z) = n i=1

[zig(xi) + (1 − zi)h(xi)] θ zi(1 − θ)

1−zi .

(b) Show that E(Zi|θ,xi) = θg(xi)/[θg(xi) + (1 − θ)h(xi)] and hence that the EM sequence is given by

θˆ

(j+1) = 1 n

n i=1

θˆ

(j )g(xi)

θˆ

(j )g(xi) + (1 − θˆ

(j ))h(xi)

.

(c) Show that θˆ

(j ) → θˆ, the ML estimator of θ.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Theory Of Point Estimation

ISBN: 9780387985022

2nd Edition

Authors: Erich L. Lehmann, George Casella

Question Posted: