Refer to the previous exercise. Unlike the conditional ML estimator of ????1, the unconditional ML estimator is

Question:

Refer to the previous exercise. Unlike the conditional ML estimator of ????1, the unconditional ML estimator is inconsistent (Andersen 1980, pp. 244–245).

a. Averaging over the population, explain why

????21 = E

[ 1 1 + exp(????0i)

exp(????0i + ????1)

1 + exp(????0i + ????1)

]

, where the expectation refers to the distribution for {????0i} and {????ab} are the probabilities for the population analog of Table 9.1. Similarly, state ????12.

For a random sample of size n → ∞, explain why n21∕n12 p

−→ exp(????1).

b. Find the log-likelihood. Show that the likelihood equations are y+j = ∑

i P(yij = 1) and yi+ = ∑

j P(yij = 1). Substituting exp(????0i)∕[1 +

exp(????0i)] + exp(????0i + ????1)∕[1 + exp(????0i + ????1)] in the second likelihood equation, show that ????̂

0i = −∞ for the n22 subjects with yi+ = 0, ????̂

0i = ∞ for the n11 subjects with yi+ = 2, and ????̂

0i = −????̂

1∕2 for the n21 + n12 subjects with yi+ = 1.

c. By breaking ∑

i P(yij = 1) into components for the sets of subjects having yi+ = 0, yi+ = 2, and yi+ = 1, show that the first likelihood equation is, for j = 1, y+1 = n22(0) + n11(1) + (n21 + n12) exp(−????̂

1∕2)∕[1 + exp(−????̂

1∕2)].

Explain why y+1 = n11 + n12, and solve the first likelihood equation to show that ????̂

1 = 2 log(n21∕n12). Hence, as a result of (a), ????̂

1 p

−→ 2????1.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: