Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

In Given data 197 animal are distributed multinomially into four categories 4 = ( 125 , 18, 20, 34 ) f cell probability (2+ 9

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
In Given data 197 animal are distributed multinomially into four categories 4 = ( 125 , 18, 20, 34 ) f cell probability (2+ 9 , 1-8 with prior distribution of [oil] . The observed number of each cell was Y = ( 125 , 18 , 20 , 34 ) f sampling model of multinomial distribution fy (y /0 ) = n ! yo! yz ! ya ! yo ! The log likelihood function is then e ( 0 / y ) = City, 69 ( 2+8 ) +( yz + y3) log (1-0 ) +yu logo Differentiating w. n.t O we get do e (oly ) = y' - yaty3 + y y 2 + 0 1- 0 & the fisher information is I(0 ) = - 78 ((oly ) = g. + 12+y3 + 84 ( 2 +0 ) 2 ( 1- 8 ) 2 the log- likelihood can be maximised explicity we use the example of illustrate the EM algorithm We think that it is multinomial experiment with five categories with observation * = ( Yl,ya ,yz,ysily ,the ) each with cell probability ( 1, 9 , 1 9 , 1 - , 9 Thus we split the first category into two & we can only observe the sum y, = yllt yiz Thus yu yiz are unobservable variable ( considered )The complete likelihood for the clata is f x 10 ( yu , yiz I yziyaily ) = n! you ! yes ! ya ! yz ! I the log - likelihood = ( yiz + jy ) logo + lys+ y3 ) wg (1 -0 ) Since 12 is to unobservable we carinot maximise this directly The obstacle is over come by E- Step Let " be initial gues for . The E-step require computation of Q ( B / 0 0 ) = Ego [ e ( 0 , y ,1 , 1/ 2 , Y2 , 73 , 14 ) | 71 , Y2, Y3, Vy ] = ED. [( / 12 + y/4 ) lung e + (12+ 43 ) Log (1-0 ) | 4 . Y2, 43/74 ] = (Eo' [ /12 / y, ] + 74) Logo + (/2+ 73) log (1-0) we need to compute conditional Expectation of Y12 dyn given @= go But this is Binomial distribution with sample size y, I poumet. P = (80/4 ) ( 1/2+0 /4 ) . Hence Expected value = yu 2+0' of the expression for Q ( 010') is Q ( 0 1 8 " ) = ( 4 12 0 + y4 ) ugo + ( yz + 43 ) wg (1-0) In the M-Step we maximise this wix.t O, we get 0'= yz + yy 412 " + yz + yz + yy Iterating gave us finally the estimate of 0 . where ye' = y.o' 2+0'3. Genetic study. A genetic study has divided n = 197 animals into four categories: y = (125, 18, 20, 34). A genetic model for the population cell probabilities is given by 01-0 1 -0 0 4 4 and thus, the sampling model is a multinomial distribution: n! y1 y2 D - A 1 A y3 A y4 p(y|0) = y1!yz!y3!y4! N 4 4where n = y1 + 92 + 93 + y4. Assume the prior distribution for 0 to be Uniform (0, 1). To find the posterior distribution of 0, a Gibbs sampling algorithm can be implemented by splitting the first category into two (yo, y1 - yo) with probabilities (?, "). Here yo can be viewed as another parameter (or a latent variable). Thus, n! yo y1 -yo A y2 y4 p(0, yoly) x A y3 yo!(y1 - yo)!yz!y3!y4! N 4 41. Derive the full conditional distributions of 0 and yo. 2. Implement Gibbs sampling in R, Matlab, Python, or Winbugs and obtain the posterior distribution of 9 (plot the density). 3. Find the estimate and 95% credible interval of 0. Hint

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access with AI-Powered Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Global Strategy

Authors: Mike W. Peng

5th Edition

0357512367, 978-0357512364

Students also viewed these Mathematics questions