Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Exercise 11.6 EM for a finite scale mixture of Gaussians (Source: Jaakkola..) Consider the graphical model in Figure 11.23 which defines the following: (11.125) 11.6.

image text in transcribed

Exercise 11.6 EM for a finite scale mixture of Gaussians (Source: Jaakkola..) Consider the graphical model in Figure 11.23 which defines the following: (11.125) 11.6. Fitting models with missing data 377 where {P1 , . . . , Pm, 1 , . . . ,Pm, q1 , . . . , g, . . . . , ?) are all the parameters. Here p.* P(J.- j) and qk P(K, k) are the equivalent of mixture weights. We can think of this as a mixture of m non-Gaussian components, where each component distribution is a scale mixture, p(x):0) 1 qkN(x;H5 ,), combining Gaussians with different variances (scales) We will now derive a generalized EM algorithm for this model. (Recall that in generalized EM, we do a partial update in the M step, rather than finding the exact maximum.) a. Derive an expression for the responsibilities, P(J-j, Knkn,), needed for the E step b. Write out a full expression for the expected complete log-likelihood new old (11.126) C. Solving the M-step would require us to jointly optimize the means 41, ,m and the variances It will turn out to be simpler to first solve for the 's given fixed %, and subsequently solve for % given the new values of 's. For brevity, we will just do the first part. Derive an 2 1, , . expression for the maximizing ,s given fixed 2pke., solve 0-0 Exercise 11.6 EM for a finite scale mixture of Gaussians (Source: Jaakkola..) Consider the graphical model in Figure 11.23 which defines the following: (11.125) 11.6. Fitting models with missing data 377 where {P1 , . . . , Pm, 1 , . . . ,Pm, q1 , . . . , g, . . . . , ?) are all the parameters. Here p.* P(J.- j) and qk P(K, k) are the equivalent of mixture weights. We can think of this as a mixture of m non-Gaussian components, where each component distribution is a scale mixture, p(x):0) 1 qkN(x;H5 ,), combining Gaussians with different variances (scales) We will now derive a generalized EM algorithm for this model. (Recall that in generalized EM, we do a partial update in the M step, rather than finding the exact maximum.) a. Derive an expression for the responsibilities, P(J-j, Knkn,), needed for the E step b. Write out a full expression for the expected complete log-likelihood new old (11.126) C. Solving the M-step would require us to jointly optimize the means 41, ,m and the variances It will turn out to be simpler to first solve for the 's given fixed %, and subsequently solve for % given the new values of 's. For brevity, we will just do the first part. Derive an 2 1, , . expression for the maximizing ,s given fixed 2pke., solve 0-0

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

New Trends In Databases And Information Systems Adbis 2019 Short Papers Workshops Bbigap Qauca Sembdm Simpda M2p Madeisd And Doctoral Consortium Bled Slovenia September 8 11 2019 Proceedings

Authors: Tatjana Welzer ,Johann Eder ,Vili Podgorelec ,Robert Wrembel ,Mirjana Ivanovic ,Johann Gamper ,Mikolaj Morzy ,Theodoros Tzouramanis ,Jerome Darmont

1st Edition

3030302776, 978-3030302771

More Books

Students also viewed these Databases questions

Question

Discuss the advantages and disadvantages of cloud computing.

Answered: 1 week ago

Question

Are you exposed to any hazards or unusual working conditions?

Answered: 1 week ago