In some variance component models, several pedigrees share the same theoretical mean vector and variance matrix

Question:

In some variance component models, several pedigrees share the same theoretical mean vector µ and variance matrix Ω. Maximum likelihood computations can be accelerated by taking advantage of this redundancy. In concrete terms, we would like to replace a random sample y1,...,yk from a multivariate normal distribution with a smaller random sample z1,...,zl in a manner that leaves the loglikelihood invariant up to a known multiplicative constant. Show that this requirement can be expressed formally as −k 2 ln | det Ω| − k 2 tr 
Ω−1 1 k
k j=1 (yj − µ)(yj − µ)
t 
= −c 


l 2 ln | det Ω| − l 2 tr 
Ω−1 1 l
l j=1 (zj − µ)(zj − µ)
t 



for some constant

c. Matching terms involving ln | det Ω| forces the choice c = k l . Given

c, we then take 1 l
l j=1 (zj − µ)(zj − µ)
t = 1 k
k j=1 (yj − µ)(yj − µ)
t .
Prove that this last equality holds for all µ if and only if ¯z = ¯y and 1 l
l j=1 (zj − z¯)(zj − z¯)
t = 1 k
k j=1 (yj − y¯)(yj − y¯)
t = S. (8.18)
Until this point, we have not specified the reduced sample size l. If each yj has m components, we claim that we can take l = m + 1.
This claim is based on constructing m+ 1 vectors v1,...,vm+1 in Rm satisfying m
+1 j=1 vj = 0 (8.19)
m m + 1 m
+1 j=1 vjvt j = Im×m, where Im×m is the m × m identity matrix. Given these vectors and given the Cholesky decomposition S = MMt of the sample variance of the sequence y1,...,yk, we define zj = √mMvj + ¯y. Show that this construction yields the sample mean equality ¯z = ¯y and the sample variance equality (8.18).
Thus, it remains to construct the sequence v1,...,vm+1. Although geometrically the vectors form the vertices of a regular tetrahedron, we proceed in a purely algebraic fashion. For m = 1, the trivial choice v1 = (1) and v2 = (−1) clearly meets the stated requirements. If v1,...,vm work in Rm−1, then verify by induction that the vectors wj =  vj √
1 − m−2 −m−1 
1 ≤ j ≤ m satisfy the two equalities (8.19) in Rm.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: