Question
Let X 0 , X 1 , ... be a Markov chain with state space S such that i j is the value that X
Let X0,X1,... be a Markov chain with state spaceS such that ij is the value thatXj takes in the jth state. One of the properties that it satisfies is the Markov property:
P(Xn=inXn1=in1,...,X0=i0)=P(Xn=inXn1=in1) , for all i0,i1,...,inS,nZ>0
Use the Markov property and the total probability theorem to prove the following.
a) P(X3=i3X2=i2,X1=i1)=P(X3=i3X2=i2) , for all i1,i2,i3S
Note: This is not exactly the Markov property because it does not condition on X0
b) P(X3=i3X1=i1,X0=i0)=P(X3=i3X1=i1), for all i0,i1,i3S
c) P(X1=i1X2=i2,X3=i3)=P(X1=i1X2=i2), for all i1,i2,i3S
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started