Consider the Markov chain which at each transition either goes up 1 with probability p or down
Question:
Consider the Markov chain which at each transition either goes up 1 with probability p or down 1 with probability q = 1 p. Argue that (q/p), n 1, is a martingale
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: