Consider the Markov chain which at each transition either goes up 1 with probability p or down

Question:

Consider the Markov chain which at each transition either goes up 1 with probability p or down 1 with probability q = 1 p. Argue that (q/p), n 1, is a martingale

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Stochastic Processes

ISBN: 9780471120629

2nd Edition

Authors: Sheldon M. Ross

Question Posted: