2. A Markov chain with transition probability matrix P = (pij) is called regular, if for some...
Question:
2. A Markov chain with transition probability matrix P = (pij) is called regular, if for some positive integer n, pn ij > 0 for all i and j. Let {Xn : n = 0, 1, . . .} be a Markov chain with state space {0, 1} and transition probability matrix
Is {Xn : n = 0, 1, . . .} regular?Why or why not?
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Fundamentals Of Probability With Stochastic Processes
ISBN: 9780429856273
4th Edition
Authors: Saeed Ghahramani
Question Posted: