Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Find solutions 4. Consider a discrete-time Markov chain with the following probability transition matrix 0 0 0 0 7 1-3-y P = 1-I-VVO T 0

Find solutions

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
4. Consider a discrete-time Markov chain with the following probability transition matrix 0 0 0 0 7 1-3-y P = 1-I-VVO T 0 0 1 0 Is it possible to choose values for a and y so that the Markov chain has the following properties? In each case, state the values of a and y, or give a brief reason why it is not possible. (a) The Markov chain has period 2. [2) (b) The Markov chain is reducible. (c) The Markov chain has at least one transient state. UNN (d) The Markov chain has invariant distribution (1/4, 1/4, 1/4, 1/4).For each of the following transition matrices, determine whether the Markov chain with that transition matrix is regular: (1) Is the Markov chain whose transition matrix whose transition matrix is 0 0.5 0.5 0.5 0 0.5 0 0 regular? (Yes or No) (2) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0 0.7 0 0 regular? (Yes or No) (3) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.6 0 0.4 1 0 0 regular? (Yes or No) (4) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0 0.6 0 0.4 regular? (Yes or No) (5) Is the Markov chain whose transition matrix whose transition matrix is 0 1 0 0.3 0.2 0.5 0 1 02. (15 pts) Consider a Markov chain { Xn } with state space S = {0, 1, 2} and transition matrix and transition matrix P = O ON/H HNIH O (1) Let the mapping f : S - S satisfy f(0) = 0 and f(2) = 1 and assume that f(1) + f(2). If Yn = f(Xn), then when is { Yn } a Markov chain? Is { Yn } always a Markov chain? In other words, are functions of Markov chains always Markov chains?2. Markov chain transitions P = [P/j] = Al- AI- NI- Al- NI- AI- NI- AI- Al- Let X1 be distributed uniformly over the states {0, 1, 2}. Let (Xill be a Markov chain with transition matrix P; thus, P(Xn+1=j \\Xn= i) = Pu , i, j E {0, 1, 2}. (a) Is the information source stationary? ~ (b) Find the stationary distribution of the Markov chain (c) Find the entropy rate of the Markov chain

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Finite Mathematics

Authors: Stefan Waner, Steven Costenoble

6th Edition

1285415604, 9781285415604

More Books

Students also viewed these Mathematics questions