Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Question: The following table is a transportation matrix showing the cost ($) per unit of shipping from Plant i to Distributor j. The objective is

image text in transcribed

Question:

The following table is a transportation matrix showing the cost ($) per unit of shipping from Plant i to Distributor j. The objective is to minimize shipping costs while meeting the demand placed by each Distributor. There is no shipment between Plant 2 and Distributor 3.

Distributor 1 Distributor 2 Distributor 3 Supply

Plant 1. 9 22. 24. 430

Plant 2. 14. 10 ..... 360

Plant 3. 18. 12. 13 220

Demand 400 390 190. ---

Let xij = number units shipped from Plant i to Distributor j

Solve the transportation problem and use the result to fill the boxes below.

All answers are equally weighted.

a) At optimal solution:

x11=

x12=

x13=

x21=

x22=

x23=

x31=

x32=

x33=

Minimum Cost = $

b) At optimal solution, how much unused capacity is available at each Plant?

Slack at Plant 1 = units

Slack at Plant 2 = units

Slack at Plant 3 = units

image text in transcribedimage text in transcribedimage text in transcribedimage text in transcribed
uansition probabilities. If it is not, tell why not. 5. A Markov chain {Xn, n > 0) with states 0, 1, 2, has the transition probability matrix 0 -IN 0 -IN -Im -/m O -10 N/m -IN If P{Xo =0} = P{Xo = 1} = 4, find E[X3]. Let the transition probability matrix of a two-state Markov chain be given, asLet A i) find the spectral decomposition A. if) find the spectral decomposition of A' and show that the diagonal matrix of eigenvalues is equal to the square of the matrix D found in part (i). ill) find the spectral decomposition of A-1 and show the diagonal matrix of eigenvalues is equal to the inverse of the matrix D found in part (1).The diagrams below show three Markov chains, where arrows indicate a non-zero transition probability. A Markov Chain 1 State 1 State 2 State 3 B Markov Chain 2 State 1 State 2 State 3 State 4 C Markov Chain 3 State 1 State 2 State whether each of the chains is: e irreducible . periodic, giving the period. [3]4. Consider the Markov chain X" = {X,} with state space S = {0, 1, 2, ...} and transition probabilities 1 ifj=i-1 Puj = 10 otherwise , for i 2 1 and Poo = 0, Poj = for j > 1. (a) Is this Markov chain irreducible? Determine the period for every state. (b) Is the Markov chain recurrent or transient? Explain. (c) Is the Markov chain positive recurrent? If so, compute the sta- tionary probability distribution. (d) For each state i, what is the expected number of steps to return to state i if the Markov chain X starts at state i? 5. Consider a Markov chain X = {X} with state space S = {0, 1, 2, ...} and transition probability matrix 0 1 0 0 P 0 0 P = O p 0 q 0 0 . . . 0 0 P 0 4 0 Here p > 0, q > 0 and p+q =1. Determine when the chain is positive recurrent and compute its stationary distribution

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Basic College Mathematics W/Early Integers (Subscription)

Authors: Elayn Martin Gay

3rd Edition

0134186419, 9780134186412

More Books

Students also viewed these Mathematics questions

Question

Distinguish between intrinsic and extrinsic teleology.

Answered: 1 week ago