All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
probability and stochastic modeling
Questions and Answers of
Probability And Stochastic Modeling
3.4.4 A coin is tossed repeatedly until two successive heads appear. Find the mean number of tosses required.Hint: Let Xn be the cumulative number of successive heads. The state space is 0;1; 2, and
3.4.3 Consider the Markov chain whose transition probability matrix is given by(a) Starting in state 1, determine the probability that the Markov chain ends in state 0.(b) Determine the mean time to
3.4.2 Consider the Markov chain whose transition probablity matrix is given by(a) Starting in state 1, determine the probability that the Markov chain ends in state 0.(b) Determine the mean time to
3.4.1 Find the mean time to reach state 3 starting from state 0 for the Markov chain whose transition probability matrix is 0 1 2 3 P = 0 0.4 0.3 0.2 0.1| 10 0.7 0.2 0.1 23 3 0 0 0.9 0.1 000 1
3.3.10 Consider a discrete-time, periodic review inventory model and let n be the total demand in period n, and let Xn be the inventory quantity on hand at the end-of-period n. An .s;S/ inventory
3.3.9 Suppose that two urns A and B contain a total of N balls. Assume that at time t, there are exactly k balls in A. At time tC1, a ball and an urn are chosen with probability depending on the
3.3.8 Two urns A and B contain a total of N balls. Assume that at time t, there were exactly k balls in A. At time tC1, an urn is selected at random in proportion to its contents (i.e., A is chosen
3.3.7 A component in a system is placed into service, where it operates until its failure, whereupon it is replaced at the end of the period with a new component having statistically identical
3.3.6 Two teams, A and B, are to play a best of seven series of games. Suppose that the outcomes of successive games are independent, and each is won by A with probability p and won by B with
3.3.5 You are going to successively flip a quarter until the pattern HHT appears, that is, until you observe two successive heads followed by a tails. In order to calculate some properties of this
3.3.4 Consider the queueing model of Section 3.4. Now, suppose that at most a single customer arrives during a single period, but that the service time of a customer is a random variable Z with the
3.3.3 Consider the inventory model of Section 3.3.1. Suppose that unfulfilled demand is not back ordered but is lost.(a) Set up the corresponding transition probability matrix for the end-ofperiod
3.3.2 Three fair coins are tossed, and we let X1 denote the number of heads that appear. Those coins that were heads on the first trial (there were X1 of them)we pick up and toss again, and now we
3.3.1 An urn contains six tags, of which three are red and three are green. Two tags are selected from the urn. If one tag is red and the other is green, then the selected tags are discarded and two
3.3.5 An urn initially contains a single red ball and a single green ball. A ball is drawn at random, removed, and replaced by a ball of the opposite color, and this process repeats so that there are
3.3.4 Consider the inventory model of Section 3.3.1. Suppose that S D 3 and that the probability distribution for demand is Prf D 0g D 0:1; Prf D 1g D 0:4;Prf D 2g D 0:3, and Prf D 3g D 0:2. Set
3.3.3 Consider the inventory model of Section 3.3.1. Suppose that S D 3. Set up the corresponding transition probability matrix for the end-of-period inventory level Xn.
3.3.2 Consider two urns A and B containing a total of N balls. An experiment is performed in which a ball is selected at random (all selections equally likely) at time t.t D 1;2; : : :/ from among
3.3.1 Consider a spare parts inventory model in which either 0; 1, or 2 repair parts are demanded in any period, withand suppose s D 0 and S D 3. Determine the transition probability matrix for the
3.2.5 A Markov chain has the transition probability matrixThe Markov chain starts at time zero in state X0 D 0. Let T D minfn 0IXn D 2g be the first time that the process reaches state 2.
3.2.4 Suppose Xn is a two-state Markov chain whose transition probability matrix isThen, Zn D .Xn????1;Xn/ is a Markov chain having the four states .0;0/; .0;1/;.1;0/, and .1;1/. Determine the
3.2.3 Let Xn denote the quality of the nth item produced by a production system with Xn D 0 meaning “good” and Xn D 1 meaning “defective.” Suppose that Xn evolves as a Markov chain whose
3.2.2 Consider the problem of sending a binary message, 0 or 1, through a signal channel consisting of several stages, where transmission through each stage is subject to a fixed probability of error
3.2.1 Consider the Markov chain whose transition probability matrix is given bySuppose that the initial distribution is pi D 14 for i D 0;1;2; 3. Show that PrfXn D kg D 1 4 ; k D 0;1;2; 3, for all n.
3.2.6 A Markov chain X0;X1;X2; : : : has the transition probability matrixand initial distribution p0 D 0:5 and p1 D 0:5. Determine the probabilities PrfX2 D 0g and PrfX3 D 0g. 012 00.3 0.2 0.5 P 1
3.2.5 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the conditional probabilities PrfX3 D 1jX1 D 0g and PrfX2 D 1jX0 D 0g: 012 0 0.1 0.1 0.8 P=1 P 1 0.2 0.2 0.6 2 0.3
3.2.4 A Markov chain X0;X1;X2; : : : has the transition probability matrixIf it is known that the process starts in state X0 D 1, determine the probability PrfX2 D 2g. 012 0 0.6 0.3 0.1 P 1 0.3 0.3
3.2.3 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the conditional probabilities PrfX3 D 1jX0 D 0g and PrfX4 D 1jX0 D 0g: 0 00.7 0 0.7 P 1 0 12 0.2 0.1 P=1 0.6 0.4 2
3.2.2 A particle moves among the states 0;1;2 according to a Markov process whose transition probability matrix isLet Xn denote the position of the particle at the nth move. Calculate PrfXn D 0jX0 D
3.2.1 A Markov chain fXng on the states 0;1;2 has the transition probability matrix(a) Compute the two-step transition matrix P2.(b) What is PrfX3 D 1jX1 D 0g?(c) What is PrfX3 D 1jX0 D 0g? 012 00.1
3.1.4 The random variables 1; 2; : : : are independent and with the common probability mass functionSet X0 D 0, and let Xn D maxf1; : : : ; ng be the largest observed to date. Determine the
3.1.3 Consider a sequence of items from a production process, with each item being graded as good or defective. Suppose that a good item is followed by another good item with probability and is
3.1.2 Consider the problem of sending a binary message, 0 or 1, through a signal channel consisting of several stages, where transmission through each stage is subject to a fixed probability of error
3.1.1 A simplified model for the spread of a disease goes this way: The total population size is N D 5, of which some are diseased and the remainder are healthy.During any single period of time, two
3.1.5 A Markov chain X0;X1;X2; : : : has the transition probability matrixand initial distribution p0 D 0:5 and p1 D 0:5. Determine the probabilities PrfX0 D 1;X1 D 1;X2 D 0g and PrfX1 D 1;X2 D 1;X3
3.1.4 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the conditional probabilities PrfX1 D 1;X2 D 1jX0 D 0g and PrfX2 D 1;X3 D 1jX1 D 0g: 012 0 0.1 0.1 0.8 P 1 0.2 0.2
3.1.3 A Markov chain X0;X1;X2; : : : has the transition probability matrixIf it is known that the process starts in state X0 D 1, determine the probability PrfX0 D 1;X1 D 0;X2 D 2g. 012 0 0.6 0.3
3.1.2 A Markov chain X0;X1;X2; : : : has the transition probability matrixDetermine the conditional probabilities PrfX2 D 1;X3 D 1jX1 D 0g and PrfX1 D 1;X2 D 1jX0 D 0g: 012 0 0.7 0.2 0.1 P=1 0 0.6
3.1.1 A Markov chain X0;X1; : : : on states 0, 1, 2 has the transition probability matrixand initial distribution p0 D PrfX0 D 0g D 0:3;p1 D PrfX0 D 1g D 0:4, and p2 D PrfX0 D 2g D 0:3. Determine
2.5.5 Consider a stochastic process that evolves according to the following laws: If Xn D 0, then XnC1 D 0, whereas if Xn > 0, then(a) Show that Xn is a nonnegative martingale.(b) Suppose that X0
2.5.4 Let 1; 2; : : : be independent Bernoulli random variables with parameter p;0
2.5.3 Let S0 D 0, and for n 1, let Sn D "1 C C"n be the sum of n independent random variables, each exponentially distributed with mean E["] D 1. Show that Xn D 2n exp.????Sn/; n 0 defines a
2.5.2 Let U1;U2; : : : be independent random variables each uniformly distributed over the interval .0; 1]. Show that X0 D 1 and Xn D 2nU1 Un for n D 1;2; : : :defines a martingale.
2.5.1 Use the law of total probability for conditional expectations E[EfXjY;ZgjZ] D E[XjZ] to show E[XnC2jX0; : : : ;Xn] D E[EfXnC2jX0; : : : ;XnC1gjX0; : : : ;Xn]:Conclude that when Xn is a
2.5.3 Let be a random variable with mean and standard deviation . Let X D. ????/2. Apply Markov’s inequality to X to deduce Chebyshev’s inequality: Pr{ ) for any > 0.
2.5.2 Let X be a Bernoulli random variable with parameter p. Compare PrfX 1g with the Markov inequality bound.
2.5.1 Let X be an exponentially distributed random variable with mean E[X] D 1. For x D 0:5; 1, and 2, compare PrfX > xg with the Markov inequality bound E[X]=x.
2.4.8 Let X and Y have the normal density given in Chapter 1, in (1.47). Show that the conditional density function for X, given that Y D y, is normal with moments xx =x+ and OXY = 0x1-p (Art).
2.4.7 Suppose that X and Y are independent random variables, each having the same exponential distribution with parameter . What is the conditional probability density function for X, given that Z D
2.4.6 Let X0;X1;X2; : : : be independent identically distributed nonnegative random variables having a continuous distribution. Let N be the first index k for which Xk > X0. That is, N D 1 if X1 >
2.4.5 Let X and Y be jointly distributed random variables whose joint probability mass function is given in the following table:Show that the covariance between X and Y is zero even though X and Y
2.4.4 Suppose X and Y are independent random variables having the same Poisson distribution with parameter , but where is also random, being exponentially distributed with parameter . What is the
2.4.3 Let X have a Poisson distribution with parameter > 0. Suppose itself is random, following an exponential density with parameter .(a) What is the marginal distribution of X?(b) Determine
2.4.2 Let N have a Poisson distribution with parameter > 0. Suppose that, conditioned on N D n, the random variable X is binomially distributed with parameters N D n and p. Set Y D N ????X. Show
2.4.1 Suppose that the outcome X of a certain chance mechanism depends on a parameter p according to PrfX D 1g D p and PrfX D 0g D 1????p, where 0 p 1. Suppose that p is chosen at random,
2.4.5 Let U be uniformly distributed over the interval [0;L] where L follows the gamma density fL.x/ D xe????x for x 0. What is the joint density function of U and V D L????U?
2.4.4 Suppose X and Y are independent random variables, each exponentially distributed with parameter . Determine the probability density function for ZDX=Y.
2.4.3 A random variable T is selected that is uniformly distributed over the interval.0; 1]. Then, a second random variable U is chosen, uniformly distributed on the interval .0;T]. What is the
2.4.2 Suppose that three components in a certain system each function with probability p and fail with probability 1????p, each component operating or failing independently of the others. But the
2.4.1 Suppose that three contestants on a quiz show are each given the same question and that each answers it correctly, independently of the others, with probability p. But the difficulty of the
2.3.5 To form a slightly different random sum, let 0; 1; : : : be independent identically distributed random variables and let N be a nonnegative integer-valued random variable, independent of 0; 1;
2.3.4 Suppose 1; 2; : : : are independent and identically distributed random variables having mean and variance 2. Form the random sum SN D 1 C CN.(a) Derive the mean and variance of SN
2.3.3 Suppose that 1; 2; : : : are independent and identically distributed with Prfk D1g D 1 2 . Let N be independent of 1; 2; : : : and follow the geometric probability mass functionwhere 0 1 C
2.3.2 For each given p, let Z have a binomial distribution with parameters p and N.Suppose that N is itself binomially distributed with parameters q and M. Formulate Z as a random sum and show that Z
2.3.1 The following experiment is performed: An observation is made of a Poisson random variable N with parameter . Then N independent Bernoulli trials are performed, each with probability p of
2.3.5 The number of accidents occurring in a factory in a week is a Poisson random variable with mean 2. The number of individuals injured in different accidents is independently distributed, each
2.3.4 A six-sided die is rolled, and the number N on the uppermost face is recorded.From a jar containing 10 tags numbered 1;2; : : : ;10 we then select N tags at random without replacement. Let X be
2.3.3 Suppose that upon striking a plate a single electron is transformed into a number N of electrons, where N is a random variable with mean and standard deviation. Suppose that each of these
2.3.2 Six nickels are tossed, and the total number N of heads is observed. Then N dimes are tossed, and the total number Z of tails among the dimes is observed.Determine the mean and variance of Z.
2.3.1 A six-sided die is rolled, and the number N on the uppermost face is recorded.Then a fair coin is tossed N times, and the total number Z of heads to appear is observed. Determine the mean and
2.2.2 Consider a pair of dice that are unbalanced by the addition of weights in the following manner: Die #1 has a small piece of lead placed near the four side, causing the appearance of the outcome
2.2.1 Let X1;X2; : : : be independent identically distributed positive random variables whose common distribution function is F. We interpret X1;X2; : : : as successive bids on an asset offered for
2.2.3 Determine the win probability when the dice are shaved on the 1–6 faces and pC D 0:206666 and p???? D 0:146666 .
2.2.2 Verify the win probability of 0.5029237 by substituting from (2.21) into (2.20).
2.2.1 A red die is rolled a single time. A green die is rolled repeatedly. The game stops the first time that the sum of the two dice is either 4 or 7. What is the probability that the game stops
2.1.10 Do men have more sisters than women have? In a certain society, all married couples use the following strategy to determine the number of children that they will have: If the first child is a
2.1.9 Let N have a Poisson distribution with parameter D 1. Conditioned on N D n, let X have a uniform distribution over the integers 0;1; : : : ;nC1. What is the marginal distribution for X?
2.1.8 Initially an urn contains one red and one green ball. A ball is drawn at random from the urn, observed, and then replaced. If this ball is red, then an additional red ball is placed in the urn.
2.1.7 The probability that an airplane accident that is due to structural failure is correctly diagnosed is 0.85, and the probability that an airplane accident that is not due to structural failure
2.1.6 A dime is tossed repeatedly until a head appears. Let N be the trial number on which this first head occurs. Then, a nickel is tossed N times. Let X count the number of times that the nickel
2.1.5 A nickel is tossed 20 times in succession. Every time that the nickel comes up heads, a dime is tossed. Let X count the number of heads appearing on tosses of the dime. Determine PrfX D 0g.
2.1.4 Suppose that X has a binomial distribution with parameters p D 1 2 and N, where N is also random and follows a binomial distribution with parameters q D 14 and M D 20. What is the mean of X?
2.1.3 Let X and Y denote the respective outcomes when two fair dice are thrown. Let U D minfX;Yg;V D maxfX;Yg, and S D U CV;T D V ????U.(a) Determine the conditional probability mass function for U
2.1.2 A card is picked at random from N cards labeled 1;2; : : : ;N, and the number that appears is X. A second card is picked at random from cards numbered 1;2; : : : ;X and its number is Y.
2.1.1 Let M have a binomial distribution with parameters N and p. Conditioned on M, the random variable X has a binomial distribution with parametersM and .(a) Determine the marginal distribution
2.1.6 Suppose U and V are independent and follow the geometric distributionDefine the random variable Z D U CV.(a) Determine the joint probability mass function pU;Z.u; z/D PrfUDu;ZDzg.(b) Determine
2.1.5 Let X be a Poisson random variable with parameter . Find the conditional mean of X given that X is odd.
2.1.4 A six-sided die is rolled, and the number N on the uppermost face is recorded.From a jar containing 10 tags numbered 1;2; : : : ; 10, we then select N tags at random without replacement. Let X
2.1.3 A poker hand of five cards is dealt from a normal deck of 52 cards. Let X be the number of aces in the hand. Determine PrfX > 1jX 1g. This is the probability that the hand contains more than
2.1.2 Four nickels and six dimes are tossed, and the total number N of heads is observed. If N D 4, what is the conditional probability that exactly two of the nickels were heads?
2.1.1 I roll a six-sided die and observe the number N on the uppermost face. I then toss a fair coin N times and observe X, the total number of heads to appear. What is the probability that N D 3 and
1.5.9 A flashlight requires two good batteries in order to shine. Suppose, for the sake of this academic exercise, that the lifetimes of batteries in use are independent random variables that are
1.5.8 Let U1;U2; : : : ;Un be independent uniformly distributed random variables on the unit interval [0; 1]. Define the minimum Vn D minfU1;U2; : : : ;Ung.(a) Show that PrfVn > vg D .1????v for 0
1.5.7 Let X1;X2; : : : , Xn be independent random variables that are exponentially distributed with respective parameters 1;2; : : : , n. Identify the distribution of the minimum V D minfX1;X2; :
1.5.6 Determine the upper tail probabilities PrfV > tg and mean E[V] for a random variable V having the exponential densitywhere is a fixed positive parameter. Jo fv (v) = for v < 0, The for v 0,
1.5.5 Show thatfor a nonnegative random variable W. E[W]= 2y[1-Fw(y)]dy 0
1.5.4 Let V be a continuous random variable taking both positive and negative values and whose mean exists. Derive the formula 0 E[V] = [[1-Fv(v)]dvFv(v)dv. 0 -00
1.5.3 Suppose that X is a discrete random variable having the geometric distribution whose probability mass function is p.k/ D p.1????p/k for k D 0;1; : : : :(a) Determine the upper tail
1.5.2 Let X1;X2; : : : ;Xn be independent random variables, all exponentially distributed with the same parameter . Determine the distribution function for the minimum Z D minfX1; : : : ;Xng.
1.5.1 Let X1;X2; : : : be independent and identically distributed random variables having the cumulative distribution function F.x/ D PrfX xg. For a fixed number , let N be the first index k for
Showing 4400 - 4500
of 6914
First
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
Last