All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
theory of probability
Questions and Answers of
Theory Of Probability
Let X and Y have a common negative binomial distribution. Find the conditional probability P(X = j | X + Y = k} and show that the identity II, (12.16) now becomes obvious without any calculations. 11
If two random variables X and Y assume only two values each, and if Cov (X, Y) = 0, then X and Y are independent.
Birthdays. For a group of n people find the expected number of days of the year which are birthdays of exactly k people. (Assume 365 days and that all arrangements are equally probable.)
Continuation. Find the expected number of multiple birthdays. How large should n be to make this expectation exceed 1?
A man with n keys wants to open his door and tries the keys independ- ently and at random. Find the mean and variance of the number of trials (a) if unsuccessful keys are not eliminated from further
Let (X, Y) be random variables whose joint distribution is the trinomial defined by (1.8). Find E(X), Var (X), and Cov (X, Y) (a) by direct computa- tion, (b) by representing X and Y as sums of n
Find the covariance of the number of ones and sixes in n throws of a die.
In the animal trapping problem 24 of VI, 10, prove that the expected number of animals trapped at the vth trapping is nap-1.
If X has the geometric distribution P{X=k} = qp (where k = 0, 1, ...), show that Var (X) = qp. Conclude that the negative binomial distri- bution {f(k; r,p)} has variance rqp 2 provided r is a
In the waiting time problem (3.d) prove that Conclude that N-2E(SN) ~k-2 (Incidentally, the value of this series is 26.) Hint: Use the variance of the geometric distribution found in the pre- ceding
Continuation. Let Y, be the number of drawings required to include r preassigned elements (instead of any r different elements as in the text). Find E(Y) and Var (Y,). (Note: The exact distribution
The blood-testing problem.12 A large number, N, of people are subject to a blood test. This can be administered in two ways. (i) Each person can be tested separately. In this case N tests are
Sample structure. A population consists of r classes whose sizes are in the proportion Pi Pa pr. A random sample of size n is taken with replacement. Find the expected number of classes not
Let X be the number of a runs in a random arrangement of r alphas and r2 betas. The distribution of X is given in problem 23 of II, 11.Find E(X) and Var (X).
In Polya's urn scheme [V,(2.c)] let , be one or zero according as the nth trial results in black or red. Prove p(xn, Xm) = c/(b+r+c) for nm.
Continuation. Let S be the total number of black balls extracted in the first n drawings (that is, S = X + + X). Find E(S) and Var (Sn). Verify the result by means of the recursion formula in problem
Stratified sampling. A city has n blocks of which n, have x, inhabit- ants each (n + n + n). Let mn,x,/n be the mean number of inhabitants per block and put a = n na - m. In sampling without
Length of random chains. 13 A chain in the x,y-plane consists of n links, each of unit length. The angle between two consecutive links is a where a is a positive constant; each possibility has
In a random placement of r balls into n cells the probability of finding exactly m cells empty satisfies the recursion formula II,(11.8). Let m, be the expected number of empty cells. From the
Let S be the number of successes in n Bernoulli trials. Prove E(Snp) 2vqb(v; n, p) where is the integer such that np < np + 1.
Let {X} be a sequence of mutually independent random variables with a common distribution. Suppose that the X assume only positive values and that E(X)a and E(X) = b exist. Let S = ++ X. Prove that
Continuation. 15 Prove that E Sm n Sm SS 55 = m - n if mn E =1+ (m-n)aE(S), if m n.
Let X,,, be mutually independent random variables with a common distribution; let its mean be m, its variance o. Let X = = (x+ +Xn)/n. Prove that6 1 n- E (XX)) - == = .
Let X,, X, be mutually independent random variables. Let U be a function of X,..., X and V a function of X+1,, X. Prove that U and V are mutually independent random variables. be mono-
Generalized Chebyshev inequality. Let (x) > 0 for x >0 tonically increasing and suppose that E(4(\X\)) = M exists. Prove that M P{|X 1) p(1)
Schwarz inequality. For any two random variables with finite variances one has E2(XY) < E(X2)E(Y). Prove this from the fact that the quadratic polynomial E((tX+Y)2) is non-negative.
Prove that the law of large numbers applies in example (5.a) also when 20.The central limit theorem holds if -.
Decide whether the law of large numbers and the central limit theorem hold for the sequences of mutually independent variables X with distributions defined as follows (k 1): (a) P{X, = 2%; (b) P{X =
Ljapunov's condition (1901). Show that Lindeberg's condition is satisfied if for some 8 > 0 1 +(X+0) 0.m k-1
Let the X be mutually independent random variables such that Xx assumes the 2k + 1 values 0, Lk, 2Lk, ..., kL, each with probability 1/(2k+1). Find conditions on the constants L which will ensure
Do the same problem if X assumes the valuesa, -a, and 0 with probabilities P. Pk and 1-2Pk- Note: The following seven problems treat the weak law of large numbers for dependent variables.
In problem 13 of V, 8 let X = 1 if the kth throw results in red, and X = 0 otherwise. Show that the law of large numbers does not apply.
Let the (X) be mutually independent and have a common distribution with mean and finite variance. If S = X + + Xn, prove that the law of large numbers does not hold for the sequence {S} but holds for
Let {X} be a sequence of random variables such that X may depend on X-1 and X+1 but is independent of all other X,. Show that the law of large numbers holds, provided the X have bounded variances.
If the joint distribution of (X1,..., ,) is defined for every n so that the variances are bounded and all covariances are negative, the law of large numbers applies.
Continuation. Replace the condition Cov (X,, X) < 0 by the assumption. that Cov (X,, X) 0 uniformly as |jk|co. Prove that the law of large numbers holds.
If Scn and Var (Sn) > an, then the law of large numbers does not apply to {X}.
In the Polya urn scheme [example V, (2.c)] let X equal 1 or 0 according to whether the kth ball drawn is black or red. Then S, is the number of black balls in n drawings. Prove that the law of large
The mutually independent random variables X assume the values r = 2, 3, 4, with probability p, = c/(r2 log r) where c is a constant such that p, 1.Show that the generalized law of large numbers (4.1)
Let {X} be a sequence of mutually independent random variables such that X = 1 with probability (1-2)/2 and X = 2" with prob- ability 21.Prove that both the weak and the strong law of large numbers
Example of an unfavorable "fair" game. Let the possible values of the gain at each trial be 0, 2, 22, 23,...; the probability of the gain being 2* is (8.1) 1 Pk = 2*k (k+1) ' and the probability of 0
Converse to the strong law of large numbers. Under the assumption of problem 16 there is probability one that |S| > An for infinitely many n.
A converse to Kolmogorov's criterion. If o/k diverges, then there exists a sequence {X} of mutually independent random variables with Var {X} = o for which the strong law of large numbers does not
Let X be a random variable with generating function P(s). Find the generating functions of X + 1 and 2X.
Find the generating functions of (a) P{X n}, (b) P{X < n}, (c) P{X n}, (d) P{Xn+1}, (e) P{X = 2n}.
In a sequence of Bernoulli trials let u be the probability that the com- bination SF occurs for the first time at trials number n 1 and n. Find the generating function, mean, and variance.
Discuss which of the formulas of II, 12, represent convolutions and where generating functions have been used.
Let an be the number of ways in which the score n can be obtained by throwing a die any number of times. Show that the generating function of {a} is {1-s-s2-53-54-55-56}-1 1.
Let a be the number of ways in which a convex polygon PoP Pm with n+1 sides can be partitioned into triangles by drawing n - 2 (non- intersecting) diagonals. Put a = 1.Show that for n 2 an aan-1 +
(a) The probability that a return to equilibrium occurs at or before the nth trial is given by (1-s)F(s). (b) Conclude: The generating function for the probability that S, 0 for j=1,...,n is given by
The generating function for the probabilities that no return to equilibrium occurs after the nth trial (exclusive) is given by (1-s)-U(s) |pq|
(a) The generating function for P{S = r} (with r > 0 fixed) is given by Qr(s) U(s). (b) When this is also the generating function for the probability that Sr for exactly one subscript k n. =
(a) Find the generating function for the probabilities that the event Sr will occur exactly k times (r > 0 and k > 0 fixed). = (b) Do the same problem with "exactly" replaced by "at most."
(a) Find the generating function for the probability that the first return to equilibrium following a first passage through >0 occurs at trial number r. (b) Do the same problem with the words "the
In the waiting time example IX, (3.d) find the generating function of S, (for fixed). Verify formula IX, (3.3) for the mean and calculate the variance.
Continuation. The following is an alternative method for deriving the same result. Let P(r) = P{S, = n}. Prove the recursion formula (7.1) Pn+1(r) = N Pn(r) + N-r+1 N Pa(r-1). Derive the generating
Solve the two preceding problems for preassigned elements (instead of arbitrary ones)."
Let the sequence of Bernoulli trials up to the first failure be called a turn. Find the generating function and the probabiity distribution of the accumulated numbers S, of successes in r turns.
Continuation. (a) Let R be the number of successive turns up to the th success (that is, the th success occurs during the Rth turn). Find E(R) and Var (R). Prove that P{R } p'q1| = v-1 (b) Consider
Let X assume the values 0, 1, ..., r - 1 each with the same probability 1/r. When is a composite number, say r = ab, it is possible to represent X as the sum of two independent integral-valued random
Let S X ++X be the sum of mutually independent variables each assuming the values 1, 2,..., a with probability 1/a. Show that the generating function is given by whence for jn P(s) = 00 (s(1-sa)" a(1
Continuation. The probability P{S, 24.Moment generating functions. Let X be a random variable with generating function P(s), and suppose that pns" converges for some so > 1.Then all moments m, = P(X)
The distribution (1.1) of the random sum SN has mean E(N)E(X) and variance E(N) Var (X) + Var (N)E2(X). Verify this (a) using the generating function, (b) directly from the definition and the notion
Animal trapping [example (1.b)]. If {g} is a geometric distribution, so is the resulting distribution. If {gn} is a logarithmic distribution [cf. (2.8)], there results a logarithmic distribution with
In N Bernoulli trials, where N is a random variable with a Poisson distribution, the numbers of successes and failures are stochastically independent variables. Generalize this to the multinomial
Randomization. Let N have a Poisson distribution with mean 1, and let N balls be placed randomly into n cells. Show without calculation that the probability of finding exactly m cells empty is m
Continuation, 13 Show that when a fixed number r of balls is placed ran- domly into n cells the probability of finding exactly m cells empty equals the coefficient of e/r! in the expression above.
Mixtures of probability distributions. Let {f} and {g} be two prob- ability distributions, a > 0, > 0, a + B = 1.Then {af + Bg) is again a probability distribution. Discuss its meaning and the
Continuation. If n>m show that E(XX) = -E(X2).
Continuation. Show that the bivariate generating function of (Xm, Xn) is Pm($1 Pn-m($2)). Use this to verify the assertion in problem 8.
10. Consider the changes introduced in the branching process when each individual has a fixed probability p to die before producing descendants.
Branching processes with two types of individuals. Assume that each individual can have descendants of either kind; the numbers of descendants of the two types are regulated by two bivariate
Suppose that F(s) is a polynomial. Prove for this case all theorems of section 3, using the partial fraction method of XI, 4.
Let r coins be tossed repeatedly and let & be the recurrent event that for each of the coins the accumulated number of heads and tails are equal. Is & persistent or transient? For the smallest r for
In a sequence of independent throws of a perfect die let & stand for the event that the accumulated numbers of ones, twos, . . ., sixes are equal. Show that & is a transient (periodic) recurrent
In a sequence of Bernoulli trials let & occur when the accumulated number of successes equals 1 times the accumulated number of failures; here is a positive integer. [See example (1.c).] Show that &
In a sequence of Bernoulli trials we say that & occurs when the accumulated number of successes is twice the accumulated number of failures and the ratio has never exceeded 2.Show that & is transient
Let the X, be independent integral-valued random variables with a common distribution. Assume that these variables assume both positive and negative values. Prove that the event defined by S, 0, S
Geiger counters. [See examples (1.g) and (4.e).] Denote by N, and Zn, respectively, the number of occurrences of & and the number of registrations up to and including epoch n. Discuss the
In Geiger counters of type II every arriving particle (whether registered or not) locks the counter for exactly time units (that is, at the r-1 trials following the arrival). The duration of the
A more general type of Geiger counters. As in problem 8 we assume that every arriving particle completely obliterates the effect of the preceding ones, but we assume now that the time for which a
For a delayed recurrent event & the probabilities Un are constant only when the generating function of the first occurrence of & is given by B(s) = -[1-F(s)u(1-s), that is, when b = fn +1 +
Find an approximation to the probability that in 10,000 tossings of a coin the number of head runs of length 3 will lie between 700 and 730.
In a sequence of tossings of a coin let & stand for the pattern HTH. Let be the probability that & does not occur in n trials. Find the generating function and use the partial fraction method to
In example (8.b) show that the expected duration of the game is M(+12),
The possible outcomes of each trial are A, B, and C; the corresponding probabilities area, , y (a + B + y = 1). Find the generating function of the probability that in n trials there is no run of
Continuation. Find the probability that the first 4-run of length r precedes the first B-run of length p and terminates at the nth trial. [Note that this problem does not reduce to that of example
Self-renewing aggregates. In example (10.d) find the limiting age dis- tribution assuming that the lifetime distribution is geometric: fx qp.
Continuation. The initial age distribution {P} is called stationary if it perpetuates itself for all times. Show (without computation) that this is the case only when Prx|.
Continuation. Denote by W (n) the expected number of elements at epoch that are of age k. Find the determining equations and verify from them that the population size remains constant. Furthermore,
Let & be a persistent aperiodic recurrent event. Assume that the re- currence time has finite mean and variance 02.Put qn = fn+1 + fn +2 +.. and r9n+1+9+2 + Show that the generating functions Q(s)
Let & be a persistent recurrent event and N, the number of occurrences of in trials. Prove that (12.3) E(N) = u + + +2 and hence that E(N) is the coefficient of s* in (12.4) F2 (s) + F(s)
In a sequence of Bernoulli trials let 9kn be the probability that exactly n success runs of length r occur in k trials. Using problem 22, show that the generating function Q(x) = 9, is the
Continuation. The Poisson distribution of long runs. 12 If the number k of trials and the length of runs both tend to infinity, so that kqp" , then the probability of having exactly n runs of length
In a random walk starting at the origin find the probability that the point a>0 will be reached before the point -b < 0.
Prove that with the notations of section 2: (a) In a random walk starting at the origin the probability to reach the point a> 0 before returning to the origin equals p(1-91). (b) In a random walk
If qp, conclude from the preceding problem: In a random walk starting at the origin the number of visits to the point a > 0 that take place before the first return to the origin has a geometric
Using the preceding two problems prove the theorem12: The number of visits to the point a > 0 that take place prior to the first return to the origin has expectation (plq)" when p
A particle moves at each step two units to the right or one unit to the left, with corresponding probabilities p and q (p + q = 1). If the starting position is 0, find the probabilitya, that the
Continuation. 13 Show that a equals the probability that in a sequence of Bernoulli trials the accumulated number of failures will ever exceed twice the accumulated number of successes. [When pq this
Showing 5300 - 5400
of 6259
First
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
Last