Question
1. Toss a biased coin (with probability p of heads and 1p of tails) until the first head appears. Let N be the number of
1. Toss a biased coin (with probability p of heads and 1p of tails) until the first head appears. Let N be the number of tosses required to get the first head. Now toss a fair die N times, and let S be the sum of results of the N tosses of the die. Use conditioning to compute E[S] and Var[S] under the following circumstances:
(a) The die has every face numbered 3.
(b) The die has the faces numbered 1, 2, 3, 4, 5, 6.
2.Let X and Y be independent random variables. Show that g(X) and h(Y ) are also independent, where g and h are functions from R to R.
3.Let X1 and X2 independent geometric random variables with parameters p1 and p2 respectively.
(a) Find P(X1 X2).
(b) Find P(X1 = X2).
4. Let X1 and X2 be independent geometric random variables with parameters p1 and p2 respectively. Let D = X1 X2 and M = min(X1, X2).
(a)Find the joint PMF of D and M.
(b) Compute the marginal PMFs of D and of M.
(c) Are D and M independent? Explain.
5. Let X1 and X2 be independent Poisson random variables with the same parameter . What is the distribution of S = X1 + X2? This is Possion Compounding, the opposite of Poisson Thinning.
6. Let X1 and X2 be independent random variables that are uniformly distributed on {1, ..., n}. What is the PMF of S = X1 + X2?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started