All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
introduction to probability statistics
Questions and Answers of
Introduction To Probability Statistics
Let X ∼ N(0, 1) and W ∼ Bernoulli (1/2) be independent random variables. Define the random variable Y as a function of X and W:Find the PDF of Y and X +Y. Y=h(X, W) = X 1-x if W = 0 if W = 1
Let X and Y be two independent N(0, 1) random variable andFind ρ(Z,W). Z = 7+X+Y, W = 1+ Y.
Let X ∼ Uniform(1, 3) and Y |X ∼ Exponential(X). Find Cov(X, Y).
Let X and Y be two independent N(0, 1) random variables andFind Cov(Z,W). Z = 1+X+XY2, W = 1 + X.
Let X and Y be two random variables. Suppose that σ2X = 4, and σ2Y = 9. If we know that the two random variables Z = 2X −Y and W = X + Y are independent, find Cov(X, Y) and ρ(X, Y).
Let X1, X2,⋯,Xn be i.i.d. random variables, where Xi ∼ Bernoulli(p). Define1. E[Y],2. Var(Y). If Y = Y₁ + Y₂ + + Yn. find Y₁ = X₁X₂, Y2 = X2 X3, E Yn-1 = Xn-1Xn, Yn = XnX₁.
Let X and Y be jointly normal random variables with parameters μX = 2, σ2X = 4, μY = 1, σ2Y = 9, and ρ = −1/2.a. Find E[Y |X = 3].b. Find V ar(Y |X = 2).c. Find P(X +2Y ≤ 5|X +Y = 3).
Let X and Y be jointly normal random variables with parameters μX, σ2X, μY, σ2Y, and ρ. Find the conditional distribution of Y given X = x.
Let X ∼ Exponential(λ). Find the MGF of X, MX(s), and all of its moments, E[Xk].
In this problem, our goal is to find the variance of the hypergeometric distribution. Let's remember the random experiment behind the hypergeometric distribution. You have a bag that contains b blue
Let X ∼ Poisson(λ). Find the MGF of X, MX(s).
If X ∼ Geometric(p), find the MGF of X.
For a random variable X, we know thatFind the distribution of X. Mx(s) 2 2-s for s € (-2,2).
Iffind EX and Var(X). 1 Mx(s) = =+ =/e³ + 1e²s
Using MGF s show that if X ∼ N(μX, σ2X) and Y ∼ N(μY, σ2Y) are independent, then X + Y ~ N] μχ + μY, on tôi
Using MGFs prove that if X ∼ Binomial(m, p) and Y ∼ Binomial(n, p) are independent, then X + Y ∼ Binomial(m + n, p).
Let X be a continuous random variable with the following PDFFind the MGF of X, MX(s). fx(x) = +) == 12e-²1|²2|1 e-A/²).
If X ∼ Exponential (λ), show that ox(w) = T X -jw
Let X, Y, and Z be three independent N(1, 1) random variables. Find E[XY |Y +Z = 1].
For each of the following random variables, find the MGF.a. X is a discrete random variable, with PMFb. Y is a Uniform(0, 1) random variable. Px(k) = 3 k = 1 k = 2
Suppose that X, Y, and Z are three independent random variables. If X, Y ∼ N(0, 1) and Z ∼ Exponential(1), find1. E[XY |Z = 1],2. E[X2Y2Z2|Z = 1].
Let X, Y and Z be three jointly continuous random variables with joint PDF1. Find the joint PDF of X and Y.2. Find the marginal PDF of X.3. Find the conditional PDF of fXY |Z(x, y|z) using4. Are X
Let X, Y and Z be three jointly continuous random variables with joint PDF1. Find the constant c.2. Find the marginal PDF of X. fxyz(x, y, z) = c(x + 2y + 3z) 0 0≤x, y, z ≤ 1 otherwise
Let X and Y be jointly normal random variables with parametersa. Find P(2X + Y ≤ 3).b. Find Cov(X + Y, 2X −Y).c. Find P(Y > 1|X = 2). x = 1, o=1, y = 0, o = 4, and p py =
Remember that a continuous random variable X is said to have a Gamma distribution with parameters α > 0 and λ > 0, shown as X ∼ Gamma(α, λ), if its PDF is given byIf X ∼ Gamma(α, λ),
Using the M GFs show that if Y = X1 +X2 + ⋯ + Xn, where the Xi's are independent Exponential(λ) random variables, then Y ∼ Gamma(n,λ).
For a random v ector X, show CX = RX −EXEXT.
A sensor net work consists of n sensors that are distributed randomly on the unit square. Each node's location is uniform over the unit square and is independent of the locations of the other node. A
Let Bn be the event that a graph randomly generated according to G(n, p) model has at least one isolated node. Show thatAnd conclude that for any P(Bn) ≤n(1-p)n-1.
A system consists of 4 components in a series, so the system works properly if all of the components are functional. In other words, the system fails if and only if at least one of its components
Letbe a normal random vector with the following mean and covarianceFind the MGF of X defined as X= X1 X₂ X3.
Let X and Y be two jointly normal random variables with X ∼ N(μX,σX), Y ∼ N(μY, σY), and ρ(X, Y) = ρ. Show that the above PDF formula for PDF of [X Y] is the same as fX,Y (x, y) given in
Letbe a normal random vector with the following mean and covariance matricesLet also1. Find P(X2 > 0).2. Find expected value vector of Y, mY = EY.3. Find the covariance matrix of Y, CY.4. Find
Let X be an n-dimensional random vector. Let A be a fixed (non-random) invertible n by n matrix, and b be a fixed n-dimensional vector. Define the random vector Y as Y = AX +b.Find the PDF of Y in
Let X ∼ Uniform (0, 1). Suppose that given X = x, Y and Z are independent and Y |X = x ∼ Uniform(0,x) and Z|X = x ∼ Uniform(0, 2x). Define the random vector U as1. Find the PDFs of Y and Z.2.
Let X be an n-dimensional random vector and the random vector Y be defined aswhere A is a fixed m by n matrix and b is a fixed m-dimensional vector. Show that Y = AX +b,
Let X ∼ Geometric (p). Using Markov's inequality find an upper bound for P(X ≥ a), for a positive integer a. Compare the upper bound with the real value of P(X ≥ a).
A bank teller serves customers standing in the queue one by one. Suppose that the service time Xi for customer i has mean EXi = 2 (minutes) and Var(Xi) = 1. We assume that service times for different
The number of accidents in a certain city is modeled by a Poisson random variable with an average rate of 10 accidents per day. Suppose that the number of accidents on different days are independent.
In a communication system each data packet consists of 1000 bits. Due to the noise, each bit may be received in error with probability 0.1. It is assumed bit errors occur independently. Find the
Consider the following random experiment: A fair coin is tossed repeatedly forever. Here, the sample space S consists of all possible sequences of heads and tails. We define the sequence of random
In a communication system, each codeword consists of 1000 bits. Due to the noise, each bit may be received in error with probability 0.1. It is assumed bit errors occur independently. Since error
If X1, X2, X3, ⋯ is a sequence of i.i.d. random variables with CDF FX(x), then Xn d→ X. This is because Fx₂(x) = Fx(x), for all x.
The amount of time needed for a certain machine to process a job is a random variable with mean EXi = 10 minutes and Var(Xi) = 2 minutes2. The times needed for different jobs are independent from
You have a fair coin. You toss the coin n times. Let X be the portion of times that you observe heads. How large n has to be so that you are 95% sure that 0.45 ≤ X ≤ 0.55? In other words, how
Let X2, X3, X4, ⋯ be a sequence of random variable such thatShow that Xn converges in distribution to Exponential(1). Fxn (x) = nx (1-(1-4) ⁰² 0 x > 0 otherwise
An engineer is measuring a quantity q. It is assumed that there is a random error in each measurement, so the engineer will take n measurements and reports the average of the measurements as the
Let X1, X2, X3, ⋯ be a sequence of random variable such thatwhere λ > 0 is a constant. Show that Xn converges in distribution to Poisson(λ). (MA). n Xn Binomialn, for n E N, n>d,
Let X2, X3, X4, ⋯ be a sequence of random variables such thatShow that Xn converges in distribution to X = 1. Fx₁, (x) = en(x-1) 1+en (2-1) 0 x>0 otherwise
Let X2, X3, X4, ⋯ be a sequence of random variables such thatShow that Xn converges in distribution to Uniform(0, 1). Fxn(x) = = enx + xen n+. enz + (¹+¹ ) en n enx ten - (+¹²) en enx
Let Xn ∼ Exponential(n), show that Xn ρ → 0. That is, the sequence X1, X2, X3, ⋯ converges in probability to the zero random variable X.
Let X be a random variable, and Xn = X +Yn, wherewhere σ > 0 is a constant. Show that Xn ρ→ X. EYn n 9 Var (Y) = n 9
Consider a sequence {Xn,n = 1, 2, 3,⋯} such thatShow that Xn = n 0 with probability n2 with probability 1 1 n²
Let Xn ∼ Uniform(0, 1/n). Show that Xn Lr→ 0, for any r ≥ 1.
Consider a sequence {Xn,n = 1, 2, 3,⋯} such thatShow thata. Xn p→ 0.b. Xn does not converge in the rth mean for any r ≥ 1. Xn = n² 0 with probability with probability 1 - n
Consider the following random experiment: A fair coin is tossed once. Here, the sample space has only two elements S = {H,T}. We define a sequence of random variables X1, X2, X3, ⋯ on this sample
We perform the following random experiment. We put n ≥ 10 blue balls and n red balls in a bag. We pick 10 balls at random (without replacement) from the bag. Let Xn be the number of blue balls. We
Consider the sample space S = [0, 1] with a probability measure that is uniform on this space, i.e.,Define the sequence {Xn, n = 1, 2,⋯} as follows:Also, define the random variable X on this sample
Find two sequences of random variables {Xn, n = 1, 2,⋯} and {Yn, n = 1, 2,⋯} such thatbut Xn + Yn does not converge in distribution to X +Y. d Xn → X, and d Y→ Y, n
Let X1, X2, X3, ⋯ be a sequence of continuous random variable such thatShow that Xn converges in probability to 0. n fxn (ax) = "e-미피. -ll. ·e 2
Consider a sequence {Xn,n = 1, 2, 3,⋯} such thatShow that Xn a.s.→ 0. Xn n n with probability with probability
Let X1, X2, X3, ⋯ be a sequence of continuous random variable such thatShow that Xn converges in probability to 0. fxn (x) nx² 0 x > otherwise
Let Y1, Y2, Y3, ⋯ be a sequence of i.i.d. random variables with mean EYi = μ and finite variance Var(Yi) = σ2. Define the sequence {Xn,n = 2, 3, . . . } as Xn = Y₁Y2 + Y2Y3+ Yn-1Yn +
Let Y1, Y2, Y3, ⋯ be a sequence of positive i.i.d. random variables with 0 i] = γ n,n = 1, 2, 3, . . . } as Xn = (Y1Y2Y3 · · · Yn-1Yn) = 7, ... Р Show that Xne. for n = 1,2,3,..
Let X1, X2, X3, ⋯ be a sequence of random variable such thatwhere λ > 0 is a constant. Define a new sequence Yn asShow that Yn converges in mean square to λ, i.e., Yn m.s.→ λ. Xn Poisson
Let {Xn,n = 1, 2,⋯} and {Yn,n = 1, 2,⋯} be two sequences of random variables, defined on the sample space S. Suppose that we knowProve that Xn +Yn Lr→ X +Y . You may want to use Minkowski's
Let X1, X2, X3, ⋯ be a sequence of random variable such that Xn ∼ Rayleigh(1), a.s. Show that X → 0. - {n²z exp{-} {" 0 fx, (x) = x > 0 otherwise
Let Y1,Y2, ⋯ be independent random variables, where Yn ∼ Bernoulli (n + 1) for n = 1, 2, 3,⋯. We define the sequence {Xn,n = 2, 3, 4,⋯} as Show that X → 0. a.s. → Xn+1=Y₁ Y₂Y3
Let X be the weight of a randomly chosen individual from a population of adult men. In order to estimate the mean and variance of X, we observe a random sample X1,X2,⋯ ,X10. Thus, the Xi's are
Let X1, X2, X3, X4 be a random sample from the Uniform(0, 1) distribution, and let X(1), X(2), X(3) , X(4). Find the PDFs of X(1), X(2), and X(4).
Let X1, X2, X3, . . ., Xn be a random sample with unknown mean EXi = μ, and unknown variance Var(Xi) = σ2. Suppose that we would like to estimate θ = μ2. We define the estimator ^ Θ asto
Let Xi be i.i.d. Uniform(0, 1). We define the sample mean asa. Find E[Mn] and Var(Mn) as a function of n.b. Using Chebyshev's inequality, find an upper bound onc. Using your bound, show that Mn X₁
Let X be a random variable with EX = 1 and RX = (0, 2). If Y = X3 −6X2, show that EY ≤ −5.
Let X be a positive random variable with EX = 10. What can you say about the following quantities?1. E[X −X3]2. E[Xln√X]3. E[|2 −X|]
Prove for two random variables X and Y with finite moments, and 1 ≤ p Note thatThereforeNow, apply Hölder's inequality. |X+YP = |X+Y|P-¹|X+Y| ≤X+YP ¹ (X+Y) ≤X + YP ¹|X| + |X + YP-¹|Y|.
Let X ∼ Binomial(n, p). Using Markov's inequality, find an upper bound on P(X ≥ αn), where p < α < 1. Evaluate the bound for p = 1/2 and α = 3/4.
in Geometric (p). Using Chebyshev's inequality find an upper bound for P(|X −EX| ≥ b).
Let X ∼ Binomial(n, p). Using Chebyshev's inequality, find an upper bound on P(X ≥ αn), where p < α < 1. Evaluate the bound for p = 1/2 and α = 3/4.
Let X be a random variable with EX = 0 and Var(X) = σ2. We would like to prove that for any a > 0, we haveThis inequality is sometimes called the one-sided Chebyshev inequality. One way to show
Let X ∼ Binomial (n, p). Using Chernoff bounds, find an upper bound on P(X ≥ αn), where p < α < 1. Evaluate the bound for p = 1/2 and α = 3/4.
The number of customers visiting a store during a day is a random variable with mean EX = 100 and variance Var(X) = 225.1. Using Chebyshev's inequality, find an upper bound for having more than 120
Using the Cauchy-Schwarz inequality, show that for any two random variables X and Y |ρ(X,Y)| ≤ 1.Also, |ρ(X,Y )| = 1 if and only if Y = aX +b for some constants a, b ∈ R.
Let Xi be i.i.d. and Xi ∼ Exponential(λ). Using Chernoff bounds find an upper bound for P(X1 +X2 +⋯+Xn ≥ a), where a > n/λ. Show that the bound goes to zero exponentially fast as a function
Let X be a positive random variable. Compare E[Xa] with (E[X])a for all values of a ∈ R.
Let X and Y be two independent Uniform (0, 1) random variables. Let the random vectors U and V be defined asDetermine whether CU and CV are positive definite. U= [X²Y] · V= [X + x] X Y LX+Y +
Let X and Y be two jointly continuous random variables with joint PDFand let the random vector U be defined as1. Find the mean vector of U, EU.2. Find the correlation matrix of U, RU.3. Find the
Let X and Y be two jointly continuous random variables with joint PDFand let the random vector U be defined asFind the correlation and covariance matrices of U. fx,y(x, y) = 3 ت ادا 0 x² + y 0 <
Let X be a random variable with characteristic function ϕX(ω). If Y = aX +b, show that ϕY (ω) = ejωbϕX(aω).
Let X1, X2, X3, . . ., Xn be a random sample from a distribution with mean EXi = θ, and variance Var(Xi) = σ2. Consider the following two estimators for θ:Find MSE(^Θ1) and MSE(^Θ2) and show
Let X1,…,X4 be a random sample from a Geometric(p) distribution. Suppose we observed (x1,x2,x3,x4) = (2, 3, 3, 5). Find the likelihood function using PXi(xi; p) = p(1 −p)xi−1 as the PMF.
Let X1, X2, X3, . . ., Xn be a random sample with mean EXi = θ, and variance Var(Xi) = σ2. Show that ^Θn = X̅ is a consistent estimator of θ.
Let X1,…,X4 be a random sample from an Exponential (θ) distribution. Suppose we observed (x1,x2,x3,x4) = (2.35, 1.55, 3.25, 2.65). Find the likelihood function usingas the PDF. fx; (x₁; 0) =
Let X1, X2, X3, . . ., Xn be a random sample with mean EXi = μ, and variance Var(Xi) = σ2. Suppose that we useto estimate σ2. Find the bias of this estimator 1 5--2-xr-|(2x-ux) (X X)
Often when working with maximum likelihood functions, out of ease we maximize the log-likelihood rather than the likelihood to find the maximum likelihood estimator. Why is maximizing L(x; θ) as a
Let T be the time that is needed for a specific task in a factory to be completed. In order to estimate the mean and variance of T, we observe a random sample T1,T2,⋯, T6. Thus, Ti's are i.i.d. and
Let X be one observation from a N(0,σ2) distribution.a. Find an unbiased estimator of σ2.b. Find the log likelihood, log(L(x;σ2)), usingas the PDF.c. Find the Maximum Likelihood Estimate (MLE) for
In this problem, we would like to find the CDFs of the order statistics. Let X1,…,Xn be a random sample from a continuous distribution with CDF FX(x) and PDF fX(x).Define X(1),…,X(n) as the order
For the following random samples, find the likelihood function:1. Xi ∼ Binomial(3, θ), and we have observed (x1,x2,x3,x4) = (1, 3, 2, 2).2. Xi ∼ Exponential(θ) and we have observed
Let X1,…,Xn be a random sample from a Poisson(λ) distribution.a. Find the likelihood equation, L(x1,…,xn;λ), usingas the PMF.b. Find the log likelihood function and use that to obtain the MLE
Suppose that we have observed the random sample X1, X2, X3, . . ., Xn, where Xi ∼ N(θ1, θ2), soFind the maximum likelihood estimators for θ1 and θ2. fx; (xi; 01,02)
Showing 7000 - 7100
of 7137
First
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72