All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
mathematics
statistics
Questions and Answers of
Statistics
(a) For the hierarchy in Example 4.4.6, show that the variance of X can be writtenVar X = nEP(l - BP) + n(n - 1) Var P.(The first term reflects binomial variation with success probability EP, and the
A generalization of the hierarchy in Exercise 4.34 is described by D. G. Morrison (1978), who gives a model for forced binary choices. A forced binary choice occurs when a person is forced to choose
(The gamma as a mixture of exponentials) Gleser (1989) shows that, in certain cases, the gamma distribution can be written as a scale mixture of exponentials, an identity suggested by different
Let (X1,..., Xn) have a multinomial distribution with m trials and cell probabilities p1,... ,pn (see Definition 4.6.2). Show that, for every i and j,
A pdf is defined by(a) Find the value of C. (b) Find the marginal distribution of X. (c) Find the joint cdf of X and Y. (d) Find the pdf of the random variable Z = 9/(X + l)2.
Let X and Y be independent random variables with means μx, μy and variances σ2X σ2Y. Find an expression for the correlation of XY and Y in terms of these means and variances.
Let X1, X2, and X3 be uncorrelated random variables, each with mean μ and variance σ2. Find, in terms of μ and σ2, Cov(X1 + X2, X2 + X3) and Cov(X1 + X2, X1 - X2).
Prove the following generalization of Theorem 4.5.6: For any random vector (X1,. . . . . . ., Xn)
Show that if (X, Y) ~ bivariate normal(μx, μy, Ï2X, Ï2Y, p), then the following are true.(a) The marginal distribution of X is n(μx,
Let Z1 and Z2 be independent n(0,1) random variables, and define new random variables X and Y bywhere aX, bX, cX, aY, bY, and cy are constants. (a) Show that (b) If we define the constants aX, bX,
Let X and Y be independent n(0,1) random variables, and define a new random variable Z by(a) Show that Z has a normal distribution. (b) Show that the joint distribution of Z and Y is not bivariate
where 0 (a) Show that the marginal distributions tire given by fx(x) = af1(x) + (1 - a)f2(x) and fγ(x) = ag1(y) + (1 - a)g2(y).(b) Show that X and Y are independent if and only if |f1(x)
Let X, Y, and Z be independent uniform(0,1) random variables. (a) Find P(X/Y < t) and P(XY < t). (Pictures will help.) (b) Find P(XY/Z < t).
Let A, B, and C be independent random variables, uniformly distributed on (0, 1). What is the probability that Ax2 + Bx + C has real roots? (If X ~ uniform(0,1), then - log X ~ exponential. The sum
Find the pdf of IIni=1Xi, where the Xi,s are independent uniform(0,1) random variables. (Try to calculate the cdf, and remember the relationship between uniforms and exponentials.)
A parallel system is one that functions as long as at least one component of it functions. A particular parallel system is composed of three independent components, each of which has a lifelength
Refer to Miscellanea 4.9.2.(a) Show that Ai is the arithmetic mean, A-1 is the harmonic mean, and A0 = limr0 Ar is the geometric mean.(b) The arithmetic-geometric-harmonic mean inequality
For any three random variables X, Y, and Z with finite variances, prove (in the sprit of Theorem 4.4.7) the covariance identity Cov(X, Y) = E(Cov(X, Y|Z)) + Cov(E(X|Z),E(Y|Z)), where Cov(X, Y|Z) is
A and B agree to meet at a certain place between 1 PM and 2 PM. Suppose they arrive at the meeting place independently and randomly during the hour. Find the distribution of the length of time that A
DeGroot (1986) gives the following example of the Borel Paradox (Miscellanea 4.9.3): Suppose that X1 and X2 are iid exponential(l) random variables, and define Z = (X2 - 1)/X1. The probability-zero
A random variable X is defined by Z = log A, where EZ = 0. Is EX greater than, less than, or equal to 1?
This exercise involves a well-known inequality known as the triangle inequality (a special case of Minkowski's Inequality). (a) Prove (without using Minkowski's Inequality) that for any numbers a and
Prove the Covariance Inequality by generalizing the argument given in the text immediately preceding the inequality.
A woman leaves for work between 8 AM and 8:30 AM and takes between 40 and 50 minutes to get there. Let the random variable X denote her time of departure, and the random variable Y the travel time.
Prove that if the joint cdf of X and Y satisfies Fx,y(x,y) = FX(X)FY(y), then for any pair of intervals (a, b), and (c, d), P(a < X < b, c < Y < d) = P(a < X < b)P(c < Y < d).
Color blindness appears in 1% of the people in a certain population. How large must a sample be if the probability of its containing a color-blind person is to be .95 or more? (Assume that the
Let X1,..., Xn be a random sample from a n(μ, σ2) population. (a) Find expressions for θ1,...,θ4, as defined in Exercise 5.8, in terms of μ and σ2. (b) Use the results of Exercise 5.8, together
Suppose X- and S2 are calculated from a random sample X1,...,Xn drawn from a population with finite variance σ2. We know that ES2 = σ2. Prove that ES ≤ σ, and if σ2 > 0, then ES < σ.
Let X1,... ,Xn be iid n(μ,σ2). Find a function of S2, the sample variance, say g(S2), that satisfies Eg(S2) = σ. (Hint: Try g(S2) = c√S2, where c is a constant.)
Establish the following recursion relations for means and variances. Let X-n and S2n be the mean and variance, respectively, of X1,...,Xn. Then suppose another observation, Xn+i, becomes available.
Let Xi, i = 1,2,3, be independent with n(i, i2) distributions. For each of the following situations, use the Xis to construct a statistic with the indicated distribution. (a) Chi squared with 3
Let X be a random variable with an Fp,q distribution.(a) Derive the pdf of X.(b) Derive the mean and variance of X.(c) Show that 1/X has an Fp,q distribution.(d) Show that (p/q)X/[ 1 + (p/q)X\ has a
Let X be a random variable with a Student's t distribution with p degrees of freedom.(a) Derive the mean and variance of X.(b) Show that X2 has an F distribution with 1 and p degrees of freedom.(c)
(a) Prove that the x2 distribution is stochastically increasing in its degrees of freedom; that is, if p > q, then for any a, P(x2p > a) ≥ P(X2q > a) with strict inequality for some
a. We can see that the t distribution is a mixture of normals using the following argument:where T" is a t random variable with v degrees of freedom. Using the Fundamental Theorem of Calculus and
What is the probability that the larger of two continuous iid random variables will exceed the population median? Generalize this result to samples of size n.
Let X and Y be iid n(0,1) random variables, and define Z = min(X, V). Prove that Z2 ~ X2I.
Let Ui,i = 1,2,..., be independent uniform(0,1) random variables, and let X have distribution P(X = x) = c/x x = l, 2,3,..., X where c = l/(e - 1). Find the distribution of Z = min{U1,...
Let X1,..., Xn be a random sample from a population with pdfLet X(1)
As a generalization of the previous exercise, let X1,... ,Xn be iid with pdfLet X(1)
Let X1,...,Xn be iid with pdf fx(x) and cdf Fx(x), and let X(i1)<.....< X(i1) be the order statistics.(a) Find an expression for the conditional pdf of X(i) given X(j) in terms of fx and Fx.(b)
A manufacturer of booklets packages them in boxes of 100. It is known that, on the average, the booklets weigh 1 ounce, with a standard deviation of .05 ounce. The manufacturer is interested in
Let X1,..., Xn be iid random variables with continuous cdf Fx, and suppose EXi = p. Define the random variables Y1,..., Yn byFind the distribution of ni = 1 Yi.
If Xi and Xn are the means of two independent samples of size n from a population with variance σ2, find a value for n so that P{\X1 - Xn| < σ/5) ≈ .99. Justify your calculations.
Suppose X- is the mean of 100 observations from a population with mean μ and variance σ2 = 9. Find limits between which X - μ will lie with probability at least .90. Use both Chebychev's
Let X1, X2, .....be a sequence of random variables that converges in probability to a constant a. Assume that P(Xi > 0) = 1 for all i. a. Verify that the sequences defined by Yi = √Xi and Yii =
Let Xn be a sequence of random variables that converges in distribution to a random variable X. Let Yn be a sequence of random variables with the property that for any finite number c,Show that for
Let Xi,..., Xn be a random sample from a population with mean fi and variance Ï2. Show thatThus, the normalization of X-n in the Central Limit Theorem gives random variables that have the
Stirling's Formula (derived in Exercise 1.28), which gives an approximation for factorials, can be easily derived using the CLT.(a) Argue that, if Xi ~ exponentiall),i = 1,2,..., all independent,
In Example 5.5.16, a normal approximation to the negative binomial distribution was given. Just as with the normal approximation to the binomial distribution given in Example 3.3.2, the approximation
This exercise, and the two following, will look at some of the mathematical details of convergence.(a) Prove Theorem 5.5.4. (Hint: Since h is continuous, given ∈ > 0 we can find a S such that
Prove Theorem 5.5.13; that is, show thata. Set e = |x - μ| and show that if x > μ, then P(Xn ‰¤ x) > P(|Xn - μ| ˆˆ). Deduce the => implication.b. Use the fact that {x : |x -
Fill in the details in the proof of Theorem 5.5.24.(a) Show that if √n(Yn - μ) → n(0, σ2) in distribution, then Yn → μ in probability.(b) Give the details for the application of Slutsky's
For the situation of Example 5.6.1, calculate the probability that at least 75% of the components last 150 hours when (a) c = 300, X ~ gamma(a, b), a = 4,b = 5. (b) c = 100, X ~ gamma(a, b), a = 20,
Verify the distributions of the random variables in (5.6.5).
Let U ~ uniform(0,1).(a) Show that both - log U and - log(l - U) are exponential random variables.(b) Show that X = log u/1-u is a logistic(0,1) random variable.(c) Show how to generate a
Let X1,..., Xn be iid with pdf fx(x), and let X denote the sample mean. Show that fx(x) = nfXl +.....+ xn(nx),
One of the earlier methods (not one of the better ones) of generating pseudo-random standard normal random variables from uniform random variables is to take X = ∑12i= l Ui - 6, where the U,s are
For each of the distributions in the previous exercise: (a) Generate 1,000 variables from the indicated distribution. (b) Compare the mean, variance, and histogram of the generated random variables
Park et al. (1996) describe a method for generating correlated binary variables based on the follow scheme. Let X1,X2,X3 be independent Poisson random variables with mean λ1, λ2, λ3,
Prove that the algorithm of Example 5.6.7 generates a beta (a,b) random variable.
If X has pdf fx(x) and Y, independent of X, has pdf fy(y), establish formulas, similar to (5.2.3), for the random variable Z in each of the following situations. (a) Z = X -Y (b) Z = XY (c) Z = X/Y
(a) Suppose it is desired to generate Y ~ beta(o, b), where a and b are not integers. Show that using V ~ beta([a], |b]) will result in a finite value of M = suPy, fy (y)j fv (y).(b) Suppose it is
For generating Y ~ n(0,1) using an Accept/Reject Algorithm, we could generate U ~ uniform, V ~ exponential (λ) , and attach a random sign to V (± each with equal probability). What value of A will
A variation of the importance sampling algorithm of Exercise 5.64 can actually produce an approximate sample from f. Again let X ~ f and generate Y1, Y2,..., Ym, iid from g. Calculate qi =
In many instances the Metropolis Algorithm is the algorithm of choice because either (i) there are no obvious candidate densities that satisfy the Accept/Reject supremum condition, or (ii) the
Show that the pdf fy(y) is a stable point of the Metropolis Algorithm. That is, if Zi ~ fv(y), then Zi + 1 ~ fv(y).
In Example 5.2.10, a partial fraction decomposition is needed to derive the distribution of the sum of two independent Cauchy random variables. This exercise provides the details that are skipped in
Let X1,..., Xn be a random sample, where X- and S2 are calculated in the usual way(a)Assume now that the X,s have a finite fourth moment, and denote 6 = EX*, 02 -E(xi-e1y,j = 2,3,4(b)Show that Var S2
Establish the Lagrange Identity, that for any numbers a1, a2,..., an and b1,b2,..., bn,Use the identity to show that the correlation coefficient is equal to 1 if and only if all of the sample points
Let X be one observation from a n(0, σ2) population. Is |X| a sufficient statistic?
Show that the minimal sufficient statistic for the uniform(θ, θ + 1), found in Example 6.2.15, is not complete.
Refer to the pdfs given in Exercise 6.9. For each, let X(1) < . . . < Xn-1) be the ordered sample, and define Yi = X(n) - X(i), i = 1,..., n - 1. a. For each of the pdfs in Exercise 6.9, verify that
A natural ancillary statistic in most problems is the sample size. For example, let N be a random variable taking values 1, 2,... with known probabilities p1, p2, . . ., where ∑pi = 1. Having
Suppose X1 and X2 are iid observations from the pdf f(x|α) = axα-1 e-xα, x > 0, α > 0. Show that (logX1)/(logX2) is an ancillary statistic.
Let X1,..., Xn be a random sample from a location family. Show that M - is an ancillary statistic, where M is the sample median.
Let X1,..., Xn be iid n(θ, aθ2), where a is a known constant and θ > 0. a. Show that the parameter space does not contain a two-dimensional open set. b. Show that the statistic T = (, S2) is a
Let X1,..., Xn be iid with geometric distribution Pθ(X = x) = θ(1 - θ)x-1, x = 1, 2,..., 0 < θ < 1. Show that ∑Xi is sufficient for θ, and find the family of distributions of ∑Xi. Is the
Let X1,..., Xn be iid Poisson(λ). Show that the family of distributions of ∑Xi is complete. Prove completeness without using Theorem 6.2.25.
The random variable X takes the values 0, 1, 2 according to one of the following distributions:In each case determine whether the family of distributions of X is complete.
Let X1,..., Xn be independent random variables with densitiesProve that T = mini(Xi/i) is a sufficient statistic for θ.
For each of the following pdfs let X1,...,Xn be iid observations. Find a complete sufficient statistic, or show that one does not exist.a.b.c.d.e.
Let X be one observation from the pdfa. Is X a complete sufficient statistic? b. Is |X| a complete sufficient statistic? c. Does f{x|θ) belong to the exponential class?
Let X1,..., Xn be a random sample from a population with pdf f(x|θ) = θxθ-1 , 0 < x < 1, θ > 0. a. Is ∑Xi sufficient for θ? b. Find a complete sufficient statistic for θ.
Let X1,..., Xn be a random sample from a uniform distribution on the interval (θ, 2θ), θ > 0. Find a minimal sufficient statistic for θ. Is the statistic complete?
Consider the following family of distributions: P = {Pλ(X = x): Pλ(X = x) = λxe-λ / x!; x = 0, 1, 2,...; λ = 0 or 1}. This is a Poisson family with λ restricted to be 0 or 1. Show that the
We have seen a number of theorems concerning sufficiency and related concepts for exponential families. Theorem 5.2.11 gave the distribution of a statistic whose sufficiency is characterized in
Let X1,... ,Xn be a random sample from the inverse Gaussian distribution with pdfa. Show that the statistics are sufficient and complete. b. For n = 2, show that has an inverse Gaussian
The concept of minimal sufficiency can be extended beyond parametric families of distributions. Show that if X1,..., Xn are a random sample from a density f that is unknown, then the order statistics
Let X1,... ,Xn be a random sample from the pdfFind a two-dimensional sufficient statistic for (μ, Ï).
Let X1,..., Xn be a random sample from the pdf f(x|μ) = e-(x-μ) where - ∞ < μ < x < ∞. a. Show that X(1) = mini Xi is a complete sufficient statistic. b. Use Basu's Theorem to show that X(1)
Boos and Hughes-Oliver (1998) detail a number of instances where application of Basu's Theorem can simplify calculations. Here are a few.a. Let X1,..., Xn be iid n(μ, Ï2),
Prove the Likelihood Principle Corollary. That is, assuming both the Formal Sufficiency Principle and the Conditionality Principle, prove that if E = (X, θ, {f(x|θ)}) is an experiment, then Ev(E,
Fill in the gaps in the proof of Theorem 6.3.6, Birnbaum's Theorem.a. Define g(t|θ) = g((j, xj)|θ) = f* ((j, xj)|θ) andShow that T(j, xj) is a sufficient
A risky experimental treatment is to be given to at most three patients. The treatment will be given to one patient. If it is a success, then it will be given to a second. If it is a success, it will
Joshi and Nabar (1989) examine properties of linear estimators for the parameter in the so-called "Problem of the Nile," where (X, Y) has the joint densityf(x, y|θ) =
Measurement equivariance requires the same inference for two equivalent data points: x, measurements expressed in one scale, and y, exactly the same measurements expressed in a different scale.
Prove Theorem 6.2.10.Let X1,..., Xn be iid observations from a pdf or pmf f(x|θ) that belongs to an exponential family given bywhere θ = (θ1, θ2,
Let X1,... , Xn be iid observations from a location-scale family. Let T1 (X1,... , Xn) and T2(X1, ... , Xn) be two statistics that both satisfy Ti(ax1 + b,..., axn + b) = aTi(xi,... ,xn) for all
Suppose that for the model in Example 6.4.6, the inference to be made is an estimate of the mean μ. Let T(x) be the estimate used if X = x is observed. If ga(X) = Y = y is observed, then let T*(y)
Showing 70600 - 70700
of 88274
First
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
Last