All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
theory of probability
Questions and Answers of
Theory Of Probability
1.2 In model (1.9), suppose that n = 2 and that f satisfies f (−x1, −x2) = f (x2, x1).Show that the distribution of (X1 + X2)/2 given X2 − X1 = y is symmetric about 0.Note that if X1 and X2 are
1.1 Prove the parts of Theorem 1.4 relating to (a) risk and (b) variance.
6.12 Prove the Bhattacharyya inequality (6.29) and show that the condition of equality is as stated.
6.11 Prove that (6.26) is necessary for equality in (6.25). [Hint: Problem 6.9(a).]
6.10 (a) Show that if the matrixAis nonsingular, then for any vector x, (xAx)(xA−1x) >(xx)2.(b) Show that, in the notation of Theorem 6.6 and the following discussion, 1 ∂∂θi Eθ δ22
6.9 (a) Let A =a bb C where a is a scalar and b a column matrix, and suppose that A is positive definite.Show that |A| ≤ a|C| with equality holding if and only if b = 0.
6.8 Let A =A11 A12 A21 A22 be a partitioned matrix with A22 square and nonsingular, and let B =I −A12A−1 22 0 I.Show that |A| = |A11 − A12A−1 22 A21|·|A22|.
6.7 Verify the expressions (6.20) and (6.21).
6.6 If p(x) = (1−ε)φ(x −ξ )+(ε/τ )φ[(x −ξ )/τ ] where φ is the standard normal density, find I (ε, ξ, τ ).
6.5 Verify (a) the information matrices of Table 6.1 and (b) Equations (6.15) and (6.16).
6.4 Prove (6.11) under the assumptions of the text.
6.3 An alternate proof of Theorem 6.1 uses the method of Lagrange (or undetermined)multipliers. Show that, for fixed γ , the maximum value of aγ , subject to the constraint that aCa = 1, is
6.2 In this problem, we establish some facts about eigenvalues and eigenvectors of square matrices. (For a more general treatment, see, for example, Marshall and Olkin 1979, Chapter 20.)We use the
6.1 For any random variables (ψ1,...,ψs), show that the matrices ||Eψiψj || and C =||cov(ψi, ψj )|| are positive semidefinite.
5.33 Let F be the class of all unimodal symmetric densities or, more generally, densities symmetric around zero and satisfying f (x) ≤ f (0) for all x. Show that min f ∈Fx2 f (x)dx = 1 12 , and
5.32 Brown and Gajek (1990) give two different sufficient conditions for (8.2) to hold, which are given below. Show that each implies (8.2). (Note that, in the progression from(a) to (b) the
5.31 (a) Show that if (5.38) holds, then the family of densities is strongly differentiable(see Note 8.6).(b) Show that weak differentiability is implied by strong differentiability.
5.30 Show that (5.38) is satisfied if either of the following is true:(a) |∂ log pθ /∂θ| is bounded.(b) [pθ+W(x) − pθ (x)]/W → ∂ log pθ /∂θ uniformly.
5.29 Show that the assumption (5.36(b)) implies (5.38), so Theorem 5.15 is, in fact, a corollary of Theorem 5.10.
5.28 Extend condition (5.38) to vector-valued parameters, and show that it is satisfied by the exponential family (1.5.1) for s > 1.
5.27 Verify directly that the following families of densities satisfy (5.38).
5.26 Kiefer inequality.(a) Let X have density (with respect to µ) p(x,θ) which is > 0 for all x, and let 1 and 2 be two distributions on the real line with finite first moments. Then, any
5.25 Under the assumptions of the preceding problem, let X¯ ∗ be the integer closest to X¯ .(a) The estimator X¯ ∗ is unbiased for the restricted parameter θ.(b) There exist positive
5.24 If X1,...,Xn are iid as N(θ,σ2) where σ is known and θ is known to have one of the values 0, ±1, ±2,..., the inequality of the preceding problem shows that any unbiased estimator δ of the
5.23 Let X1,...,Xn be iid according to a density p(x,θ) which is positive for all x.Then, the variance of any unbiased estimator δ of θ satisfies
5.22 Let X1,...,Xn be a sample from the Poisson (λ) distribution truncated on the left at 0, i.e., with sample space X = {1, 2, 3,...} (see Problem 3.20). Show that the Cramer-Rao lower bound for
5.21 (Liu and Brown 1993) Let X be an observation from the normal mixture density pθ (x) =1 2√2πe−(1/2)(x−θ)2+ e−(1/2)(x+θ)2, θ ∈ , where is any neighborhood of zero. Thus, the
5.20 If Eθ δ = g(θ), the information inequality lower bound is IB(θ)=[g(θ)]2/I (θ). Ifθ = h(ξ ) where h is differentiable, show that IB(ξ ) = IB(θ).
5.19 Show that if Eθ δ = g(θ), and var(δ) attains the information inequality bound (5.31), thenδ(x) = g(θ) + g(θ)I (θ)∂∂θ pθ (x).
5.18 Show that if a given function g(θ) has an unbiased estimator, there exists an unbiased estimator δ which for all θ values attains the lower bound (5.1) for some ψ(x,θ)satisfying (5.2) if
5.17 If pθ (x) is given by 1.5.1 with s = 1 and T (x) = δ(x), show that var[δ(X)] attains the lower bound (5.31) and is the only estimator to do so. [Hint: Use (5.18) and (1.5.15).]
5.16 (a) For the scale family with density (1/θ)f (x/θ),θ > 0, the amount of information a single observation X has about θ is 1θ 2 yf (y)f (y)+ 12 f (y) dy.(b) Show that the information X
5.15 For the location t density, calculate the information inequality bound for unbiased estimators of θ.
5.14 Verify (a) formula (5.25) and (b) formula (5.27).
5.13 For the distribution with density (5.24), show that I (θ) is independent of θ.
5.12 Evaluate (5.25) when f is the density of Student’s t-distribution with ν degrees of freedom. [Hint: Use the fact that ∞−∞dx(1 + x2)k = H(1/2)H(k − 1/2)H(k) .
5.11 Verify the entries of Table 5.2.
5.10 Show that (5.13) can be differentiated by differentiating under the integral sign when pθ (x) is given by (5.24), for each of the distributions of Table 5.2. [Hint: Form the difference quotient
5.9 For inverse binomial sampling (see Example 3.2):(a) Show that the best unbiased estimator of p is given by δ∗(Y )=(m−1)/(Y+m − 1).(b) Show that the information contained in Y about P is I
5.8 Find a function of θ for which the amount of information is independent of θ:(a) for the gamma distribution H(α, β) with α known and with θ = β;(b) for the binomial distribution b(p, n)
5.7 Verify the following statements, asserted by Basu (1988, Chapter 1), which illustrate the relationship between information, sufficiency, and ancillarity. Suppose that we let I (θ) =
5.6 If X is distributed as P(λ), show that the information it contains about √λ is independent of λ.
5.5 Find I (p) for the negative binomial distribution.
5.4 If X is normal with mean zero and standard deviation σ, determine I (σ).
5.3 Verify I (θ) for the distributions of Table 5.1.
5.2 Determine the conditions under which equality holds in (5.1).
5.1 Under the assumptions of Problem 1.3, determine for each p1, the value LV (p1) of the LMVU estimator of p at p1 and compare the function LV (p), 0
4.10 Let (X1, Y1),..., (Xn, Yn) be iid with F ∈ F, where F is the family of all bivariate densities. Show that the sufficient statistic T , which generalizes the order statistics to the bivariate
4.9 Under the assumptions of the preceding problem, find the UMVU estimator of(a) P(Xi ≤ Yi);(b) P(Xi ≤ Xj and Yi ≤ Yj ), i = j .
4.8 Let (X1, Y1),..., (Xn, Yn) be iid F ∈ F, where F is the family of all distributions with probability density and finite second moments. Show that δ(X, Y ) = (Xi −X¯ )(Yi − Y¯)/(n − 1)
4.7 Under the assumptions of Problem 4.5, let ξ = EXi and η = EYj . Show that ξ 2η2 possesses an unbiased estimator if and only if m ≥ 2 and n ≥ 2.
4.6 Under the assumptions of the preceding problem, find the UMVU estimator of P(Xi < Yj ).
4.5 If X1,...,Xm and Y1,...,Yn are independently distributed according to F and G ∈F0, defined in Problem 4.1, the order statistics X(1) < ··· < X(m) and Y(1) < ···
4.4 Let X1,...,Xn be iid with distribution F ∈ F where F is the class of all symmetric distributions with a probability density. There exists no UMVU estimator of the center of symmetry θ of F (if
4.3 In the preceding problem, show that 1/varFXi does not have an unbiased estimator for any n.
4.2 Let F be the class of all univariate distribution functions F that have a probability density function f and finite mth moment.(a) Let X1,...,Xn be independently distributed with common
4.1 Let X1,...,Xn be iid with distribution F.(a) Characterize the totality of functions f (X1,...,Xn) which are unbiased estimators of zero for the class F0 of all distributions F having a
3.30 In Example 3.10, show that the estimator δ1 of pijk is unbiased for the model (3.20).[Hint: Problem 3.29.]
3.29 LetX, Y , and g be such thatE[g(X, Y )|y] is independent of y. Then,E[f (Y )g(
3.28 Verify (3.20).
3.27 In Example 3.9, show that independence of A and B implies that (n1+, ..., nI+) and(n+1, ..., n+J ) are independent with multinomial distributions as stated.
3.26 In Example 3.9 when pij = pi+p+j , determine the variances of the two unbiased estimators δ0 = nij /n and δ1 = ni+n+j /n2 of pij , and show directly that var(δ0) > var(δ1)for all n > 1.
3.25 For the multinomial distribution of Example 3.8,(a) show that pr0 0 ··· prs s has an unbiased estimator provided r0,...,rs are nonnegative integers with ri ≤ n;(b) find the totality of
3.24 If X1,...,Xn are iid according to the logarithmic series distribution of Problem 1.5.14, evaluate the estimators (3.13) and (3.14) for n = 1, 2, and 3.
3.23 If X1,...,Xn are iid P(λ), consider estimation of e−bλ, where b is known.(a) Show that δ∗ = (1 − b/n)t is the UMVU estimator of e−bλ.(b) For b>n, describe the behavior of δ∗, and
3.22 For the negative binomial distribution truncated at zero, evaluate the estimators(3.13) and (3.14) for m = 1, 2, and 3.
3.21 Suppose that X has the Poisson distribution truncated on the right ata, so that it has the conditional distribution of Y given Y ≤a, where Y is distributed as P(λ). Show that λ does not have
3.20 Let X1,...,Xn be a sample from the Poisson distribution truncated on the left at 0, i.e., with sample space X = {1, 2, 3,...}.(a) For t = xi, the UMVU estimator of λ is (Tate and Goen 1958)
3.19 If X1,...,Xn are iid according to (3.18), the Poisson distribution truncated on the left at 0, find the UMVU estimator of θ when (a) n = 1 and (b) n = 2.
3.18 If X has the Poisson distribution P(θ), show that 1/θ does not have an unbiased estimator.
3.17 Generalize the preceding problem to the case that two points (r1, s1) and (r2, s2)with ri + si < n are added to the boundary. Assume that these two points are such that all n + 1 points x + y =
3.16 Consider n binomial trials with success probability p, and letr and s be two positive integers with r + s
3.15 Use (3.3) to determine A(t,n) in (3.11) for the negative binomial distribution with m = n, and evaluate the estimators (3.13) of qr, and (3.14).
3.14 For any sequential binomial sampling plan under which the point (1, 1) is reached with positive probability but is not a stopping point, find an unbiased estimator of pq depending only on (X, Y
3.13 Consider any closed sequential binomial sampling plan with a set B of stopping points, and letB be the setB∪{(x0, y0)} where (x0, y0) is a point not inB that has positive probability of being
3.12 For any sequential binomial sampling plan, the coordinates (X, Y ) of the end point of the sample path are minimal sufficient.
3.11 Curtailed single sampling. Leta, b
3.10 In Example 3.4(ii), (a) show that the plan is closed but not simple, (b) show that(X, Y ) is not complete, and (c) evaluate the unbiased estimator (3.7) of p.
3.9 Consider sequential binomial sampling with the stopping points (0, 1) and (2, y), y = 0, 1,.... (a) Show that this plan is closed and simple. (b) Show that (X, Y ) is not complete by finding a
3.8 Verify Equation (3.7) with the appropriate definition of N(x, y) (a) for the estimation of p and (b) for the estimation of pa qb.
3.7 Suppose that binomial sampling is continued until the number of successes equals the number of failures.(a) This rule is closed if p = 1/2 but not otherwise.(b) If p = 1/2 and N denotes the
3.6 If binomial sampling is continued until m successes have been obtained, let Xi (i =1,...,m) be the number of failures between the (i − 1)st and ith success.(a) The Xi are iid according to the
3.5 Consider the scheme in which binomial sampling is continued until at least a successes and b failures have been obtained. Show how to calculate a reasonable estimator of log(p/q). [Hint: To
3.4 If Y is distributed according to (3.3), use Method 1 of Section 2.1(a) to show that the UMVU estimator of pr (r
3.3 (a) Use the method leading to (3.2) to find the UMVU estimator πk (T ) of P[X1 +··· + Xm = k] = m kpkqm−k (m ≤ n).(b) For fixed t and varying k, show that the πk (t) are the
3.2 If T is distributed as b(p, n), find an unbiased estimator δ(T ) of pm (m ≤ n) by Method 1, that is, using (1.10). [Hint: Example 1.13.]
3.1 (a) In Example 3.1, show that (Xi − X¯ )2 = T (n − T )/n.(b) The variance of T (n−T )/n(n−1) in Example 3.1 is (pq/n)[(q−p)2+2pq/(n−1)].
2.29 Under the assumptions of Lemma 2.7, show that:(a) If b is replaced by any random variable B which is independent of X and not 0 with probability 1, then Rδ (θ) < Rδ∗ (θ).(b) If squared
2.28 Verify (2.26).
2.27 In Example 2.6(b), show that(a) The bias of the ML estimator is 0 when ξ = u.(b) At ξ = u, the ML estimator has smaller expected squared error than the UMVU estimator.[Hint: In (b), note that
2.26 Verify the ML estimators given in (2.24).
2.25 Let X1,...,Xm and Y1,...,Yn be iid as U(0, θ) and U(0, θ), respectively. If n > 1, determine the UMVU estimator of θ/θ.
2.24 Let X1,...,Xn be iid according to the uniform distribution U(ξ −b, ξ + b). If ξ,b are both unknown, find the UMVU estimators of ξ ,b, and ξ/b. [Hint: Problem 1.6.30.]
2.23 In Problem 2.21, suppose that a = a.(a) Show that the complete sufficient statistic of Problem 2.21(a) is still minimal sufficient but no longer complete.(b) Show that a UMVU estimator for a =
2.22 In the preceding problem, suppose that b = b.(a) Show that X(1), Y(1), and [Xi − X(1)] + [Yj − Y(1)] are sufficient and complete.(b) Find the UMVU estimators of b and (a − a)/b.
2.21 LetX1,...,Xm and Y1,...,Yn be independently distributed asE(a,b) andE(a, b), respectively.(a) Ifa, b, a, and b are completely unknown, X(1), Y(1), [Xi −X(1)], and [Yj −Y(1)]jointly are
2.20 Find the UMVU estimator of P(X1 ≥ u) for the family (2.22) when both a and b are unknown.
2.19 For the family (2.22) with b = 1, find the UMVU estimator of P(X1 ≥ u) and of the density e−(u−a) of X1 at u. [Hint: Obtain the estimator δ(X(1)) of the density by applying Method 2 of
2.18 Show that the estimators (2.23) are UMVU. [Hint: Problem 1.6.18.].
2.17 For the family (2.22), show that the UMVU estimator of a when b is known and the UMVU estimator of b is known are as stated in Example 2.5. [Hint: Problem 6.18.]
Showing 1100 - 1200
of 6259
First
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
Last