All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
statistical techniques in business
Questions and Answers of
Statistical Techniques in Business
In the situation of Problem 12.63, consider the hypothesis of marginal homogeneity H : pi+ = p+i for all i, where pi+ = a j=1 piij , p+i = a j=1 pjii.(i) The maximum-likelihood estimates of the
The hypothesis of symmetry in a square two-way contingency table arises when one of the responses A1,...,Aa is observed for each of n subjects on two occasions (e.g. before and after some
Consider the following model which therefore generalizes model(iii) of Section 4.7. A sample of ni subjects is obtained from class Ai(i = 1,...,a), the samples from different classes being
The problem is to test independence in a contingency table.Specifically, suppose X1,...,Xn are i.i.d., where each Xi is cross-classified, so that Xi = (r, s) with probability pr,s, r = 1,...,R, s =
Prove (iii) of Theorem 12.4.2. Hint: If θ0 satisfies the null hypothesis g(θ0) = 0, then testing Ω0 behaves asymptotically like testing the null hypothesis D(θ0)(θ − θ0) = 0, which is a
Provide the details of the proof to part (ii) of Theorem 12.4.2.
Prove (12.86).
In Example 12.4.6, show that 2 log(Rn) − Qn P→ 0 under the null hypothesis.
In Example 12.4.6, show that Rao’s Score test is exactly Pearson’s Chi-squared test.
(i) In Example 12.4.6, check that the MLE is given by ˆpj =Yj/n. (ii) Show (12.82).
Suppose (X1, Y1),..., (Xn, Yn) are i.i.d., with Xi also independent of Yi. Further suppose Xi is normal with mean µ1 and variance 1, and Yi is normal with mean µ2 and variance 1. It is known that
Suppose X1,...,Xn are i.i.d. with the gamma Γ(g,b) density f(x) = 1Γ(g)bg xg−1 e−x/b x > 0 , with both parameters unknown (and positive). Consider testing the null hypothesis that g = 1, i.e.,
Suppose X1,...,Xn are i.i.d. N(µ, σ2) with both parameters unknown. Consider testing the simple null hypothesis (µ, σ2) = (0, 1). Find and compare the Wald test, Rao’s Score test, and the
In Example 12.4.5, determine the distribution of the likelihood ratio statistic against an alternative, both for the simple and composite null hypotheses.
In Example 12.4.5, consider the case of a composite null hypothesis with Ω0 given by (12.79). Show that the null distribution of the likelihood ratio statistic given by (12.80) is χ2 p. Hint:
Prove (12.76). Then, show that[Σ(r)(θ)]−1 ≤ [I(r)(θ)] .What is the statistical interpretation of this inequality?
Suppose X1,...,Xn are i.i.d. Pθ, with θ ∈ Ω, an open subset of RI k . Assume the family is q.m.d. at θ0 and consider testing the simple null hypothesis θ = θ0. Suppose ˆθn is an estimator
In Example 12.4.7, verify (12.88) and (12.89).
Suppose X1,...,Xn are i.i.d. N(µ, σ2) with both parameters unknown. Consider testing µ = 0 versus µ = 0. Find the likelihood ratio test statistic, and determine its limiting distribution under
For a q.m.d. model with ˆθn satisfying (12.62), find the limiting behavior of the Wald statistic given in the left side of (12.71) under θn = θ0 +hn−1/2.
Verify that h˜n in (12.61) maximizes L˜n,h.
Suppose X1,...,Xn are i.i.d., uniformly distributed on [0, θ].Find the maximum likelihood estimator ˆθn of θ. Determine a sequence τn such that τn(ˆθn − θ) has a limiting distribution, and
Let X1, ··· , Xn be i.i.d. with density f(x, θ) = [1 + θ cos(x)]/2π, where the parameter θ satisfies |θ| < 1 and x ranges between 0 and 2π. (The observations Xi may be interpreted as
Let X1,...,Xn be i.i.d. N(θ, θ2). Compare the asymptotic distribution of X¯ 2 n with that of an efficient likelihood estimator sequence.
Let X1,...,Xn be a sample from a Cauchy location model with density f(x − θ), where f(z) = 1π(1 + z2).Compare the limiting distribution of the sample median with that of an efficient likelihood
Let (Xi, Yi), i = 1 ...n be i.i.d. such that Xi and Yi are independent and normally distributed, Xi has variance σ2, Yi has variance τ 2 and both have common mean µ.(i) If σ and τ are known,
Prove Corollary 12.4.1. Hint: Simply define ˆθn = θ0 +n−1/2I−1(θ0)Zn and apply Theorem 12.4.1.
Generalize Example 12.4.2 to multiparameter exponential families.
Suppose X1,...,Xn are i.i.d. Pθ according to the lognormal model of Example 12.2.7. Write down the likelihood function and show that it is unbounded.
In Example 12.4.1, show that the likelihood equations have a unique solution which corresponds to a global maximum of the likelihood function.
Suppose X1,...,Xn are i.i.d. according to a model {Pθ : θ ∈Ω}, where Ω is an open subset of Rk. Assume that the model is q.m.d. Show that there cannot exist an estimator sequence Tn
Generalize Corollary 12.3.2 in the following way. Suppose Tn =(Tn,1,...,Tn,k) ∈ RI k. Assume that, under Pn,(Tn,1,...,Tn,k, log(Ln)) d→ (T1,...,Tk, Z) , where (T1,...,Tk, Z) is multivariate
Assume X1,...,Xn are i.i.d. according to a family {Pθ} which is q.m.d. at θ0. Suppose, for some statistic Tn = Tn(X1,...,Xn) and some function µ(θ) assumed differentiable at θ0, n1/2(Tn −
Suppose Pθ is the uniform distribution on (0, θ). Fix h and determine whether or not P n 1 and P n 1+h/n are mutually contiguous. Consider both h > 0 and h < 0.
Suppose X1,...,Xn are i.i.d. according to a model which is q.m.d. at θ0. For testing θ = θ0 versus θ = θ0 +hn−1/2, consider the test ψn that rejects H if log(Ln,h) exceeds z1−ασh − 1 2
Verify (12.53) and evaluate it in the case where f(x) =exp(−|x|)/2 is the double exponential density.
Show that σ1,2 in (12.52) reduces to h/√π.
Prove the convergence (12.40).
Suppose Q is absolutely continuous with respect to P. If P{En} → 0, then Q{En} → 0.
Suppose Xn has distribution Pn or Qn and Tn = Tn(Xn) is sufficient. Let P T n and QT n denote the distribution of Tn under Pn and Qn, respectively. Prove or disprove: Qn is contiguous to Pn if and
Suppose, under Pn, Xn = Yn +oPn (1); that is, Xn −Yn → 0 in Pn-probability. Suppose Qn is contiguous to Pn. Show that Xn = Yn + oQn (1).
Consider a sequence {Pn, Qn} with likelihood ratio Ln defined in (12.36). Assume L(Ln|Pn) d→ W , where P{W = 0} = 0. Deduce that Pn is contiguous to Qn. Also, under the assumptions of Corollary
Suppose Qn is contiguous to Pn and let Ln be the likelihood ratio defined by (12.36). Show that EPn (Ln) → 1. Is the converse true?
Fix two probabilities P and Q and let Pn = P n and Qn = Qn.Show that {Pn} and {Qn} are contiguous iff P = Q.
Fix two probabilities P and Q and let Pn = P and Qn = Q.Show that {Pn} and {Qn} are contiguous iff P and Q are absolutely continuous.
Show the convergence (12.35).
Prove (12.31).
Assume {Pθ, θ ∈ Ω} is L1-differentiable, so that (12.90) holds.For simplicity, assume k = 1 (but the problem generalizes). Let φ(·) be uniformly bounded and set β(θ) = Eθ[φ(X)]. Show,
Suppose {Pθ, θ ∈ Ω} is a model with Ω an open subset of RI k , and having densities pθ(x) with respect to µ. Define the model to be L1-differentiable at θ0 if there exists a vector of
Suppose X1,...,Xn are i.i.d. and uniformly distributed on(0, θ). Let pθ(x) = θ−1I{0
To see what might happen when the parameter space is not open, let f0(x) = xI{0 ≤ x ≤ 1} + (2 − x)I{1 < x ≤ 2} .Consider the family of densities indexed by θ ∈ [0, 1) defined by pθ(x) =
Suppose {Pθ} is q.m.d. at θ0. Show Pθ0+h{x : pθ0 (x)=0} = o(|h|2)as |h| → 0. Hence, if X1,...,Xn are i.i.d. with likelihood ratio Ln,h defined by(12.12), show that P nθ0+hn−1/2 {Ln,h = ∞}
Suppose {Pθ} is q.m.d. at θ0 with derivative η(·, θ0). Show that, on {x : pθ0 (x)=0}, we must have η(x, θ0) = 0, except possibly on a µ-null set.Hint: On {pθ0 (x)=0}, write 0 ≤ n1/2
Prove Theorem 12.2.2 using an argument similar to the proof of Theorem 12.2.1.
In Example 12.2.5, show that{[f(x)]2/f(x)}dx is finite iffβ > 1/2.
In Examples 12.2.3 and 12.2.4, find the quadratic mean derivative and I(θ).
Show that the definition of I(θ) in Definition 12.2.2 does not depend on the choice of dominating measure µ.
to construct a family of distributions Pθ with θ ∈ RI 2, defined for all small |θ|, such that P0,0 = P, the family is q.m.d. at θ = (0, 0) with score vector at θ = (0, 0)given by (u1(x),
Fix a probability P on S and functions ui(x) such that ui(x)dP(x) = 0 and u2 i (x)dP(x) < ∞, for i = 1, 2. Adapt
Fix a probability P. Let u(x) satisfyu(x)dP(x)=0 .(i) Assume supx |u(x)| < ∞, so that pθ(x) = [1 + θu(x)]defines a family of densities (with respect to P) for all small |θ|. Show this family is
Suppose X and Y are independent, with X distributed as Pθand Y as P¯θ, as θ varies in a common index set Ω. Assume the families {Pθ} and{P¯θ} are q.m.d. with Fisher Information matrices
Suppose gn is a sequence of functions in L2(µ) and, for some function g,(gn − g)2dµ → 0. If h2dµ < ∞, show that hgndµ → hgdµ.
Suppose gn is a sequence of functions in L2(µ); that is, g2 ndµ < ∞. Assume, for some function g,(gn − g)2dµ → 0. Prove that g2dµ < ∞.
Generalize Example 12.2.2 to the case of a multiparameter exponential family. Compare with the result of Problem 12.1.
Generalize Example 12.2.1 to the case where X is multivariate normal with mean vector θ and nonsingular covariance matrix Σ.
Let Yn,1,...,Yn,n be i.i.d. bernoulli variables with success probability pn, where npn = λ and λ1/2 = δ. Let Un,1,...,Un,n be i.i.d. uniform variables on (−τn, τn), where τ 2 n = 3p2 n. Then,
Prove the second equality in (11.81). In the proof of Lemma 11.4.2, show that κn(n) → 0.
Consider the problem of testing µ(F) = 0 versus µ(F) = 0, for F ∈ F0, the class of distributions supported on [0, 1]. Let φn be Anderson’s test.(i) If|n1/2µ(Fn)| ≥ δ > 2sn,1−α , then
Prove Lemma 11.4.5.
In the proof of Theorem 11.4.4, prove Sn/σ(Fn) → 1 in probability.
Suppose F satisfies the conditions of Theorem 11.4.6. Assume there exists φn such that sup F ∈F: µ(F )=0 EF (φn) → α .Show that lim sup n EF (φn) ≤ αfor every F ∈ F.
Let φn be the classical t-test for testing the mean is zero versus the mean is positive, based on n i.i.d. observations from F. Consider the power of this test against the distribution N(µ, 1).
Assuming F is absolutely continuous with 4 moments, verify(11.76).
When sampling from a normal distribution, one can derive an Edgeworth expansion for the t-statistic as follows. Suppose X1,...,Xn are i.i.d.N(µ, σ2) and let tn = n1/2(X¯n − µ)/Sn, where S2 n is
Let X1,...,Xn be a sample from N(ξ, σ2), and consider the UMP invariant level-α test of H : ξ/σ ≤ θ0 (Section 6.4). Let αn(F) be the actual significance level of this test when X1,...,Xn is
is not robust against nonnormality.
Show that the test derived in
In the preceding problem, investigate the rejection probability when the Fi have different variances. Assume min ni → ∞ and ni/n → ρi.
For i = 1,...,s and j = 1,...,ni, let Xi,j be independent, with Xi,j having distribution Fi, where Fi is an arbitrary distribution with mean µi and finite common variance σ2. Consider testing µ1 =
The size of each of the following tests is robust against nonnormality:(i) the test (7.24) as b → ∞,(ii) the test (7.26) as mb → ∞,(iii) the test (7.28) as m → ∞.
If Πi,i are defined as in (11.56), show that n i=1 Π2 i,i = s.Hint: Since the Πi,i are independent of A, take A to be orthogonal.
If ξi = α + βti + γui, express the condition (11.57) in terms of the t’s and u’s.
with cn = nk.
Let cn = u0+u1n+···+uknk, ui ≥ 0 for all i. Then cn satisfies(11.48). What if cn = 2n? Hint: Apply
Let {cn} and {cn} be two increasing sequences of constants such that cn/cn → 1 as n → ∞. Then {cn} satisfies (11.48) if and only if {cn}does.
Show that (11.48) holds whenever cn tends to a finite nonzero limit, but the condition need not hold if cn → 0.
Suppose (11.57) holds for some particular sequence Π(n)Ω with fixed s. Then it holds for any sequence ΠΩ(n) ⊂ Π(n)Ω of dimension s < s.Hint: If ΠΩ is spanned by the s columns of A,
In the two-way layout of the preceding problem give examples of submodels Π(1)Ω and Π(2)Ω of dimensions s1 and s2, both less than ab, such that in one case the condition (11.57) continues to
Let Xijk (k = 1,...,nij ; i = 1,,...,a; j = 1,...,b) be independently normally distributed with mean E(Xijk) = ξij and variance σ2. Then the test of any linear hypothesis concerning the ξij has a
In Example 11.3.3, verify the Huber Condition holds.
Verify (11.52).
Verify the claims made in Example 11.3.1.
Prove Lemma 11.3.3. Hint: For part (ii), use Problem 11.61.
Prove (i) of Lemma 11.3.2.
Determine the maximum asymptotic level of the one-sided ttest when α = .05 and m = 2, 4, 6: (i) in Model A; (ii) in Model B.
Show that the conditions of Lemma 11.3.1 are satisfied and γhas the stated value: (i) in Model B; (ii) in Model C.
In Model A, suppose that the number of observations in group i is ni. if ni ≤ M and s → ∞, show that the assumptions of Lemma 11.3.1 are satisfied and determine γ.
Verify the formula for V ar(X¯) in Model A.
(i) Given ρ, find the smallest and largest value of (11.42) asσ2/τ 2 varies from 0 to ∞.(ii) For nominal level α = .05 and ρ = .1, .2, .3, .4, determine the smallest and the largest asymptotic
Under the assumptions of Lemma 11.3.1, compute Cov(X2 i , X2 j )in terms of ρi,j and σ2. Show that V ar(n−1 n i=1 X2 i ) → 0 and hence n−1 n i=1 X2 iP→ σ2.
Showing 1400 - 1500
of 5757
First
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
Last