All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
statistical sampling to auditing
Questions and Answers of
Statistical Sampling To Auditing
8. Let (Xal , . . • , Xa p )' a = 1, ... , n, be a sample from any p-variate distribution with zero mean and finite nonsingular covariance matrix I . Then the distribution of T 2 defined by (10)
7. Null distribution of Hotelling's T 2• The statistic W = YS- IY/ defined by (6). where Y is a row vector. has the distribution of a ratio. of which the numerator and denominator are distributed
6. Let Z be the m X p matrix (Zai)' where p ;5; m and the Zai are independently distributed as N(O.I). let S = Z/Z. and let SI be the matrix obtained by omitting the last row and column of S. Then
5. Let Z,,; (IX = 1, . .. , m; i = 1, .. . , p) be independently distributed as N(O,1), and let Q = Q(Y) be an orthogonal m X m matrix depending on a random variable Y that is independent of the Z
4. In the case r = 1, the statistic W given by (6) is maximal invariant under the group induced by G, and G3 on the statistics Y;, Va; (i = 1, . . . , p; IX = 1, . . . , S - 1), and S = Z'Z. [There
3. (i) If A and Bare k X m and m X k matrices respectively, then the product matrices AB and BA have the same nonzero characteristic roots. (ii) This provides an alternative derivation of the fact
2. [(ii): The V's are eliminated through G,. Since the r + m row vectors of the matrices Y and Z may be assumed to be linearly independent, any such set of vectors can be transformed into any other
1. (i) If m < p , the matrix S, and hence the matrix S/m (which is an unbiased estimate of the unknown covariance matrix of the underlying p-variate distribution), is singular. If m p, it is
71. In the regression model of Problem 8, generalize the confidence bands of Example 12 to the regression surfaces (i) ht(et,· ··,es ) = Ej-IeA; (ii) h2(e2,. .. , es ) = PI + Ej_2ejPj •
70. In generalization of Problem 66, show how to extend the Dunnett intervals of Problem 69 to the set of all contrasts. [Use the fact that the event Iy; - Yol ::; fj. for i = 1, . . . ,s is
69. Dunnett's method. Let XOj (j = 1, .. . , m) and X;k (i = 1,. .. , s; k = 1, . . . , n) represent measurements on a standard and s competing new treatments, and suppose the X's are independently
68. Construct an example [i.e., choose values nl = . .. = ns = n and a and a particular contrast (cl , .. . , cs ») for which the Tukey confidence intervals (121) are shorter than the Scheffe
67. (i) Let X;j (j = 1, . . . , n; i = 1, .. . , s) be independent N( t, 02), 0 2 unknown. Then the problem of obtaining simultaneous confidence intervals for all differences - ~i is invariant under
66. In the preceding problem consider arbitrary contrasts [c;~; , [c; = O. The event (120) I( x, - X;) - (~j - ;) I £\ for all i *" j is equivalent to the event £\ (121) ILc;X; - Lc;~;1 s 2" Lkl
65. Tukey :s T-Method. Let X; (i = 1, .. . , r) be independent N(L I), and consider simultaneous confidence intervals (116) L[(i,j) ; x) s ~j - t s M[(i ,j); x) for all i *" j . The problem of
64. Let (X\ j\" " ,X\jn; X2j\" " ,X2jn ; . .. ; X"j\, ,, ,,X"jn),j=l, .. . ,b,bea sample from an-variate normal distribution. Let E(X;jk) = i' and denote by ~;; the matrix of covariances of (X;j\ ' "
63. Among all tests that are both unbiased and invariant under suitable groups under the assumptions of Problem 62, there exist UMP tests of (i) HI : a\ = . . . = a" = 0; (ii) H2 + ( 2 ) s C; (iii)
62. Formal analogy with the model of Problem 61 suggests the mixed model X; jk = P. + a i + Bj + Cij + ii, with the B's, C's, and U's as in Problem 61. Reduce this model to a canonical form involving
61. Permitting interactions in the model of Problem 57 leads to the model X;jk = J.L + Ai + Bj + Cij + U;jk (i = 1, ... , a; j = 1, ... , b; k = 1, . . . , n) .where the A's, B's, C's, and U's are
60. Under the assumptions of the preceding problem, determine the UMP invariant test (with respect to a suitable G) of H: ~l = .. . = ~p" [Show that this model agrees with that of Problem 58 if p =
59. Let (Xl)" .. , Xpj ) , j = 1, .. . , n, be a sample from a p-variate normal distribution with mean -l/(p - 1). [For fixed a and p < 0, the quadratic form (1/a2)LLaijYiJ'j = LY? + PLLYiJ'j takes
58. For the mixed model Xi} = J.L +a, + + U;j (i=l, ... ,a; j=l, . . . ,n), where the B's and u's are as in Problem 57 and the a's are constants adding to zero, determine (with respect to a suitable
57. Consider the additive random-effects model X jk = J.L + Ai + Bj + U;jk (i=l,. ..,a; j=l, .. . ,b; k=l, . . . ,n), where the A's, B's, and U's are independent normal with zero means and variances
56. Under the assumptions of the preceding problem. the null distribution of W· is independent of q and hence the same as in the normal case. namely. F with r and n - s degrees of freedom. [See
55. Consider the following generalization of the univariate linear model of Section 1. The variables (i = 1•... , n) are given by X, = ~i + U;, where (UI,···. Un) have a joint density which is
54. Consider the mixed model obtained from (115) by replacing the random variables Ai by unknown constants ai satisfying Eo, = O. With (ii) replaced by (ii') Eaf/(na'E + ( 2), there again exist tests
53. Consider the model II analogue of the two-way layout of Section 6, according to which constant (which may be zero): (i) ol/o2; (ii) o}/(nol + ( 2); (iii) aJ/(nol + ( 2). Note that the test of (i)
52. The general nested classification with a constant number of observations per cell, under model II, has the structure X; jk . .. = P. + Aj + Bij + Cjj k + ... + U;jk .. . , i = 1, .. . , a; j = 1,
51. If X;j is given by (95) but the number n, of observations per batch is not constant, obtain a canonical form corresponding to (96) by letting Y;1 = F: X; • . Note that the set of sufficient
50. The tests (102) and (103) are UMP unbiased.
49. In the model (95), the correlation coefficient p between two observations X;j ' X;k belonging to the same class, the so-called intraclass correlation coefficient, is given by p = a]/(a] + a2 ).
48. (i) The test (97) of H: A s Ao is UMP unbiased. (ii) Determine the UMP unbiased test of H : A = Ao and the associated uniformly most accurate unbiased confidence sets for A. 438
47. (i) In Example 10, the simultaneous confidence intervals (89) reduce to (93). (ii) What change is needed in the confidence intervals of Example 10 if the v's are not required to satisfy (92),
46. (ii) The most general confidence sets (87) which are equivariant under G, G, and G3 are of the form (88). (i) In Example 11, the set of linear functions w,a, w, (,.-..) for all w can also be
45. (i) The confidence intervals L(u; y, S) =u,y,c(S) are equivariant un- der G, if and only if L(u; by, bS) = bL(u; y, S) for all b> 0.
44. Let X, (i=1,.,r) be independent N(,, 1). (i) The only simultaneous confidence intervals equivariant under Go are those given by (80). (ii) The inequalities (80) and (82) are equivalent. (iii)
43. (i) A function L is equivariant under G2 if and only if it satisfies (64). For the confidence sets (70), equivariance under G and G reduces to (71) and (72) respectively. => (ii) For fixed
42. (i) A function L satisfies the first equation of (62) for all u, x, and orthogonal transformations Q if and only if it depends on u and x only through u'x, x'x, and u'u. (ii)
41. Give an example of an analysis of covariance (46) in which (56) does not hold but the level of the F-test of H: a l = . . . = ab is robust against nonnormality.
40. Show how to weaken (56) if a robustness condition is required only for testing a particular subspace TI", of TIu. [Suppose that TI", is given by PI = . . . = P, = 0, and use (54).]
39. Show that E7-1n;; = s. [Since the TI;; are independent of A, take A to be orthogonal.]
38. If~ = a + Pt;+ YU;, express the condition (56) in terms of the t's and u's.
37. (i) Under the assumptions of Problem 30, express the condition (56) in terms of the t's. (ii) Determi-ie whether the condition of part (i) is equivalent to (51).
36. Let CII = Uo + U1n + ... + Uk nk , U; 0 for all i. Then c; satisfies (56). [Apply Problem 35 with c~ = nk .]
35. Let {clI } and { 00 . Then {cn } satisfies (56) if and only if {c:.} does.
34. Suppose (56) holds for some particular sequence TI~n) with fixed s. Then it holds for any sequence TI~n) c TIbn) of dimension s' < s. [If TIu is spanned by the s columns of A, let TID be spanned
33. In the two-way layout of the preceding problem give examples of submodels TIUl and TIll) of dimensions S1 and S2, both less than ab, such that in one case the condition (56) continues to require
32. Let ~jk (k=1, . . . ,n;j; i = 1, . .. ,a; j=1, . . . ,b) be independently normally distributed with mean E(~jd = e;j and variance C1 2• Then the test of any linear hypothesis concerning the
31. Verify the claims made in Example 8.
30. Let Xl" ' " Xn be independently normally distributed with common variance 0 2 and means t = a + Pt; + rt;, where the t, are known. If the coefficient vectors (tt , . . . , t:), k = 0,1,2, are
29. Let Xl • •. . ' Xm ; Yl , . 0" y" be independently normally distributed with common variance 0 2 and means E(X;) = a + P(u; - u), E(lj) = r + 8(vj - 0), where the u's and v's are known
28. In a regression situation, suppose that the observed values and lj of the independent and dependent variable differ from certain true values Xl and lj' by errors 1l.J, V; which are independently
27. In the three-factor situation of the preceding problem, suppose that a = b = m. The hypothesis H can then be tested on the basis of m2 observations as follows. At each pair of levels (i, j) of
26. Let X;jk (i = 1, . . . , a; i - 1, ... , b; k = 1, . . . , m) be independently normally distributed with common variance a2 and mean E( X;jk) = II- + a; + Pj + Yk (La; = LPj = LYk = 0). Determine
25. Let X>, denote a random variable distributed as noncentral X2 with f degrees of freedom and noncentrality parameter '),,2. Then X>,, is stochastically larger than X>, if A < A'. [It is enough to
24. The size of each of the following tests is robust against nonnormality: (i) the test (35) as b --+ 00, (ii) the test (37) as mb --+ 00 , (iii) the test (39) as m --+ 00. Note. Nonrobustness
23. In the two-way layout of Section 6 with a = b = 2, denote the first three terms in the partition of EEE(X;jk - X;j.)2 by S;, SJ, and SiB' corresponding to the A, B, and AB effects (i.e. the a's,
22. The linear-hypothesis test of the hypothesis of no interaction in a two-way layout with m observations per cell is given by (39).
21. The Tukey T-method leads to the simultaneous confidence intervals (114) cs I( Aj.- X;.) - (JLj - JL;) Is 'sn( n - r for all i, j .[The probability of (114) is independent of the JL'S and hence
20. Show that the Tukey levels (vi) satisfy (29) when s is even but not when s is odd .
19. Prove Lemma 2 when s is odd.
18. In Lemma 1, show that as - \ = at is necessary for admissibility.
17. (i) For the validity of Theorem 1 it is only required that the probability of rejecting homogeneity of any set containing {JL; , . ••, JL ; } as a proper , v, subset tends to 1 as the
16. Show that r+l ( y, + .. . + Y.)2 r ( y, + ... + Y.)2 L Y - \ r+ \ _ L Y _ 1 r > 0 ; - 1 I r + 1 i-I I r - .
15. (i) If XI" '" Xn is a sample from a Poisson distribution with mean E(X;) = ).., then {n (IX - IX) tends in law to N(O,~) as n -> 00 . (ii) If X has the binomial distribution b(p, n), then
14. Let 2 1" " , Z, be independently distributed as N(r;, a;), i = 1, .. . , s, where the 0 ; are known constants. (i) With respect to a suitable group of linear transformations there exists a UMP
13. If the variables Xi} (j = 1, .. . , n;; i = 1, .. . , s) are independently distributed as N(p.;, ( 2 ), then E[En;(X;.- x.i] = (s - 1)02 + En;(p.; - p.i, E[EE(X;j - x;i] = (n - S)02 .
12. Under the assumptions of the preceding problem suppose that E(X;) = t = Ej_lo;A, E(Y,) = 1/;= Ej-1b;A with the n X s matrices A = (a ;j) and B = (b; j) of rank s. Then the experiment based on the
11. Consider two experiments with observations (Xl" '" Xn ) and (YI , · .. , Y,,) respectively, where the X; and Y; are independent normal with variance a2 = 1 and means E(X;) = c;8;, E(Y;) = 8;.
10. Let XI" ' " x" be independently normally distributed with known variance aJ and means E( X;) = j' and consider any linear hypothesis with s n (instead of s < n which is required when the variance
9. Let X; j (j= 1, .. . ,mj ) and Y;k (k= 1,. .. ,n;) be independently normally distributed with common variance a2 and means E(X;j) = and E(Y;) = t + !!.. . Then the UMP invariant test of H :!!.. =
8. Under the assumptions of Section 1 suppose that the means t are given by s ~; = L a;/Ji, j-l where the constants aij are known and the matrix A = (aij) has full rank, and where the fJj are unknown
7. Given any 1/12 > 0, apply Theorem 9 and Lemma 3 of Chapter 6 to obtain the F-test (7) as a Bayes test against a set 0' of alternatives contained in the set 0< 1/1::::; 1/12
6. Use Theorem 8 of Chapter 6 to show that the F-test(7) is a-admissible against 0' : 1/1 1/1, for any 1/1, > O.
5. Best average power. (i) Consider the general linear hypothesis H in the canonical form given by (2) and (3) of Section 1, and for any 'IJr+ I' .. . , 'IJs'a, and p let S = S('IJr+ I" . . , 'IJs'
4. (i) The noncentral X2 and F distributions have strictly monotone likelihood ratio . (ii) Under the assumptions of Section 1, the hypothesis H' : 0/2 s o/~ (% > 0 given) remains invariant under the
3. Noncentral F- and beta-distributionI Let Y" . .. , Y,.; Y.+ I' .. ·' Yn be independently normally distributed with common variance (12 and means £(Y;) = 'IJ;(i = 1, ... , r); £(Y;) = 0 (i = s
2. Noncentral X2-distribution· . (i) If X is distributed as N(!/t, 1), the probability density of V = X2 is pf( v) =. f.f'-OPk(!/t )fu+ 1(v), where Pk(!/t) = (!/t2/2)ke-(lj2),y2/k! and where fu + 1
1. Expected sums of squares. The expected values of the numerator and denominator of the statistic W· defined by (7) are ( r Y2) 1 r [n Y2] E L -' = ,,2+ - L r17 and E L -=- = ,,2. ;=1 r r i - I i
81. Under the assumptions of Problem 79, suppose that a family of confidence sets S( x) is equivariant under G*. Then there exists a set B in the range space of the pivotal V such that (70) holds. In
80. Under the assumptions of the preceding problem, the confidence set S( x) is equivariant under G*.
79. (i) If Gis transitive over flEx w and V(X, 8) is maximal invariant under G, then V( X, 0) is pivotal. (ii) By (i), any quantity W( X, 8) which is invariant under Gis pivotal; give an example
78. Let V( X, 8) be any pivotal quantity [i.e, have a fixed probability distribution independent of (8, ~)], and let B be any set in the range space of V with probability P(V E B) = 1 - Q . Then the
77. (i) Let XI' . . . , Xm ; YI , . . . , Y,. be LLd. according to a continuous distribution F, let the ranks of the Y's be SI < . .. < Sn' and let T = h(SI) + .. . +h(SII)' Then if either m = n or
76. The Kolmogorov test (56) for testing H : F = Fo (Fo continuous) is consistent against any alternative FI Fo, that is, its power against any fixed FI tends to 1 as n -> 00. [The critical value /).
75. The totality of permutations of K distinct numbers al"' " a K for varying a., • . . , aK can be represented as a subset CK of Euclidean K-space RK, and the group G of Example 8 as the union of
74. Let XI"'" x" be a sample from Na, a2 ) , and consider the UMP invariant level-a test of H: ~/a :=; 80 (Section 6.4). Let an(F) be the actual significance level of this test when XI" ' " X; is a
73. The following UMP unbiased tests of Chapter 5 are also UMP invariant under change in scale: (i) The test of g :=; go in a gamma distribution (Problem 73 of Chapter 5). (ii) The test of bl :=; b2
72. The UMP invariant test of Problem 69 is also UMP similar. [Consider the problem of testing a = 0 vs. a > 0 in the two-parameter exponential family with density ( a I-a) C(a,T)exp --2LX; - --Llx;!
71. Show that the test of Problem 5(i) reduces to (i) [x( nl - x(1)l/S < c for normal vs, uniform; (ii) [x - xod/S < c for normal vs. exponential; (iii) [x - x o d / [x(n) - xod < c for uniform vs.
70. Uniform us. triangular. (i) For lo( x) = 1 (0 < x < I), II(x) = 2x (0 < x < I) , the test of Problem 68 reduces to rejecting when T = x(nJ!:x < C. (ii) Under 10' the statistic 2n log T is
69. Normal us. double exponential. For lo(x) =e- x 2 / 2 / & , II(x) = e-1xl/ 2, the test of the preceding problem reduces to rejecting when /r.x;/r.lxil
68. Let Xl" '" x" be a sample from a distribution with density 1 (Xl) (XII) -;;;f ..·f ,where I( x) is either zero for x < 0 or symmetric about zero. The most powerful scale-invariant test for
67. Consider the problem of obtaining a (two-sided) confidence band for an unknown continuous cumulative distribution function F. (i) Show that this problem is invariant both under strictly
66. If the confidence sets S(x) are equivariant under the group G, then the probability P8 {8 E S( X)} of their covering the true value is invariant under the induced group G.
65. Let Xi; (j = 1, .. . , n;; i = 1, ... , s) be samples from the exponential distribution E(~; 0). Determine the smallest equivariant confidence sets for (~l .. . , t) with respect to the group
64. Let Xl " ' " x" be a sample from the exponential distribution E( ~, 0). With respect to the transformations X[ = bX; + a determine the smallest equivariant confidence sets (i) for 0, both when
63. Solve the problem corresponding to Example 20 when (i) Xl" ' " X" is a sample from the exponential density Ea,o), and the parameter being estimated is 0; (ii) Xl " '" X" is a sample from the
62. Generalize the confidence sets of Example 18 to the case that the X; are Na;, d;( 2 ) where the d's are known constants.
61. Let Xl •. . .• Xm ; Yl ,.. . , y" be independently normally distributed as N( t ( 2 ) and N( T/, ( 2 ) respectively. Determine the equivariant confidence sets for T/ - ( that have smallest
Showing 100 - 200
of 3033
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Last