All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Tutor
New
Search
Search
Sign In
Register
study help
business
applications of the time value of money
Questions and Answers of
Applications Of The Time Value Of Money
1. Derive the difference equation (7.21) without introducing the distribution P(a,k) by applying the law of total probability to expectations.
. How large must ajc be when c = 10 to ensure no more than a 50% chance of absorption at 0 (going broke) when p = 0.4, p = 0.5 and p = 0.6? Repeat the calculation when c = 20. What conclusion do you
. Suppose a random walk is defined by X^ = X^ _ 1 + n = 1 , 2 ,3,...where now the are i.i.d. random variables with Pr(Z^ = + 1 ) = p, Pr (Zj = 0) = r, Pr(Z, = — 1) = p and p q + r = 1.(i) Sketch a
. If P^(p) denotes the probability that a simple random walk is absorbed at zero when there are absorbing states at 0 and c and the probability of a step up is p, deduce that Pa(P}= I - i’r-ad -P l
. To solve the difference equation for the probability that the simple random walk is absorbed at 0 , with boundary conditions Pq = I, = 0 , proceed as follows, the method being analogous to that of
. Deduce that P^ = 1 — ale (equation (7.16)) when p = q = ^ by a limiting argument from (7.17). Do this by setting p = = A and letting A^O.
. In certain gambling situations (e.g. horse racing, dogs) the following is an approximate description. At each trial a gambler bets $m, assumed fixed.With probability q he loses all the Sm and with
. Let _ ] -h Z„, n = 1,2,..., describe a random walk in which the are independent normal random variables eaeh with mean ¡a and variance Find the exact probability law of if Xq = Xq with
. For a simple random walk enumerate all possible sample paths that lead to the value X4 ^ - 2 after 4 steps. Hence verify formula (7.5) for Pr(X4- -2)._
. Let X { Xq, X ,, X2 ,...} be a random process in discrete time and with a diserete state space. Given that sueeessive increments X, -X1 ,... are independent, show that X is a Markov process.
. Given physical examples of the four kinds of random process ((a)-(d) in Section 7.1 ). State in each case whether the process is a Markov process.
. Prove that if X„-^X then
. Let be a sequence of random variables with E( Y„) 0 and Var ( y„) ^ 0 as n ^ CO. Show that is a consisterli estimator of 6. (Him: Use Chebyshev's inequality to find an upper bound for Pr
. Let X have probability density Exercises 121 fM = 0, X < a.Let X1 , X2 , ..., X^ be a random sample of size n for X. Show that= min (X1 , X2 , ..., XJ is a consistent estimator of a.
. Let X be uniformly distributed on (0,a). Let X ,, X2 , ..., X^ be a random sample of size n for X. Show that = max (X ,, X2 , ..., X^) is a consistent estimator of a.
. In statistics, if a sequence of random variables {6^, n = 1,2,...[ converges in probability to a parameter 6, then 6„ is called a consistent estimator of 6.(See, for example, Hogg and Craig,
. How many observations are required to ensure that the probability is at most 0.1 that the relative frequency of an event differs from its actual(unknown) probability by no more than 0.1? What if it
. Let X be uniformly distributed on (0,1) and for n= 1,2,...let X„ be uniformly distributed on (0, 1 + \/n). Show that X„ ^ X but X^ -A X.
. For /7 = 1,2,... let Pr(X„ = p"") = 1/n and Pr (X„ = 0) ^ 1 - l//i. Show that X^ -^0 yet E(X^) GO as n oc.
. Let c be a constant. Prove that if X„^r, then X^-^c. Hence give an alternative proof of Corollary 3 to Theorem 6.12 using the result of Exercise 13.
. Show that in Example 1 of Section 6.3, X^-^X but X„ ^ X.
. Prove Corollaries 1,2 and 3 to Theorem 6.12.
. If c is a real number and X is a random variable, prove that for any c,/7 > 0, Pr {IX - c| ^ c[ ^ E( IX - cD/c"'.
. Prove Theorem 6.11.
. Show that if the conditions of Theorem 6.10 are satisfied, then for any/c > 0, Pr {I X - ia\ ^ k(j} < \/k^.
. Let X^ be a Poisson random variable with parameter /, so that E(X^) = Var (X^) = 2. Show that as 2 x . (X - 2 ) / ^ X(0. 1). This is the basis for the normal approximation to the Poisson
. Let ! /( = 1,2__[be independent and identically distributed with finite means ^ and variances Prove that the sample mean X^ ^¡a as /7 x .
. If for n = K2,..., Pr(X„ = 0) = 1 / / 7 = 1 - Pr(X^ = 1) and Pr(X = 1)^ 1, prove that X.
. Prove that if EiV) < 00 then E (V ' ‘) < CO. n = 1 , 2 ,__In particular, if E{X^) < 00 then E(X) < 00. Hence if the second moment of X is finite then so too are its mean and variance. Thus, the
. Let X j and X 2 be independent random variables both uniformly distributed on ( - 1,1). Show that the density oí X = X ^ -f X 2 is/ ( v ) - ( 2 - |v |) /4 , v e (-2 ,2 )and that E(X) = 0, Var(X) ^
. Prove the DeMoivre- Laplace form of the central limit theorem directly from the characteristic function of a binomial random variable, 0 J/) = (q^
. Let n = 1,2,...} be normal random variables with E{X^)=\/n, VariX^) = 1 . Sketch the densities of Vj, X 2 and X Prove that X^ A^(0, 1) as f? ^ 00.
. Use characteristic functions to show that if X , and X 2 are independent normal random variables with means ¡a¡.1^2 and variances er f ,c r ^ , then V = V 1 -f V 2 is normal with mean ¡.i = +
. If X is A^(0, 1) show that for k = 1,2,3,..., E(A'^d = (2i
. Verify the relation (6.4):(^am. 2 ,(Q) ^ 4 - 1 )(/)
. Use characteristic functions to show that if X ^ and X 2 are independent Poisson random variables with parameters 2 , and X2 . then X = X ^ X2 is Poisson with parameter 2 = 2j -f 2 2 - Generalize
. Use Theorem 6.3 to obtain the characteristic function of a normal random variable with mean jj and variance from that of a standard normal random variable.
. Show that the characteristic function of a random variable which is gamma distributed with parameters n (a positive integer) and 2 is f 2 ( 2 + /■/)'120 Convergence of sequences of random
. Show that the characteristic function of a Poisson random variable with parameter A is(f){i) = exp[Me‘’ - I)].
. Let X be a random variable with Pr {X=\ )=Pr {X= - 1) = Consider the following sequence which is supposed to be a random sample for X:{ - 1 , 1 , - 1, - 1, - 1, L - 1 , 1 , L 1, - 1 , 1 , - 1 , 1 ,
. Let X have distribution function F(x) = x", 0 < x ^ 1, where n is a positive integer ^ 2. Give two methods of finding random numbers for X from uniform random numbers? (Hint: Use the result of
. How might random numbers for a random variable which is gamma distributed with p = n a positive integer ^ 2 be obtained from uniform random numbers?
. Verify that the four partial derivatives of Gj and G2 are as given in (5.4)-(5.7). Hence verify that J{x, y) = - T exp [ - -I- y^)J.In
. Show that in a linear congruential sequence with n = 2"^, n? odd and I = 1(mod 4) the maximum period is attained.
. Consider the linear congruential sequence with I = 15, w = 17, r? = 49 and Ni = 15. It is the maximum period (49) attained? What if N , ^ 16? (4ns: yes, yes.)
. What is the maximum period attainable with a multiplicative congruentiaJ sequence with / = 11 and n = 16? What will the period be with /V i = 1?N, =2? (Ans:4A.2.)
. Instead of using the relation X = - (1/2) In (1 - U) to generate random numbers from an exponential distribution we may use X = - (1/2) In U.Why?
. Let X have distribution function F. Prove that if F is continuous, then FiX)is C(0 , 1 ).
. Find expressions for the reliability functions of the systems in Fig. 4.8.
. (a) Verify formula (4.20).(b) Use the following general result for sets 4¿, /= 1,2,...,n , P( (j t I P ( / I , n / I , . )\ - I / i - \ d i s t i n c t p a i r s d i s t i n c t triples-----+ (
. A plane has four engines whose failure times are gamma distributed with mean 100 hours and p = 4. If at least three engines must operate for the plane to fly, what is the probability that a flight
. Show that for n components in parallel,' ■ ( / ) = i
. For n independent components in series deduce that the failure time is T = min(7i, 72,..., T„)whereas for n components in parallel 7 =max( 7 i , 7 2 , . . . , 7 J.Figure 4.8 Hence ded uce that
. After a person reaches age 50 the failure times of his heart and liver are gamma distributed with means of 10 years. For the heart p = 3, whereas for the liver p = 2. Assuming the heart and liver
. If Ti, T2 are independent and gamma distributed with parameters n^and À2, n2 prove that (c.f. (4.13))( l - 2 ) ”‘ ^ ÀHn,4-k-2)l Fv{T,
. A system has m locations at which the same component is used, each component having an exponentially distributed failure time with mean 1 /L When a component at any location fails it is replaced by
. This problem shows the crucial importance of back-up systems. The main power supply at a hospital has a failure time which is gamma distributed with 2 = 1/1000 hours and p = 4. A standby power
. A machine component has an exponentially distributed failure time with a mean of 2 0 hours.(i) If there are an original and just one spare, find the probability that the component is operational
. Prove that if the failure rate is constant then the failure time is exponentially distributed.
. Prove that if 7,, Tj are independent and gamma distributed with parameters 2, pj and 2, P2 , then 7 = 7j -h 7 2 is gamma distributed with parameters X, p = p^ A- Pi.(Hint: Use the relation between
. Let T be gamma distributed with parameters p and 2=1.(a) Show that 1 V P - 1 1 t'ÓU.(b) Hence deduce that the failure rate is a decreasing function of n f p < 1 and an increasing function of t if
. Let T have a truncated normal density 78 Reliability theory m - exp na 2a^/ > 0 .Find a formula for k.
. If T has a Weibull distribution with parameters X and p, prove E{T) = X-^‘^r(] + 1/p)Var(T) = 2-"^i r( l + 2 /p ) - r " ( l + 1/p)].
. With time in hours a machine has a failure time which has a Wei bull distribution with p = .5 and 2. = A hours'-^.(a) What are the mean and variance of the failure time?(b) What is the expectation
. Prove that the distribution function of a random variable which is gamma distributed with parameters X and p = \ is F{1)= 1 - c ' (21)2 Qu" F(t)=1-e 1+21+ +. + 2! (n - 1)!
. Show that the mean and variance of a random variable T with a gamma density are E(T) = p / l Var(7) = pMT
. Show that if \ is a positive integer, then V(n) = (n - 1)! (Hint: First show V(n) = (n - \ )V(n - 1 ).)
. Verify the substitution property f(x)d{x)dx=f{0), e.by evaluating lim f(x)3^(x)dx£-0 J when (a) f(x) = cos x, (b) f(x) = ¿'L
. Let N be binomial with parameters n and p and let {Xj^.k = 1,2,...} be i.i.d. with mean p and variance Let = + X 2 + + X¡^.Show £[5/vJ = pnp Var [S^,] = nplu^i 1 - p) + cr^].
. When a needle of length L < 1 is dropped onto a surface marked with parallel lines distance 1 apart, the probability of an intersection is 2Lln. Lf when the needle lands it breaks into N -f 1
. Using a suitable subset of the Lansing Woods picture (Fig. 3.7), test by one or more of the methods outlined in Section 3.6 (or otherwise) the Poisson forest assumption. When doing this ignore the
. Let { X = 1,2,...,^?} be a random sample of poini-to-plani distances.Show that under the assumption of a Poisson forest an unbiased estimator of the reciprocal of 2 (where 2 is the expected number
. If R is the distance to the nearest object in a Poisson forest with intensity 2, show that is exponentially distributed with mean \/Xn. Use this result in Exercise 21.
. Using the result of Exercise 18, test whether the spatial distribution in Fig. 3.14 is a Poisson forest. (Use x^-)
. A finite Poisson forest with an average number A of trees per unit area is divided into N cells of equal area A. Show that the expected number of cells containing k trees is nf = N exp( — n)n^/kl
. Lei an HPPP start at / = 0. Let 5 > 0 be an arbitrary but fixed time point.Find the density of the time intervaJ back from s to the most recent event.
. Let N be an HPPP on the line. Show that Cov [/V(/,,/2 ), NU2J 4)] = X\(t^,t2)n{l2,t^)\where |(s,/)| = / - s denotes the length of an interval.
. Let N be an HPPP on the line. Given that one event occurred in (0, /] show that its time of occurrence is uniformly distributed on (0,/).
. With the same setup as in Exercise 13, show that the probability that the distance between two bacteria is greater than 10cm is p = exp( - 27T/15) .66.
. Water in a reservoir is contaminated with bacteria which occur randomly with mean rate 1 per 10 litres. If a person drinks a litre of water every day, show that the probability he swallows one or
. Consider an HPPP with intensity / in three dimensions. Show that the distance to the nearest point has density//?(r) = 427rr^exp( - AXnr^l3).
. Northbound cars arrive at an intersection regularly at timesa, 2a, 3a,... whereas eastbound cars arrive at random (H PPP) with mean rate X. If a northbound and eastbound car arrive within c of each
. Vehicles, consisting of cars and trucks, arrive at a checkpoint as HPPP in time with intensity A. If the probability that a given vehicle is a car is p, show that the arrival times of cars are HPPP
. If A/i and N2 are independent homogeneous Poisson point processes(HPPP) with intensities A, and X2, show that /V = /V1 + A2 HPPP with intensity X = X I Xr ^2-
. Show that the time interval between two events in a homogeneous Poisson point process with intensity X has an exponential distribution with mean\/X. {Hint: Use the law of total probability.)
. A population of size N contains initially M marked individuals but with probability q each marked individual loses its mark. A sample of size n is subsequently obtained. Under the approximation of
. (a) For inverse sampling show that the maximum likelihood estimate for N is m(b) An estimator is unbiased if its expectation is equal to the quantity it estimates. We saw in Exercise 3 that for
. in the inverse sampling method of capture-recapture, the sampling continues until a predetermined number, m, of marked individuals is obtained. Thus the sample size is a random variable Y. Under
. A bank dispenses ten thousand $ 1 bills and finds out later that 100 of them are counterfeit. Subsequently 100 of the $1 bills are recovered.(a) Write down an exact expression for the probability p
. Without doing any calculations deduce that the expected value of N is infinite.
. A lake contains an unknown number offish. One thousand fish are caught, marked and returned to the lake. At a later time a thousand fish are caught and 100 of them have marks. Estimate the total
. Show by direct calculation from E(X) = Y.kPr(X=k)k lhat if X is a hypergeometric random variable with parameters M, n and/V, then E(X) = Mn/N. (Hint: Assuming M < n.M= kI=0 Mk Factor nM/N out of
. Let X1 , X 2. Y\. Yz be independent and uniform on (0,1). Let X = IX2 - X, I and Y = {¥2 - L, |. We have seen in Section 2.2 that the densities of X and Y are f^(x) - 2 (1 - x), fyiy) = 2( \ - y),
. A point is chosen at random within a square of unit side. If U is the square of the distance from the point to the nearest corner of the square, show that the distribution function of U is F,(u)
. With reference to the dropping of two points randomly in a circle,(i) Show P{r + dr I at least one point is in S] ^ arccos i —(ii) Complete the proof that the density of R is given by (2.1).(iii)
. Let (7, F, W be independent random variables taking values in (0, 0 0 ).Show that the density Y = U V W is friy) = fu(u)fy{z - u)ffy{y - z) du dz.
. Let X and V be independent random variables, both exponentially distributed with means l/2i and I//I2 . Find the density of Z = X + 7.
. (Laplace's extension of the Buffon needle problem.) A needle of length L is thrown at random onto a rectangular grid. The rectangles of the grid have sides A and B where A, B> L.(a) Find the
. Consider Buffon’s needle problem with parallel lines distance 1 apart and a needle of length L, 1 < L < 2. Show that the probability of one intersection is Pi = -(L + 2 arccos(l/L) - 2 (L^ - 1
. Two people agree to meet between 1 2 noon and 1 p.m., but both forget the exact time of their appointment. If they arrive at random and wait for only ten minutes for the other person to show, prove
Showing 100 - 200
of 296
1
2
3