All Matches
Solution Library
Expert Answer
Textbooks
Search Textbook questions, tutors and Books
Oops, something went wrong!
Change your search query and then try again
Toggle navigation
FREE Trial
S
Books
FREE
Tutors
Study Help
Expert Questions
Accounting
General Management
Mathematics
Finance
Organizational Behaviour
Law
Physics
Operating System
Management Leadership
Sociology
Programming
Marketing
Database
Computer Network
Economics
Textbooks Solutions
Accounting
Managerial Accounting
Management Leadership
Cost Accounting
Statistics
Business Law
Corporate Finance
Finance
Economics
Auditing
Hire a Tutor
AI Study Help
New
Search
Search
Sign In
Register
study help
business
theory of probability
Questions and Answers of
Theory Of Probability
The random variables \(\xi\) and \(\eta\) are independent and identically distributed with the density function\[ p_{\xi}(x)=p_{\eta}(x)=\frac{C}{1+x^{4}} \]Find the constant \(C\) and prove that
The random variables \(\xi\) and \(\eta\) are independent and their density functions are, respectively, given byProve that the variable \(\xi \eta\) is normally distributed. 1 P (x)= (x < 1) and P
Let \(\xi\) and \(\zeta\) be independent and let them have the density functions\[ p_{\xi}(x)=p_{\zeta}(x)= \begin{cases}0 & \text { for } x \leqslant 0 \\ \lambda e^{-\lambda x} & \text { for }
The random variables \(\xi\) and \(\eta\) are independent and are uniformly distributed on the interval \((-1,1)\). Compute the probability that the roots of the equation \(x^{2}+\xi x+\eta=0\) are
A random variable \(\xi\) takes on only integral nonnegative values with probabilities(a) \(\mathrm{P}(\xi=k)=\frac{a^{k}}{(1+a)^{k+1}}, a>0\) is a constant (this is the Pascal distribution).(b)
Let \(\mu\) be the number of occurrences of an event \(A\) in \(n\) independent trials, in each of which \(P(A)=p\). Find(a) \(M \mu^{3}\),(b) \(M \mu^{4}\),(c) \(M|\mu-n p|\)
The probability that event \(A\) will occur in the \(i\) th trial is \(p_{i}\). Let \(\mu\) be the number of occurrences of \(A\) in the first \(n\) independent trials. Find(a) \(M \mu\),(b)
Prove that, given the conditions of the preceding problem, \(D \mu\) reaches a maximum for the given value of \(a=\frac{1}{n} \sum_{1}^{n} p_{i}\) provided\[ p_{1}=p_{2}=\ldots=p_{n}=a \]Preceding
Let \(\mu\) be the number of occurrences of an event \(A\) in \(n\) independent trials, in each of which \(P(A)=p\). Also, let a variable \(\eta\) be 0 or 1 depending on whether \(\mu\) proves to be
The density function of a random variable \(\xi\) is\[ p(x)=\frac{1}{2 \alpha} e^{-\frac{|x-a|}{\alpha}} \](the Laplace distribution). Find \(\mathbf{M}_{\xi}\) and \(\mathbf{D \xi}\).
The density function of the absolute speed of a molecule is given by the Maxtell distribution\[ p(x)=\frac{4 x^{2}}{\alpha^{3} \sqrt{\pi}} e^{-\frac{x^{2}}{\alpha^{2}}} \text { for } x>0 \]and
The probability density that a molecule in Brownian motion will be at a distance \(x\) from a reflecting wall at time \(t\), if at time \(t_{0}\) it was at a distance \(x_{0}\), is given by the
Prove that for an arbitrary random variable \(\xi\), the possible values of which lie in the interval \((a, b)\), the following inequalities are valid:\[ a \leqslant M \xi \leqslant b \text { and }
Let \(x_{1}, x_{2}, \ldots, x_{k}\) be possible values of a random variable \(\xi\). Prove that as \(n \rightarrow \infty\)(a) \(\frac{M \xi^{n+1}}{M \xi^{n}} \rightarrow \max _{1 \leqslant j
Let \(F(x)\) be the distribution function of \(\xi\). Prove that if \(\mathbf{M} \xi\) exists, then\[ M \xi=\int_{0}^{\infty}[1-F(x)+F(-x)] d x \]and for the existence of \(\boldsymbol{M} \xi\) it
Two points are dropped at random on the line-segment \((0, l)\). Find the expectation, variance and the expectation of the nth power of the distance between them.
A random variable \(\xi\) is distributed according to the logarithmic normal law; i.e., for \(x>0\) the density function of \(\xi\) is\[ p(x)=\frac{1}{x \beta \sqrt{2 \pi}} e^{-\frac{1}{2
A random variable \(\xi\) is normally distributed. Find \(M|\xi-a|\) where \(a=M \xi\).
A box contains \(2^{n}\) tickets; the number \(i(i=0,1, \ldots, n)\) is written on \(C_{n}^{l}\) of them. A total of \(m\) tickets are drawn at random, \(s\) is the sum of the numbers written on
The random variables \(\xi_{1}, \xi_{2}, \ldots, \xi_{n+m}(n>m)\) are independent, identically distributed and have a finite variance. Find the correlation coefficient of the sums\[
The random variables \(\xi\) and \(\eta\) are independent and are normally distributed with the same parameters \(a\) and \(\sigma\). Find the correlation coefficient of the quantities \(\alpha
A random vector \((\xi, \eta)\) is normally distributed; \(\boldsymbol{M} \xi=a, \mathbf{M} \eta=b, \mathbf{D} \xi=\sigma_{1}^{2}\), \(D \eta=\sigma_{2}^{2}\), and \(R\) is the correlation
Let \(x_{1}\) and \(x_{2}\) be the results of two independent observations of a normally distributed variable \(\xi\). Prove that \(M \max \left(x_{1}, x_{2}\right)=a+\frac{\sigma}{\sqrt{\pi}}\)
A random vector \((\xi, \eta)\) is normally distributed, \(\mathbf{M} \xi=\mathbf{M} \eta=0, \mathbf{D} \xi=\mathbf{D} \eta=1\), \(\mathbf{M} \xi \eta=R\). Prove that\[ M \max (\xi,
The unevenness in length of cotton fibre is given by\[ \lambda=\frac{a^{\prime \prime}-a^{\prime}}{a} \]where \(a\) is the expectation of fibre length, \(a^{\prime \prime}\) is the expectation of
The random variables \(\xi_{1}, \xi_{2}, \ldots, \xi_{n}, \ldots\) are independent and uniformly distributed over \((0,1)\). Let \(v\) be a random variable equal to the \(k\) for which the sum\[
Let \(\xi\) be a random variable with density function\[ p_{\xi}(x)=\frac{1}{\pi} \frac{1}{1+x^{2}} \]Find \(M \min (|\xi|, 1)\).
Prove that if the random variable \(\xi\) is such that \(M e^{a *}\) exists \((a>0\) is a constant), then\[ \mathbf{P}\{\xi \geqslant \varepsilon\} \leqslant \frac{M
Let \(f(x)>0\) be a nondecreasing function. Prove that if \(\mathbf{M}(f(|\xi-M \xi|)\) exists, then\[ \mathbf{P}\{|\xi-\mathbf{M} \xi| \geqslant \varepsilon\} \leqslant \frac{\mathbf{M}
A sequence of independent and identically distributed random variables \(\left\{\xi_{i}\right\}\) is defined by the equalities(a) \(\mathbf{P}\left\{\xi_{n}=2^{k-\log k-2 \log \log
Prove that the law of large numbers may be applied to a sequence of independent random variables \(\left\{\xi_{n}\right\}\) such that\[
Prove that if the independent random variables \(\xi_{1}, \xi_{2}, \ldots, \xi_{n}, \ldots\) are such that\[ \max _{1 \leqslant k \leqslant n} \int_{|x| \geqslant A}|x| d F_{k}(x) \rightarrow 0 \text
Using the result of the preceding problem, prove that if for a sequence of independent random variables \(\left\{\xi_{n}\right\}\) there exist numbers \(\alpha>1\) and \(\beta\) such that
Given a sequence of random variables \(\xi_{1}, \xi_{2}, \ldots\), for which \(D \xi_{n} \leqslant C\) and \(R_{i j} \rightarrow 0\) as \(|i-j| \longrightarrow \infty\left(R_{i j}\right.\) is the
Prove that the functions\[ f_{1}(t)=\sum_{k=0}^{\infty} a_{k} \cos k t, \quad f_{2}(t)=\sum_{k=0}^{\infty} a_{k} e^{i \lambda_{k} t} \]where \(a_{k} \geqslant 0\) and \(\sum_{k=0}^{\infty}
Find the characteristic function for the following probability density functions:(a) \(ho(x)=\frac{a}{2} e^{-a|x|}\);(b) \(p(x)=\frac{a}{\pi\left(a^{2}+x^{2}\right)}\);(c)
Prove that the functions\[ \varphi_{1}(t)=\frac{1}{\cosh t}, \varphi_{2}(t)=\frac{t}{\sinh t}, \varphi_{3}(t)=\frac{1}{\cosh ^{2} t} \]are characteristic functions of the density functions \[
Find the probability distributions of random variables whose characteristic functions are(a) \(\cos t\);(b) \(\cos ^{2} t\);(c) \(\frac{a}{1+i t}\);(d) \(\frac{\sin a t}{a t}\)
Prove that one can find independent random variables \(\xi_{1}, \xi_{2}, \xi_{3}\) such that the probability distributions of \(\xi_{2}\) and \(\xi_{3}\) are different, while the distribution
Prove that if \(f(t)\) is a characteristic function equal to zero when \(|t| \geqslant a\), then the function \(\varphi(t)\) defined by the equations\[\varphi(t)= \begin{cases}f(t) & \text { when
Prove that if \(f(t)\) is a characteristic function, then the function\[ \varphi(t)=e^{f(t)-1} \]is also a characteristic function.
Prove that if the function \(f(t)\) is a characteristic function, then the function\[ \varphi(t)=\frac{1}{t} \int_{0}^{t} f(z) d z \]is also a characteristic function.
Prove that for any real characteristic function \(\varphi(t)\) the inequality\[ 1-\varphi(2 t) \leqslant 4\{(1-\varphi(t)\} \]holds and, hence, for any characteristic function the inequality \[
Prove that for any real characteristic function the inequality\[ 1+\varphi(2 t) \geqslant 2\{\varphi(t)\}^{2} \]holds.
Prove that if \(F(x)\) is a distribution and \(f(t)\) the corresponding characteristic function, then for any value of \(x\) the equation\[ \lim _{T \rightarrow \infty} \frac{1}{2 T} \int_{-T}^{T}
Prove that if \(F(x)\) is a distribution function, \(f(t)\) the corresponding characteristic function, and \(x_{v}\) are abscissas of jumps in the function \(F(x)\), then\[ \lim _{T \rightarrow
Prove that if a random variable has a density function, then its characteristic function tends to zero as \(t \rightarrow \infty\).
A random variable \((\xi)\) is Poisson distributed; \(\mathbf{M} \xi=\lambda\). Prove that as \(\lambda \rightarrow \infty\), the distribution of the variable \(\frac{\xi-\lambda}{\sqrt{\lambda}}\)
A random variable \(\xi\) has the density function\[ p(x)= \begin{cases}0 & \text { for } x \leqslant 0 \\ \frac{\beta^{\alpha}}{\Gamma(\alpha)} x^{\alpha-1} e-\beta_{x} & \text { for }
Prove that if \(\varphi(t)\) is a characteris ic function and the function \(\psi(t)\) is such that for some sequence \(\left\{h_{n}\right\}\left(h_{n} \rightarrow \infty\right.\) as \(\left.n
Prove that as \(n\) tends to infinity\[ \frac{1}{\Gamma\left(\frac{n}{2}\right)} \sqrt{\left(\frac{n}{2}\right)^{n}} \int_{0}^{1+t} \int^{\frac{\overline{2}}{n}} z^{\frac{n}{2}-1} e^{-\frac{n z}{2}}
The random variables\[ \xi_{n}=\left\{\begin{array}{l} -n^{x} \text { with probability } \frac{1}{2} \\ +n^{\alpha} \text { with probability } \frac{1}{2} \end{array}\right. \]are independent.
Prove that as \(n \rightarrow \infty\),\[ e^{-n} \sum_{k=0}^{n} \frac{n^{k}}{k!} \rightarrow \frac{1}{2} \]Apply the Lyapunov theorem to the sum of the Poisson distributed random variables with
The probability of occurrence of an event \(A\) in the \(i\) th trial is equal to \(p_{i} ; \mu\) is the number of occurrences of \(A\) in \(n\) independent trials. Prove that\[
Prove that under the conditions of the preceding problem, the requirement that \(\sum_{i=1}^{\infty} p_{i} q_{i}=+\infty\) is sufficient not only for the integral theorem but for the local theorem as
Prove that the distributions of(a) Pascal (Exercise 1 (a) of Chapter 5),(b) Polya (Exercise 1 (b) of Chapter 5),(c) Cauchy (Example 5, Ssc. 24)are infinitely divisible.Exercise 1:(a)
Prove that a random variable with density function\[ p(x)=\left\{\begin{array}{cc} 0 & \text { for } x \leqslant 0 \\ \frac{\beta^{\alpha}}{\Gamma(\alpha)} x^{\alpha-1} e_{e-\beta x} & \text { for
Prove that, no matter what the constants \(\alpha>0\) and \(\beta>0\),\[ \varphi(t)=\left(1+\frac{t^{2}}{\beta^{2}}\right)^{-\alpha} \]is an infinitely divisible characteristic function.From this
Prove that if the sum of two independent infinitely divisible random varia. bles is distributed according to(a) the Poisson law,(b) the normal law, then every summand is Poisson distributed in case
Showing 6200 - 6300
of 6259
First
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63