P(carp L2) = 0.7, P(tench|L2) = 0.1, P(perch L2) = 0.1, P(pike( L2) = 0.1 P(carp L3) = 0.3, P(tench| L3) = 0.3, P(perch| L3) = 0.2, P(pike| L3) = 0.2 Deliverables: . The man returns from a days fishing with 1 tench, 2 carp and 1 pike. Which lake did he go to? Show fully all your workings. . Now suppose that our man camps by the same lake for three days and fishes each day: - on day 1 he catches 2 carp, 1 tench and 1 perch - on day 2 he catches 1 carp, 1 tench and 2 pike - on day 3 he catches 1 carp and 3 tench Which lake did he go to? What assumptions have we made, and are they reasonable? Show fully all your workings.8. Let X ~ N(u, 1). Consider testing Ho : p =0 versus H1 : p # 0. 11. Bayesian Inference Take P(Ho) = P(H,) = 1/2. Let the prior for a under H, be | ~ N(0, 62). Find an expression for P(HolX = x). Compare P(HolX = =) to the p-value of the Wald test. Do the comparison numerically for a variety of values of a and b. Now repeat the problem using a sample of size n. You will see that the posterior probability of Ho can be large even when the p-value is small, especially when n is large. This disagreement between Bayesian and frequentist testing is called the Jeffreys-Lindley paradox.Bayesian inference for a proportion. For this problem we will return to the Beta-Binomial setup from earlier classes/HWs: (0 ~ Beta(c, B), X1, . .., Xn | 0 " Bernoulli(0). Recall that the posterior distribution is 0 | X1,. .., Xn ~ Beta(a + T, B+n -T) where T = X1+ . .. + Xn is the total number of successes. You can use any facts we've stated before about the Beta(a, 8) distribution, including: its density is Bla,B): go-1(1 - x)3-1 and its mean is - ats' (a) What is the posterior mean of 0, i.e. its expected value under the posterior distribution, E(0 | X1, . . ..X,)