Q1. In the class we discussed the multidimensional linear regression problem. We learned that if a problem has two features {31,33} and a. response y, a linear t of the form 3; = larl + ,321'2 can be obtained by minimizing the RSS loss: R = Ellie: 13131; smaalza {ll i=1 where (3;,1, 3,12] represent the features for sample 1', and 3;; represents the response for sample 2'. Now consider a system, where the features and the response are continuously time dependent. Basically at each time t, we have (31(t],1'2[t]] as the time dependent features, and y[t} as the system rasponse. [a] We decide to t a model of the form y) = Exit} + 5323] for [l :1 t 5'. g. Following a generalization of the R33 notion above, explain why we should consider minimizing the following HES: 19:th ,31.r1{t) ngin dt. (2} [b] Suppose that tn. = 1, and the variables 31, I2 and y are the following functions of t: suit] 2 c', 3213] = 3t + 2, t) 2 i2 - 3. Find the parameters of the t .51 and g, by minimizing [2). Report the precise values of ,31 and g, along with the derivation. You may nd some integral tables such as [https : f 1" en.vikipedia. orgfwikifList_ of _integrals_of_expcnentia1_functions] useful. [c] Stewie Grifn who is a student of machine learning at Quahog Tech, decidEs to come up with a way of bypassing the Math and solving the problem through a computer program. For n = It], be discretises the interval between [I and 1, into n + 1 = 11 squispaced points i 1 a: , i=Luqn+L Tl. He then forms a data table with 11 rows and 3 colunms, where the colunms are formed by 3,1,1, $133 and 1;; dened as $1} = 33., Eli = 3h + 2:- !\" = 1:12- 3' He then uses the 1m command in R, to t a model to this data table. Follow the steps that Stewie has taken and report the values of 131 and .52 that you get through this approach. {d} How close are the results you got in part {c} vs part [b]? Repeat part (c) for n = 1000, and again compare the obtained J31 and .52 with those of part {b}. Share the code that you wrote to do this in R. or Python