Question
Consider a data fitting problem, with first basis function 1 ( x ) = 1 , and data set x ( 1 ) , .
Consider a data fitting problem, with first basis function 1(x)=1, and data set x(1),...,x(N),y(1),...,y(N). Assume the matrix A=[1xd] , has linearly independent columns and let ^ denote the parameter values that minimize the mean square prediction error over the data set. Let the N-vector r^d denote the prediction errors using the optimal model parameter ^. Show that avg(r^d)=0.
In other words: With the least squares fit, the mean of the prediction of errors over the data set is zero.
\textit{Hint}. Use the orthogonality principle (Az)r^, with z=e1.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started