Answered step by step
Verified Expert Solution
Question
1 Approved Answer
J = ( a ) ? , iinD ( Y a i - [ U V T ] a i ) 2 2 + 2
iinD
In order to break a big optimization problem into smaller pieces that we know how to
solve, we fix and find the best for that But a subtle and important point is
that even if is best for then the that is best for might not be the
original It's like how we might be some lonely person's best friend, even though
they are not our best friend. In light of this, we repeat, like this: we fix and solve
for then fix to be the result from the previous step and solve for and repeat
this alternate process until we find the solution. This is an example of iterative
optimization, where we greedily take steps in a good direction, but as we do so the
context shifts so 'the good direction' evolves. Gradient descent, which we've already
seen, has the same structure.
Consider the case The matrices and reduce to vectors and such
that and
When is fixed, finding that minimizes becomes equivalent to finding that
minimizes
Consider the case The matrices and reduce to vectors and such
that and
When is fixed, finding that minimizes becomes equivalent to finding that
minimizes
iinD
iinD
iinD
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started