Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

J = ( a ) ? , iinD ( Y a i - [ U V T ] a i ) 2 2 + 2

J=(a)?,iinD(Yai-[UVT]ai)22+2(a?,kUak2+i?,kVik2)
In order to break a big optimization problem into smaller pieces that we know how to
solve, we fix U and find the best V for that U. But a subtle and important point is
that even if V** is best for U, then the U** that is best for V** might not be the
original U! It's like how we might be some lonely person's best friend, even though
they are not our best friend. In light of this, we repeat, like this: we fix U and solve
for V, then fix V to be the result from the previous step and solve for U, and repeat
this alternate process until we find the solution. This is an example of iterative
optimization, where we greedily take steps in a good direction, but as we do so the
context shifts so 'the good direction' evolves. Gradient descent, which we've already
seen, has the same structure.
Consider the case k=1. The matrices U and V reduce to vectors u and v, such
that ua=Ua1 and vi=Vi1.
When v is fixed, finding u that minimizes J becomes equivalent to finding u that
minimizes ...
Consider the case k=1. The matrices U and V reduce to vectors u and v, such
that ua=Ua1 and vi=Vi1.
When v is fixed, finding u that minimizes J becomes equivalent to finding u that
minimizes ...
(Yai-uavi)22+2a?(ua)2
(a)?,iinD(Yai-uavi)22+2a?(ua)2
(a)?,iinD(Yai-uavi)22
(a)?,iinD(Yai-uavi)22+2i?(vi)2
image text in transcribed

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access with AI-Powered Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Students also viewed these Databases questions