Answered step by step
Verified Expert Solution
Question
1 Approved Answer
Consider the following convex optimization problem over matrices: min [F(X) = f(x) + ||X||.] where C = {X Rxd | ||X||F 0 is a
Consider the following convex optimization problem over matrices: min [F(X) = f(x) + \||X||.] where C = {X Rxd | ||X||F 0 is a regularization parameter, and ||X. denotes the nuclear (trace) norm of a matrix X, which is the sum, or equivalently the 1 norm, of the singular values of X. (a) Show that projection over the set C is: II(X) = min {1, M X. (b) What is the subdiffernartiable set OF(X) of the objective function. You might need to read [AW]. (c) Consider the projected subgradient descent algorithm for solving the above optimization problem which iteratively updates the initial solution X = 0 by Xt+1 = Ilc (X-Gt), where G = OF(X) (gradient of f plus subgradient of trace norm from part (b)). Show the convergence rate after T iterations can be bounded by: T E[F(X.)] F(X.) + ! ||X|+G 2=1nt t=1 t=1 where G is an upper bound on the gradient of f(x), i.e.. ||V(X)||2 G (d) Decide an optimal value for learning rate and simply the convergence rate.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started