Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Please fill the blanks: clearvars clc addpath('../Generation') addpath('../Basic_blocks') addpath('../Algorithms') % Loading scenarios % =========================== scenario=2; [data_class set_up]=scenarios_regression(scenario); % Definition of the problem %=================================== loss_lasso =

Please fill the blanks:

clearvars clc addpath('../Generation') addpath('../Basic_blocks') addpath('../Algorithms')

% Loading scenarios % =========================== scenario=2; [data_class set_up]=scenarios_regression(scenario);

% Definition of the problem %=================================== loss_lasso = @(N,U,x,y,lambda) (1/N*(U*x-y)'*(U*x-y)+lambda*norm(x,1)); subgrad_lasso = @(N,U,x,y,lambda) (2/N*U'*(U*x-y)+lambda*sign(x)); grad_LS = @(N,U,x,y,lambda) (2/N*U'*(U*x-y));

% Solution of the empirical risk using CVX %========================================= x_lasso_cvx=solver_cvx(set_up,@(N,A,x,y,lambda) loss_lasso(N,A,x,y,lambda)); loss_opt=loss_lasso(set_up.Niter_train,set_up.Utrain(:,1:set_up.M+1),x_lasso_cvx,set_up.ytrain(:,1),set_up.Lambda);

% Gradient descent out_subgd =grad_FOM(set_up,@(N,A,x,y,lambda) subgrad_lasso(N,A,x,y,lambda)); out_subgd_decay =grad_FOM_decay(set_up,@(N,A,x,y,lambda) subgrad_lasso(N,A,x,y,lambda)); loss_subgrad=eval_loss(out_subgd,set_up,@(N,A,x,y,lambda) loss_lasso(N,A,x,y,lambda)); loss_subgrad_decay=eval_loss(out_subgd_decay,set_up,@(N,A,x,y,lambda) loss_lasso(N,A,x,y,lambda));

out_ista=ista_lasso(set_up,@(N,A,x,y,lambda) grad_LS(N,A,x,y,lambda)); out_fista=fista_lasso(set_up,@(N,A,x,y,lambda) grad_LS(N,A,x,y,lambda)); loss_ista=eval_loss(out_ista,set_up,@(N,A,x,y,lambda) loss_lasso(N,A,x,y,lambda)); loss_fista=eval_loss(out_fista,set_up,@(N,A,x,y,lambda) loss_lasso(N,A,x,y,lambda));

% Plot of learning curves plot(1:set_up.Niter_train,10*log10(sum((loss_subgrad-loss_opt*ones(1,set_up.Niter_train)).^2,1)),'b','LineWidth',3), hold plot(1:set_up.Niter_train,10*log10(sum((loss_subgrad_decay-loss_opt*ones(1,set_up.Niter_train)).^2,1)),'r','LineWidth',3), plot(1:set_up.Niter_train,10*log10(sum((loss_ista-loss_opt*ones(1,set_up.Niter_train)).^2,1)),'m','LineWidth',3), plot(1:set_up.Niter_train,10*log10(sum((loss_fista-loss_opt*ones(1,set_up.Niter_train)).^2,1)),'c','LineWidth',3), hold off legend('Subgradient.Fixed','Subgradient.Decay','ISTA', 'FISTA'),grid xlabel('Iterations') ylabel('MSE') title('Lasso. Different implementations')

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image_2

Step: 3

blur-text-image_3

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Medical Image Databases

Authors: Stephen T.C. Wong

1st Edition

1461375398, 978-1461375395

Students also viewed these Databases questions

Question

1. Identify and control your anxieties

Answered: 1 week ago