Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

RegressionSample.m: /** start **/ clc clear close % add libsvm to the path addpath('libsvm-3.11matlab'); % instance matrix is n by d where d is number

image text in transcribed

RegressionSample.m:

/** start **/

clc clear close

% add libsvm to the path addpath('libsvm-3.11\matlab\');

% instance matrix is n by d where d is number of features [label_vector, instance_matrix] = libsvmread('Abalone.txt');

% Nomralize features. Make sure the length of each of feature vector is 1 instance_matrix = NormalizeFea(instance_matrix, 1);

num_sample = length(label_vector);

% make training/test split (70/30) train_data = instance_matrix (1: floor(num_sample*0.7), :); train_label = label_vector (1: floor(num_sample*0.7)); test_data = instance_matrix (floor(num_sample*0.7)+1:num_sample,:); test_label = label_vector (floor(num_sample*0.7)+1: num_sample);

% We show how to use logistic regression here: b = ridge(y,D,k); % y- label % D- data matrix % k- lambda value % k could be a vector including a set of different values k = [0.001, 0.01, 0.1, 1, 10, 100, 1000];

% each column of B is a weight vector corresponding to a lambda value from % [0.001, 0.01, 0.1, 1, 10] B = ridge(train_label,train_data,k);

% Will display 7 different weights side by side. The norm of weight becomes % small when lambda is larger. This can be identified from the range of % column vector b in B bar(B')

% Compute the square difference error % compute the prediction results regression_result = test_data * B;

% compute the difference with ground truth sqr_diff = (regression_result - repmat(test_label, [1, length(k)])).^2; % take the average sqr_diff = sum(sqr_diff,1)/length(test_label);

/** end**/

abalone.txt::-

15 1:1 2:0.455 3:0.365 4:0.095 5:0.514 6:0.2245 7:0.101 8:0.15

7 1:1 2:0.35 3:0.265 4:0.09 5:0.2255 6:0.0995 7:0.0485 8:0.07

9 1:2 2:0.53 3:0.42 4:0.135 5:0.677 6:0.2565 7:0.1415 8:0.21

10 1:1 2:0.44 3:0.365 4:0.125 5:0.516 6:0.2155 7:0.114 8:0.155

7 1:3 2:0.33 3:0.255 4:0.08 5:0.205 6:0.0895 7:0.0395 8:0.055

8 1:3 2:0.425 3:0.3 4:0.095 5:0.3515 6:0.141 7:0.0775 8:0.12

20 1:2 2:0.53 3:0.415 4:0.15 5:0.7775 6:0.237 7:0.1415 8:0.33

16 1:2 2:0.545 3:0.425 4:0.125 5:0.768 6:0.294 7:0.1495 8:0.26

9 1:1 2:0.475 3:0.37 4:0.125 5:0.5095 6:0.2165 7:0.1125 8:0.165

19 1:2 2:0.55 3:0.44 4:0.15 5:0.8945 6:0.3145 7:0.151 8:0.32

14 1:2 2:0.525 3:0.38 4:0.14 5:0.6065 6:0.194 7:0.1475 8:0.21

10 1:1 2:0.43 3:0.35 4:0.11 5:0.406 6:0.1675 7:0.081 8:0.135

11 1:1 2:0.49 3:0.38 4:0.135 5:0.5415 6:0.2175 7:0.095 8:0.19

10 1:2 2:0.535 3:0.405 4:0.145 5:0.6845 6:0.2725 7:0.171 8:0.205

10 1:2 2:0.47 3:0.355 4:0.1 5:0.4755 6:0.1675 7:0.0805 8:0.185

12 1:1 2:0.5 3:0.4 4:0.13 5:0.6645 6:0.258 7:0.133 8:0.24

7 1:3 2:0.355 3:0.28 4:0.085 5:0.2905 6:0.095 7:0.0395 8:0.115

10 1:2 2:0.44 3:0.34 4:0.1 5:0.451 6:0.188 7:0.087 8:0.13

7 1:1 2:0.365 3:0.295 4:0.08 5:0.2555 6:0.097 7:0.043 8:0.1

9 1:1 2:0.45 3:0.32 4:0.1 5:0.381 6:0.1705 7:0.075 8:0.115

11 1:1 2:0.355 3:0.28 4:0.095 5:0.2455 6:0.0955 7:0.062 8:0.075

10 1:3 2:0.38 3:0.275 4:0.1 5:0.2255 6:0.08 7:0.049 8:0.085

12 1:2 2:0.565 3:0.44 4:0.155 5:0.9395 6:0.4275 7:0.214 8:0.27

9 1:2 2:0.55 3:0.415 4:0.135 5:0.7635 6:0.318 7:0.21 8:0.2

10 1:2 2:0.615 3:0.48 4:0.165 5:1.1615 6:0.513 7:0.301 8:0.305

11 1:2 2:0.56 3:0.44 4:0.14 5:0.9285 6:0.3825 7:0.188 8:0.3

11 1:2 2:0.58 3:0.45 4:0.185 5:0.9955 6:0.3945 7:0.272 8:0.285

12 1:1 2:0.59 3:0.445 4:0.14 5:0.931 6:0.356 7:0.234 8:0.28

15 1:1 2:0.605 3:0.475 4:0.18 5:0.9365 6:0.394 7:0.219 8:0.295

11 1:1 2:0.575 3:0.425 4:0.14 5:0.8635 6:0.393 7:0.227 8:0.2

10 1:1 2:0.58 3:0.47 4:0.165 5:0.9975 6:0.3935 7:0.242 8:0.33

15 1:2 2:0.68 3:0.56 4:0.165 5:1.639 6:0.6055 7:0.2805 8:0.46

18 1:1 2:0.665 3:0.525 4:0.165 5:1.338 6:0.5515 7:0.3575 8:0.35

19 1:2 2:0.68 3:0.55 4:0.175 5:1.798 6:0.815 7:0.3925 8:0.455

13 1:2 2:0.705 3:0.55 4:0.2 5:1.7095 6:0.633 7:0.4115 8:0.49

8 1:1 2:0.465 3:0.355 4:0.105 5:0.4795 6:0.227 7:0.124 8:0.125

16 1:2 2:0.54 3:0.475 4:0.155 5:1.217 6:0.5305 7:0.3075 8:0.34

8 1:2 2:0.45 3:0.355 4:0.105 5:0.5225 6:0.237 7:0.1165 8:0.145

11 1:2 2:0.575 3:0.445 4:0.135 5:0.883 6:0.381 7:0.2035 8:0.26

9 1:1 2:0.355 3:0.29 4:0.09 5:0.3275 6:0.134 7:0.086 8:0.09

9 1:2 2:0.45 3:0.335 4:0.105 5:0.425 6:0.1865 7:0.091 8:0.115

14 1:2 2:0.55 3:0.425 4:0.135 5:0.8515 6:0.362 7:0.196 8:0.27

5 1:3 2:0.24 3:0.175 4:0.045 5:0.07 6:0.0315 7:0.0235 8:0.02

5 1:3 2:0.205 3:0.15 4:0.055 5:0.042 6:0.0255 7:0.015 8:0.012

4 1:3 2:0.21 3:0.15 4:0.05 5:0.042 6:0.0175 7:0.0125 8:0.015

7 1:3 2:0.39 3:0.295 4:0.095 5:0.203 6:0.0875 7:0.045 8:0.075

9 1:1 2:0.47 3:0.37 4:0.12 5:0.5795 6:0.293 7:0.227 8:0.14

7 1:2 2:0.46 3:0.375 4:0.12 5:0.4605 6:0.1775 7:0.11 8:0.15

6 1:3 2:0.325 3:0.245 4:0.07 5:0.161 6:0.0755 7:0.0255 8:0.045

9 1:2 2:0.525 3:0.425 4:0.16 5:0.8355 6:0.3545 7:0.2135 8:0.245

8 1:3 2:0.52 3:0.41 4:0.12 5:0.595 6:0.2385 7:0.111 8:0.19

7 1:1 2:0.4 3:0.32 4:0.095 5:0.303 6:0.1335 7:0.06 8:0.1

10 1:1 2:0.485 3:0.36 4:0.13 5:0.5415 6:0.2595 7:0.096 8:0.16

10 1:2 2:0.47 3:0.36 4:0.12 5:0.4775 6:0.2105 7:0.1055 8:0.15

7 1:1 2:0.405 3:0.31 4:0.1 5:0.385 6:0.173 7:0.0915 8:0.11

8 1:2 2:0.5 3:0.4 4:0.14 5:0.6615 6:0.2565 7:0.1755 8:0.22

8 1:1 2:0.445 3:0.35 4:0.12 5:0.4425 6:0.192 7:0.0955 8:0.135

8 1:1 2:0.47 3:0.385 4:0.135 5:0.5895 6:0.2765 7:0.12 8:0.17

4 1:3 2:0.245 3:0.19 4:0.06 5:0.086 6:0.042 7:0.014 8:0.025

7 1:2 2:0.505 3:0.4 4:0.125 5:0.583 6:0.246 7:0.13 8:0.175

7 1:1 2:0.45 3:0.345 4:0.105 5:0.4115 6:0.18 7:0.1125 8:0.135

9 1:1 2:0.505 3:0.405 4:0.11 5:0.625 6:0.305 7:0.16 8:0.175

10 1:2 2:0.53 3:0.41 4:0.13 5:0.6965 6:0.302 7:0.1935 8:0.2

7 1:1 2:0.425 3:0.325 4:0.095 5:0.3785 6:0.1705 7:0.08 8:0.1

8 1:1 2:0.52 3:0.4 4:0.12 5:0.58 6:0.234 7:0.1315 8:0.185

8 1:1 2:0.475 3:0.355 4:0.12 5:0.48 6:0.234 7:0.1015 8:0.135

12 1:2 2:0.565 3:0.44 4:0.16 5:0.915 6:0.354 7:0.1935 8:0.32

13 1:2 2:0.595 3:0.495 4:0.185 5:1.285 6:0.416 7:0.224 8:0.485

10 1:2 2:0.475 3:0.39 4:0.12 5:0.5305 6:0.2135 7:0.1155 8:0.17

6 1:3 2:0.31 3:0.235 4:0.07 5:0.151 6:0.063 7:0.0405 8:0.045

13 1:1 2:0.555 3:0.425 4:0.13 5:0.7665 6:0.264 7:0.168 8:0.275

8 1:2 2:0.4 3:0.32 4:0.11 5:0.353 6:0.1405 7:0.0985 8:0.1

20 1:2 2:0.595 3:0.475 4:0.17 5:1.247 6:0.48 7:0.225 8:0.425

11 1:1 2:0.57 3:0.48 4:0.175 5:1.185 6:0.474 7:0.261 8:0.38

13 1:2 2:0.605 3:0.45 4:0.195 5:1.098 6:0.481 7:0.2895 8:0.315

15 1:2 2:0.6 3:0.475 4:0.15 5:1.0075 6:0.4425 7:0.221 8:0.28

9 1:1 2:0.595 3:0.475 4:0.14 5:0.944 6:0.3625 7:0.189 8:0.315

10 1:2 2:0.6 3:0.47 4:0.15 5:0.922 6:0.363 7:0.194 8:0.305

11 1:2 2:0.555 3:0.425 4:0.14 5:0.788 6:0.282 7:0.1595 8:0.285

14 1:2 2:0.615 3:0.475 4:0.17 5:1.1025 6:0.4695 7:0.2355 8:0.345

9 1:2 2:0.575 3:0.445 4:0.14 5:0.941 6:0.3845 7:0.252 8:0.285

12 1:1 2:0.62 3:0.51 4:0.175 5:1.615 6:0.5105 7:0.192 8:0.675

16 1:2 2:0.52 3:0.425 4:0.165 5:0.9885 6:0.396 7:0.225 8:0.32

21 1:1 2:0.595 3:0.475 4:0.16 5:1.3175 6:0.408 7:0.234 8:0.58

14 1:1 2:0.58 3:0.45 4:0.14 5:1.013 6:0.38 7:0.216 8:0.36

12 1:2 2:0.57 3:0.465 4:0.18 5:1.295 6:0.339 7:0.2225 8:0.44

13 1:1 2:0.625 3:0.465 4:0.14 5:1.195 6:0.4825 7:0.205 8:0.4

10 1:1 2:0.56 3:0.44 4:0.16 5:0.8645 6:0.3305 7:0.2075 8:0.26

9 1:2 2:0.46 3:0.355 4:0.13 5:0.517 6:0.2205 7:0.114 8:0.165

12 1:2 2:0.575 3:0.45 4:0.16 5:0.9775 6:0.3135 7:0.231 8:0.33

15 1:1 2:0.565 3:0.425 4:0.135 5:0.8115 6:0.341 7:0.1675 8:0.255

12 1:1 2:0.555 3:0.44 4:0.15 5:0.755 6:0.307 7:0.1525 8:0.26

13 1:1 2:0.595 3:0.465 4:0.175 5:1.115 6:0.4015 7:0.254 8:0.39

10 1:2 2:0.625 3:0.495 4:0.165 5:1.262 6:0.507 7:0.318 8:0.39

15 1:1 2:0.695 3:0.56 4:0.19 5:1.494 6:0.588 7:0.3425 8:0.485

14 1:1 2:0.665 3:0.535 4:0.195 5:1.606 6:0.5755 7:0.388 8:0.48

9 1:1 2:0.535 3:0.435 4:0.15 5:0.725 6:0.269 7:0.1385 8:0.25

8 1:1 2:0.47 3:0.375 4:0.13 5:0.523 6:0.214 7:0.132 8:0.145

7 1:1 2:0.47 3:0.37 4:0.13 5:0.5225 6:0.201 7:0.133 8:0.165

10 1:2 2:0.475 3:0.375 4:0.125 5:0.5785 6:0.2775 7:0.085 8:0.155

7 1:3 2:0.36 3:0.265 4:0.095 5:0.2315 6:0.105 7:0.046 8:0.075

15 1:1 2:0.55 3:0.435 4:0.145 5:0.843 6:0.328 7:0.1915 8:0.255

15 1:1 2:0.53 3:0.435 4:0.16 5:0.883 6:0.316 7:0.164 8:0.335

10 1:1 2:0.53 3:0.415 4:0.14 5:0.724 6:0.3105 7:0.1675 8:0.205

12 1:1 2:0.605 3:0.47 4:0.16 5:1.1735 6:0.4975 7:0.2405 8:0.345

12 1:2 2:0.52 3:0.41 4:0.155 5:0.727 6:0.291 7:0.1835 8:0.235

11 1:2 2:0.545 3:0.43 4:0.165 5:0.802 6:0.2935 7:0.183 8:0.28

10 1:2 2:0.5 3:0.4 4:0.125 5:0.6675 6:0.261 7:0.1315 8:0.22

9 1:2 2:0.51 3:0.39 4:0.135 5:0.6335 6:0.231 7:0.179 8:0.2

9 1:2 2:0.435 3:0.395 4:0.105 5:0.3635 6:0.136 7:0.098 8:0.13

9 1:1 2:0.495 3:0.395 4:0.125 5:0.5415 6:0.2375 7:0.1345 8:0.155

9 1:1 2:0.465 3:0.36 4:0.105 5:0.431 6:0.172 7:0.107 8:0.175

9 1:3 2:0.435 3:0.32 4:0.08 5:0.3325 6:0.1485 7:0.0635 8:0.105

9 1:1 2:0.425 3:0.35 4:0.105 5:0.393 6:0.13 7:0.063 8:0.165

11 1:2 2:0.545 3:0.41 4:0.125 5:0.6935 6:0.2975 7:0.146 8:0.21

11 1:2 2:0.53 3:0.415 4:0.115 5:0.5915 6:0.233 7:0.1585 8:0.18

11 1:2 2:0.49 3:0.375 4:0.135 5:0.6125 6:0.2555 7:0.102 8:0.22

10 1:1 2:0.44 3:0.34 4:0.105 5:0.402 6:0.1305 7:0.0955 8:0.165

9 1:2 2:0.56 3:0.43 4:0.15 5:0.8825 6:0.3465 7:0.172 8:0.31

8 1:1 2:0.405 3:0.305 4:0.085 5:0.2605 6:0.1145 7:0.0595 8:0.085

9 1:2 2:0.47 3:0.365 4:0.105 5:0.4205 6:0.163 7:0.1035 8:0.14

7 1:3 2:0.385 3:0.295 4:0.085 5:0.2535 6:0.103 7:0.0575 8:0.085

14 1:2 2:0.515 3:0.425 4:0.14 5:0.766 6:0.304 7:0.1725 8:0.255

6 1:1 2:0.37 3:0.265 4:0.075 5:0.214 6:0.09 7:0.051 8:0.07

Please use Abalone dataset in the folder for this task. There are 8 different m.easurements as input features, and 1 output (ages), and the regression model will predict the age of Abalone based on these 8 measurements. Assume the features matrix is A, the regression parameter vector is , and the age vector is y. Then a simple linear regression model tries to approach the problem: AB-Y where is unknown. Could you use: a. Linear regression (with least square loss) b. Lasso (inear regression11 norm regularization) c. Ridge regression (linear regression 12 norm regularization) to solve this problem (Hint: a sample code called "RegressionSample.m" has been included in the folder). In each regression experiment, please show the followings: a. Use a 70/30 training/testing split, and then report the average square-error: -/m where y' is the predicted value, y is the ground truth value, n is the total number of testing data. (Hint: pick a 2 and show the best results if you have regularization terms.) In Lasso and Ridge Regression, you will pick a good value for the regularization parameter (e.g., 10-4, 10-3, 10-1, 1, 10, etc.). Please visualize different given different b. Please use Abalone dataset in the folder for this task. There are 8 different m.easurements as input features, and 1 output (ages), and the regression model will predict the age of Abalone based on these 8 measurements. Assume the features matrix is A, the regression parameter vector is , and the age vector is y. Then a simple linear regression model tries to approach the problem: AB-Y where is unknown. Could you use: a. Linear regression (with least square loss) b. Lasso (inear regression11 norm regularization) c. Ridge regression (linear regression 12 norm regularization) to solve this problem (Hint: a sample code called "RegressionSample.m" has been included in the folder). In each regression experiment, please show the followings: a. Use a 70/30 training/testing split, and then report the average square-error: -/m where y' is the predicted value, y is the ground truth value, n is the total number of testing data. (Hint: pick a 2 and show the best results if you have regularization terms.) In Lasso and Ridge Regression, you will pick a good value for the regularization parameter (e.g., 10-4, 10-3, 10-1, 1, 10, etc.). Please visualize different given different b

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image_2

Step: 3

blur-text-image_3

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Progress Monitoring Data Tracking Organizer

Authors: Teacher'S Aid Publications

1st Edition

B0B7QCNRJ1

More Books

Students also viewed these Databases questions

Question

4) Compute the covariance between Y and Z. Pg45

Answered: 1 week ago

Question

Describe the factors influencing of performance appraisal.

Answered: 1 week ago

Question

What is quality of work life ?

Answered: 1 week ago

Question

Approaches to Managing Organizations

Answered: 1 week ago

Question

Communicating Organizational Culture

Answered: 1 week ago