Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Problem 1: Implement the line search Newton's method with the Hessian modication to minimize the Rosenbrock function: f(x) = 100(x2 x21 )2 + (1 x1)2:

Problem 1: Implement the line search Newton's method with the Hessian modication to

minimize the Rosenbrock function:

f(x) = 100(x2 x21

)2 + (1 x1)2:

Choose the step length to satisfy strong Wolfe conditions with c1 = 104 and c2 = 0:1. Set the initial

step length 0 = 1 and the initial point x0 = (0; 1)T . Terminate the algorithm once krf(xk)k 104.

Report the objective function value f, the step length and the norm of gradient rf of the last 10

iterations. (The codes for nding the step length to satisfy strong Wolfe conditions with

c1 = 104 and c2 = 0:1 are the same ones as posted before on Canvas.)

1) The Hessian is modied by adding a matrix with minimum Frobenius norm so that all eigen-

values of the resulting modied Hessian are at least = 103 (namely, the non-diagonal

modication studied in class).

2) The Hessian is modied by adding a matrix with minimum Euclidean norm so that all eigenval-

ues of the resulting modied Hessian are at least = 103 (namely, the diagonal modication

studied in class).

3) The Hessian is modied by the modied Cholesky factorization approach with = 103 and

= 106.

4) Compare the results obtained by the above three Hessian modication schemes and conclude

which one is best.

image text in transcribed

plz show the matlab works!!

1

Problem 1: Implement the line search Newton's method with the Hessian modification to minimize the Rosenbrock function f(z) = 100(T2- 2 + (1-X1)2. Choose the step length to satisfy strong Wolfe conditions with c1 10-4 and c2 0.1. Set the initial step length = 1 and the initial point zo (0, 1)T, Terminate the algorithm once llVf(rk) l 10-4. Report the objective function value f, the step length and the norm of gradient f of the last 10 iterations. (The codes for finding the step length to satisfy strong Wolfe conditions with c1 = 10-4 and c2 = 0.1 are the same ones as posted before on Canvas.) 1) The Hessian is modified by adding a matrix with minimum Frobenius norm so that all eigen values of the resulting modified Hessian are at least -10-3 (namely, the non-diagonal modification studied in class) 2) The Hessian is modified by adding a matrix with minimum Euclidean norm so that all eigenval ues of the resulting modified Hessian are at least -10-3 (namely, the diagonal modification studied in class) 3) The Hessian is modified by the modified Cholesky factorization approach with = 10-3 and 4) Compare the results obtained by the above three Hessian modification schemes and conclude which one is best Problem 1: Implement the line search Newton's method with the Hessian modification to minimize the Rosenbrock function f(z) = 100(T2- 2 + (1-X1)2. Choose the step length to satisfy strong Wolfe conditions with c1 10-4 and c2 0.1. Set the initial step length = 1 and the initial point zo (0, 1)T, Terminate the algorithm once llVf(rk) l 10-4. Report the objective function value f, the step length and the norm of gradient f of the last 10 iterations. (The codes for finding the step length to satisfy strong Wolfe conditions with c1 = 10-4 and c2 = 0.1 are the same ones as posted before on Canvas.) 1) The Hessian is modified by adding a matrix with minimum Frobenius norm so that all eigen values of the resulting modified Hessian are at least -10-3 (namely, the non-diagonal modification studied in class) 2) The Hessian is modified by adding a matrix with minimum Euclidean norm so that all eigenval ues of the resulting modified Hessian are at least -10-3 (namely, the diagonal modification studied in class) 3) The Hessian is modified by the modified Cholesky factorization approach with = 10-3 and 4) Compare the results obtained by the above three Hessian modification schemes and conclude which one is best

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Optimization And Data Science Trends And Applications 5th Airoyoung Workshop And Airo Phd School 2021 Joint Event

Authors: Adriano Masone ,Veronica Dal Sasso ,Valentina Morandi

1st Edition

3030862887, 978-3030862886

More Books

Students also viewed these Databases questions