Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

1. Program in Python or MatLab the gradient descent algorithm using the back- tracking line search described below to find step sizes or. Use them

image text in transcribed
1. Program in Python or MatLab the gradient descent algorithm using the back- tracking line search described below to find step sizes or. Use them to minimize the Rosenbrock's function f(0, 12) = 100(x2 - 2) + (1 - 1) and terminate the algorithms when || f(x)|| Se=10-4. Set the initial step length 0 = 1 and print the step length used by each method at each iteration. First try the initial point Xo = (1.2, 1.2) and then the more difficult point Xo = (-1.2, 1). Here is the algorithm for the backtracking line search: Algorithm (Backtracking Line Search with search direction px) Choose a > 0, p (0,1), CE (0,1); set -a; repeat until f(xx + apx) = f(xx)+ca (Vf(),px) end (repeat) Terminate with cd = a

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

PostgreSQL Up And Running A Practical Guide To The Advanced Open Source Database

Authors: Regina Obe, Leo Hsu

3rd Edition

1491963417, 978-1491963418

More Books

Students also viewed these Databases questions