Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider the unconstrained NLP max x 1 x 2 - 5(x 1 -2) 4 -3(x 2 - 5) 4 once with the starting point of

Consider the unconstrained NLP

max x1x2 - 5(x1-2)4 -3(x2 - 5)4

once with the starting point of 1,3 and again with the starting point 4,0

(a) Use graphing software to produce a contour map of the objective function for x1 ? [1,4], x2 ?[2,8].

(b) Compute the move direction that would be pursued by gradient search Algorithm Step 0: Initialization. Choose any starting solution x ( 0 ), pick stopping tolerance ? > 0, and set solution index t ? 0 .

Step 1: Gradient. Compute objective function gradient ? f ( x ( t ) ) at current point x ^.

Step 2: Stationary Point. If gradient norm ? ? ? f ( x ( t ) ) ? ? < ? ??f(x(t))?( t ) is sufficiently close to a stationary point.

Step 3: Direction. Choose gradient move direction ?x(t+1)??f(x(t))

( + for maximize and - for minimize).

Step 4: Line Search. Solve (at least approximately) corresponding one-dimensional line search max or min f ( x ( t ) + ? ? x ( t + 1 ) ) to compute ? t + 1.

Step 5: New Point. Update

x ( t + 1 ) ? x ( t ) + ? t + 1 ? x ( t + 1 )

Step 6: Advance. Increment t ? t + 1 , and return to Step 1.

Gradient norm ? ? f ( x ( t ) ) ? ? ? ? Squire root ? j ( ? f /? x i ) 2 at x(0)=(1,3).

(c) State the line search problem implied by your direction.

(d)Solve your line search problem graphically and compute the next search point x(1).

(e) Do two additional iterations of Algorithm

Step 0: Initialization. Choose any starting solution x ( 0 ), pick stopping tolerance ? > 0, and set solution index t ? 0

Step 1: Gradient. Compute objective function gradient ? f ( x ( t ) ) at current point x ^.

Step 2: Stationary Point. If gradient norm ? ? ? f ( x ( t ) ) ? ? < ? ??f(x(t))?( t ) is sufficiently close to a stationary point.

Step 3: Direction. Choose gradient move direction ?x(t+1)??f(x(t))

( + for maximize and - for minimize).

Step 4: Line Search. Solve (at least approximately) corresponding one-dimensional line search max or min f ( x ( t ) + ? ? x ( t + 1 ) ) to compute ? t + 1.

Step 5: New Point. Update

x ( t + 1 ) ? x ( t ) + ? t + 1 ? x ( t + 1 )

Step 6: Advance. Increment t ? t + 1 , and return to Step 1.

Gradient norm ? ? f ( x ( t ) ) ? ? ? ? Squire root ? j ( ? f /? x i ) 2

to compute x(2) and x(3)

(f) Plot progress of the search on the contour map of part (a).

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Pro SQL Server Wait Statistics

Authors: Enrico Van De Laar

1st Edition

1484211391, 9781484211397

More Books

Students also viewed these Databases questions

Question

- What impact will identifying a theme have on your questions?

Answered: 1 week ago