In Section 4.8 we considered Newton's method for approximating a root r of the equation f(x) =
Question:
Use Taylor's Inequality with n = 1, a = xn, and x = r to show that if f'' (x) exists on an interval I containing r, xn, and xn+1, and |f''(x) | ( M, | f'(x) | ( K for all x ( I, then
[This means that if xn is accurate to decimal places, then xn+1 is accurate to about 2d decimal places. More precisely, if the error at stage is at most 10-m, then the error at stage n + 1 is at most (M/2K)10-2m.]
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Question Posted: