Question

Let f(x) be a polynomial and let r be a root of f(x). If x_1 is...

Let f(x) be a polynomial and let r be a root of f(x). If x_1 is sufficiently close to r then x_2 = i(x_1) is closer, x_3 = i(x_2) is closer still, etc. Here i(x) = x - f(x)/f'(x) is what we called the improvement function


a. Let f(x)=x^2-10. Compute i(x) in simplified form (i.e. everything in one big fraction involving x). Let r = sqrt(10) and x_1=3. Show a hand computation of x_2 and then x_3, expressing both your answers first as fractions and then (using a calculator if you like) as decimals.

b. A calculator says that sqrt(10) is 3.162277660168379331... Using your answer to a, compute x_3-sqrt(10) to five significant figures, writing your answer without using scientific notation. (If you're using Mathematica, things like N[Pi,30] to get 30 digits of accuracy would be useful).   

c Hand-draw (copying from a calculator or Mathematica would be fine) a reasonably accurate picture of the situation. Your picture must show the graph of f(x), the x-values x_1, x_2, x_3, and r, and the two tangent lines relevant to your computation. If you think it's better, you can draw two different pictures at different scales.  

Homework Answers

Answer #1

!! PLEASE RATE WITH THUMBS UP !!

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let R be the polynomial ring in infinitly many variables x_1,x_2,.... with coefficients in a field...
Let R be the polynomial ring in infinitly many variables x_1,x_2,.... with coefficients in a field F. Let M be the cyclic R- module R itself. Prove that the submodule {x_1,x_2,....} cannot be generated by any finite set.
Let f(x) = |x| for −1 ≤ x ≤ 1 and extend f periodically to R...
Let f(x) = |x| for −1 ≤ x ≤ 1 and extend f periodically to R by f(x + 2) = f(x). Complete the following: (a) Draw a picture of f. (b) Calculate the Fourier series for f thought of as an element of  L 2 [−1, 1]. (PDE)
Let R = R[x], f ∈ R \ {0}, and I = (f). Show that R/I...
Let R = R[x], f ∈ R \ {0}, and I = (f). Show that R/I is an integral domain if and only if f is an irreducible polynomial.
Let R = R[x], f ∈ R \ {0}, and I = (f). Show that R/I...
Let R = R[x], f ∈ R \ {0}, and I = (f). Show that R/I is a real vector space of dimension equal to deg(f).
Let R be the region bounded above by f(x) = 3 times the (sqr root of...
Let R be the region bounded above by f(x) = 3 times the (sqr root of x) and the x-axis between x = 4 and x = 16. Approximate the area of R using a midpoint Riemann sum with n = 6 subintervals. Sketch a graph of R and illustrate how you are approximating it’ area with rectangles. Round your answer to three decimal places.
Let f: [0, 1] --> R be defined by f(x) := x. Show that f is...
Let f: [0, 1] --> R be defined by f(x) := x. Show that f is in Riemann integration interval [0, 1] and compute the integral from 0 to 1 of the function f using both the definition of the integral and Riemann (Darboux) sums.
Let f : R → R be defined by f(x) = x^3 + 3x, for all...
Let f : R → R be defined by f(x) = x^3 + 3x, for all x. (i) Prove that if y > 0, then there is a solution x to the equation f(x) = y, for some x > 0. Conclude that f(R) = R. (ii) Prove that the function f : R → R is strictly monotone. (iii) By (i)–(ii), denote the inverse function (f ^−1)' : R → R. Explain why the derivative of the inverse function,...
Let f : R \ {1} → R be given by f(x) = 1 1 −...
Let f : R \ {1} → R be given by f(x) = 1 1 − x . (a) Prove by induction that f (n) (x) = n! (1 − x) n for all n ∈ N. Note: f (n) (x) denotes the n th derivative of f. You may use the usual differentiation rules without further proof. (b) Compute the Taylor series of f about x = 0. (You must provide justification by relating this specific Taylor series to...
Let f : R → R + be defined by the formula f(x) = 10^2−x ....
Let f : R → R + be defined by the formula f(x) = 10^2−x . Show that f is injective and surjective, and find the formula for f −1 (x). Suppose f : A → B and g : B → A. Prove that if f is injective and f ◦ g = iB, then g = f −1 .
1. (a) Let f(x) = exp(x),x ∈ R. Show that f is invertible and compute the...
1. (a) Let f(x) = exp(x),x ∈ R. Show that f is invertible and compute the derivative of f−1(y) in terms of y. [5] (b) find the Taylor series and radius of convergence for g(x) = log(1+ x) about x = 0. [6]