Suppose that f is an infinitely differentiable function on R. Assume that there exist constants 0<C1,C2<∞ so that ∣f′′(x)∣⩾C1 and ∣f′′′(x)∣⩽C2 for all x∈R. Fix x0∈R and for each n∈N set
xn=xn−1−f′′(xn−1)f′(xn−1).
Let x∗ be the unique value of x where f attains its minimum. Prove that
∣x∗−xn+1∣⩽2C1C2∣x∗−xn∣2 for all n∈N.
[Hint: Express f′(x∗) in terms of the Taylor series for f′ at xn using the Lagrange form of the remainder: f′(x∗)=f′(xn)+f′′(xn)(x∗−xn)+21f′′′(yn)(x∗−xn)2 where yn is between xn and x∗.]