(a) Let \(f(x)=x^{100}\). Then \(f^{\prime}(x)=100 x^{99}\) and the Newton Method iteration is

\[

x_{n+1}=x_{n}-\frac{x_{n}^{100}}{100 x_{n}^{99}}=\frac{99}{100} x_{n} .

\]

So, to calculator accuracy, \(x_{1}=0.099, x_{2}=.09801, x_{3}=0.0970299\), \(x_{4}=0.096059601\), and \(x_{5}=0.095099004\).

Note the slow progress rate. The root is 0 , of course, but in 5 steps we have barely inched closer to the truth.

(b) Let \(f(x)=3 x^{1 / 3}\). Then \(f^{\prime}(x)=x^{-2 / 3}\), and the Newton Method iteration becomes

\[

x_{n+1}=x_{n}-\frac{3 x^{1 / 3}}{x^{-2 / 3}}=x_{n}-3 x_{n}=-2 x_{n} .

\]

Now everything is easy. The next 10 estimates are \(-0.2,0.4,-0.8\), \(1.6,-3.2,6.4,-12.8,2.56,-5.12\), and \(10.24\). It is obvious that things are going bad. In fact, if we start with any non-zero estimate, the Newton Method estimates oscillate more and more wildly.

**Note.** The above two examples-with very slow convergence in (a) and total failure in (b) - are not at all typical. Ordinarily the Newton Method is marvellously efficient, at least if the initial estimate is close enough to the truth.

Note that in part (a), successive estimates were quite close to each other, but not really close to the truth. So we need to be a little cautious about the usual rule of thumb that we can stop when two successive estimates agree to the number of decimals we are interested in. But still, in most cases, the rule of thumb is a good one.