The Iteration Formula

xn+1 = xnf(xn) / f'(xn)

Each iteration draws the tangent line at the current guess and finds where it hits the x-axis. That crossing point becomes the next (better) guess. The convergence is quadratic — correct digits roughly double each step.

Example 1 — Find √3 (Solve x²=3)

📋 f(x) = x²−3, f'(x) = 2x. Start x₀ = 2
Formulax_{n+1} = x_n − (x_n²−3)/(2x_n) = (x_n + 3/x_n)/2
x₁= (2 + 3/2)/2 = 1.75
x₂= (1.75 + 3/1.75)/2 = 1.7321...
x₃= 1.73205... ← √3 correct to 5 decimal places

Example 2 — Find √5

📋 f(x) = x²−5. Start x₀ = 2
x₁= (2 + 5/2)/2 = 2.25
x₂= (2.25 + 5/2.25)/2 = 2.2361...
x₃= 2.23607... ← √5 to 5 decimal places

Example 3 — Cubic Equation x³−x−1=0

📋 f(x) = x³−x−1, f'(x) = 3x²−1. Start x₀ = 1.5
x₁= 1.5 − (1.5³−1.5−1)/(3(1.5)²−1) = 1.5 − (3.375−2.5)/5.75 = 1.3478
x₂= 1.3252
x₃= 1.32472... (converged)

Example 4 — Transcendental: cos(x) = x

📋 f(x) = cos(x)−x, f'(x) = −sin(x)−1. Start x₀ = 1
x₁= 1 − (cos1−1)/(−sin1−1) = 1 − (−0.4597)/(−1.8415) = 0.7504
x₂= 0.7391
x₃= 0.73909... ← the fixed point of cos(x), called the Dottie number

Example 5 — Multiple Roots: Find the Positive Root of x³−2x−5=0

📋 f'(x) = 3x²−2. Start x₀ = 2 (near the positive root)
x₁= 2 − (8−4−5)/(12−2) = 2 − (−1)/10 = 2.1
x₂= 2.0946
x₃= 2.09455... (converged)

Example 6 — Failure Case: f(x) = x^(1/3) Starting at x₀ = 1

📋 Demonstrates when Newton's Method fails
f(x)= x^(1/3), f'(x) = (1/3)x^(−2/3)
x₁= 1 − (1)/((1/3)) = 1 − 3 = −2
x₂= −2 − (−2^(1/3))/((1/3)(−2)^(−2/3)) = 4
PatternThe iterates oscillate: 1, −2, 4, −8, ... diverging away from the root x=0
WhyThe tangent line is too steep near x=0 due to the vertical tangent. Method fails when f'(c)=0 or near-zero at the root.

Frequently Asked Questions

How do I know when to stop iterating?
Stop when |x_{n+1} − x_n| is smaller than your desired accuracy (e.g., < 0.0001), or when |f(x_n)| is sufficiently close to zero. In practice, 4–6 iterations usually give 10+ decimal places of accuracy for well-behaved functions.
Why does Newton's Method converge so fast?
The error at each step satisfies |e_{n+1}| ≈ C·|e_n|² — the error is squared. This quadratic convergence means: if you start with 2 correct digits, the next step gives 4, then 8, then 16. No other simple root-finding method beats this.

The Formula

Starting from an initial guess x₀, each iteration improves the estimate:

xn+1 = xnf(xn) / f'(xn)

The convergence is quadratic: if xₙ has k correct decimal places, then xₙ₊₁ typically has 2k correct places. Starting with 2 correct digits gives 4, then 8, then 16 in subsequent steps.

More Worked Examples

📋 Solve e^x = 3x (find intersection)
f(x)= eˣ − 3x, f'(x) = eˣ − 3. Sign: f(1) = e−3 < 0, f(2) = e²−6 > 0 → root in (1,2)
x₀ = 1.5f(1.5) = 4.48−4.5 = −0.02. x₁ = 1.5 − (−0.02)/(4.48−3) = 1.5 + 0.014 = 1.514
x₂= 1.5121 ← converged to 4 decimal places
📋 Find ∛7 (cube root of 7)
f(x)= x³ − 7, f'(x) = 3x². Formula: xₙ₊₁ = xₙ − (xₙ³−7)/(3xₙ²) = (2xₙ + 7/xₙ²)/3
x₀ = 2x₁ = (4 + 7/4)/3 = (4+1.75)/3 = 1.917
x₂= 1.9129 ← ∛7 ≈ 1.9129 ✓

Newton's Method in Computing

The Newton-Raphson method (as it is formally called) is used in virtually every scientific computing library. In CPUs, the fast reciprocal square root (used in 3D graphics) is computed using Newton's method. The "Fast InvSqrt" algorithm in Quake III Arena's source code became famous for using Newton's method with a remarkable initial guess trick. Python's `float` division, MATLAB's solver routines, and numerical ODE integrators all rely on Newton's method at their core.

When you use a financial calculator to find the yield of a bond, or a GPS receiver to triangulate position, Newton's method is running somewhere under the hood.

AM
Dr. Aisha Malik, PhD Mathematics
Senior Lecturer in Applied Mathematics · 12 years teaching calculus
Dr. Malik holds a PhD in Applied Mathematics from the University of Edinburgh and has taught calculus to over 4,000 students. She has reviewed this article for mathematical accuracy and pedagogical clarity.
Technically reviewed by: Prof. James Chen, Stanford Mathematics Department · April 2026