The Problem
Most equations f(x) = 0 cannot be solved algebraically. Newton's Method uses derivatives to solve them numerically — starting from an initial guess and improving it rapidly.
xn+1 = xn − f(xn) / f'(xn)
At each step, draw the tangent line to f at the current point. The tangent line crosses the x-axis at the next (better) guess.
📋 Find √2 using Newton's Method
f(x) = x² − 2, f'(x) = 2x. Formula: xₙ₊₁ = (xₙ + 2/xₙ)/2. Start x₀ = 1.
x₁= (1 + 2)/2 = 1.5
x₂= (1.5 + 1.333)/2 = 1.41667
x₃= 1.41422 — correct to 5 decimal places after 3 steps
Quadratic Convergence
Newton's Method converges quadratically — the number of correct decimal places roughly doubles with each iteration. Starting with 2 correct digits, the next iteration gives 4, then 8, then 16.
⚠ When It Fails
Newton's Method can fail if the initial guess is far from the root, if f'(xₙ) = 0 (division by zero), or if the function oscillates. Always check that your answer satisfies f(x) ≈ 0.
Frequently Asked Questions
How do I choose the initial guess?▾
Plot the function or use physical intuition. The closer to the actual root, the faster the convergence. If you have no idea, bisect the interval first to narrow down the region.
Is Newton's Method still used today?▾
Absolutely. It underpins modern numerical solvers in engineering, physics simulations, and machine learning optimisation. It is one of the most practically important algorithms in applied mathematics.
Where the Formula Comes From
At the current guess xₙ, the tangent line to f is: y = f(xₙ) + f'(xₙ)(x − xₙ). Set y = 0 to find where the tangent crosses the x-axis: 0 = f(xₙ) + f'(xₙ)(x − xₙ), which gives x = xₙ − f(xₙ)/f'(xₙ). This is the formula. Every iteration replaces the curve with its tangent and finds where that tangent hits zero.
More Worked Examples
📋 Solve x⁴ − 5 = 0 (find ⁴√5)
Setupf(x) = x⁴−5, f'(x) = 4x³. Formula: xₙ₊₁ = xₙ − (xₙ⁴−5)/(4xₙ³)
x₀ = 1.5f(1.5) = 0.0625, f'(1.5) = 13.5. x₁ = 1.5 − 0.0625/13.5 = 1.4954
x₂= 1.4953 ← ⁴√5 correct to 4 decimal places in 2 steps
📋 Solve ln(x) = 2 − x
Setupf(x) = ln(x) − 2 + x, f'(x) = 1/x + 1. Start x₀ = 1
x₁= 1 − (0 − 2 + 1)/(1 + 1) = 1 + 0.5 = 1.5
x₂= 1.5 − (0.405−0.5)/(0.667+1) = 1.557. Converging to ≈ 1.557
When to Stop
Stop iterating when |xₙ₊₁ − xₙ| < your desired tolerance (e.g. 0.0001), or when |f(xₙ)| < tolerance. Always verify: plug your answer back into f(x) and confirm it is close to zero. Three to six iterations typically give ten or more decimal places of accuracy for well-behaved functions.
Failure Modes in Detail
📋 When Newton's method fails
f'(xₙ) = 0Division by zero. Tangent is horizontal, never crosses x-axis. Fix: pick a different starting point.
Far from rootTangent may point to a completely different root or diverge. Fix: use bisection to narrow down first.
Multiple rootsIf roots are close together, the method may jump between them. Fix: plot f first to see root locations.
CycleSome functions produce x₁ → x₂ → x₁ → x₂ ... Fix: choose a different starting point.
Four More Examples
📋 Find all roots of x³ − x − 1 = 0
Analysisf(1)=−1<0, f(2)=5>0 → one real root near x=1.3. f'(x)=3x²−1
x₁= 1.5 − (1.875−1.5−1)/(6.75−1) = 1.5 − (−0.625/5.75) = 1.609
x₂= 1.3252
x₃= 1.3247 ← converged
📋 Minimise f(x) = x⁴ − 4x² using Newton's method on f'
GoalFind zeros of f'(x) = 4x³ − 8x. Apply Newton to g(x) = f'(x) with g'(x) = f''(x) = 12x²−8
Start x₀=1.5g(1.5) = 13.5−12 = 1.5. g'(1.5) = 27−8 = 19. x₁ = 1.5−1.5/19 = 1.421
x₂= 1.414 ← √2 (the actual minimum location)
Newton's Method vs Bisection
Bisection is slower (linear convergence — one more decimal place per step) but guaranteed to converge if you start with a bracket [a,b] where f(a) and f(b) have opposite signs. Newton's method is far faster (quadratic convergence) but can fail. In practice, use bisection to find a rough bracket, then switch to Newton's method to refine quickly. This is the "hybrid" approach used by production numerical solvers.
Solving Optimisation Problems Numerically
Newton's method applies directly to optimisation: to minimise f(x), find zeros of f'(x) by applying Newton's method to g(x) = f'(x). The iteration becomes xₙ₊₁ = xₙ − f'(xₙ)/f''(xₙ). This is Newton's method applied one level up — and it is the foundation of second-order optimisation algorithms used in machine learning.
AM
Dr. Aisha Malik, PhD Mathematics
Senior Lecturer in Applied Mathematics · 12 years teaching calculus
Dr. Malik holds a PhD in Applied Mathematics from the University of Edinburgh and has taught calculus to over 4,000 students. She has reviewed this article for mathematical accuracy and pedagogical clarity.
Technically reviewed by: Prof. James Chen, Stanford Mathematics Department · April 2026