| Input | Result | |
|---|---|---|
| d/dx[sin(x²)] | 2x·cos(x²) | |
| d/dx[e^(3x+1)] | 3·e^(3x+1) | |
| d/dx[(x²+5)⁶] | 12x·(x²+5)⁵ | |
| d/dx[ln(cos x) | −tan(x) |
The Chain Rule — Complete Guide
The Chain Rule differentiates composite functions: if y = f(g(x)), then dy/dx = f'(g(x))·g'(x). In words: differentiate the outer function (leaving the inner intact), then multiply by the derivative of the inner function. In Leibniz notation: dy/dx = (dy/du)·(du/dx) where u = g(x).
Identifying compositions: The Chain Rule is needed whenever the argument of a function is not plain x. sin(3x) — outer: sin, inner: 3x. e^(x²) — outer: eˣ, inner: x². (2x+1)⁵ — outer: (·)⁵, inner: 2x+1. √(x²+1) — outer: √·, inner: x²+1.
Examples: d/dx[sin(5x)] = cos(5x)·5 = 5cos(5x). d/dx[e^(3x²)] = e^(3x²)·6x. d/dx[(x³+1)⁴] = 4(x³+1)³·3x² = 12x²(x³+1)³. d/dx[ln(cos x)] = (1/cos x)·(−sin x) = −tan x.
Chain Rule with Product Rule: d/dx[x²·sin(x³)] = 2x·sin(x³) + x²·cos(x³)·3x² = 2x·sin(x³) + 3x⁴·cos(x³).
How to Use This Chain Rule Calculator
Enter your expression in the input box above using standard mathematical notation. Use ^ for exponents (e.g., x^3 for x³), * for multiplication when needed, sin(), cos(), tan(), ln(), sqrt() for standard functions. Then click Calculate to get your answer with full step-by-step working.
This calculator handles polynomial, trigonometric, exponential, logarithmic expressions, and combinations thereof. Results are shown in simplified form where possible, with each step of the working displayed below the answer.
For best results, enter expressions clearly without ambiguity. Use parentheses to group terms: (x^2 + 1)/(x - 1) rather than x^2+1/x-1. The calculator follows standard order of operations.
The Chain Rule — Deep Understanding
The Chain Rule is calculus's answer to a fundamental question: how do you differentiate a function that is built by nesting one function inside another? The answer is elegant: differentiate the outer function (treating the inner as a single unit), then multiply by the derivative of the inner function. This "outer times inner derivative" structure reflects something deeper — the chain of dependencies between variables. If y depends on u, and u depends on x, then how fast y changes with x equals how fast y changes with u, scaled by how fast u changes with x.
In Leibniz notation this is: dy/dx = (dy/du)·(du/dx). This looks like fraction multiplication — the du's "cancel" — and while this isn't literally what happens (du is not a fraction), the notation captures the right intuition. The formal proof uses the definition of the derivative and requires a small technical argument to handle the case where g'(x) = 0, but the result itself is exactly what the notation suggests.
Recognising Composite Functions
The critical skill is identifying compositions before differentiating. A function f(x) is a composition whenever evaluating it at a specific number x = 2 requires a two-step process: first compute something (the inner function), then apply another operation to that result (the outer function). For sin(x²): compute x² first (inner), then apply sin (outer). For e^(3x+1): compute 3x+1 first (inner), then exponentiate (outer). For √(x²−4): compute x²−4 first (inner), then take the square root (outer).
Contrast with non-compositions: sin(x)·x² is a product, not a composition — both factors depend directly on x, not on each other. It needs the Product Rule, not the Chain Rule. The distinction is structural: are two functions layered (one inside the other), or side by side (multiplied, added, or divided)?
The Chain Rule in Multiple Dimensions
The Chain Rule generalises to multivariable calculus in an important way. If z = f(x,y) and both x and y are functions of t, then dz/dt = (∂f/∂x)·(dx/dt) + (∂f/∂y)·(dy/dt). Each path from t to z through an intermediate variable contributes one term — the partial derivative along that path times the rate of change along it. This multivariable Chain Rule is the mathematical heart of backpropagation in neural networks, where the loss function depends on weights through many layers of compositions.
- Stewart, J. (2015). Calculus, §3.4. Cengage.
- Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep Learning, §6.5. MIT Press.
- Apostol, T. (1967). Calculus, Vol. 1. Wiley.