The chain rule is the statement that differentiation $d : Diff \to Diff$ is a functor on Diff:
given two smooth functions between smooth manifolds $f : X \to Y$ and $g : Y \to Z$ we have
If one thinks of a tangent vector $v \in T_x X$ to be an equivalence class of a smooth path $\gamma_v : [-\epsilon,\epsilon] \to X$, for some $\epsilon\gt 0$, with $\gamma(0) = x$, then the chain rule is the associativity of the composite
Bracketed as $(g \circ f)\circ \gamma_v$ this represents $d(g \circ f)(v)$. Bracketed as $g \circ (f \circ \gamma_v)$ is represents $d g (d f (v))$.
Alternatively, in a context of synthetic differential geometry where with $D$ being the infinitesimal interval we may identify $v$ with $v : D \to X$, the chain rule is the associativity of
Let $X = Y = Z = \mathbb{R}$ the real line. Then the tangent bundle $T X$ is canonically identified with $\mathbb{R} \times \mathbb{R}$.
Given two functions, $f, g : \mathbb{R} \to \mathbb{R}$ their derivatives are traditionally regarded again as functions $f', g' : \mathbb{R} \to \mathbb{R}$, though strictly speaking we are to think of them as the maps
given by
and
The composite
is therefore the map
Therefore we have
This is the form in which the chain rule is usually introduced in elementary calculus.
While the chain rule is of great theoretical importance, it is completely unnecessary for the working out of derivatives of elementary functions in ordinary calculus (including the multivariable case). Every result giving the derivative of an elementary function corresponds to a rule (along the lines of the product rule) for the operation of that function to any expression. For example, instead of learning that
and then applying both this fact and the chain rule to find the derivative of an expression of the form $\sin u$, just learn
and apply this rule directly; the original fact is the special case in which $u \coloneqq x$. Even better, learn the rule as
then it applies without further modification to multivariable calculus (as well as implicit differentiation, related rates, integration by substitution, and other stock features of one-variable calculus).
The chain rule could still be used in the proof of this ‘sine rule’. Even so, it is quite possible to prove the sine rule directly (much as one proves the product rule directly rather than using the two-variable chain rule and the partial derivatives of the function $x, y \mapsto x y$). In any case, the chain rule is not directly needed when working out specific derivatives. As a rule of differentiation, the chain rule is needed only when an unspecified differentiable function $f$ appears, and then may be given in the form
or
to match other rules.
Although the chain rule is often written as
this is an oversimplification. Upon choosing an independent variable, it is possible (and easy) to give a rigorous definition of the differential $\mathrm{d}u$, and then (1) is a triviality. However, with such a definition of differential, (1) is not the chain rule! The reason is that, when (1) is used as a mnemonic for the chain rule, ${\mathrm{d}u}/{\mathrm{d}x}$ uses $x$ as the independent variable, but ${\mathrm{d}y}/{\mathrm{d}u}$ uses $u$. Either may be chosen to define differentials^{1}, but one must (a priori) be consistent. Now, it so happens that the choice of independent variable is entirely irrelevant; differentials have the same meaning no matter which independent variable is used. But this fact requires proof; it is the chain rule (or at least a prerequisite for using (1) as the chain rule), and it is not a triviality. In this form, the chain rule is also known as Cauchy’s invariant rule.
On the Goodwillie chain rule? in Goodwillie calculus:
John R. Klein, John Rognes, A chain rule in the calculus of homotopy functors, Geom. Topol. 6 (2002) 853-887 (arXiv:math/0301079)
Jacob Lurie, section 6.3 of Higher Algebra
Well, either may be chosen as long as their differentials are nowhere zero, which is exactly what must be true for (1) to make sense. More precisely, given that $x$ works as an independent variable, the Chain Rule tells us that $u$ works just as well so long as $\mathrm{d}u$ (as defined using $x$) is nowhere zero. This may be related to the easy but wrong proof of the Chain Rule that founders on a division by zero in exactly the same place (where $\mathrm{d}u$ would be), although I don't see a direct connection. ↩