I am teaching Calc I, for the first time, and I haven't seriously revisited the subject in quite some time. An interesting pedagogy question came up: How misleading is it to regard $\frac{dy}{dx}$ as a fraction?

There is one strong argument against this: We tell students that $dy$ and $dx$ mean "a really small change in $y$" and "a really small change in $x$", respectively, but these notions aren't at all rigorous, and until you start talking about nonstandard analysis or cotangent bundles, the symbols $dy$ and $dx$ don't actually mean anything.

But it gives the right intuition! For example, the Chain Rule says $\frac{dy}{du} \cdot \frac{du}{dx}$ (under appropriate conditions), and it looks like you just "cancel the $du$". You can't literally do this, but it is this intuition that one turns into a proof, and indeed if one assumes that $\frac{du}{dx} \neq 0$ this intuition gets you pretty close.

The debate about how rigorous to be when teaching calculus is old, and I want to steer clear of it. But this leaves an honest mathematical question: Is treating $\frac{dy}{dx}$ as a fraction the road to perdition, for reasons beyond the above, and which have not occurred to me?For example, what (if any) false statements and wrong formulas will it lead to?

(Note: Please don't worry, I have no intention of telling students that $\frac{dy}{dx}$ *is* a fraction; only, perhaps, that it can usually be treated as one.)

7more comments