Teaching calculus with infinitesimals and differentials

  • PDF / 1,809,354 Bytes
  • 14 Pages / 595.276 x 790.866 pts Page_size
  • 30 Downloads / 290 Views

DOWNLOAD

REPORT


ORIGINAL PAPER

Teaching calculus with infinitesimals and differentials Robert Ely1 Accepted: 5 October 2020 © FIZ Karlsruhe 2020

Abstract Several new approaches to calculus in the U.S. have been studied recently that are grounded in infinitesimals or differentials rather than limits. These approaches seek to restore to differential notation the direct referential power it had during the first century after calculus was developed. In these approaches, a differential equation like dy = 2x·dx is a relationship between b (x) increments of x and y, making dy/dx an actual quotient rather than code language for lim f (x+h)−f  . An integral ∫ a 2xdx is a h h→0

sum of pieces of the form 2x·dx, not the limit of a sequence of Riemann sums. One goal is for students to develop understandings of calculus notation that are imbued with more direct referential meaning, enabling them to better interpret and model situations by means of this notation. In this article I motivate and describe some key elements of differentials-based calculus courses, and I summarize research indicating that students in such courses develop robust quantitative meanings for notations in single- and multi-variable calculus. Keywords  Calculus · Differentials · Infinitesimals · Nonstandard analysis · Definite integral

1 Introduction Long before there was The Calculus, there were the “differential calculus,” the “integral calculus,” and “the infinitesimal calculus,” several among many calculuses that emerged in the seventeenth and eighteenth centuries (Hutton 1795). The fundamental notations for these calculi, which are still the main notations of calculus today, were invented by Leibniz in the mid-1670s. He used the differential (e.g., dx) to denote an infinitesimal difference, and the integral ∫ (big S for summa) to denote an infinite sum of such infinitesimal quantities. Isaac Newton, of course, independently developed similar foundational ideas although his notations are less commonly used today. During the first century in the life of what is now calculus, mathematicians on the continent used Leibniz’ notation, treating the differential calculus and infinitesimal calculus as being fundamentally about (unsurprisingly) differentials and infinitesimals. What exactly these continental mathematicians imagined “infinitesimals” to be is a matter of scholarly debate,1 but nonetheless two things are clear enough about their views:

* Robert Ely [email protected] 1



University of Idaho, Moscow, ID, USA

1. The fundamental object of the differential calculus was the differential, not the derivative, and certainly not the limit. 2. A differential represented an infinitesimal difference between two values of a variable. By the end of the nineteenth century, various calculuses had become The Calculus. The differential was no longer the primary object, nor did it any longer refer to an infinitesimal. By the mid-twentieth century, the story had become about how infinitesimals were jettisoned due to their lack of rigor. Bertrand Russell sums up this oft-told ta