A correction to the definition of ∫f(x)dx

Welcome file

Abstract. This essay points out a widely appeared mistake in some textbooks about the definition of the indefinite integral notation f(x)dx\int_{}^{}{f\left( x \right)\text{dx}}.

Some textbooks define f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} as a set of antiderivatives of ff, such as Thomas’Calculus1:

The collection of all antiderivatives of ff is called the indefinite integral of ff with respect to xx, and is denoted by

f(x)dx\int_{}^{}{f\left( x \right)\text{dx}}

The symbol \int_{}^{}{}is an integral sign. The function ff is the integrand of the integral, and xx is the variable of integration.

And also in another widely used book James Stewart’s Calculus2:

…an indefinite integral f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} is a function (or family of functions)…

Be aware of the phrase enclosed in the bracket, a family of functions is a collection of functions whose equations are related3, hence the statement above also indicates that f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} represents a set.

However, f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} is also defined as a primitive function of ff in some books, such as Richard Courant & Fritz John’s Introduction to Calculus and Analysis Volume I4:

Every primitive function F(x)F\left( x \right) of a given function f(x)f\left( x \right) continuous on an interval can be represented in the form

F(x)=c+ϕ(x)=c+axf(u)duF\left( x \right) = c + \phi(x) = c + \int_{a}^{x}{f\left( u \right)\text{du}}

where cc and aa are constants, and conversely, for any constant values of

aa and cc chosen arbitrarily this expression always represents a primitive function.

And also in book The Fundamentals of Mathematical Analysis Volume 1 by

G. M. Fikhtengol’ts5:

Consequently the expression F(x)+CF\left( x \right) + C, where CC is an arbitrary constant, is the general form of the function which has the derivative f(x)f\left( x \right) or the differential f(x)dxf\left( x \right)\text{dx}. This expression is called the indefinite integral of f(x)f\left( x \right) and is denoted by

f(x)dx\int_{}^{}{f\left( x \right)\text{dx}}

which implicitly contains the arbitrary constant.

So which definition of f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} is right?

There is no doubt that if F(x)F\left( x \right) is an antiderivative of f(x)f\left( x \right) on an interval II, then the general antiderivative of f(x)f\left( x \right) on an interval II is F(x)+CF\left( x \right) + C where CC is an arbitrary constant, and the set of antiderivative of f(x)f\left( x \right) on II is {F(x)+CCR}\{ F\left( x \right) + C|C \in R\}, if one insists that f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} represents the set of antiderivative of f(x)f\left( x \right), then there should be

f(x)dx={F(x)+CCR}\int_{}^{}{f\left( x \right)dx = \{ F\left( x \right) + C|C \in R\}}

But the convention of using f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} in practice is

f(x)dx=F(x)+C\int_{}^{}{f\left( x \right)dx = F\left( x \right) + C}

where CC is any real number, and consequently exists

exdx=ex+C\int_{}^{}{e^{x}dx = e^{x} + C}

cosxdx=sinx+C\int_{}^{}{cosxdx = \text{sinx} + C}

etc., that is, f(x)dx\int_{}^{}{f\left( x \right)\text{dx}} is NOT treated as a set of antiderivatives of f(x)f\left( x \right) in using, according to the de facto convention, f(x)dx\int_{}^{}{\mathbf{f}\left( \mathbf{x} \right)\mathbf{\text{dx}}} should be unambiguously defined as the general form of the antiderivative of f(x)\mathbf{f}\left( \mathbf{x} \right) rather than a set of antiderivatives of f(x)f\left( x \right).


  1. Thomas’ calculus, 14th edition, Joel R. Hass, Christopher E. Heil, Maurice D. Weir , p236 ↩︎

  2. Calculus, 8th Edition, James Stewart, p331 ↩︎

  3. Calculus, 8th Edition, James Stewart, p29 ↩︎

  4. Introduction to Calculus and Analysis Volume I, Reprint of the 1989 edition, Richard Courant, Fritz John, p188 ↩︎

  5. The Fundamentals of Mathematical Analysis, Volume 1, 1st Edition, G. M. Fikhtengol’ts, p300 ↩︎

Comments