## March 19, 2012

### Reader Survey: log|x| + C

#### Posted by Tom Leinster The semester is nearly over here — just one more week of teaching to go! I’m profoundly exhausted, but as the end comes into sight, I feel my spirits lifting. As soon’s as it’s over, I’ll be heading to Ohio to spend a couple of weeks working with Mark Meckes. The trip is close enough now that I’m starting to get that excited anticipation; soon I’ll be back exploring the wide world of new ideas.

But not so fast: there’s one teaching-related matter to deal with first.

Have you ever taught calculus? If so, what did you tell your students was the answer to $\displaystyle\int \frac{1}{x} d x$?

Here we tell them that it’s $\log|x| + C$, where $C$ is the famous ‘constant of integration’. I’m pretty sure that’s what I was taught myself.

But it’s wrong. At least, it’s wrong if you interpret the question and the answer in what I think is the obvious way. It’s wrong for reasons that won’t surprise many readers, and although I’ll explain those reasons, I don’t think that’s such an interesting point in itself.

What I’m more interested in hearing about is the pedagogy. If you think it’s bad to teach students things that are flat-out incorrect, what do you do instead? I’m not talking about advanced students here: these are 17- and 18-year-olds, many of whom won’t take any further math courses. What do you tell them about $\displaystyle\int \frac{1}{x} d x$?

Here, we tell our students explicitly that to ‘solve’ an indefinite integral $\displaystyle\int f(x) d x$ is to find the general antiderivative of $f$, that is, to find the general solution $F$ to the differential equation $F' = f$. So, when one says that $\displaystyle\int \frac{1}{x} d x = \log|x| + C$, one is saying that the general solution $F$ to $F'(x) = 1/x$ is $F(x) = \log|x| + C$, where $C$ is a constant.

This is simply not the case. The general solution is

$F(x) = \begin{cases} \log|x| + C^- &\text{if }  x \lt 0\\ \log|x| + C^+ &\text{if }  x \gt 0 \end{cases}$

where $C^-$ and $C^+$ are constants. So, the space of solutions is two-dimensional, not one-dimensional.

It’s implicit here that $f$ and $F$ are supposed to be real-valued functions defined on $\mathbf{R}\setminus\{0\}$. Courses at this level don’t usually pay much attention to domains and codomains, but since the question itself involves a term $1/x$, it’s clear that the value $x = 0$ is forbidden.

If we ignore the concerns of teaching for a moment, probably the best way to say it is that the general antiderivative of $1/x$ on $U = \mathbf{R}\setminus\{0\}$ is $x \mapsto \log|x| + C$, where $C$ is not a constant but a locally constant function on $U$.

More generally, if $U$ is an open subset of $\mathbf{R}$ then the functions $F\colon U \to \mathbf{R}$ satisfying $F' = 0$ are exactly the locally constant functions. The dimension of the space of solutions is, therefore, the number of connected-components of $U$. So, if $f\colon U \to \mathbf{R}$ is a function with at least one antiderivative, then the dimension of the space of antiderivatives is also the number of connected-components of $U$. In the case at hand, it’s two.

As I said, none of that is profound or difficult. All the same, it came as a bit of a shock to learn that the hallowed formula ‘$\log|x| + C$’ that I’ve carried around in my head for so long isn’t really the correct answer to anything — at least, not if $C$ is a constant.

So what do we do about it?

I’m all for giving informal explanations. I learned to differentiate before I knew the definition of differentiation, and I learned the definition of differentiation before I saw a rigorous treatment of the real numbers. That’s how teaching traditionally goes at this level. We don’t work our way through Bourbaki.

But I don’t like the idea of teaching things that are outright wrong. So, I don’t want to tell my students that $\displaystyle\int \frac{1}{x} d x = \log|x| + C$ where $C$ is a constant.

What do we do instead? Are we really going to tell these students — who, remember, might be 17 years old and not interested in mathematics at all — that the constant of integration $C$ is actually a ‘constant that varies’? Do we give them the explicit formula

$\int \frac{1}{x} d x = \begin{cases} \log|x| + C^- &\text{if }  x \lt 0\\ \log|x| + C^+ &\text{if }  x \gt 0 \end{cases}$

where $C^-$ and $C^+$ are constants? Or do we simply cop out, by avoiding integrating $1/x$ over disconnected domains?

I think I know what I think — but I want to hear your answers first.

Posted at March 19, 2012 5:16 AM UTC

TrackBack URL for this Entry:   http://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2509

## 74 Comments & 0 Trackbacks

### Re: Reader Survey: log|x| + C

Simplest thing to do, I think, is think of C as being locally constant, rather than globally constant. (After all, this is how one would need to generalise the fundamental theorem of calculus to disconnected domains anyway.)

Posted by: Terry on March 19, 2012 5:39 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Right, that’s how I think of it. But are you saying that’s how you’d put it to beginning calculus students? Would you actually use the word “locally” and explain what it meant, or would you find some other way?

Of course, they don’t know what “disconnected” means. Connectedness is an easy enough thing to explain intuitively, but there’s a question of how much time can be spent on explaining just one integral.

Posted by: Tom Leinster on March 19, 2012 6:07 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Students seeing integral calculus for the first time should understand what it means for a function to be continuous on an interval. The standard approach is just to point out that an equation which involves an indefinite integral, such as

$\int f(x) dx = F(x) +C$

only holds on an interval where $f(x)$ is defined and continuous. Then you can explain why the constant is an issue by taking a few cases like $f(x) = \ln(x)$ or $f(x) = \frac{x^2}{1-x^2}$ and explicitly verify by counterexample that $C$ is not globally defined in general.

In my experience, students have no problem picking up on the idea that the constant $C$ is locally defined, without any need to make such a definition. A wise man once told me that students have no problems with abstraction, so long are they’re comfortable with the concrete cases they’re abstracting from.

Posted by: Brendan Cordy on March 21, 2012 5:26 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

My approach is basically what you called a cop out: to avoid integrating $1/x$ — or indeed any other function — over a disconnected domain. My justification for this is that the fundamental theorem of calculus, which most students at the level you’re talking about quickly start to believe is the definition of a definite integral, only holds over an interval.

Here’s a trickier cousin of your example to consider. Blindly attempting to apply the fundamental theorem of calculus, you might write $\int_{-1}^1 \frac{1}{x^2} dx = -\frac{1}{x}\bigr]_{-1}^1 = -2.$ This is clearly wrong — if the integral exists at all its value must be positive. Searching for the mistake, if you’re not sophisticated enough to think you’d better check the hypotheses of the fundamental theorem, you might think you had the antiderivative wrong. And in fact you do have “the” antiderivative wrong — the real antiderivative of $1/x^2$ is $-1/x + C$, where $C$ is a locally constant function on $\mathbb{R}\setminus \{ 0 \}$. By adjusting the values of $C$ on the two components of the domain, you can make the formal calculation above yield any finite value, but not the correct value of $+\infty$.

As far as I can see the best solution at this level is to keep integration essentially restricted to intervals on which the integrand is everywhere defined.

Posted by: Mark Meckes on March 19, 2012 6:06 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Nice! I’m sure I’ve seen that example before, but I’d forgotten it.

Posted by: Tom Leinster on March 19, 2012 6:22 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Someone told me once that an early version of one of the popular commercial computer algebra systems made precisely that mistake.

Posted by: Mark Meckes on March 19, 2012 6:30 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I teach my students that $\log|x|$ is a anti-derivative of $\frac{1}{x}$. I rarely ask them to give all anti-derivatives of a given function.

And I tell them that whenever the integrand becomes singular, this means that there are some hidden limits. Thus, I’d say that $\int_{-1}^1\frac{1}{x^2}dx$ is defined as $\int_{-1}^1\frac{1}{x^2}dx = \lim_{\epsilon\to0+} \int_{-1}^{-\epsilon}\frac{1}{x^2}dx + \lim_{\delta\to0+} \int_{\delta}^{1}\frac{1}{x^2}dx.$ Now they can use any two anti-derivatives of $\frac{1}{x^2}$ in order to evaluate the definite integrals, and will always get the correct answer $\infty$.

Posted by: Stefan Keppeler on March 19, 2012 5:40 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

It’s been several years since I taught that material, and on reflection I think what I’ve actually done in the past is essentially what you describe. However, the issue remains that many textbooks (although maybe not you) state that if $F$ is an antiderivative of $f$, then the general antiderivative is $F + C$ where $C$ is an arbitrary constant, and as Tom points out that’s simply not true.

Posted by: Mark Meckes on March 19, 2012 9:06 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

> (although maybe not you)

Oh, I’m pretty sure I also taught the wrong statement before.

Posted by: Stefan Keppeler on March 20, 2012 10:14 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Interesting question! I don’t understand what the “cop out” option means, though. The question is about an indefinite integral, which doesn’t come with any domain at all. Right? If you’re talking about definite integrals, then I’ve never even tried (or seen any beginning calculus class that tried) to teach people to integrate over any domain other than a closed interval, and I think Mark is exactly right that then the integrand should be defined over the entire interval. But I don’t see how that solves the problem of what to tell them the indefinite integral of $1/x$ is.

Posted by: Mike Shulman on March 19, 2012 3:29 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I suppose by “cop out” I meant sticking to definite integrals over subintervals of either $(0, \infty)$ or $(-\infty, 0)$. It’s a cop-out because we’re deliberately not mentioning the indefinite integral of $1/x$. (Of course, cop-outs are sometimes pedagogically wise.)

As Mark says, students at this level tend to treat integration as the inverse of differentiation anyway. So to them, the difference between an indefinite integral $\displaystyle\int f(x) d x$ and a definite integral $\displaystyle\int_a^b f(x) d x$ isn’t so great.

To do the first, you (i.e. they) find some $F$ whose derivative is $f$, and then the “answer” is $F(x) + C$. (And only the most conscientious bother writing “where $C$ is the constant of integration”.) To do the second, you do the same thing but without mentioning a $C$, and then calculate $F(b) - F(a)$. I’d imagine that as far as they’re concerned, the main difference between indefinite and definite integrals is that in the second case, you have to do this extra little step of calculation.

Posted by: Tom Leinster on March 19, 2012 3:38 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I once taught an honors second-term calculus course for new students — these were students just beginning university who had a lot of mathematical ability and a good but not extensive high school math background. On the first day I gave them a quiz to find out which topics they had and hadn’t seen already, and to check their understanding of the fundamentals. Among the questions I asked were for the meanings of the terms “definite integral” and “indefinite integral”. All but two of them gave answers along the lines of “an integral with limits” and “an integral without limits”.

I decided to devote the second class to clearing that up before moving on.

Posted by: Mark Meckes on March 20, 2012 4:15 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I don’t see how you avoid mentioning the indefinite integral of $1/x$. Are you saying that you just don’t talk about indefinite integrals at all, i.e. only write down the symbol $\int$ when it has an upper and a lower limit? I like that idea a lot, but it would be difficult to do with all the calculus textbooks that I’ve used.

Posted by: Mike Shulman on March 19, 2012 11:15 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I just meant not mentioning the indefinite integral of $1/x$ (or similarly problematic functions). Mentioning definite integrals of $1/x$ over intervals not including $0$, and mentioning indefinite integrals of nicer functions, should be OK. I’m not sure that’s a good solution; I was just presenting it as an option.

In fact, my role this semester has just been to teach exercise classes, not lecture, so I don’t get to decide how things are presented anyway. (We don’t tend to use books here to nearly the same extent as I gather is normal in the US. While that certainly has some disadvantages, it does have the advantage that the lecturer isn’t constrained by what the book does.)

Posted by: Tom Leinster on March 19, 2012 11:34 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I guess I’m just lazy, or a heretic. I tend to tell them the integral of 1/x is log(x) + C, no absolute values. So implicitly I’m telling them not to play with x But yes, if you’re going to claim it’s log|x|, then something has to be done about the non-constant constant. Of course, all this gets even messier when you’re solving simple differential equations. At that point the non-constant constant often turns into a not-necessarily-positive e^C.

Posted by: Fernando on March 19, 2012 4:23 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Thanks for this post, I had never even realized this issue.

One sneaky way out would be to speak of “a” general antiderivative, rather than “THE” general antiderivative. log|x|+C is a general antiderivative of 1/x, just not the most general possible one.

To support this, look at why exactly we teach freshmen constants of integration in the first place. It’s not for the sake of calculating definite integrals– the constants cancel there, provided the fundamental theorem is actually applicable (as opposed to Stefan Keppeler’s brilliant example). Nor (apparently) is it for the sake of the deeper theory, since as this blog post points out, if that’s the purpose, we’re doing it wrong! We teach constants of integration to make initial-value differential equations solvable. For that purpose, only “a” general antiderivative is needed, rather than “the most” general one.

(At least provided the question makes sense, not like “A car’s velocity function is 1/t, at time t=-1 it has position 0, find its position at time t=1”)

Posted by: Sam Alexander on March 19, 2012 9:19 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

On a somewhat tangential note, in physics applications, it really should be $\log(Cx)$, since if $x$ is a physical quantity such as length, $\log x$ doesn’t actually make sense.
Posted by: Harald Hanche-Olsen on March 19, 2012 10:36 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Posted by: Tom Leinster on March 19, 2012 11:07 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

While I’m at it, here’s another piece of widely taught wrongitude:

Definition  A group is a set $G$ together with a binary operation $\star$, such that:

1. for all $x, y, z \in G$,   $(x \star y) \star z = x \star (y \star z)$
2. there exists $e \in G$ such that for all $x \in G$,   $x \star e = x = e \star x$
3. for all $x \in G$ there exists $x' \in G$ such that $x \star x' = e = x' \star x$.

This seems to be quite common. It’s the “definition” contained in the notes for a course I took over this year. It’s also the “definition” I was given as an undergraduate myself. I remember thinking at the time that this was the height of logical rigour. I only realized years later that it was, in fact, nonsense.

It’s (3) that doesn’t make sense. It refers to an element called “$e$”, but no element called $e$ has been defined. Presumably it’s meant to refer to “the” element $e$ satisfying the condition in (2), but in principle, there could be many such elements $e$ — that is, many identity elements. In principle, (3) might be true for some identity elements and not others. Is (3) required to hold for all identity elements $e$? Or just at least one of them? Can the identity element(s) for which it holds depend on $x$? And so on: it’s just not a logically well-formed statement.

Another way to describe the problem: (2) is logically equivalent to:

there exists $f \in G$ such that for all $x \in G$,   $x \star f = x = f \star x$.

If we replace (2) by this statement (which we can, because it’s logically equivalent) then (3) is clearly nonsensical — the letter $e$ hasn’t even been used before.

I know two and a half ways to say it properly. The first is to pause after (1) and (2), prove that identities are in fact unique, then proceed to (3). The second is to say “a group is a set $G$ together with a binary operation $\star$ and an element $e$ satisfying…”. The two-and-a-halfth is the same, but including inverses as part of the structure too (rather than asserting the existence of inverses as a property). Whichever way you do it, you’re going to have to prove statements on uniqueness of identities and inverses sooner or later, anyway.

I don’t find this as pedagogically challenging as $\log|x| + C$, because in this case I can see several ways of presenting the material that I’m comfortable with.

Posted by: Tom Leinster on March 19, 2012 11:51 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I am aghast that I never noticed this before. I believe I’ve seen textbooks that give the definition correctly, but I know I’ve seen textbooks that do it incorrectly, and that I’ve failed to notice the difference and taught the wrong version to students.

Posted by: Mark Meckes on March 20, 2012 1:06 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Wow. When I read what you just wrote, I immediately thought “would any experienced mathematician actually write that?” But then I went and picked up the first abstract algebra book I owned (Dummit & Foote) and was horrified to discover that that’s the definition they give! (My first abstract algebra class didn’t follow the book, though, and I don’t think I still have the notes, so I can’t say what definition I was taught.)

How can anyone write that and fail to notice that it is nonsense? Maybe mathematicians are actually constructive type theorists and don’t know it.

By the way, I think your first proper way to say it is inferior to the other one-and-a-half. As I’m sure you know, the identity, and perhaps the inverses too, should be considered structure rather than a property, even though they are unique as soon as they exist — because you want them to be preserved by homomorphisms. In the case of groups, by accident it happens that preserving the multiplication implies preserving the identity and inverses, so you can get away with considering the other two as properties — but it’s not a good habit to get into. (-:

Posted by: Mike Shulman on March 20, 2012 1:38 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

What a good game! I hadn’t thought of looking at the definitions in textbooks.

I just tried it with the first three relevant books I could find: Lang’s Algebra, Herstein’s Topics in Algebra, and Allenby’s Rings, Fields and Groups. Lang, unsurprisingly, does it respectably: he defines monoid first, then says that a group is a monoid with inverses (thus taking your least favourite correct route). But Herstein and Allenby both fall into the trap I described.

Posted by: Tom Leinster on March 20, 2012 1:49 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Loyal category theorists will wince: Mac Lane and Birkhoff get it wrong in their text Algebra, too. They employ a curious phrasing:

A group $G$ is a set $G$ together with a binary operation $G \times G \to G$, written $(a, b) \mapsto a b$, such that

1. this operation is associative;
2. there is an element $u \in G$ such that $u a = a = a u$ for all $a \in G$;
3. for this element $u$, there is to each element $a \in G$ an element $a' \in G$ with $a a' = u = a' a$

(my emphasis). It’s almost as if they know it’s not legit. But they don’t get around to remarking that identities are unique for another four pages.

Posted by: Tom Leinster on March 20, 2012 1:55 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I think the definition of Birkhoff and Mac Lane is not so bad as the first version because there are many times we give definitions in mathematics which depend on choices to be well defined and then we prove that they are well defined. So I think the Birkhoff Mac Lane definition would be legit if they immediately followed it up with a proof that e is unique.
Posted by: Benjamin Steinberg on March 20, 2012 4:18 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I hadn’t thought of looking at the definitions in textbooks.

Wow, that really does prove the truth of your assertion that

We don’t tend to use books here to nearly the same extent as I gather is normal in the US.

The definition of Mac Lane and Birkhoff really sounds to me like they are being constructive type theorists and not realizing it. Fundamentally, it seems to me as though we want to write all definitions like a dependent record type in Coq or Agda, treating structure and axioms on the same footing, and where each item in the definition is allowed to depend on the items that come before it.

To be honest, I’ve actually started writing definitions in a style like that myself. It seems fine to me to write

A group is a set $G$ together with the following structure and axioms.

1. A binary operation $G\times G\to G$, written $(a,b)\mapsto a\cdot b$.
2. This operation is associative.
3. An element $e\in G$ such that $a\cdot e = a = e\cdot a$ for all $a\in G$.
4. For each $g\in G$, an $h\in G$ such that $g\cdot h = e = h\cdot g$.

In other words, I think it’s fine to mix structure and properties as long as the wording makes it clear which is which. Do you agree?

Posted by: Mike Shulman on March 20, 2012 4:40 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Oh yes, I think it’s fine without question. And I don’t strenuously object to definitions in the style of “a set $X$ is called a semigroup if there is an associative binary operation on $X$”. I wouldn’t write it like that myself and I guess it’s a bad influence on the innocent, but it is at least clear what’s meant, whereas for the group one there really is an issue of “which identity $e$ do you mean?” until uniqueness of identities has been proved.

Posted by: Tom Leinster on March 20, 2012 9:10 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Speaking as a constructive type theorist, you mean most mathematicians don’t do that?

I had assumed that what you describe was completely ordinary mathematical practice, since (a) the category of sets is complete and co-complete, which lets interpret sigma- and pi-types in the obvious way without any problems, and (b) has a truth-values object, which lets you treat properties as structure via characteristic functions.

IOW, I thought of dependent sums/products as giving a more hygienic syntax for defining indexed/families of sets, in the same way that lambda-abstraction is a more hygienic syntax for defining functions.

Posted by: Neel Krishnaswami on March 22, 2012 11:52 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

To most mathematicians, “there exists” means that something exists but is not specified as data. Basically no one besides constructive type theorists uses unmodified propositions-as-types logic; the existential quantifier is interpreted not by a Sigma-type but by the squashing thereof.

Posted by: Mike Shulman on March 22, 2012 4:01 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

There’s another way of fixing it that I prefer: it seems that it is correct if you interpret the existential quantification to range over 3 as well as the second half of 2.

Posted by: Tom Ellis on March 20, 2012 7:32 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

interpret the existential quantification to range over 3 as well as the second half of 2.

How would you indicate to the reader the intended scope of the quantifier? For instance, the universal quantifier in 1 does not range over 2 and 3 as well.

Posted by: Mike Shulman on March 20, 2012 6:00 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

In fact, also Wikipedia gets it wrong. The unit axiom is awkwardly phrased in that it suddenly turns an indefinite article into a definite one (emphasis mine):

Identity element: There exists an element $e$ in $G$, such that for every element $a$ in $G$, the equation $e\cdot a = a \cdot e = a$ holds. The identity element […]

What do you think about inserting here, after the period, something along the lines of “Such an $e$ is called an identity element and can be shown to be unique”?

Posted by: Tobias Fritz on March 21, 2012 7:07 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I vaguely remembered that I was given a more carefully phrased version of
the axioms but was not sure. So I looked it up in Lüneburg (Vorlesungen
über Lineare Algebra) and indeed the conditions are given as follows
(my translation):

(a) the binary operation on G is associative
(b) there is an element e in G such that
(b1) ea = a for all a in G
(b2) for each a in G there is some b in G with ba = e

Posted by: Marc Olschok on April 2, 2012 6:27 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I know two and a half ways to say it properly. The first is to pause after (1) and (2), prove that identities are in fact unique, then proceed to (3). The second is to say “a group is a set $G$ together with a binary operation $\star$ and an element $e$ satisfying…”. The two-and-a-halfth is the same, but including inverses as part of the structure too (rather than asserting the existence of inverses as a property). Whichever way you do it, you’re going to have to prove statements on uniqueness of identities and inverses sooner or later, anyway.

There is another way, taken, I think, by van der Waerden; it is to replace (2) and (3) by:

($(2 + 3)'$) For all $x, y \in G$, there is a unique solution $z \in G$ to $x \star z = y$.

This is a very specific work-around that builds in uniqueness, and thus doesn’t address the spirit of your question; but I rather like the way that it unifies the original (2) and (3).

Posted by: L Spice on December 23, 2013 2:09 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I’m still thinking about my answer to your original question, but I thought I would point out that the problem is not confined to $\int \frac{1}{x}$, but spreads out from there to infect other antiderivatives as well, sometimes even more virulently. For instance, we teach that

$\int \frac{1}{cos(x)} dx = log \;{| tan x |} + C$

but this is (if possible) even wronger than $\int \frac{1}{x} dx = log \, {|x|} + C$, since in this case the domain has infinitely many connected components and the constant could be different on each one.

Perhaps the real culprit of this problem is the reprehensible habit of implicitly assuming the domain of every function to be “the largest subset of the real line on which the given algebraic expression is defined”. But that’s probably not going to go away any time soon. (-:

Posted by: Mike Shulman on March 20, 2012 1:53 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Yes, good point. We get an infinite-dimensional space of antiderivatives before we know it.

Seeing as I’m in a mood for going round obnoxiously pointing out other people’s mistakes (and thereby inevitably setting myself up for a fall), I can’t resist pointing out that you probably meant either

$\int \frac{1}{\cos(x)} \;d x = \log\;{|\sec(x) + \tan(x)|} + C$

or

$\int \tan(x) \;d x = \log\;{|\sec(x)|} + C.$

But the point you were making is unaffected.

Posted by: Tom Leinster on March 20, 2012 2:08 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Blarg. One year goes by without teaching any calculus and I forget all my derivatives. That doesn’t bode well.

Posted by: Mike Shulman on March 20, 2012 4:31 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I lectured the same first-year calculus course three years running. It included a section on solving ordinary differential equations, involving all the usual basic tricks and techniques. Every year I taught it, every year I duly forgot it, and every year I had to re-learn it. It’s like weightlifters who are ripped in season and flabby for the rest of the year. Get me at the right time of year and I can do these things, get me at the wrong time of year and it’s hopeless.

Posted by: Tom Leinster on March 20, 2012 9:16 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Good evening shulman and leinster,

How do you remember all your derivatives in the first place? They just look like a bunch of symbols to me. I’m not a mathematician so I’m curious to know.

Posted by: kim on March 22, 2012 3:17 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

You could give the traditional definition given to students at this level, and then end up with saying, “What I just said is not technically correct. The real situation is actually more complicated than I described. If you are interested in learning more about it, you can see me after class”. This would be enough to pique the curiosity of the few students actually interested. If I was that age, I would be eager to ask the teacher why the supposed correct answer is actually wrong. That way, you would not be wasting the time, causing distraction or confusion, to all of the other students who could not care less.

Posted by: Jeffery Winkler on March 20, 2012 10:25 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I suppose I have a resistance to saying things that are “not technically correct”, in that “not technically correct” means “wrong”. A large part of the battle at this level of teaching, I find, is to train students not to write down things that are wrong.

For example, I routinely apply red ink to expressions such as

$\ldots = \sqrt{2} = 1.414,$

telling them that $\sqrt{2}$ is no more equal to $1.414$ than $0$ is equal to $1$. I don’t want them to be able to come back to me and say “well, I know it’s not technically correct. But you sometimes write things that aren’t technically correct, so why can’t I?”

Posted by: Tom Leinster on March 21, 2012 1:51 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Let’s say you are teaching a class in classical Newtonian mechanics. Everything you would say in the class would be technically wrong since you would be ignoring special relativity, general relativity, quantum mechanics, and quantum field theory. Would you refuse to write a non-relativistic classical equation on the board because it’s wrong?

Posted by: Jeffery Winkler on April 1, 2012 7:06 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Well, I’ve never taught physics. If I were to do so, I imagine I would want to make clear that physical models tend to be only approximations to reality. But mathematics is a different matter.

Posted by: Tom Leinster on April 1, 2012 8:35 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I agree with what Tom said, but I think it’s worth emphasizing also that everything you say in a class on Newtonian mechanics can be technically correct as a description of Newtonian mechanics. Newtonian mechanics is a well-defined theory, even if it is only an approximation of “reality”.

Posted by: Mike Shulman on April 2, 2012 3:24 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I suppose I have a resistance to saying things that are “not technically correct”, in that “not technically correct” means “wrong”.

Yeah, I’m with you. It’s okay to lie in a seminar talk, since you’re talking to mathematicians who know perfectly well what it means to be precise, but when we’re trying to teach people to think mathematically, it’s dangerous to allow ourselves to be sloppy.

Posted by: Mike Shulman on March 21, 2012 4:01 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

It’s okay to lie in seminar talks

I agree, but I’m glad you said that. I’ll be introducing you when you give your seminar talk in Glasgow, and I’d been wondering what to say. Now I know :-)

Posted by: Tom Leinster on March 21, 2012 4:09 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I think my inclination is to go the “constant that varies” route. But I would not say “constant that varies” or “locally constant function”, or introduce the notion of connectedness — I would just say that the constant can be different on each interval in the domain, but that we still just write $C$ even when the domain consists of more than one interval. And emphasize, when we come to FTC, that it is only valid when the function is continuous over the whole interval (so that in particular the constant in any antiderivative must be the same at both ends).

I think this is consistent with the level at which we talk about most things in a calculus class; all subsets of the real line we think about are disjoint unions of intervals, and any theorems that we state explicitly are usually stated in terms of functions defined and continuous on intervals.

Now I want to hear what you (Tom) think!

Posted by: Mike Shulman on March 21, 2012 12:24 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Well, I’m afraid I don’t think anything very interesting, and it’s been altered a bit by what people have said here. When I wrote the post, my opinion was that one should avoid as far as possible talking about the indefinite integral of $1/x$ (and $1/\cos(x)$ etc), but if necessary, giving the two-cases formula.

Now I’m not so keen on the two-cases formula. I’m thinking that if there’s time to get into the indefinite integral at all, the thing to do is to make something interesting out of it: to say “look, the hyperbola has two different branches, and we get a different constant on each”. (Maybe both called “$C$”.) Present it as something exotic and try to attract the interest of the more imaginative students, rather than making it just one more integral to learn or mumbling and sweeping it under the carpet. Even if the full story isn’t told (and it probably can’t be), my current inclination would be to try to turn this difficulty into a point of interest — maybe a teaser for future courses on differential equations.

Posted by: Tom Leinster on March 21, 2012 6:36 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I think Mike’s suggestion sounds like the right one. I’ll try to remember it when (or rather, if) I teach single-variable calculus again.

Posted by: Mark Meckes on March 21, 2012 2:25 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Although I had never appreciated this issue (thanks!), when I taught Calculus last year, my approach successfully sidestepped this issue, by restricting the domain of $\frac{1}{x}$ to $(0,\infty)$, then integrating it to log(x)+C. Later on, I also integrated $\frac{1}{x}$ to $(-\infty,0)$ to $\log(-x)+C$. Finally, in the lead-up to the test, I introduced $\int\frac{1}{x}=\log|x|+C$ as a shorthand or as a mnemomic.

Accidentally, I think this turned out to be a really good idea- i.e. I was lucky. It avoids teaching something wrong, it’s clear and intuitive to students, and it avoids integrating over non-connected domains. I don’t think any students got annoyed or confused by it, either.

Posted by: Daniel Moskovich on March 21, 2012 1:55 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I feel like something of a fuddy-duddy now, in that I really want to say that the function $x\mapsto \frac{1}{x}$ simply isn’t integrable on any interval that includes $0$. I think Daniel gets closest to this fact of non-integrability with his “sidestep”, and the emphasis (perhaps I’m reading it in?) that the integrals for $\frac{1}{x}$ are one-sided things. It’s all very well to say that all the functions $x\mapsto \log |x| + C + c H(x)$ have the same derivative function; but not a single one of them is of the form $x\mapsto( k + \int_a^x \frac{1}{t} dt)$ for any $k$ or $a$. And I’m more than happy to distinguish between a function having a primitive or prederivative (“antiderivatve” is a term that will have to die!), as vs. a function being integrable on intervals. Certainly, one mustn’t suggest that all primitives are integrals; the upshot of the Fundamental Theorems should be seen as

• assuming good circumstances, one has a construction of a continuous primitive
• when applied to the derivative of a (necessarily, differentiable) function, this construction produces a vertical shift of that function.

It’s worth-while pointing out cases in which these theorems do not apply: as instigated this post, there are primitives that are more general than any integral; there are also integrable functions which aren’t the derivative of anything (toss in not-too-many-too-large removable discontinuities, e.g.)

Posted by: Jesse C. McKeown on March 21, 2012 5:00 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Well, if “prederivative” works for you, what if we shift the prefix to a different one that also connotes “before”? The “antederivative”, anyone?

Posted by: John Armstrong on March 21, 2012 5:17 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

What’s wrong with “antiderivative”? It’s obvious what it means, as contrasted with “primitive”. And a “prederivative” sounds like something you do along the way to finding a derivative. Maybe “antiderivative” is a bit unlovely, but we don’t have to write songs about it.

Posted by: Mike Shulman on March 21, 2012 5:52 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Well, I agree with Mark that actually trying to change the accepted terminology would take a long time, for there are many books that would have to go out of print first (happily, they re-edit these calculus texts pretty much every two or three years, which is otherwise insane and unnecessary, but…) and then, of course, who knows what else wouldn’t change and confuse things further?

My own beef with “antiderivative” is that it’s not (the output of) an operator until one knows about torsors; whereas the derivative and the integral are both definitely (families of) operators. It’s also a terribly isolated usage of the “anti-” prefix, in maths; within calculus one talks of “antisymmetric” functions, which aren’t things that produce symmetric functions (unless they’re also differentiable, in which case the derivative is… though the derivative of a differentiable symmetric function is anti-… ! But this is punning).

The advantage of the “pre-” prefix for a primitive is that a pre-derivative definitely belongs to the pre-image of a singleton, under the operation “form the derivative”; and if in a calculus course one hasn’t talked of preimages for functions generally, then… I just don’t know. It is, after all, part of by-far the easiest way to spell “continuous”. I suppose “pre-” is also a fairly overloaded morpheme, what with “presheaf” and “pretopos” and “presymplectic” (in which case “pre” means there’s some fiddly stuff/structure/property we don’t ask for yet, but which is particularly interesting when it is there); but also “prefix”, “pre-compose” and “predict”…

And I quite agree that “primitive” would again be an isolated usage (though it’s already current), while “prederivative” is quite as ugly a word as “antiderivative”.

Btw, John’s suggestion made me laugh.

Posted by: Jesse McKeown on March 22, 2012 4:19 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I don’t care much for “antiderivative”, since it reinforces the mistaken notion (which I try hard to beat out of my students) that one “derives” $f$ in order to find $f'$.

In fact, of course, $f'$ is called the “derivative” of $f$ because it is derived from $f$. This is a very unfortunate bit of terminology, because there are many other functions which one may derive from $f$, including the antiderivatives of $f$. For some reason, the act of deriving $f'$ is also called “differentiating” $f$. Since in non-mathematical English that would mean telling the difference between $f$ and something else, that’s another pretty unfortunate bit of terminology. To “antidifferentiate” is logical enough, once you’re used to the mathematical meaning of “differentiate”, but an antiderivative is neither the opposite of a derivative, nor the result of doing the opposite of deriving (whatever that would mean).

With that said, crusading for logical, self-consistent terminology in a field this established is a waste of time. Students need to learn to correctly understand and use terminology as it is used by the rest of the world.

Posted by: Mark Meckes on March 21, 2012 2:47 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

One thing I dislike about teaching differential equations is the archaic terminology. Here I’m talking about the most basic, methods-y, ODE stuff — nothing advanced. The students are all supposed to learn ancient and mystical expressions such as “complementary function”, “auxiliary equation”, “general solution”, “particular integral” and “particular solution” (and learn that the last two are not in fact synonymous). Actually, in their written work it quickly becomes a succession of two-letter abbreviations, which is a relief; it’s much more efficient and hardly less meaningful. It does, however, deprive me of the amusement of seeing it misspelled as “complimentary function”, which makes me imagine a function that declares “Hey! Great hair!”

Now that I think about it, terms such as “the general solution”, or indeed “the general antiderivative” are similar to an old-fashioned English turn of phrase that in my mind is inextricably associated with stereotyping. “The Chinaman is as a rule a hard worker”, that kind of thing. This is like “the general solution is …”, where in standard modern mathematical language, we’d say “the set of solutions is …”.

I can see some advantages to the old-fashioned language. At the same time, I can’t shake the feeling that damage is caused by the old-fashioned and non-rigorous way in which differential equations are often taught. Many talented undergraduates with a pure-mathematical bent seem to find differential equations instinctively revolting, only revising their opinion years later if at all. This is a shame. But perhaps the demands of departments of engineering, chemistry, physics etc. are simply too strong to resist, forcing us into teaching differential equations in a “methods” style.

Posted by: Tom Leinster on March 23, 2012 6:29 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Seriously? Now you’re just trying to find something to complain about.

Yes, saying “the general x P” instead of “for all x in X, x P” is not a completely contemporary phrasing, but it’s perfectly straightforward. If anything, it avoids drowning the concept in explicit universal quantifiers, which (non-math) students also hate. And drawing a tenuous parallel to racism is an extremely dirty rhetorical trick.

Yes, ODE courses are stylistically awkward, and could possibly do with a revamp. They occupy a particularly difficult niche as the most mathematically sophisticated material non-majors are usually required to take; teaching a real understanding would expand the course to at least two semesters’ worth of material – especially since the basics have been crowded out of calculus courses – and engineering departments aren’t interested in buying the product at that price. So we’re left with the blue-light special: a collection of ersatz tricks that are useful in some situations and little to tie them together into a coherent theory.

But the difficulty can hardly be attributed to the language.

 Incidentally, this situation is not actually unique to ODEs; I have the same complaint about Ramsey theory, in that every time someone tries to tell me about it I get a flood of toy problems with superficially unrelated solutions.

Posted by: John Armstrong on March 23, 2012 12:00 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

So we’re left with the blue-light special: a collection of ersatz tricks that are useful in some situations and little to tie them together into a coherent theory.

The only real change we have to make is to tell the students that that’s what we’re doing. The whole point of an elementary ODE course is that there is no silver bullet for ODEs (unlike, say, linear systems and Gaussian Elimination) so we do the best we can. The best we can is to figure out some types of ODE where we can solve them, and then hope that our unknown ODE is one of these.

One can even make the point that whilst a random ODE is unsolveable, just as a random continuous function has no “nice” ant(i|e)derivative, most ODEs are solveable, or at least amenable to the techniques that we have. The more intelligent ones can be left to ponder why this is so (are the ODEs that tend to crop up the ones that are more studied because they are the ones that tend to crop up more often, or do they tend to crop up more often because they are the ones more studied?) whilst the ones who just want recipes at least get an understanding of why we only teach recipes for solving ODEs.

I agree with Tom, though, that current ODE courses get so bogged down in useless terminology and spend so much time on “auxiliary equations” and “transfer functions” that we never have the time to explain what’s really going on. So many students try to learn the auxiliary equation for an ODE with constant coefficients (and therefore get hopelessly confused if we learn them Euler-Cauchy ODEs as well), whereas if they learnt a few general principles it would be so much easier. And those principles aren’t hard, not even conceptually hard. They are:

1. A solution of the ODE is just something that works - the method used to find it is not relevant to whether or not it is a solution.
2. There is no “silver bullet” for ODEs, so the best method is simply to guess.
3. 1st order constant coefficients leads to exponential solution, so for 2nd order constant coefficient we should start with an exponential solution with a bit of arbitrariness built in and see what happens.

 Yes I know it’s not perfect in practice, but when compared to how to solve an ODE then it’s pretty darned good.

 I’m really trying hard to get in a reference to the line “her ante-penultimate breath” but failing dismally.

 A completely pointless activity.

Posted by: Andrew Stacey on March 26, 2012 8:45 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Having read this and the comments (so far) I have to say that that’s what we get for trying to teach students about Fredholm operators before they are ready!

I think that my “solution” to this would be to try to avoid the problem altogether and never, if at all possible, to talk about “indefinite integrals”. I think that the possibilities for confusion would be much less if instead of

$\int \cos(x) dx = \sin(x)$

(and it wasn’t until I’d written that that I realised that I’d also made the classic error of scoping of variables) one wrote

$\int_a^x \cos(t) dt = \sin(x) - \sin(a)$

or one could write:

$\sin(x) = \sin(a) + \int_a^x \cos(t) dt$

which would be my favourite.

So one still gets the integral as a function (of its upper limit) but without the potential bizarreness of the indefinite integral.

(and it might mean that we stop saying things like $\int \cos(x) = \sin(x)$ which, although not technically wrong, falls into that trap of using the same label for two different variables)

Posted by: Andrew Stacey on March 22, 2012 3:26 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I think that my “solution” to this would be to try to avoid the problem altogether and never, if at all possible, to talk about “indefinite integrals”.

I would definitely agree wholeheartedly with this, except for the likelihood that in some future class (or maybe even in the real world?) someone will expect the students to know what the phrase “indefinite integral” means.

Posted by: Mark Meckes on March 22, 2012 3:37 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Sad but true.

Before writing the above, I tried really hard to think why we teach about indefinite integrals. All I could come up with was habit.

I’m leaning towards classing them with determinants and invertible matrices: relics of an older time when it was thought that these were important. That’s not to say that they are never used, just not as used as the prominence we give them in teaching warrants.

I think I’d teach about indefinite integrals just after teaching about solving linear systems. After seeing a few examples of $A x = b$ where there are different numbers of solutions, and the solution spaces are parametrised by the solutions of $A x = 0$, then it shouldn’t be too hard a leap to see that the number of solutions to $D f = g$ depends on the number of solutions of $D f = 0$, and then it’s also not a large leap (I think) to see that this can vary considerably from problem to problem.

Posted by: Andrew Stacey on March 22, 2012 8:10 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

What’s the problem with invertible matrices?

Posted by: Tom Ellis on March 23, 2012 7:49 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

What’s wrong with invertible matrices? Where shall I start?

Seriously, there’s nothing wrong with them in and of themselves, but there’s a lot wrong with the emphasis that they receive in a beginning linear algebra course and thus with the impression that students take away from it. The most dangerous one is the idea that when solving $A x = b$ then figuring out whether or not $A$ is invertible might actually be of some use. A simple role-play reveals the inanity of that:

• Teacher How are you going to test to see if $A$ is invertible?
• Student Compute $\det(A)$.
• Teacher How?
• Student I’ll row reduce $A$, keeping track of row swaps and scales.
• Teacher Right, you’ve done that and $A$ happens to be invertible. Now what?
• Student I’ll compute $A^{-1}$.
• Teacher How?
• Student I’ll row reduce $[A I_n]$ to get $[I_n A^{-1}]$.
• Teacher Right, you’ve done that and computed $A^{-1}$. Now what?
• Student Multiply out to get $x = A^{-1}b$.

No, the fact that a matrix is invertible is only useful if you already know its inverse. And then it is not whether $A$ is invertible or not that is the key. What you do is have a sack of invertible matrices for which you know the inverse and when solving $A x = b$ you pick out something from this sack, say $E$, and apply it to $A x = b$ to get $E A x = E b$. Then you hope that this is easier to solve than the original. Because $E$ is invertible and you know its inverse, it is easy to go back and forth between $A x = b$ and $E A x = E b$. Admittedly, for solving, then you only need to go in one direction (though you need to know that the other direction is theoretically possible), but for more general questions then it is important to be able to go both ways, and to go both ways easily.

If you think about it, this is exactly what Gaussian Elimination is all about.

(There’s a similar case to be made with eigenvalues and the characteristic polynomial. Because students only ever see simple examples with 2x2 and 3x3 matrices, they think that solving the characteristic polynomial is the method for finding eigenvalues. Of course, here in Norway then it’s easier to explain the ridiculousness of this notion.)

Posted by: Andrew Stacey on March 26, 2012 8:57 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Sounds like you’re saying there’s a benefit to being a constructive type theorist again: to “be invertible” should mean “to be equipped with an inverse”. (-:

Posted by: Mike Shulman on March 26, 2012 7:47 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I don’t know that I would have known to put it that way, but yes.

I guess I’m heavily influenced by the course that I’m currently teaching in this. It’s a first course in linear algebra with a bit of ODEs thrown in for good measure. The course is taught to everyone who wants to attend, not just maths students, so when I can think of a way to link what I’m teaching to something concrete then I do so. By that, I don’t mean introducing topics by contrived and over-simplified “real world” examples, I mean that when there’s a way to link a concept to something real then I try to present it. For example, when motivating matrices as a way of thinking of linear systems, then I try to get the students to think about what information they would actually need to enter into a computer to tell it about the linear system. Doesn’t take them long to get that all they need are the coefficients and the row/column nature. Hey presto: a matrix.

Mind you, this does get me into a bit of trouble where the “standard” window dressing for a subject is a bit out of date. Determinants are the worst, but eigenvectors and eigenvalues aren’t far off. It’s easy to say that we’re interested in eigenvalues and eigenvectors because then we can easily calculate high powers of the matrix. But it’s wrong. What is correct is to say that we’re interested in eigenvalues and eigenvectors because then we don’t have to calculate high powers of the matrix. To see that the original line is false one just has to look at the routines for computing eigenvalues and eigenvectors: compute high powers of the matrix and look for convergences.

(I read recently that if you ask one of the major maths software products to compute the characteristic polynomial of a matrix then what it actually does is to compute the eigenvalues via some fast numerical approximation and then work out the polynomial from its roots. So teaching students that to find the eigenvalues we factor the characteristic polynomial is a Big Fat Lie.)

Posted by: Andrew Stacey on March 27, 2012 9:35 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

You can motivate eigenvalues for solutions of ODEs, by showing students that they give the frequencies of normal modes. E.g., in a series of masses connected by springs, eigenvalues for the matrix of the linked ODEs had all better be negative, or else the oscillations would grow exponentially.

Posted by: David Corfield on March 27, 2012 6:37 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Oh, I can motivate eigenvalues easy enough. I just can’t motivate calculating eigenvalues.

Having said that, I find modes somewhat unmotivating. Take two identical coupled pendula. It’s easy enough to explain that one mode corresponds to their average position and another mode to the difference and you can imagine two “virtual” pendula at those positions and these are uncoupled. But you can uncouple them further to produce first order ODEs and then when you look at the modes you get things like “the average position plus three times the average angular velocity”. How does that make eigenvectors and eigenvalues any more motivating? And if you’re going to use modes to motivate eigenvalues and eigenvectors then you have to go the whole way, not just stop when it becomes physically unrealistic.

Moreover, using modes to motivate eigenvectors is problematic. If I thrown in a few numbers, I can get a system where one mode is $y_a + y_b + z_a + z_b$ (the $z$s are the velocities). This corresponds to eigenvalue $-4$ of this particular system, but the eigenvector is $\begin{bmatrix} -1 \\ -1 \\ 4 \\ 4 \end{bmatrix}$! How does that eigenvector correspond to that mode?

(I know the answer; my point is that if one is using modes to motivate eigenvalues/eigenvectors then the connection is not so direct as one might like, whereupon the pedagogical value of using them is considerably lessened - to the point where the extra baggage needed to explain the connection makes things worse than just saying “Using eigenvectors makes it theoretically easy to solve the ODE. Knowing what the answer looks like in theory, we can work out what information we need to answer the question we’re actually interested in, and then look for more direct ways to get that information numerically, guided by the fact that we know how to do it theoretically.”)

Posted by: Andrew Stacey on March 29, 2012 8:57 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I envy you getting to teach a class to people who want to attend. (-:

Posted by: Mike Shulman on March 27, 2012 4:44 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I don’t find these examples upsetting. I think that they reflect that an average class will be of mixed ability, some of whom will be happy when given a recipe, and (possibly) the basic idea; and some of them won’t be happy to find that they’ve been misled slightly (I’m in this camp). If we can live with a certain level of ambiguity in ordinary every-day language we use, surely its allowable to have that in textbooks?

Perhaps definitions should be marked with some sort of symbol to indicate what level of mathematical scrutiny they will survive:)

Anyway, isn’t this sort of ‘sloppy’ definitions fairly rampant in physics? ie The Path Integral.

Posted by: mozibur ullah on March 24, 2012 11:33 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Did we really make it to the end of this thread without anyone saying that, whatever you do for that integral, at least it is clear what

$\int 1/cabin d(cabin)$

is??

(A joke I read in a Pynchon novel, back when the world was young)

Posted by: Silly Mood on March 27, 2012 7:05 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Piling one corny joke on top of the other (and to turn back slightly in the direction of the post, sort of), Halmos asked his students this question and told them the actual answer was “houseboat” (log cabin plus sea).

Posted by: Todd Trimble on March 27, 2012 9:43 PM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

Funny you should post this now - I just got back from a class where I covered this exactly!

My approach is to talk about antiderivatives graphically first, before ever writing down a single formula. I draw the graph of a function f on the board and ask the students to draw the graph of a function F whose derivative is f. I give the students a few minutes to do this.

Then I ask the students if there are any other functions besides the one they drew which would work. Some student will realize that shifting their function vertically will work.

Then I draw a function with an obviously disconnected domain, and repeat the process (NOTE: students think that (0,1)U(2,3) is a lot more disconnected than (0,1)U(1,2) - they will often have the insight that the two pieces can be vertically shifted independently with the first interval, but not with the second.)

I do one more example where the domain is an interval with a single point deleted.

At this point the students have already realized for themselves that antiderivates differ from each other by a locally constant function.

After we have been drawing graphs long enough, I start writing down formulas, but we always make sure the formulas make sense graphically (checking increasing/decrease behavior and concavity of the antiderivative). When we get to 1/x, the students provide the correct form without a second thought.

Posted by: Steven Gubkin on March 28, 2012 1:50 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

When we get to $1/x$, the students provide the correct form without a second thought.

To coin a phrase, the proof of the teaching is in the learning. Forget the pedagogical theory, if it works, I’ll try it.

Posted by: Andrew Stacey on March 29, 2012 9:00 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

I’ve only read the question today, but I’d like to point at that, typically, in France, we apparently have a different approach to the whole integral thing. Teaching of mathematics in France has been very influenced by Bourbaki’s books, by the way.

I’ve never taught calculus myself, but if I recollect my school years correctly: from the start on we use the integral sign only for integration on an interval (is that what english speakers call definite integrals?) on which the function is defined. When we are asking about antiderivatives (which we call primitives) we spell it out (usually asking about an antiderivative, rather than the general form). We are also, usually, explicit about the domain of the input function.

That said, it might not prevent from learning the wrong theorem about the general form of antiderivatives. As students we are usually very inattentive, it might be valuable for the teacher to stress the conditions of the theorem, and even to give Tom’s example to illustrate why they matter.

Btw: on the definition of groups, it may be worth pointing out that the two-and-halfth version (with *, e and inverse being given as operations) is also the version advertised by universal algebra.

Posted by: Arnaud Spiwack on April 1, 2012 10:40 AM | Permalink | Reply to this

### Re: Reader Survey: log|x| + C

but is log|x| + C really wrong? you are suggesting to use $\int \frac{1}{x}dx=\mathrm{Log}\left[Cx\right]+D\mathrm{Sign}\left[x\right]$ instead, which means $\frac{1}{x}=\frac{1}{x}+\mathrm{D DiracDelta}\left[x\right]$ D=0?

Posted by: h.a on April 1, 2012 10:43 AM | Permalink | Reply to this

Post a New Comment