“Simplifying” is subjective. Also, crucial.

As a math teacher, I spend a lot of time belaboring a certain distinction. To some students, it’s inscrutable hair-splitting, as subtle as John Adams vs. John Quincy Adams. To others, it’s barely worth saying aloud, as obvious as John Adams vs. Amy Adams.

The distinction is wrong vs. weird.

Mathematical notation is a system of communication. Thus, a mathematical statement can be held to two distinct standards: (1) Content: Is it saying something true? (2) Form: Is it clear, concise, and written in observance of all the familiar conventions?

The same dichotomy holds in any language. There are lot of crisp, idiomatic ways to say something wrong, and there are weird, off-putting ways to say something true.

Students don’t always see math this way. For many of them, rules are rules are rules, no matter whether they concern form or content. I’ve seen teachers and textbooks blur the distinction, too. For them, there are no mathematical misdemeanors, no minor slips of the tongue. All sins are mortal sins. An unrationalized denominator is murder most foul.

To me, that’s silly. Mathematics is both (1) a world of ideas, and (2) a language for describing that world. We must not mistake one for the other.

One key testing ground for these ideas is the concept of “simplification.”

I’ve seen exercises that use the word “simplify” in maddening, capricious ways. After all, simplicity is a matter of taste and judgment. Which is simpler: a(b+c), or ab + ac? Depends who you’re talking to, what you’re talking about, where you want to steer the conversation next.

Simplicity is subjective. It’s tangled up in our mathematical customs and conventions.

Should we then banish the word “simplify”? No, no, no! It’d be like English teachers abandoning the idea of an elegant sentence. Just as a sentence can be true and also hideous, a mathematical statement can be correct and also a hard-to-interpret mess that costs me my sanity and hair.

Something subjective can still be real and important.

For example, why do we reduce fractions? To reveal hidden equalities (how else would you notice that 34/51 is the same as 58/87?). Why write coefficients first, followed by variables in alphabetical order? To speed up operations like addition (just try adding 3abc + cb5d + b2ac + dbc7). Why arrange a polynomial by descending powers? So we can know its degree at a glance.

Even where simplification is not a matter of consensus, it’s worth exploring the different possible forms, and debating the advantages and disadvantages of each, just as we’d try on outfits before making a purchase.

Mathematics, at its best, teaches us to distinguish the necessary from the arbitrary, the content from the form. What better place to begin those lessons with the language in which they are taught? I want my students to learn the conventions, and also to learn that the they are precisely that: conventions.

16 thoughts on ““Simplifying” is subjective. Also, crucial.

  1. There was a brief period of my life where I was working on a formula editor (graphical). You could use it to graphically move pieces of algebra around, and it would make sure that everything updated correctly. I think it’s technically still out there, because the Internet neither forgives nor forgets.

    I dropped it because it has no market. People who understand it don’t need it, and people who don’t understand it can barely use it.

    But man, this makes it clear to me why the “simplify” menu item was so absurdly hard to program.

    1. Mmm, I think you just captured a huge dilemma of computational technology and math education in a single pithy sentence: “People who understand it don’t need it, and people who don’t understand it can barely use it.”

      One of many reasons that it’s hard to fold computational technologies into the classroom in a natural and effective way.

    2. I used to have a program that did exactly that back when I was younger. You could drag terms around, and if you dragged, say, a +x to the other side of an equals sign it would turn into a -x. Namely, the Graphing Calculator program that came with every Macintosh computer did that (in addition to, of course, graphing functions), up until Mac OS X came out and they replaced it with Grapher, which doesn’t have that feature. I remember using it to sort of figure out how derivatives worked before I started calculus, and also trying (unsuccessfully) to solve a problem that I didn’t know how to solve yet. I don’t remember that far back well enough to remember if it helped me learn algebra, but it might have.

      I seem to remember it having more options than just “Simplify”—at least I think there was an “Expand” option, and maybe some others.

      (…hmm… maybe I should write my own program that does that…)

    3. There’s some interesting discussion about the difficulty in computer algebra here: https://en.wikipedia.org/w/index.php?title=Computer_algebra&oldid=1062766157#Simplification
      “Some rewriting rules sometimes increase and sometimes decrease the size of the expressions to which they are applied.
      … “As there is no way to make a good general choice of applying or not such a rewriting rule, such rewritings are done only when explicitly asked for by the user.
      “… it may be the case, like for expressions involving radicals, that a canonical form, if it exists, depends on some arbitrary choices and that these choices may be different for two expressions that have been computed independently.”

  2. As someone who understands the wrong/weird distinction well, I’m curious as to what might be causing the lack of distinction.
    Do teachers misunderstand this distinction themselves? Are they intentionally ignoring it so students feel more pressure to give standardized responses? Would teachers feel bad about penalizing “weird” answers, and so feel the need to label them “wrong”? Is the distinction being devalued because both “weird” and “wrong” answers lead to the same grade on standardized tests? (Do they?)

    1. Good questions, and your speculations are at least as good as mine!

      I suspect some teachers don’t so much misunderstand my distinction as disagree with it. They’re linguistic prescriptivists, so to speak. They’d acknowledge that ab + ba = 2ab, but still view the left-hand-side as generally “the wrong thing to write.”

      If I had to situate the trend in a broader, narrative, I’d say something like this: Mathematics plays a gatekeeping role in society. We want our gatekeepers to be impartial, objective, and consistent. So we demand that mathematical assessments be graded in unambiguous, black-and-white ways. This, in turn, shapes our view of the subject itself, making us see it as more black-and-white than it really is.

    1. Touche!

      Somewhere in my notes I have plans for a whole post collecting examples like this, where by coincidence the naive rule winds up working, e.g., where (f/g)’ = f’/g’…

      1. On a more serious note, there are a couple of places in my teaching when I force this discussion to come up. For example, early in the discussion of trigonometry, we discover that the sine of one eighth of a trip around the circle is 1/sqrt(2). I then stare at my students for a while, seeing who is going to be the one to object (because someone *always* says “Shouldn’t it be sqrt(2)/2?”).

        We then get to have a discussion about what it actually means to simplify an expression. Why should we consider sqrt(2)/2 to be more simplified than 1/sqrt(2)? Does it matter? Why?

        From these discussions, I get the impression that most of my students’ previous instructors are quite prescriptivist, and don’t really understand the whys and wherefores of simplification. It is just a thing you are supposed to do. I try to convince my students that simplification is more like poetry—you want things to be aesthetically pleasing (where “aesthetically pleasing” means good for computation in many settings).

  3. When my dad was wearing his logician hat, he was interested in deontic logic, the logic of necessity and possibility. Deontic logic studies sentences like “If the Queen’s parents were human, the Queen must be [= is necessarily] human” and “Andy might have been named Tom [ = Andy is possibly named Tom = It is false that Andy is necessarily not named Tom]” It is straightforward to represent this using propositional logic plus a primitive operator spelled “nec” and meaning “is necessarily”.

    We can prove various deontic theorems such as “If a = b then a nec = b”, equality holds necessarily if it holds at all, and we have a good model where “nec” means “is true in all possible worlds”. What we have not got, and there is some reason to believe that we never will have, is a reliable method of simplification for deontic formulae. Every known normalization algorithm simplifies some formulae and complexifies others, and that makes it damned hard to study, so much so that most logicians don’t even bother.

    There is a straightforward-looking way to tame these formulae: rewrite “A white horse is necessarily a horse” as “The sentence ‘A white horse is a horse’ is necessarily true.” The trouble is that you cannot convert thid to “For each white horse X, the sentence ‘X is a horse’ is necessarily true”, because a quantified variable can’t appear in a quotation. By the same token, there are 8 planets, but “8 nec > 5” does not entail “The number of planets nec > 5”.

  4. Since I teach at a community college and often teach algebra and trig classes, this post is something that I really want to consider. I think that the term “simplify” is very nebulous, and I saw someone on Math Twitter suggest the word “standardize” instead.

    1. That’s nice! I suspect each is useful in its own way.

      Rationalizing the denominator is definitely “standardizing” (as the final form isn’t always simpler).

      Same for writing your polynomial in descending order.

      But I suspect “simplify” is still useful language for things like 2x + 3x = 5x, or 5a^2/10a = a/2.

      Maybe “simplify” is to be avoided as an instruction in technical exercises (in favor of more precise verbs such as “reduce,” “expand,” “condense,” etc.) but useful for discussions about the goals and the nature of algebraic manipulations?

  5. We can replace ∃ with ¬∀¬ eliminating the need for the existential modifier! “There is something rotten in the state of Denmark”, becomes “Not everything in Denmark isn’t rotten.”

    Math is a language, and like all language, clarity is important.

    In most cases, the highest degree term of the polynomial dominates, and hence should be first. But when that is not the case, such as in the power series expansion of … any function with a convergent power series… the smaller degree terms drive the behavior of the function.

    One problem I see is the myth that there is only one correct answer, and a best approach to find that answer. Some people point to this as fundamental to mathematics, But, it just isn’t true. And those stuck on this one-track thinking show the limits of their depth of understanding.

  6. I’m a coder, often dealing with low-level optimization. There, it’s important to have a wide array of alternative ways to write the same thing. There is rarely a single best way, echoing your “it’s simpler, but is it more practical” caption. All too often, there isn’t even a single criteria for what “best” might be. Even with “faster” as “best”, I often end up with different code for different processors. It helps to know these different ways are equivalent. Standard expression does help with factorizing though, so it’s not exclusive, every tool is useful if used in its own time and place

Leave a Reply to Ben OrlinCancel reply