One of my pet peeves is the lack of deep understanding I have regarding mathematics. The feeling of confusion is very similar to the way I feel about long-running television soap operas with a cast of characters spanning generations. I guess what I’m looking for is the *story* of mathematics, where the characters are not mathematicians but the concepts themselves. In other words: **how were these concepts born, and how do they relate to each other?**

A few days ago I started to browse the Internet to cobble together an overview of what current math education is like these days; I figured grabbing the curricula from K-12 would give me some insights into what students were being prepared for and give me an idea of what the overall pattern and intention was behind it. As far as I could tell, the intention was to introduce the right concepts at the right time, but the original motivation for doing so was inaccessible to me.

It occurred to me that I actually do have some preconceptions about math, and it’s this: the **systems of thought** behind the concepts should make sense. However, the systems of thought are buried beneath hundreds of years of forgone assumptions, and it’s these assumptions that I think I need to categorize and lay out in a systematic fashion. Some of these assumptions are so basic that it’s only through digging into the philosophy of mathematic foundation that their distinguishing properties are illuminated. For example, I have been starting with what I think would be the basics: **natural numbers**. Natural numbers, as I have just re-learned, are the “counting numbers”, and there are two main areas of interest. In pure mathematics, there’s the exploration of the properties and “behavior patterns” of such numbers, which is called Number Theory. The second area is the applied side with regards to counting and ordering numbers; this is called Combinatorics. What’s common to both sides is the huge number of properties, operations, and relationships that have been discovered, conjectured, invented, or derived. I’m thinking that there can’t be THAT many of them in foundational math up to the college level, and yet I’ve never seen a really good diagram of how it all hangs together.

It could also be that I am just missing the point, or am not being sufficiently focused in which point I want to focus on. Frankly, I have no idea what my choices are, so I’m leveraging Wikipedia to build a holistic understanding of what mathematicians *talk about*. In the way that calculus is the study of functions, I’m attempting to develop a road map of mathematic based on studying the conversation itself. This wasn’t possible before the Internet without a lot of back-and-forth to the library, and so I am optimistic.

The first insight I had was that the development of mathematics is just like any other field of endeavor. It starts with a **need**: in this case, it was the need to be able to record what you had (counting), and figure out ways of manipulating the counts to make projections (accounting). What followed was the need to solve certain classes of problems that related what you had with some other desire, which leads to the development of algebra. Simultaneously, the need to be able to construct things with some level of accuracy gives rise to the study of shapes (geometry) and how to construct and measure them. Eventually, the two fields merge together and start to share similar methods for solving the problems using inventions of manipulation that are shown to be self-consistent (and therefore reliable). This is not unlike craft. A few people can’t help but notice that some of these things they’re doing are pretty nifty in their own right, possessed of a certain aesthetic unto themselves, and fields of study devoted to this (essentially, what I might call art). Over time, specialized fields of study develop, each with their own language and set of important principles.

I think what I’m looking for is to develop at least a little mastery of the fundamental principles behind the big questions and answers that Mathematics has given us. I don’t know exactly what I mean by that, but I am thinking it might be analogous to my understanding of computers and computer programming. To me, a computer is a bunch of on/off switches that have been combined in remarkable ways to create ever-more sophisticated behaviors. Computer programming is the same thing to me: raw boolean operations simulated by taming the wild electronic forces that scamper between semiconducting substrates that gate them into a semblance of order. They have been shaped by a few generations of scientists and engineers into being able to do some pretty cool things, but underneath it all I sense its original nature and see the lineage. It’s like perceiving the grain in a piece of wood, and being able to tell when that grain is obeyed and when it is violated. The hardest part about learning how to program a computer these days is understanding the ways which people have imposed their own logical systems on the grain of the bare metal underlying it all, and most of this understanding is locked away in the heads of the programming architect or expressed as inadequate documentation. There are a lot of programmers these days who code by invoking magic words in the order they learned, which is possible now because the true wizards have packaged the magic into a form that CAN be deployed. It makes my life a lot easier, but I’m very grateful that I was born in a time where I could witness the advent of computing. Can I learn to “read the grain” of mathematics also? Is the trail too cold? I sense it’s there to be rediscovered, but this could be wishful thinking.

## 13 Comments

Your mentioning of the “story” of mathematics reminded me of this series of articles by Steven Strogatz: http://topics.nytimes.com/top/opinion/series/steven_strogatz_on_the_elements_of_math/index.html Some insightful things there about counting, geometry, calculus, etc.

In my graduate studies, I’ve had to review some college-level calculus and then study more advanced calculus on my own, so I’ve also asked myself the question of finding the roots of all of these properties, operations, and theorems. It’s all very interesting, but ultimately, the typical user of mathematics has no time to delve into “reading the grain” of mathematics. The mathematicians did the work; we enjoy the results by using the tools (e.g. derivatives and integrals) appropriately. It may help to understand the problems that motivated the tools in order to know which tools to apply where. But sometimes users of the tools find (and prove) new uses that the inventor of the tool did not conceive.

Back to programming, yes, it helps to know how a computer works at the level of bits and circuits. But the purpose of high-level programming languages and APIs is to create a layer of abstraction so that you do not need to worry about what goes on under the hood in most circumstances. Thus, when I create or destroy an object in Java or C++, I don’t need to worry about the 0’s and 1’s blipping through registers and buses. What IS important to know would be why programmers need to create or destroy objects and in what situations you would need to do that (as opposed to using statically-allocated objects). That is, ultimately, understanding the tool and how it works and when it can be used is much more important and useful than knowing why the tool was invented.

Of course, there are exceptions. If you’re writing a compiler or operating system code, then you do need to pay close attention to the hardware.

Hey, concider reading articles by Steven Strogatz in NYT http://opinionator.blogs.nytimes.com/author/steven-strogatz/. They are a nice starter for math.

Dave, Try The Story of Maths, an interesting BBC4 programme covering the historical evolution of some of the key tenets of mathematics: http://www.bbc.co.uk/programmes/b00dxjls

I think it’s viewable on iPlayer over the net, for free.

Posco, Pavel:That’s a great link to Steven Strogatz! I wasn’t familiar with his work before, and it looks like just the kind of thing I’m looking for!GrifterGirl7:That looks like a tasty link also…thanks! Sadly, the BBC prevents the show from being shown on browsers outside the UK. However, the wikipedia entry does at least outline what they cover.On Programming:I get what you’re saying, Posco, and object-oriented dynamic memory management is one of those things that really do benefit from abstraction. I’d argue, though, that there’s a beauty to good memory management that comes from understanding the nature of the underlying system, which doesn’t drift too far from the fundamental of registers and bits. But you’re right in that programming to the abstraction does allow one to make much greater leaps, and it’s been a long time since I’ve done anything on the machine language level. But I still am aware of the fundamentals, and even the abstractions have gotchyas that benefit from knowing the bones of the system. Like dynamic object allocation in C++, which requires you to implement your own reference counting. Successors like Java or C# that have automatic garbage collection for you, which is great right up until that point where automatic garbage collection starts to kill you in an unexpected way, and you learn how to pre-allocate objects and assign them from a pool, and become much more aware of when things are created and destroyed. Maybe this actually isn’t a low-level operation as I was thinking, but the “grain” it’s aligned with is an awareness of memory, cpu resources, coroutines, interrupts, and order of execution. This is much easier to see at the lowest levels of code than it is in systems of abstraction, and it makes me sad that this is increasingly lost because programmers today think nothing of allocating a megabyte of memory to use a linked list of objects to contain a simple string that leaks memory over time. I feel like an old man running around the house turning off the lights to save electricity, but conceptually it bugs me.My first “real” programming language was C. So when I learned about C++ STL’s vectors and lists and maps, I was excited and a bit saddened at the same time. Writing good code takes time and depth of understanding. But a lot of people have very rapidly coded good ideas with not-so-good code, which can be a good thing, too, at least for prototyping. Let’s just hope that the appreciation for good code doesn’t die!

Once upon a time people had only the number sense that comes in our heads. Studies show that it works kind of like a bucket: you put a cupful of water in for every thing you want to count, and you take a cupful of water out for everything that goes away. Which means your brain is very accurate up to 3, and decently accurate up to a dozen or so, but kind of loses track after that.

And so were invented the natural numbers. They answer questions like, “Is this enough food to get through to the next harvest?” which – previously – were answered with “I dunno. It looks like a lot.” If you take a pebble for each day you’ll be out hunting, and create a pile of food that represents one day’s rations, you’ll need one pile of food for each pebble. If you use your fingers instead of pebbles, you can carry your counting system with you everywhere. If you invent a word that means “I’m talking about an amount equal to the number of fingers I have” (we’ll use the word “ten” for convenience here), you can dispense with the fingers thing entirely.

This system is good for counting things that only come in units — bottlecaps or people or rocks — but there are some things that don’t quite fit. Like say you count the number of steps from here to there… and end up with half a step. Or pile all your grain into baskets so you’ll know how much you have, and the 20th basket isn’t full. Do you have 19 baskets or 20 baskets? More critically, if you’re building something, is this wall supposed to be 45 feet or 46? The answer just isn’t there. The answer is a number that’s IN BETWEEN 45 and 46.

The Greeks (possibly Pythagoras himself, of the famous theorem, but possibly not, I’m terrible with names) figured out a way to quantify that inbetween-ness. If the number you want is the same distance from 45 and 46, then it is to 1 foot as 1 is to 2: it is 1/2. The number you want is 45 1/2. If it’s closer to 46 than 45, you may have 45 3/4, or 45 5/6…. but you can describe it using a ratio: you divide the distance into equal-sized pieces, and say how many of those pieces you’re actually using. And so were invented the rational numbers (because they’re made with ratios).

This system worked really really well for making buildings. And after a while, you start to realize that you’re solving basically the same problem over and over again, just with different numbers. The Greeks were also the first to say “This is the same problem, and it doesn’t matter what numbers are in it. I can find the answer once, and then plug the specific numbers in to the formula to get the answer for this particular problem, without having to solve the whole thing over again.” This started Algebra (if you were talking about counting stuff) and Geometry (if you were talking about measuring distances), but they couldn’t figure out a way to put them together, so they were mostly seperate disciplines for the moment.

But Algebra caused its own problems. Once they’d figured out formulas, they started plugging in numbers they’d never tried before, numbers that had never come up in the problems they’d tried to solve. And some of those numbers were just entertaining (If I built a house with the front wall a mile long, the door would have to be this tall) but some were flat-out bizarre. They’d give you distances that were less than none, which clearly didn’t make any sense. Or they’d give you numbers that were in between natural numbers, but couldn’t — no matter how hard they tried — be expressed as a ratio. But, frankly, the system worked well enough for making buildings, whether normal or outsized, and those problems were basically ignored. And so we proceed to the European Renaissance.

Now keep in mind that Roman Numerals are called that because the Romans used them. And the entire European mess was the result of the Roman empire — they all used Roman Numerals, too, because that was what the Romans had used, that’s what their grandparents had used, and they were too backwards, xenophobic, and concerned with day-to-day survival to consider other methods. So when they wanted to count up their bushels of wheat, they marked the VII they had left over from last year, and also the XXIV that they’d harvested this year… and then added them up on their fingers, because there is no good way to add VII to XXIV. So you can see how mathematical manipulation was kind of stunted for many centuries.

But a better system had been invented, only it was invented in India, and it took a while to migrate to the backwaters of Europe. This system used a base — in this case, ten — and used numbers to indicate how many 1s you had, how many 10s, how many 10x10s, and so on. This wasn’t previously possible, because what if you had a 10×10 and 3 1s, but no 10s? You couldn’t tell 13 from 103, which is a fairly crucial distinction, especially if we’re talking about how much money someone owes you. But some bright fellow in India invented a symbol that means “Nothing to see here, move along.” And now you COULD tell 13 from 103, and furthermore you could add them together, quickly and easily, just by lining them up. 3+3 = 6, 1+0=1,0+1=1, put ’em together and you have 116. Sweet, ja? Mathematical development therefore went MUCH faster between 1600 and 2000 than it did between 0 and 1600.

It was about this time that someone pointed out that there are some cases where a number less than 0 does make sense. Oh, not for distances or bottlecaps. But suppose you’re talking about how much money you have? But you have no money, and furthermore you owe your landlord last month’s rent? It could be fairly said that you have less than none. Negative numbers allowed you to put “Net Worth”, and indicate with a single number whether the person in question was wealthy or in debt. And so were born the negative numbers, completing the number line.

Well, with a number system that could actually be used (and allowed you to perform basic arithmetic operations in less than 3 days), and a lot of free time now that the black plague had created a labor shortage, driven up wages, and created a middle class, the story of mathematics starts to split. There were a bunch of people hanging out, entertaining themselves by messing around with numbers and mathematical problems and having a competition to see who could send whom the weirdest letter starting off “Hey! Look what I can do!” I therefore can’t continue to tell you the entire story. Some of my favorite highlights:

Since I mentioned it earlier, Rene Descartes figures out how to combine algebra with geometry: if, instead of designating points as “a” and “b” and defining the distance between a and be; you instead designate a latitude and longitude for a and for b, you can then CALCULATE the distance between a and b using algebraic methods. Anything that can be done in geometry can now be done in algebra. Geometry remains useful, however, for its examination of the methods of mathematics.

Isaac Newton (a lunatic if mathematics ever saw one) spends all his time thinking crazy things like “Why do apples fall down instead of up? Why, if the Earth is NOT the center of the universe, does the moon orbit us anyway? What if the answer to those two questions is THE SAME THING?” His friends, hearing this, urge him to take a long vacation in the country (psychiatrists hadn’t been invented yet, so long trips to the country were the prescribed treatment for being a little loopy). This has the opposite of the intended effect, because it gives him lots of time to think about falling apples and falling moons and what have you. He works out a way to describe things that are changing continuously, which allows him to calculate what would have to happen to make the moon stay in orbit, which happens to exactly explain falling apples (or thrown apples, or more relevantly, thrown cannonballs). As a note, NEVER take a class of algebra-based physics. Newton found it easier to invent calculus than to do algebra-based physics.

Blaise Pascal, after independently inventing several parts of geometry and proving the theorem that still bears his name at age 16, is sent to Paris by his father to “get a life! Drink! Run with loose women! Gamble, for heaven’s sake!” Dutifully, Pascal goes to the gambling houses, and gets drawn into a dispute: two gamblers have to end their game early, and are arguing about how to divide the pot. Pascal starts thinking about the fairest way to do so, and concludes that if one person is in a better position — more likely to win — then he is entitled to a larger share of the pot. His attempts to quantify “more likely” and how much larger his share should be lead to the earliest work in probability.

Hope this was at least a little fun to read; I enjoyed writing it (maybe that’s just because I’m procrastinating).

Amanda

Amanda, that was a lovely story! Put it in your book! :)It makes me realize, too, that a multi-pass approach to mathematics might work for me. The sweeping arch of story might be helpful, as would be the application of mathematics to current situations. Then there’s the part where you get to the detail work (where I traditionally get lost).

Maybe what I’d like to see is something like “Iron Chef” for mathematicians, with expert commentators.

Couple of things:

Someone recently mentioned the “standard history of physics” that is often taught in introductory or survey classes, and how it really short-changes the progenitors and misrepresents the process in developing science. (For example, Einstein had, and had to have, a very good grasp of the work of Michelson and Morely, whose work was instrumental in his developing the theories of relativity. But all the latter two usually get is a reference to a “failed” experiment to detect the presence of cosmic ether, if even that.) But sometimes to teach things you have to start out with representations that are overly simplistic and abstract to reach competency, after which you can move into more nuanced and more historically accurate presentations. Whether that’s of use to someone (like yourself) seeking a high-level overview is something I’m not nearly qualified to answer.

The history of math not only has practical roots (in accounting and engineering) and theoretical roots (in what has become number theory and logic), but philosophical and mystical ones. One example is Pythagoras, easily the most famous mathematical thinker in history, who had an elaborate philosophy based on what we now call numerology that governed many aspects of daily life and the nature of reality. (I think numerology is a great example of the importance people instinctively place on math, given it’s wide spread.)

Did you see the London Underground-style map of the history of science? While I’m not as much of a fan of the style as some, it definitely was food for thought. The comments at Cripian Jago’s orginal post are awesome and reflect the strong scientific presence on the Interwebs, but I think they also illustrate one of the problems inherent in such a project: at what scale do you present this sort of history? How important does a contributor or fork or whatever have to be to get included?

I wish I could point you to some accessible books on the history of math and the relationships between math concepts, but I haven’t read any. (To be fair, I haven’t read any bad ones, either. Whether this is because there aren’t any or that my reading list hasn’t strayed into math much is debatable.)

If you do find something or some things that meets your criteria, please post about it. I would definitely be interested.

One of the great books I’ve found on the subject is The Universal History of Numbers by Georges Ifrah. That was one of those amazing library finds that blows your mind. He goes into huge detail about how some cultures counted without having a number system that went greater then 2.

The thing that made me blow a gasket, however, was reading about how multiplication developed. If you’ve encountered “Everyday Math” you might have come across diagonal or matrix multiplication. This book talks about some experimental multiplication algorithms from the Renaissance, guess what was one of them. :( There’s a reason that the Standard Algorithm was used for so long.

I’d love to see what you come up with.

It’s also the case that most K-12 math education presents things in a more or less HISTORICAL order, rather than a LOGICAL one. Read through Amanda’s post above, and you’ll see that K-12 math is taught roughly in the same order that Amanda goes through it.

That’s not actually where to start, at ALL, if you’re trying to learn real mathematics and have it make sense. Start with set theory and formal logic. Those are basically the building blocks of ALL of the rest of mathematics. They’ll help you find the grain.

Also: arithmetic != mathematics. Heck, most of the mathematicians i know can’t do arithmetic in their heads at all. Numbers are just symbols, and once you get to a certain point, they’re not even symbols that work all that well.

Pierce:Awesome link on the subway map of science…that’s a definite keeper. That neatly provides a working context for further exploration. I wonder how it could be represented as a database structure that could be annotated? I’m thinking something like this gives the history of a science “terrain”, which can only help absorption. Your comment also makes me realize that while the history/context is interesting, ultimately the point is to be able to DO the math, which means getting into the details. I suppose that’s the next step, to gather a set of problems that provide insight for each stage of application and historical development.Amanda’sstory, for example, would be a good structure for that; just need problem sets that go with each one. Awesome comment, btw.Family Lifeboat:Awesome! I just ordered a used copy…thanks for the heads-up! I’m looking forward to blowing a gasket :)King’s Rook:Do you know of a logical roadmap to mathematics? I guess this also raises the question, “what is real math”? In my own case, there are specifics: I never really mastered all the various rules of algebra, differential equations never really came to me, I hated linear algebra and geometry, and calculus seemed rarefied and somewhat lifeless. I wish I had a more intuitive grasp of what I was doing with the more advanced concepts, because I couldn’t manipulate them well. So I am going back to the basics to see if I’ve somehow missed the point, and want to rebuild a solid foundation. I remember this one Indian student in grad school who had a phenomenal mathematics foundation, and was making connections between equations I couldn’t grasp at all. I also did very poorly in that class–I think it was a numerical methods–and the experience has stuck with me. Coupled with the general lack of connection I had with mathematics going through elementary school through undergraduate college, despite it being important for several classes, and I have the desire to revisit it now to see if any of it will make any more sense. What I know now about how I learn might make the difference.There are a lot of very good popular books on mathematics out there. Some of my favorites are: Infinity and the Mind by Rudy Rucker; The Mathematical Experience by Hersh and Davis; Godel, Escher, Bach by Douglas Hofstadter; anything by Eli Maor, Martin Gardner, and Morris Kline. Most of these are historical and/or philosophical in their approach which works well for me but YMMV.

I’d also recommend looking at some problem/puzzle websites like MathsChallenge or Project Euler. Finally I’ve collected some lists of books for learning math on delicious.

Good luck and have fun.

I meant to add two other resources for seeing/learning the scope of mathematics. The first is The Math-Atlas which has a lot of information hidden behind an old-style graphics presentation. The other is another book The Princeton Companion to Mathematics which is more of a single volume encyclopedia of topics on math but is very well edited and contains some excellent chapters.