1 + 1 = uhmm...

In my years at MIT, I have noticed something interesting. Not a single one of us are any longer capable of doing basic arithmetic. Math; sure. Integrals, algorithms, proofs; no problem. But, how much do I owe for my share of dinner?

I'm quite certain that when I started at MIT, I had the ability to add, subtract, multiply and divide. Somewhere along the line, I lost it. Perhaps it popped off the top while getting filled with other skills, or displaced onto an array of electronic gadgets that can do computations far more efficiently than me.

Thus is born the terror of dining with geeks. It's not uncommon to go out to dinner with a crowd of people, maybe even as many as 8 or 10. The food is good, the conversation is great, and then the bill comes. A pall of silence falls over the table. How much does everyone owe? This could take hours!

First off, math majors aren't allowed to even get near the bill. Once you get out of high school, mathematics has nothing to do with numbers. It's symbolic manipulation and proofs. A math major might be able to prove that it is theoretically possible to find an algorithm to compute what people owe, and may even show that such a system will work even when graphed in non-Euclidean space, but will stare at you blankly when asked for an actual total. Not only that, but any math major will consider the task of basic arithmetic as below their station in society.

Then, there are the computer scientists. As a friend points out, computer scientists only know about 0, 1, and infinity; everything else is a special case. (That's a joke, you can laugh now.) CS folk are probably among the most ornery people, along with humanities majors who call themselves "artists" and pronounce it "arteeeeest." CS majors dwell on interesting systems and odd occurances. (Hey, look at who's writing this page.) It's like when you have something caught between your teeth, and you can't stop exploring it with your tongue.

Enough with the bad analogies, though. Theoretical computer scientists might be able to prove that the algorithm used to arrange tables in the restaurant provides the optimal placement in O(n log n) time, but computing a 15% tip will leave them blubbering on the floor. Software engineers may discuss the design of a program for the Palm Pilot that would efficently collate and compute billing information, and then broadcast totals to everyone via IR...but probably won't get around to writing it, and certainly won't figure out the bill themselves.

There are only two resolutions to such a dinner trip. If you're with a good crowd, everyone will throw in a Yuppie Food Coupon (a 20 dollar bill) and the total is good.

If you're unlucky, everyone throws in some money, and it's $10 too short. People begrudingly argue about the check, realize they forgot they ordered dessert, play with electronic toys, and throw in a dollar or two more. Repeat until you get a reasonable total.

Whatever the reason for this arithmetic abyss, this numerical nightmare, this mathematical menagerie, it's a good thing that computer geeks are well paid today. If not, they could quickly starve into extinction. There's probably a simple solution to this problem; it's likely as easy as two plus two...if only I could remember what that is.

Back.