25 March 2014

I didn’t study computer science in college; I studied math, with a bit of creative writing on the side. If I mention this to someone, they sometimes respond with a comment about how that “makes sense” because programming and math are both such “logical” disciplines.

That’s a silly sentiment. Are anthropology and biology are somehow less logical disciplines than math or computer science? Does programming need that much more abstract reasoning than being a lawyer or doctor?

If you set aside the left brain/right brain pseudoscience, though, I admit I did learn a few useful things studying math that I might not have learned studying English. They don’t have anything to do with abstract algebra or number theory. Rather, I picked up some tricks for attacking and understanding large, complicated systems. Here are a few of them.

Notation is an *invention* and a *technology* as much as a transistor or plane is. Rigorous notation is the foundation of modern mathematics; concise algebraic notation may be one of the greatest inventions of the last thousand years.

Don’t try to outsmart yourself – use this wonderful notation technology. Write things down, and write them down in as rigorous and concise a way as possible. Use notation to guide you. Causality diagrams, state machines plots, graph theory, even UML can be invaluable for understanding programs. Make the effort to set your problem down in notation.

All programs are special cases of more general programs. This blog is a special case of a web page, which is a special case of a client-server program. Try to imagine which generic class of program your particular program is a special case of. By attacking a problem in an idealized form, you detach yourself from unnecessary details and frivolities. You’re clear to focus on what’s important.

A concrete example of this is the application of Djikstra’s algorithm to link-state routing. Sure, you could try to imagine a routing algorithm by thinking about packets, routers and links, but the problem is much more clean when imagined with vertices and edges.

Genericity is the lack of constraints on a model, but we often gain leverage on a model by *applying* constraints to it. Thus, genericity is somewhat opposed to leverage, and you can imagine them as forming a sliding scale.

However, this scale is not smooth. Some models offer a great deal of leverage yet keep plenty of genericity. These models are solid gold: you should take special care to identify and understand them.

In mathematics, models based on relations, functions, sets, and groups are dominant. In programming, special emphasis is often placed on graphs, functions, and partially-ordered sets.

The dominant lower-level model in programming is undoubtedly the state machine, which provides a flexible but powerful way of understanding all kinds of programs. Take special care to recognize when you can model your program as a state machine.

Programs, like mathematical theories, are often too complicated to understand all at once. So start small; try to understand n=2 or n=3 before tackling n=1000. Try a few corner cases and try to understand how they relate to the general case. You don’t gain understanding by staring at a problem for hours until you have an epiphany – you gain understanding by chipping away at the edges until the core is laid bare.

All these techniques proved useful to me in time, but the best lesson I ever learned from studying math was humility. Studying math, I quickly reached a point where intuition alone could not provide me with answers.

Nowadays, when I make mistakes when programming, I almost always can trace them back to some arrogance on my part. So my last piece of advice is to take your time. Write things down, draw a diagram or two, try to think about the more general case. You’d be surprised how helpful a bit of humility and hesitation can be.

comments powered by Disqus