Naming

Digital Research Inc (yes, the CP/M folks) had a debugger once. The author of the debugger was a linguist, and thus all of the identifier names were in Russian. There were new control structures invented out of macros (things like unless and reprise and whenever, all poorly armored against side effects). There were clever constructs, such as memory moves that deliberately over-wrote each other in order to accomplish fills (the author was proud of that bit, I believe, and crowed about it in the comments). There was a bunch of fluff code (some C programmers feel compelled to reinvent the standard C library with every new project), and a fair amount of stuff that was just random and bad. But worst of all were the names. Russian wasn’t the author’s native language; hallway rumors had it that he’d written other programs in assorted scandanavian and european languages. Anyone wading into the code was going to have a tough time. And sure enough, that debugger was buggy as hell.

I guess it was cute. I’m sure it was a disaster. DRI isn’t around any more, and part of the reason (I believe) was the culture that allowed this kind of bullshit to happen.

Names are important. I and J and K are probably okay for loop control variables (though I doubt it), but as globals they are right out. The declarations of things like Klepesta and Barbados better have a damned good comments. I once called something DefineGuidRightFuckingNowDammit and it got the message across. It also had a comment next to it explaining the name (and the time of day, which was like 2AM just before a release).

Tradition has it that if you know something’s true name, you have power over it. I’m guessing that quite a few programmers don’t know the true names of the things they are manipulating, and are correspondingly powerless when it comes to figuring out what the code does.

Names get even more critical in object-oriented design. Experienced designers know that great names, like Entity or Process are so overloaded as to be meaningless. One of the best tools in a designer’s bag of tricks is a good thesaurus (I’m partial to Roget’s). Often you need a small pantheon of somewhat-related names, and this is why I believe that good designers are also compulsive readers. Command of language is hard to underestimate.

And if I see another set of variables named ii and kk I am going to scream.

Land Grabs

Keywords are bad, yes? Good languages are defined with a minimum of keywords. Bad languages (e.g., early relics like COBOL) grew up before the age of such minimalization, and are shot through with keywords.

Food for thought: Every time you use #define you’re defining a new keyword. Not so cute any more.

For years (and this may still persist) the Macintosh had a "#define nil 0" in the core headers, and all of the sample code used it; in other words, Apple had added the keyword nil to all Macintosh programs written in C. The values of true, TRUE, false and FALSE are so varied and re-re-re-defined that dependencies on these are one of the first things to clean up when merging one body of code into another. The number of programs that mis-define NULL are astounding. The variations on “a typedef for a 32 bit integer” seem nearly infinite, and are sometimes frightenly wrong.

Rules of Thumb

If you think it’s cute, it probably won’t be in the morning.

There are three kinds of cleverness: Smart, Coyote and Stupid. Smart-clever is great fun and will earn you points in the afterlife. Coyote-clever will grate on people’s nerves, but they’ll respect you for it. Stupid-clever will get you dumped off of a tall building, and people will pee on your grave.

Rewrite dodgy code as early as possible. Small bad decisions grow into large bad decisions, and they are more easily corrected before they metastasize or solidify.

The right forum for obfuscation is the official Obfuscated C contest.

This entry was posted in Uncategorized. Bookmark the permalink.

0 Responses to Naming

  1. Steve says:

    “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?” — Brian Kernighan

    Or, paraphrased and more to the point:

    “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” — (scavenged off of Google)