Debugging is twice as hard as writing code in the first place. Therefore, if you write the code as cleverly as possible you are, by definition, not smart enough to debug it.
This principle is not observed by programming beginners, who know enough to get themselves in trouble, but often not enough to get out of it. Anyone who has been coding a long time (and believe me, I have been coding a long time) knows not to get too smart lest they write code they cannot debug. It's slightly different when you are struggling to establish an architecture (are we a library or a framework?), but once the production environment is established you really want to be able to crank it out without having to rethink each corner case.
I even once went so far as to write a pre-processor for BASIC PLUS that operated on BASIC lines (which had to be numbered, whereas the input to the pre-processor did not) much as an assembler did to lines of symbolic machine code. In other words, I made BASIC code relocatable and made library sharing between projects much simpler. Because it was really more like a linking loader than an assembler (though it had features in common with both), I called it Blink. But that was thirty-odd years ago, when I was writing accounting systems. It certainly let us crank the code out. Happy days.
Of course Blink no longer exists. I am not one of those programmers who keeps every line of code they have ever written, regarding most of it as ephemeral: built to perform a task, and no longer relevant once the task is complete. I do not envy the curators of computer museums, who must decide what is worth keeping, and what can be kept running. You can do that with hardware, just about, but with software the profusion makes it impossible to track what's going on. Perhaps soon the open source world will find fixes for this bug—certainly the appearance of public DVCS systems will help. Some software, however, seemingly goes on for ever and ever.
It was very pleasing earlier this year to see a bunch of BBC Micros of ancient vintage still doing what they were designed for thirty years later. I wonder whether we'll ever see a 30-year-old iPhone anywhere but a museum? (To be fair, you don't see many BBC Micros nowadays). There are relatively few engineers building control systems and the like for a lifetime much longer than that.
Of course most systems of any age have been modified somewhat from their original purpose. You build a billing system, then the sales department come along and say "we could double our revenues if we could bill more flexibly," and the dance begins. You fix your programs to handle new requirements, they come along with even newer requirements, you add more fixes, and so on. Unless you are very, very disciplined, and are working with well-designed well-written code (e.g. if someone else wrote the program badly you may be screwed) you can end up finding that the change you make, while it meets the new requirements, no longer meets earlier requirements because of unintended consequences from your change. In simple terms, your program has become so complex that fixing it in one place breaks it in another.
In the software world we can use regression tests to alleviate the worst pain from this kind of activity (in addition to the unit tests we use to establish basic functions operate correctly). Whenever you find an error in the program, you write a test that fails with the problematic release but should pass when the system is fixed. This has the advantage that if your changes cause unanticipated failure then there should be a high probability that at least one existing test should fail. The presence of such regression tests sets a sort of “high water mark” for software performance. It has to be at least good enough to pass all the tests, or something is broken. In the presence of the tests we can refactor our code (reorganize and re-structure it)
Imagine now, if you can, a computer program two hundred years old**. Yes, I know, that pre-dates even Charles Babbage's analytical engine by more than a century. Never mind that. Just suppose that by some freak of probability some primitive computing technology had been developed by an unsung genius, and that its output is so valuable that it must be kept running†. The order of society itself depends on this program running, and yet it has never had a single test written for it. Don't blame the authors, when the constitution was written there was no such thing as test-driven development.
And yes, I am talking about the law of the land as something in need of refactoring. In just the same way as software engineering has benefited from test-driven development, so the law would, in my immodest contention, benefit from principle-driven development. By this I mean to suggest that the lawyers, when proposing a law, should list some desirable outcomes (tests) which somehow engender the law's purpose. If we could at least get agreement on what such principles might be, and the fact that they are desirable, then we might establish benchmarks for the operation of a law and be able to reject amendments that violate the principles (break the tests, in coding terms).
Just as sometimes we get our tests wrong, and have to rewrite them, sometimes legislators will occasionally get the principles wrong and need to rewrite those. But a discussion of principle would be a matter for all-out debate, whereas it seems to me that modifications of the law would be less contentious as long as none of the underlying principles were violated (in other words, as long as the law introduces no regression errors).
The present laws are full of special cases, introduced because of the lobbying of those with special interests or to suit one particular constituency. It's time we stopped placing so much emphasis on passing new laws and decided instead to add principles to the existing law so that we could start to detect more easily when the law started to diverge from society’s desires about the way it operates. In time the law could be cleaned up in much the same way as a crufty old program can be re-engineered to bring it in line with more modern requirements.
At present the law is a bug ugly ball of string, and there are many professionals making a good living finding and exploiting loopholes that operate to the advantage of their clients. We need more foresight, and we need a legal system that effectively says “this law cannot be amended, and is not intended to operate, to provide tax benefits to those who do not require them” or “this law cannot be used to the benefit of anyone with above-average income.” While this isn't a perfect proposal, it would perhaps serve to focus people's interest on those who are specifically intended to benefit from the passage of particular laws, and the principles might over time become an accepted set of goals for new legislation.
When I think of how crufty code gets after just a few years I shudder to think what the law must look like from the inside. It's certainly obvious that the legislature has not been operating “by the people, for the people and of the people.” It's time we changed that. Since I have no vote I'd appreciate it if my voter friends could execute this change at the first available opportunity.
* Merely one in a very long list of achievements, as any Internet search will reveal
** Or, perhaps, 236 years old
† Believe it or not, at the time of the "year 2k" panic some banks discovered they were running (in compatibility mode) some programs originally written in 1400-series autocode for which they no longer had the source. This would have been more surprising back in the days when banks were regarded as reliable and responsible. Happy days.
1 comment:
Also, if principles has to be stated before a law could be past, the judiciary would have a better guide for how to interpret the law and whether it was unconstitutional. Remember when Kagan trotted out the congressional record on DOMA? It shows the real intent of that law was to express displeasure with gays.
Post a Comment