Sponsored Link •
Believing your unit tests of course.
Ignoring statements by some proponents of agile processes that suggest or state commenting is not required in well factored code, most folks I talk with place some value on comments in code and do it with varying degrees of religiousness. One of the real downsides of commenting code, and a point not lost on or by the extreme XP (XXP?) set, is that commentary in code has to be maintained. Why this is so becomes quickly and sometimes cruelly obvious if you, as a client of some chunk of code, make the assumption that the comments actually reflect reality and they don't. I liken this feeling to me playing basketball with an NBA star. I'm driving to the basket all set to score and this guy fakes left, fakes right, steals the ball and I'm left there with my finger in my nose muttering an eloquent "Huh?".
The issue of course is one of security premised on assumption. The assumption that the comments are up to date and accurately describes what the code is doing gives one a false sense of security when making use of the behavior described. It doesn't take too terribly long to realize that there are some comments you can trust, and some you probably shouldn't. The JDK contains documents you can probably trust, ignoring bugs and other unintentional deviations. Meanwhile it's at least been my experience that comments at ones own work should not be trusted (unless you work for Sun and you happen to work on the JDK of course). It's just not part of the product and vanishingly few development organizations are concerned with long term efficiency to the point of investing in accurate code commentary.
If you've been in the business for long enough, say about a month, the above observation should strike you as nothing if not banal. You're either nodding your head knowingly, resigned to the realities of professional programming or you're lucky and you've found a place where the fish always bite, metaphorically speaking. Best just to keep it quiet and keep reeling them in. Anyway, for the nodders in the group, I'm actually here to talk about unit tests. Psych.
Ignoring several realities, writing code that implements well stated functional requirements is pretty straight forward. You take stuff in, apply some transformations, and generate a result. Unquestionably some things are harder than others, and I'm not going to spend time discussing or defining "well stated". Suffice to say that if someone can tell me what something is supposed to do I can build something that will do it. But, as I said, I'm ignoring several realities in making that statement. Some of those realities are that the thing has to finish in some reasonable period of time. The thing can't just crash or fail in the face of erroneous but reasonably anticipated input. In some environs these issues fall under the heading of non-functional requirements. Personally I just call this sort of stuff the fun bits. Anyway.
Unit tests are almost exclusively focused on what I'm calling the easy bits of a code chunk. Given a set of inputs does it generate the correct output sort of thing. They seem to rarely, if ever, pay attention too much to detecting race conditions, deadlocks or combinatoric performance issues. And this can, probably does, lead to a false sense of security. An assumption that the unit tests define the totality of a systems behavior. Unfortunately or fortunately, depending on your perspective, this isn't the case.
It's probably the case that writing unit tests that verify that there are no deadlocks in a system are simply too expensive to write with the routine frequency of functional unit tests. Or maybe it's too hard, one begetting the other. I dunno. But, at least where I've looked, there aren't too many of these non-functional unit tests. So you can't really take unit tests at face value. Probably, unless you're working on something small, or a single user app, you need to reserve time in the schedule to test for these things and, more, time to fix them. Unless your crew happens to be filled with nearly omniscient demi-gods the likelihood of a non-trivial system having performance issues and such approaches certainty.
So if you're looking at your unit test results and you're congratulating yourself on building a quality chunk of the world or, more likely, someone not quite so in touch with the system is doing the congratulating for you, be aware that it can pay large dividends to ask "Are you sure?"
|Rick Kitts has been making a living writing software for a little while. He's started a company or two, worked at bigger companies, but mostly at startups. Constantly on the look out for things to help him build better systems he's a bit of a tool and process slut, though he can't bring himself to try C# or get serious about UML. Go figure. He's convinced being invited to have a weblog on Artima is the result of some glitch in the matrix. He's keeping quiet about it though.|