The engineer with that harried, vaguely irritated yet highly competent demeanor we all wear so well ducks from the cameras like a common CEO or accountant as he enters the courthouse to defend his teams code from critical errors and a security hole that festered in the absence of proper state-required methodology.
Meanwhile a similar case is underway in the federal courthouse a block away, where a developer is skirting the blame for his arrogant and naïve new implementation of an otherwise useful design pattern by pointing the finger at even worse security holes in a major vendors underlying platform, where the vendor platform plus his app produced an impressive pile of dung infested with a diverse society of bugs that cost customers millions of dollars.
I can just imagine.
Accountability made headlines across many professions this year, yet despite mountains of buggy code and viruses, and security holes so gi-normous that no self-deprecating MS Office macro writer script kiddie could reasonably resist despite poorly-designed implementations of poorly-understood, informally-documented requirements software engineers remain sheltered behind liberal EULAs and ill-defined ethical expectations from a public trained (thanks Microsoft) to expect predominant failure. Will software companies and perhaps even individual engineers one day be held accountable for egregious errors and poor development and testing methodologies?
In a way, I hope not. I dont want creativity and innovation and all its necessary bleeding edges to be dulled for fear of attracting legal sharks. Yet at the same time, when tinkering becomes profession and cool creation becomes a critical set of business processes driving the worlds engines, I can imagine some sense of accountability enforced for the betterment of the craft and profession. I confess I dont have good answers as to how, exactly, but I shant let that keep me from spouting off about it.
End User License Agreements (EULAs) really are unbelievable. The click-through experience is equivalent to driving across a bridge and being stopped at the beginning and asked to sign a document (or better yet for the sake of the metaphor, using your EZ Pass) stating that the bridge may well break suddenly and for any reason and that when this happens your surviving relatives wont be entitled to any compensation whatsoever, not even something to cover the cost of a small bouquet of something purple for your funeral.
When companies make investments, they are also often investing in reduction of risk, buying the presence of someones else buttocks to place on the line as a substitute for their own hindquarters. But aside from certain sterling consulting groups that really do command my highest respect (I am awed by the collective brainpower and professionalism of my friends at Thoughtworks, for example), buyer beware in IT.
Id submit that a group of programmers who introduce a critical transactional error may cost an enterprise like Citibank more money than most any virus writer, making run-of-the-mill development and platform dependency a much riskier prospect than most things threatened by whatever code scratches at the other side of the firewall. And Id suggest that Microsofts inability to enforce a security model has gone on so long, for so many product releases, that regardless of whether the individual virus writers are brought to justice the mightiest vendor itself ought to feel and be held responsible in some way for this death of email and for the periodic attack-based lethargy of major business processes.
There should be a way to track whos accountable for which software errors, track this through in-house developers to consultants to the vendors, and so on. Much easier said than done, of course, particularly if we really do wish to avoid stifling innovation and avoid over-regulating.
Software is hard, often harder than building bridges. It is like building a billion bridges, none of which actually exist except at runtime it is like building the concept of a bridge that can dynamically build itself beneath your cars wheels as it moves forward over an increasingly risky gap in space, all the while building similar bridges for others cars in front, alongside, and behind yours, and then properly cleaning up the bridge when youve reached the other side. Software is hard, what with its intrinsic (and to me seductive) virtual nature, and keeping track of whos accountable for what moving pieces, particularly over time, is hard.
I was horrified by Goslings story of the real time engineer who introduced an error that apparently cost a test pilots life when the system failed, and that engineer was required to confess to the deceaseds family. I dont want this level of accountability applied to my life as an individual engineer, and in the enterprise there are things called corporations to bear the brunt of accountability without sending large groups of individuals to the stocks, or to bankruptcy, or to face the unimaginable sorrow of a widow. I am not a fan of resolving issues by legislation (being generally irritated when anyone other than my daughter tells me what to do), but there also things like the SEC to hold financial folks accountable, the FAA to hold aviators accountable, and so on, and I cant help but wonder if something like this is in our professions future, for the betterment of developers, companies, and end users.
This issue seems to have come to a head earlier this week, after Id started filling an emacs buffer with this directionless blogified rant: Microsoft now faces a class action suit in California as a result of security holes it has left lingering in its operating system. Interesting developments, these. Though their motivations certainly are not aligned with wishing to protect the ethics and professionalism of the software development profession, lawyers are noticing that users have had enough, and have begun to try and point a few well-manicured fingers at some of those who might reasonably be held financially or even criminally accountable.
Management gets what it wants -- if management wanted programs to be free of errors, there are well-established ways of improving the development process to eliminate the cause of errors. Instead Microsoft's management wanted features, time-to-market, and "good enough" software.
For example, Microsoft is a compiler-vendor as well as an OS vendor. They could have designed a language that makes buffer-overruns almost impossible. They didn't look for the causes of their problems, blaming programmers rather than the system the programmers have to live in.
Whether you can make people responsible for the quality of the software they produce depends on what promises were made when the software was delivered. Responsibility rests with the person who made the promises. But you cannot force a person to make promises according to some pre-defined guidelines - without taking that person's freedom away. And, by the same token, responsibility - and accountabilty - are possible only when one has the freedom to make promises.
That holds for many professions (and many human activities), not just software. I recently heard on NPR the case of a transatlantic flight where a passenger suffered a cardiac arrest above the ocean. The only doctor on the plane happened to be a heart surgeon. Upon examining the patient he determined that immediate surgery was necessary, or death would be almost certain. The problem was that an airliner, of course, was not equipped for open-heart surgeries. So the doctor had to improvise - they used whiskey for sterilization, and a coat hanger to cut the man's chest open with. The operation worked, and the man survived.
Of course, if the same doctor used that same equipment in the well-appointed operating room of a hospital, he would surely have been accused of gross negligence and irresponsibility. But under those circumstances, above the ocean, he was a maverick who saved a life.
I recently worked on a small project where the objective was to put together a database driven Web site for a demo. The project's instigator specifically stated that possible data-loss was just fine, and the speed of delivering the application was by far and away the most important criteria. So no promises of reliability were made, and that was just fine with the customer. For a different customer, delivering that same piece of software would have been irresponsible and negligent.
With regard to personal productivity software or PC operating systems, most people are fine with the possibility of viruses invading their computers from time to time. I guess in return for accepting that, they get bells and whistles features, and a reasonable price. If that arrangement didn't work, they would certainly stop buying that software.
If companies could easily and accurately identify the programmers who cause these problems, they wouldn't be paying the top ten percent of producers only marginally more than the bottom ten percent.
Why, after all this time, we don't have software that can validate and grade software code -- at least to a first approximation -- is beyond me. Unless, of course, it's because we can't be held legally responsible for what we don't know -- even if that ignorance is of our own choosing.
> If companies could easily and accurately identify the > programmers who cause these problems, they wouldn't be > paying the top ten percent of producers only marginally > more than the bottom ten percent. >
Quite - and that's because it would cost them more dollars.
Economics work such that the tendency in most companies is to always push for more from less - this is manifested in all sorts of ways:
(1) Demands for large numbers of features in impossibly short timeframes (2) Demands for large numbers of programmers but no willingness to ensure that they are trained appropriately and mentored (3) All programmers must be exceptional and there must be lots of them. (4) and so on....
Business isn't entirely to blame either - as a mentor I've seen many programmers cut corners with the justification that the work involved is "too hard" or "takes too long" - that's laziness pure and simple.
As an industry, we need to get our act together on all sides whilst accepting that people do, indeed, make mistakes. Once we've accepted that fact, we shouldn't then just ignore it - rather we should set about finding better ways of mitigating it and, in the worst cases, dealing with the fallout.
Whatever their goals were, their management probably made the right choice for the company, considering its dominant position in the market. And history is filled with other examples of superior products being beaten in the marketplace by "good enough" alternatives.
>> For example, Microsoft is a compiler-vendor as well as an OS vendor. They could have designed a language that makes buffer-overruns almost impossible. <<
They're trying, with .NET. It will just take time for their products to migrate over to it.
Give Microsoft a little credit, all the other mainstream operating systems out there are built on C/C++ as well. I bet there have been known buffer overrun exploits for every operating system in common use today.
> "I bet there have been known buffer overrun exploits for > every operating system in common use today" > > Didn't Niklaus Wirth create an operating system using his > semi-OO language Oberon? I bet that didn't have any > buffer-overruns. The Oberon language grew out of the Oberon operating system project that Wirth worked on. As far as I recall Oberon was not really object oriented. Maybe you are thinking of Modula-3? Oberon is still available at http://www.oberon.ethz.ch/. Genera, which ran on the Symbolics Lisp machines, is another example of an operating system written in something other than C or C++. Keep in mind that there are people who have written freely available applications in C who will pay you several hundred dollars if you can find any security exploits in them, including buffer overflows.
Perhaps we really are a young industry and profession, and not growing up all that quickly. Perhaps we simply do not yet have our act together.
Certainly we have no objective means of telling each other (much less managers and customers) with any certainty who is at the top 10% or bottom 10%. Meanwhile, the average owner experience for a desktop computer remains a bit too close to the owner experience for an automobile in 1896, when you were qualified to own a car if your could rebuild a carburator.
Perhaps pilots, accountants, and paramedics do not always regulate their organizations with fairness and accountability, but they do at least attempt to assert standards of certification and professionalism. They do a better job than we do. When they mess up badly, there are at least codes and standards to which to compare their behavior, and to consider revising and refining. They seem to learn collectively from their public debacles.
I don't see how we can avoid such codes and standards forever, or why we should try. As many others have said: we can try to establish these standards ourselves, or we can let lawyers make them for us. Their existence is inevitable.
Surely some meaningful horizontal expertise (as opposed to familiarity with vertical technologies du jour) can be measured objectively. Perhaps the Agile Methods movement has produced some of the first truly good material with which to attempt to measure levels and categories of programming craft.
For example, you are either familiar with version control systems and system integration, OOD principles, design patterns, refactoring, automated unit testing, and agile practices, or you are not. Standards for expertise in categories such as these will surely emerge for us all, one way or the other.
I thought your comments and story were very good, and enlightening. I question only your final paragraph:
> With regard to personal productivity software or PC > operating systems, most people are fine with the > possibility of viruses invading their computers from time > to time. I guess in return for accepting that, they get > bells and whistles features, and a reasonable price. If > that arrangement didn't work, they would certainly stop > buying that software.
I don't know that consumers generally feel empowered to make such a decision. I could be wrong, but I think that the presence of a monopoly forces them to accept this state of affairs. Regardless of whether the monopoly needs to be disrupted, I think it may need to be regulated and consumers protected, since they don't appear able to resolve the issue through purchasing decisions.
I also appreciate the difficulties faced in creating such a thing as pointed out very lucidly by others who've responded, I dislike regulation myself, and we're bound not to get it right the first time, but I think it's probably inevitable as the profession matures, and would rather have us in the community do it than have a vendor-specific group of engineers do it (a Microsoft or IBM can't regulate themselves or define all the guidelines by which they should be judged, and career governmental policy makers can't define the regulations for all of us).
Really great comments, I appreciated reading them.