Artima Weblogs | Sean Neville's Weblog | Discuss | Email | Print | Bloggers | Previous | Next
Sponsored Link •
The engineer with that harried, vaguely irritated yet highly competent demeanor we all wear so well ducks from the cameras like a common CEO or accountant as he enters the courthouse to defend his teams code from critical errors and a security hole that festered in the absence of proper state-required methodology.
Meanwhile a similar case is underway in the federal courthouse a block away, where a developer is skirting the blame for his arrogant and naïve new implementation of an otherwise useful design pattern by pointing the finger at even worse security holes in a major vendors underlying platform, where the vendor platform plus his app produced an impressive pile of dung infested with a diverse society of bugs that cost customers millions of dollars.
I can just imagine.
Accountability made headlines across many professions this year, yet despite mountains of buggy code and viruses, and security holes so gi-normous that no self-deprecating MS Office macro writer script kiddie could reasonably resist despite poorly-designed implementations of poorly-understood, informally-documented requirements software engineers remain sheltered behind liberal EULAs and ill-defined ethical expectations from a public trained (thanks Microsoft) to expect predominant failure. Will software companies and perhaps even individual engineers one day be held accountable for egregious errors and poor development and testing methodologies?
In a way, I hope not. I dont want creativity and innovation and all its necessary bleeding edges to be dulled for fear of attracting legal sharks. Yet at the same time, when tinkering becomes profession and cool creation becomes a critical set of business processes driving the worlds engines, I can imagine some sense of accountability enforced for the betterment of the craft and profession. I confess I dont have good answers as to how, exactly, but I shant let that keep me from spouting off about it.
End User License Agreements (EULAs) really are unbelievable. The click-through experience is equivalent to driving across a bridge and being stopped at the beginning and asked to sign a document (or better yet for the sake of the metaphor, using your EZ Pass) stating that the bridge may well break suddenly and for any reason and that when this happens your surviving relatives wont be entitled to any compensation whatsoever, not even something to cover the cost of a small bouquet of something purple for your funeral.
When companies make investments, they are also often investing in reduction of risk, buying the presence of someones else buttocks to place on the line as a substitute for their own hindquarters. But aside from certain sterling consulting groups that really do command my highest respect (I am awed by the collective brainpower and professionalism of my friends at Thoughtworks, for example), buyer beware in IT.
Id submit that a group of programmers who introduce a critical transactional error may cost an enterprise like Citibank more money than most any virus writer, making run-of-the-mill development and platform dependency a much riskier prospect than most things threatened by whatever code scratches at the other side of the firewall. And Id suggest that Microsofts inability to enforce a security model has gone on so long, for so many product releases, that regardless of whether the individual virus writers are brought to justice the mightiest vendor itself ought to feel and be held responsible in some way for this death of email and for the periodic attack-based lethargy of major business processes.
There should be a way to track whos accountable for which software errors, track this through in-house developers to consultants to the vendors, and so on. Much easier said than done, of course, particularly if we really do wish to avoid stifling innovation and avoid over-regulating.
Software is hard, often harder than building bridges. It is like building a billion bridges, none of which actually exist except at runtime it is like building the concept of a bridge that can dynamically build itself beneath your cars wheels as it moves forward over an increasingly risky gap in space, all the while building similar bridges for others cars in front, alongside, and behind yours, and then properly cleaning up the bridge when youve reached the other side. Software is hard, what with its intrinsic (and to me seductive) virtual nature, and keeping track of whos accountable for what moving pieces, particularly over time, is hard.
I was horrified by Goslings story of the real time engineer who introduced an error that apparently cost a test pilots life when the system failed, and that engineer was required to confess to the deceaseds family. I dont want this level of accountability applied to my life as an individual engineer, and in the enterprise there are things called corporations to bear the brunt of accountability without sending large groups of individuals to the stocks, or to bankruptcy, or to face the unimaginable sorrow of a widow. I am not a fan of resolving issues by legislation (being generally irritated when anyone other than my daughter tells me what to do), but there also things like the SEC to hold financial folks accountable, the FAA to hold aviators accountable, and so on, and I cant help but wonder if something like this is in our professions future, for the betterment of developers, companies, and end users.
This issue seems to have come to a head earlier this week, after Id started filling an emacs buffer with this directionless blogified rant: Microsoft now faces a class action suit in California as a result of security holes it has left lingering in its operating system. Interesting developments, these. Though their motivations certainly are not aligned with wishing to protect the ethics and professionalism of the software development profession, lawyers are noticing that users have had enough, and have begun to try and point a few well-manicured fingers at some of those who might reasonably be held financially or even criminally accountable.
Have an opinion? Readers have already posted 11 comments about this weblog entry. Why not add yours?
If you'd like to be notified whenever Sean Neville adds a new entry to his weblog, subscribe to his RSS feed.
|Sean Neville is a software architect at Macromedia where he is focused on creating the Flex platform. His previous projects include the JRun application server and Flash-related products for J2EE and .NET. His experiences include membership in the JCP Executive Committee and numerous JSR expert groups; authoring articles, contributing to books, and speaking on enterprise topics; financial services app consulting; building doomed yet fun web startups; maintaining open source projects; and half-decent fiddling of Irish jigs and reels.|