Among the more frequently cited truisms in software development is that software is never really "done:" Just as you complete one set of features or requirements, another feature request appears on your task manager. Indeed, some of the best software results from long-term dedication to a codebase, continuously refactoring and cleaning up code as new features are implemented. As a corollary, a codebase on which all work ceases tends to die out, and is often superseded by another project that solves the inherent problems in the abandoned code.
At the same time, business managers and customers like to have a definite notion of a deadline, or some other criteria, indicating that a project is complete. A clear notion of "doneness" is especially important in an enterprise, where a completed project allows a manager to re-allocate developers to new projects.
In a recent pair of blog posts, Steve Rowe and Earl Beeded explore various notions of "doneness" from a developer's point of view. In Not Everyone Has the Same Definition of "Done", Rowe writes that,
The problem stems from the fact that we rarely define the word. It is assumed that everyone shares a definition but it is rarely true. Is a feature done when it compiles? When it is checked in? When it can run successfully? When it shows up in a particular build? All of these are possible interpretations of the same phrase...
It is important to have a shared idea of where the finishing line is. Without that, some will claim victory and others defeat even when talking about the same events. It is not enough to have a shared vision of the product, it is also necessary to agree on the specifics of completion...
To establish a shared definition of done, it is necessary to talk about it. Flush the latent assumptions out into the open...
In Defining "Done", Earl Beede notes that frequently one part of a project maybe "under-done" while other parts are "over-done," resulting in wasted of effort, on the one hand, and unhappy users, on the other:
The decision of "doneness" has wide impacts as under-done creates creates defects, downstream rework, and lost opportunity costs while over-done wastes time and resource and incurs its own lost opportunities...
Committees or teams charted with creating common process and practices occasionally find that the only place where they can garner agreement and claim success is in the trivial...
The issue is that the important stuff in software development, as in many parts of life, is contextual. What is going on in the project, the team, and the organization at the moment when the work artifact is completed all have an effect on the decision of done. You can't really spell out in advance what done looks like...
Contextuality demands that doneness can't be defined ahead of time but the costs of not being done are so high. The answer, I believe, is not in defining "done" but defining how to determine "doneness" within a context. The process I use I call my "good enough" criteria. That is, I have four criteria I use to help me decide if the work artifact is done to a level that is good enough for what the project needs to do.
The four criteria are
Sufficient to Proceed. Is the work to a level that the next person who must take up the work has what is needed to do their job?
Appropriate for the Environment. Are the people who take up the work likely to understand it?
Sanity Checks. Has the work committed a classic mistake that can easily be detected by the review of a short checklist of critical attributes?
Feedback from Stakeholders. Do the critical stakeholders tell me that it is OK?
I find using the combination of these four criteria gives me insight into how done the work artifact is and is fully contextual. Process standardization zealots can take heart in the sanity checks and experience anarchists can rejoice in the feedback.
When is "done" truly done in your your experience?
Last year a group in San Francisco held seminars where noted authors were interviewed. "Authors" was defined broadly, including a political cartoonist. And the one I attended featured Aimee Mann, one of my favorite singer-songwriters.
(quotes are approximate)
She was asked "how do you decide that you are done with a song?".
"When I'm sick of working on it any longer".
For a lot of software modules (not the complete project, but libraries, etc...) this holds true as well. When you are sick of refactoring, fixing an obscure bug, adding one more feature request, it is done.
<blockquote>When is "done" truly done in <em>your</em> your experience?</blockquote>
When the project is canceled and all the finger pointing stops.
More seriously, it depends on the organization. If you're a vendor, software is your revenue stream so work doesn't stop on it until the revenue is less than the cost of working on it. (It's probably a more complex formula but you get the idea.)
For in-house development, the IT structure will determine it. If the structure is such that each software system has a dedicated team, then work will continue until the business unit associated with the software system is shutdown/sold.
If the structure puts developers into a pool of resources, then usually there is a committee to determine which changes bring the company closer to its goals. In that case, a software system may never change after go live.
i'm not sure who posting here is serious and who isn't. :-)
at any rate, there are different phases of the project which could have different meanings of done. so when you've already shipped but are fixing things, that's a different done than when you are trying to finish a sprint.
focusing on the latter, i like done to mean something like "hey, we have good unit tests, good coverage, the code is there, everything is passing."
(that doesn't address things like "does it still meet what the customer wants"; those would be dealt with via the more meta scrum/sprint cycles.)