How Agile Teams Should Use the Definition of Done

[article]
Summary:
The definition of done is an informal checklist that the team agrees applies to all pieces of work. But how does the definition compare to acceptance criteria? And should it apply to every task, or every story? How often should you review or change your definition? Allan Kelly helps you navigate your team's definition of done.

Hang around teams working on agile projects and you’ll frequently hear people talking about “done” and “done-done.” What they mean is that work not only is completed, but also complies to the common standard known as the definition of done. The work is both “done” and “done” to an agreed set of criteria.

The definition of done is an informal checklist that the team agrees applies to all pieces of work. The whole team is responsible for approving and writing the definition of done and applying it to every story they work on.

When I say it’s an informal checklist, I mean there is no paperwork or formal sign-off process associated with a definition of done. It is an aide memoir, a reminder, and an agreement among team members that before anyone attempts to mark a story as done, it will pass all the points on the checklist.

One team I worked with had four items on the checklist:

  • JUnit tests written for code
  • Peer code review conducted
  • Product owner approved
  • Interface to third-party system double-checked

The team wrote this list on their team board, where it was clearly visible to everyone. Before any card moved to the done column, the team members would ask themselves: Have I done these four things?

Acceptance Criteria or Definition of Done?

Teams sometimes get confused between the definition of done and acceptance criteria, or they worry about the interplay between these two completion tests.

A definition of done is not an alternative to the acceptance criteria; it is a generic baseline for all stories. Each story brings its own special acceptance criteria. In effect, every set of acceptance criteria has an unwritten first item: “Conforms to the definition of done.”

Or, to put the other way around, every definition of done has as a implicit line item: “All acceptance criteria pass.”

Perhaps surprisingly, I frequently meet team members who do not agree on what constitutes done! For example, one developer will only push a card to done after a code review, while another will not even ask for a code review. Without a general agreement on what done means, how can a team ever hope to be consistent?

This is where a definition of done helps. This isn’t something imposed from outside or above; it is important that the definition is the result of team involvement and agreement.

The aim of both acceptance criteria and the definition of done is to improve the quality of the code produced. Research—and programmer intuition—consistently shows that higher-quality code saves time and, therefore, money. When code is low-quality is must be repeatedly tested, fixed, retested, and fixed again. Time increases, costs escalate, and schedules disintegrate in the face of poor quality.

A Twist for Tasks

There is a twist on the definition of done for teams who break stories down to tasks. When teams break down work, the question arises: Does the definition of done apply to stories or tasks?

As long as the team agrees, the definition of done can apply to either or both. Take the four-line bulleted definition I gave above. If applied at a task level, then JUnit and code reviews work fine. If there was something to show the product owner on the task, then sure, the owner can see it and approve it, but if it was an internal change (e.g., a database schema change), that check would be trivial. Similarly, if the task didn’t touch the third-party system, then there is nothing to check.

As long as the team talks through the various scenarios and comes to an agreement on how they will use the definition, then it should work. Again, the important thing is to have consistent understanding within the team.

I once met a team who went so far as to define two definitions of done: one for the task level and one for the story level. That might seem a bit over the top, but if the team thinks it helps and it doesn’t add to the administration burden, then why not?

Similar logic applies at the epic level, but because epics exhibit more variability, it seldom seems to be necessary to apply a definition of done at the highest level. Done for an epic is frequently a more subjective judgment. I certainly would avoid saying “All stories complete” for any definition of done that applies to an epic. Such criteria can lead to teams building stories that aren’t needed.

Working within Columns

A definition of done is normally, as the name implies, applied to work entering the final stage, namely, “done.” For teams using a visual board to track work, this means work entering into the final column of the board. But it is also possible to extend the idea of the definition of done across the board.

Another way of thinking about the definition of done is that it represents the preconditions for work entering the done state. Because nothing occurs between the previous work-in-progress state (board column) and the done state, the definition also forms the post-condition, or exit criteria, of the previous state and board column.

From here, it is an obvious step to think about the exit criteria for each state. Each column on the board can have its own definition of done. Few teams are so rigorous as to write such a definition for every state, but teams will frequently have “Acceptance criteria completed” as an exit condition on an analysis column.

Reviewing and Updating the Definition

Finally, the definition is not set in stone. Teams should periodically—quarterly, perhaps—take their definition of done and review every item. Over time, tightening the definition should lead to higher-quality code. Conversely, an overly long definition might be self-defeating, as people will eventually consciously or subconsciously skip steps.

In fact, given long enough, I would hope that the items listed in the definition become so normal and ingrained that people comply with the definition without even thinking about it. At that point, the definition becomes redundant; teams remove it to create an even lighter process, or they rewrite it with new items to encourage even better quality.

Tags: 

User Comments

6 comments
Tim Thompson's picture

There is and will ever only be one definition of done: release date. That date is often arbitrarily set by people who are not involved in any of the team's decisions and progress. That also means that features will ship even if acceptance criteria are not met and especially when testing either did not complete or did not even start.

A definiton of done in the ideal sense comes into play only when continuous delivery is employed. Work on features and then merge them into production code once they meet acceptance criteria, testing is complete (less regression on production branch), and product owner approves. It does not matter if that falls on the end of an iteration, once a feature is complete push it out generating value for the customer.

And even with continuous delivery there is no such thing as 'done' or 'done-done'. Done is a rather subjective state because QA might not consider things done when there are many visual and UX issues, but sales might find it perfectly fine to demo and sell. Add to that the ugly natutre of Agile fostering permanent non-committal. Nobody makes a decision because we can always "hit it in the next iteration". That means we constantly deliver half-baked, incomplete features that were rushed out just to meet some release date or sales requirement. All this Agile stuff sounds great on paper, but flops in reality because nothing ever gets done right the first time. The claim that Agile generates better quality software is just clever talk from Agile consultants. If things are not done right from the get go the software is as bad as before and neither Agile not any other development approach will fix that.

In order to ever reach done we need to go back to proper requirements in sufficient detail and the necessary time to do the job well. Be prepared that this will not fit in your three week sprint model which really is more a hindrance than anything else. Lastly, by all means fix bugs, all of them that you know about, and do it now and before starting something new. It will generate more value and make things easier going forward.

September 19, 2015 - 8:05am
Allan Kelly's picture

Tim,
It sounds like you have had a very bad experience of agile and have a lot of scars.

My guess is the root cause of the bad experience was a failure to respect quality. If you don’t expect quality then you are going to have a hard time getting any type of agile to work.

I agree with you that the ultimate definition of done is released, and used, I’d also add evaluated in use.

You suggest we “go back to proper requirements in sufficient detail and the necessary time to do the job well” but I’m not sure we can ever “go back” to that position because I don’t think it ever existed.

Pre-agile requirements and specification never contained sufficient detail, even when people tried to write exhaustive documents they missed things, made mistakes and could not foresee what would happen when things were build and how customers would see these things.

As for time, the software industry has the time problem round the wrong way. How long something will take should not be the driving force, it should be “how long have we got?” There is always more time but usually value declines with time. The question we face is “What can we produce in the time we have?” or “What can we produce in time to capture some value?”

In other words: time is a constraint, as engineers we need to work within that constraint while doing the best job possible.

September 19, 2015 - 11:00am
Tim Thompson's picture

Yes, indeed, I am scared a lot by Agile. For me it increased the amount of rework, increased the amount of stress, and significantly decreased the amount of time available to do the same work as before. The pace accelerated a lot, but not for developers, only for business analysts and QA. That is not specific to Agile, that is typical for all software companies. When time is short analysis and quality are the first and often only things to get cut. Rather ironic because those are the two areas that if done well safe a lot of time during implementation and after release. I have yet to understand why exactly that is. I can only guess it is because BA/QA are considered 2nd class citizens. They often have no dedicated managerial representation, are always considered cheap workers, and are seen as easily backfilled and replaced. Developers are in a much better position, they are seen as not easily replaced, they were much more difficult to hire than others and with the sometimes significantly higher pay grade they are seen as the ultimate experts that management blindly trusts.

As far as requirements are concerned, I never worked in a team where the product owner dropped the book of requirements on the desk and then checked back in when the team reported completion. I am not looking for a novel as requirements document, but more for a tool that cleary defines the expectations of the outcome. If nobody tells devs and QA what the feature is supposed to do we can only guess and assume. Typically, QA guesses and assumes something different than devs. If we detect such differences we take them back to the BA. That adds a lot of pressure because we need a decision before we can proceed with the work. It all adds up to stress and making quick and thus often not good decisions.

As far as respect for quality is concerned, yes, that is a problem. It has nothing to do with Agile, it is that customers pay for features and expect top notch quality to be implied. They do not accept a late delivery even if that would get them a much better feature. Likewise, business does not want to spend more time on doing things right, because you cannot recognize revenue before delivery. In business finance think it is apparently better to release a craptastic feature and fix it later than do it right the first time and save a lot of expensive rework. Maybe by brain is not wired properly to understand the benefit of this insanity.

Time is of short supply and the supply is dwindling. Yet we keep using the same tools and processes and 'go Agile' with the expectation that the same work now magically takes less time. Properly analyzing, coding, and testing takes the same amount of time no matter what the approach is. I see time savings in lightening the process, but then we get Scrum that comes along with a glut of meetings, artifical time boxing, and incredible amount of overhead. So much so that it requires additional staff (Scrum Master) to keep the process going. We switched to a very simple Kanban by now which effectively cut a lot of procedural overhead. It still takes me the same amount of time to test a new UI as it did in the past years. I wish I had the time to investigate better tools, but with available time getting less and less I have no means to do that. I already do all my continuous learning, analysis, and evaluations on my own time.

What Agile did to me is nothing else than generate more stress and give me far less opportunity to do a good job. Even after years of working under these circumstances I have an incredibly difficult time to adjust to that. The only idea I have is to lower my professional standards, as in caring less that obviously broken stuff goes out the door. That isn't what I am going for.

October 3, 2015 - 9:28am
Allan Kelly's picture

Tim,

"The pace accelerated a lot, but not for developers, only for business analysts and QA."

Bingo, yes.
Think about it, if Agile does what it claims you can expect the developers to become more productive, at which point, if you aren't paying attention to quality the work for Test (which is actually different to QA) will increase and you may well see more work for BAs too.

Sounds like your implementation was a little naive at best.

"I never worked in a team where the product owner dropped the book of requirements on the desk and then checked back in when the team reported completion."

Again I'm not surprised, this is often the case. The problem is that under the old model they were supposed to do this, even though it was often impossible they couldn't. So there was a mismatch between how things were managed and what was actually happening. If nothing else the agile model describes what actually happens better than the old model.

"As far as respect for quality is concerned, yes, that is a problem. It has nothing to do with Agile, it is that customers pay for features and expect top notch quality to be implied"

Actually I disagree, quality has everything to do with Agile.
If you don't sort out quality Agile will not work.
If you attempt to run an Agile process with poor problems sprints/iterations won't work they should, work will continue to be disrupted by defects, thus schedules will continue to be a mess and predictability goes out the window.

Agile, XP especially, has always said: improve technical quality. There is a reason for that: you need it.

It sounds like your experience of Scrum type Agile lacked an experienced guide.

Its good to hear you are having some success with Kanban. I wonder if this time you have an experienced guide? Many of the problems you mention are likely to appear with Kanban too if you use it naively.

October 5, 2015 - 2:26pm
Tim Thompson's picture

It is now about a year later and Kanban is out the door as well. The team grew, was split into multiple teams with team leads, and quality is on a direct nosedive. What leaders remember about Agile is only one thing: faster release cycles. Unfortunately, even that is implemented in an odd way. Rather than have small pieces of work that get completed and then released we stick to fixed and arbitrarily set release dates (I suggested continuous releases, that got shot down) at which a ever growing list of new features has to be done. Once the release date comes the features are shipped no matter in which state they are. More and more stuff goes out the door entirely untested and without a single test case ever written or executed. During the revamping of the team structure one other thing was changed: the quality process was thrown away and replaced with nothing. We now have projects that are supposedly very important that have no BA resources, nobody is keeping track of test creation and test execution, QA is often no longer involved in any design discussions to the extent that we are completely unaware that new features were added or existing features were changed. At the same time the known defects are growing in number and are left unfixed in favor of cramming in even more features. Issues that QA marks as critical are left in the queue for months, issues that numerous customers report as problem are left unfixed in favor of cumbersome and dangerous workarounds.

I cannot speak for other companies, but based on my experience I cringe every time when I read about all the great improvements that Agile in its various forms is supposed to bring. It appears as that the theory is awesome, practice is dictated by wanting more stuff in ever less time by eliminating quality and control in favor of putting all power to developers. If it compiles it ships.

"If you don't sort out quality Agile will not work." - You are right. After the QA process was thrown away we instantly saw an increase in released bugs requring a large number of updates that had to be released under pressure.

"Its good to hear you are having some success with Kanban. I wonder if this time you have an experienced guide?" - I liked it a lot when we actually used the board as it clearly showed that things piled up in the QA column and a ridiculous amount of stories was in the development column, work stated and left unfinished because management changed its mind on an almost daily basis of what is most important. We had no experienced guide nor did we ever get to a state where we defined task limits per column. By now the board is not even used anymore and some stories get set to done once they ship, no matter if they are completed or not, other stories stay open and get abandoned although they are part of the product.

But what can I say? The company has its best year ever and customers seem to like the products a lot. I think it is a general shift in perception by customers that was driven by mobile. Customers have become used to apps crashing and looking like crap because that is the state of most of the mobile apps. Moving to cloud based solutions and hosting provides the luxury to be able to push updates at will. There is no need to get things right the first, second, or third time. As long as we are "agile".....

September 9, 2016 - 7:20am
Allan Kelly's picture

Doesn't sound good, I'm sorry you've had such an experience.

The term "Agile" gets used and abused, I think your experience is in the abused classification.
All I can say is "Thats not my Agile" - I wouldn't call what your team have been doing Agile but it seems there is a greater force which means it doesn't actually matter what is done.

 

Sorry about that.

 

September 9, 2016 - 11:37am

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.