Building a Definition of Done
Joe, the Developer, waltzed into work one sunny Tuesday morning and was approached by Kenny, the Project Manager, asking if the feature Joe was working on was done. Joe had checked in his code to their shared source code repository yesterday afternoon and had unit tested it before doing so. With an emphatic “yes” Joe confirmed the feature’s completion. Kenny sighed in relief and said “great, then we will go ahead and deploy it into the UAT environment for our customer advocates to view”. Joe quickly backtracked on his original answer and blurted out “but it has not been fully tested by QA, the documentation is not updated, and I still need to pass a code review before it is finished”.
Has this ever happened to you? Were you Joe or Kenny? How did you react in this situation? Did it feel like development was not being honest? Did it seem that the Project Manager was assuming too much? We’ve got just the tool for you; the “Definition of Done”. Following is a list of steps that I use when coaching a team on their own Defintion of Done:
- Brainstorm – write down, one artifact per post-it note, all artifacts essential for delivering on a feature, iteration/sprint, and release
- Identify Non-Iteration/Sprint Artifacts – identify artifacts which can not currently be done every iteration/sprint
- Capture Impediments – reflect on each artifact not currently done every iteration/sprint and identify the obstacle to it’s inclusion in an iteration/sprint deliverable
- Commitment – get a consensus on the Definition of Done; those items which are able to be done for a feature and iteration/sprint
During the brainstorming portion of the exercise it is important to discuss whether or not each artifact is needed to deliver features for release. Some examples are:
- Installation Build (golden bits)
- Pass All Automated Tests in Staging Environment
- Sign Off
- Pass Audit
- Installation Documentation Accepted by Operations
- Release Notes Updated
- Training Manuals Updated
It is important to note that these are not features of the application but rather the artifacts which are generated for a release. Some questions you may ask about each artifact are:
- Who is the target audience for this artifact?
- Is this a transitory artifact for the team or stakeholders?
- Who would pay for this?
- Is it practical to maintain this artifact?
When identifying non-iteration/sprint artifacts I usually ask the team to create a waterline mark below the brainstormed post-it notes. Look at each of the artifacts written on the post-it notes above the line and discuss whether or not it can be done every iteration/sprint for each feature, potentially incrementally. If it can leave it above the waterline. If it can not then move the artifact below the waterline.
In the next step of capturing impediments the team will look at each of the artifacts below the waterline and discuss all of the obstacles which stop them from delivering this each iteration/sprint. This is a difficult task for some teams because we must not hold ourselves to the current status quo. I like to inform the team that answers such as “that is just the way it is” or “we can’t do anything about that” are not acceptable answers since we can not action them. The obstacles, no matter how large an effort to remove them may seem, can be informative to management about how they can support the team in releasing with more predictability. I have found that many of the obstacles identified by teams in this step create issues such as having an unpredictable release period after the last feature is added. The obstacle may be that we have an independent verification from QA. There could be many reasons behind this, derived usually from audit guidelines and governance policies, but there may be creative ways to incrementally conduct the verification which increases predictability of the release stabilization period. Over time these obstacles can be removed and the artifact which was not included in the Definition of Done for each iteration/sprint based on that obstacle can be promoted to above the waterline.
Once you have your Definition of Done, identified artifacts which can not be delivered each iteration/sprint, and captured the obstacles for those artifacts it is time to gain consensus from the team. You can use any consensus building technique you would like but I tend to use the Fist of Five technique. If the team agrees with the Definition of Done then we are finished. If there people on the team who are not on board yet it is time to discuss their issues and work towards a consensus. It is important that all members of the team agree to the Definition of Done since they will all be accountable to each other for delivering on it for each feature. Once you have consensus I like to have the Definition of Done posted in the team’s co-located area as an information radiator informing them of their accountability to each other.
The Definition of Done exercise can have many ramifications:
- Creation of an impediments list that management can work on to support the delivery of the team
- Organizational awareness of problems stemming from organizational structure
- Team better understanding expectations of their delivery objectives
- Team awareness of other team member roles and their input to the delivery
If you do not already have a Definition of Done or it has not been formally posted, try this exercise out. I hope that building a Definition of Done in this manner helps your team get even better at their delivery. Below is an example of a real team’s Definition of Done: