Definition of Done (DoD) | Agile Scrum Master

Definition of Done (DoD) is the Scrum Team's shared standard for what it means for work to be complete and for an Increment to be usable and releasable. It improves transparency, supports inspection in Sprint Review, and prevents hidden work by making required quality explicit for every Product Backlog item; it also constrains Sprint Planning by clarifying what can realistically be finished as Done. Key elements: clear criteria, consistent application, integration, verification, releasability, and continuous improvement of the standard.

Purpose of Definition of Done (DoD)

Definition of Done (DoD) is a shared agreement that makes quality and completeness transparent. Definition of Done (DoD) answers a practical question: “What must be true for this work to be considered finished and usable?” By making the standard explicit, the Scrum Team reduces hidden work, avoids late surprises, and creates a trustworthy basis for inspection and adaptation.

Definition of Done (DoD) is most agile when it protects fast learning from real increments. It exists so inspection in Sprint Review is based on a usable Increment, not partial work presented as progress. If a Product Backlog item cannot meet the DoD, it is unfinished work that should remain visible as work in progress so the team can adapt scope, slicing, and approach.

Definition of Done (DoD) as a Scrum commitment

In Scrum, Definition of Done (DoD) is the commitment for the Increment. That means an Increment is only usable and inspectable when it meets the DoD. This creates a clear quality bar that applies to every Product Backlog item selected for the Sprint and reduces ambiguity about what “done” means.

Definition of Done (DoD) is created and owned by the Scrum Team and applied consistently by the Developers. If multiple Scrum Teams work on the same product, a shared DoD is needed to keep integration and inspection meaningful. Different “done” standards create hidden integration work, inconsistent quality, and unreliable feedback.

What belongs in Definition of Done (DoD)

Definition of Done (DoD) includes the conditions required to produce a usable Increment, not just to “finish coding.” The right content depends on the product and risks, but it typically covers functional completeness and relevant quality attributes so the Increment can be trusted for decisions.

Common components of Definition of Done (DoD) include:

  • Integrated - The change is merged, built, and works with the product without broken dependencies.
  • Verified - Appropriate checks run and pass, with automation used where it improves speed and reliability.
  • Reviewed - The work meets engineering standards through peer review or an equivalent practice.
  • Acceptance met - The behavior matches the item’s acceptance criteria or examples.
  • Quality met - Relevant performance, security, reliability, accessibility, or compliance expectations are satisfied.
  • Operable - Monitoring, logging, rollout approach, and support information exist as needed for safe use.
  • Documentation updated - Documentation is updated when it affects usability, support, or maintainability.

Definition of Done (DoD) should remain practical. If it becomes a long, brittle checklist that cannot be applied consistently, it stops creating transparency. A useful DoD is clear, testable, and achievable within normal Sprint work.

Using Definition of Done (DoD) across Scrum events

Definition of Done (DoD) shapes planning and execution. In Sprint Planning, it constrains selection by clarifying the real cost of producing a Done Increment. If the DoD includes integration, testing, and operational readiness, capacity assumptions must include that work instead of deferring it.

During Daily Scrum, Developers can use the DoD as a daily quality lens: if development progresses but verification or integration falls behind, the team is accumulating risk and reducing the chance of a usable Increment. In Sprint Review, the DoD protects the integrity of inspection by ensuring stakeholders are seeing real, usable increments rather than demonstrations of unfinished work.

In Sprint Retrospective, the Scrum Team can inspect whether the DoD is being met and whether it is sufficient for the product’s risks. Recurring defects, painful releases, and late integration surprises are evidence to adapt the DoD and/or invest in capability (for example automation, better slicing, or improved CI/CD) so “done” stays sustainable.

Key Characteristics of an Effective DoD

  • Clarity - The standard is explicit and testable, reducing interpretation gaps.
  • Transparency - The team and stakeholders can see what “done” means and what evidence supports it.
  • Consistency - The same standard applies to all work so quality does not depend on who did it.
  • Adaptability - The standard evolves as product risks, technology, and delivery capability change.

Definition of Done vs. Acceptance Criteria

Acceptance criteria describe the conditions for a specific Product Backlog item. Definition of Done (DoD) is the shared quality standard applied to all items and the Increment. Acceptance criteria may be satisfied, but if the DoD is not met, the work is not done and the Increment is not reliable for inspection.

Typical Criteria in a Definition of Done

While specifics vary by product and context, common DoD elements include:

  • Acceptance criteria met - The item’s agreed behaviors and constraints are satisfied.
  • Code integrated - The change is merged and builds cleanly with the product.
  • Tests passing - Appropriate automated and manual checks exist and pass.
  • Quality verified - Relevant non-functional expectations are checked where they matter.
  • Documentation updated - Documentation is updated when it changes usage or support needs.
  • Increment deployable - The Increment can be released safely when a release decision is made.
  • Critical issues addressed - Critical defects are resolved or explicitly handled before calling it done.

How the Definition of Done Supports Scrum Pillars

  • Transparency - Everyone understands the quality bar and what evidence indicates it is met.
  • Inspection - The team can objectively assess whether the Increment is usable and trustworthy.
  • Adaptation - When work cannot reach done, the team adapts scope, slicing, and capability so learning remains fast.

Improving Definition of Done (DoD) over time

Definition of Done (DoD) is expected to evolve. Teams often start with a minimal standard and improve it incrementally as capability grows. Raising the DoD is an investment decision that reduces future rework and shortens feedback loops by making “done” more reliable.

Practical ways to improve Definition of Done (DoD) include:

  • Raise the bar gradually - Add one meaningful improvement at a time so the team can adapt without destabilizing flow.
  • Automate where it pays - If a step matters but is expensive, reduce cost with automation and better tooling.
  • Align to product risk - Emphasize checks that reduce real user, business, and operational risk.
  • Make evidence visible - Use lightweight signals (for example build status and review completion) so “done” is inspectable.

Improvements should be grounded in evidence such as defect trends, cycle time impacts, integration failures, delayed releases, or repeated rework. The goal is not perfection, but a DoD that makes each Increment reliably usable.

Common misuse of Definition of Done (DoD) and guardrails

Definition of Done (DoD) is often weakened by patterns that create the appearance of progress while reducing usability and slowing learning.

  • Code complete counted as done - Looks like treating untested or unintegrated work as progress; it hides remaining work and delays feedback; keep it visible as unfinished until it meets the DoD.
  • Hardening as a later phase - Looks like pushing integration and testing to “later”; it accumulates risk and creates end-of-iteration crunch; integrate and verify continuously within the Sprint.
  • Done varies by person - Looks like individual shortcuts or special cases; it breaks transparency and consistency; use one shared standard and inspect adherence.
  • Done as bureaucracy - Looks like ceremonial steps that do not reduce risk; it adds cost without improving quality; keep criteria meaningful, testable, and tied to product risks.
  • Done too ambitious to sustain - Looks like routinely missing the standard; it encourages pretending; reduce scope, improve slicing, or invest in capability to make the DoD achievable.
  • Checklist without evidence - Looks like ticking boxes without verifiable results; it invites gaming; focus on observable outcomes of quality.
  • Static DoD despite problems - Looks like repeating the same standard while defects and release pain persist; it blocks improvement; adapt the DoD using evidence from incidents and feedback.
  • DoD equals acceptance criteria - Looks like treating item-specific rules as universal; it creates gaps; keep acceptance criteria per item and DoD as the shared baseline.

A practical rule is: if the team routinely cannot meet Definition of Done (DoD), respond by reducing scope, improving slicing, or investing in capability, not by lowering the definition to protect a status narrative.

The best evidence that Definition of Done (DoD) is working is fewer late surprises: fewer defects escaping to production, less unfinished work carried over, smoother Sprint Reviews, and higher confidence that each Increment could be released when the business chooses to release. Those outcomes indicate the Done state is real and inspectable.

Definition of Done (DoD) is the shared quality standard for an Increment, defining what 'done' means so work is usable, integrated, and inspectable in review