Specification by example
Specification by Example (SbE) is a collaborative software development practice that uses concrete, realistic examples to define, validate, and automate requirements as executable specifications, ensuring alignment between business goals and technical implementation.[1] This approach emphasizes deriving scope from business objectives, illustrating requirements through examples, and maintaining living documentation that evolves with the software.[2] The practice, also known as Acceptance Test-Driven Development (ATDD), fosters teamwork among developers, testers, analysts, and stakeholders to create human-readable specifications that serve as both requirements and automated tests.[3] Coined by Martin Fowler in 2002 at the XP/Agile Universe conference, SbE gained prominence through Gojko Adzic's 2011 book Specification by Example, which distills patterns from over 50 projects worldwide to deliver defect-free software in iterative cycles.[3][4] The book won the 2012 Jolt Award for its impact on agile methodologies like Scrum, Extreme Programming, and Kanban.[4] At its core, SbE follows seven key patterns: deriving scope from goals to focus efforts; using examples for clarity and precision; refining specifications to ensure testability; automating validation without altering intent; frequent execution of tests for feedback; evolving a shared documentation system with ubiquitous language; and treating living documentation as a reliable, accessible product.[1] These patterns reduce miscommunication, minimize rework, and accelerate time-to-market—for instance, one team reduced delivery from six months to four days—while providing self-checking tests that detect errors early.[1] Closely related to Behavior-Driven Development (BDD), SbE complements Test-Driven Development (TDD) by prioritizing business-oriented examples over abstract conditions.[3]Fundamentals
Definition and Overview
Specification by Example (SBE) is a collaborative method for defining software requirements and tests through the use of concrete, real-world examples that illustrate expected behaviors and outcomes.[5] In this approach, business stakeholders, developers, testers, and analysts work together to capture these examples, ensuring that requirements are unambiguous and directly tied to the software's intended functionality.[5] SBE emphasizes the creation of executable specifications that serve as both documentation and tests, allowing teams to validate software against shared expectations throughout the development process.[4] SBE bridges the gap between business intent and technical implementation by treating examples as living documentation that evolves with the project, remaining relevant and up-to-date as the software changes.[5] This practice fosters a common understanding among diverse team members, minimizing misinterpretations that often lead to defects or rework in traditional requirements processes.[5] By focusing on illustrative scenarios rather than abstract descriptions, SBE reduces ambiguity in requirements, promoting clarity and alignment on what the software should do.[4] The basic workflow in SBE typically begins with user stories that outline high-level requirements, followed by collaborative sessions to elicit specific examples that demonstrate desired behaviors under various conditions.[5] These examples are then refined and used to validate the software's implementation, ensuring it meets the defined criteria through automated or manual checks.[5] SBE integrates naturally with agile methodologies, supporting iterative development and continuous feedback.[5]Core Principles
Specification by Example (SBE) is grounded in a set of foundational principles that guide teams in creating clear, executable requirements through collaborative practices. These principles emphasize the use of concrete examples to bridge communication gaps, ensure alignment, and maintain evolving documentation that supports software development.[1] A central tenet is the principle of examples as a single source of truth, where concrete examples serve as the authoritative reference for requirements, tests, and implementation details, eliminating the need for disparate documentation that often leads to inconsistencies. By treating these examples as the definitive artifact, teams avoid silos of information and foster a unified understanding of expected behavior, as exemplified in automated specifications that remain human-readable and accessible to all stakeholders.[1][4] Collaboration forms another core principle, advocating for the early involvement of cross-functional teams—including business analysts, developers, testers, and product owners—in co-creating specifications. This approach promotes shared understanding by encouraging joint workshops and discussions, where diverse perspectives help refine examples and uncover ambiguities in requirements from the outset.[1] The principle of living documentation underscores the idea that specifications should evolve alongside the software, remaining relevant through continuous iteration and automation. Unlike static documents that quickly become outdated, living specifications are executable and integrated into the development process, providing up-to-date insights into system behavior and facilitating ongoing validation.[1][4] Finally, deliberate discovery is a guiding principle that uses examples to systematically explore requirements, revealing edge cases and hidden assumptions during elicitation. This involves iteratively deriving and refining examples to ensure comprehensive coverage, enabling teams to build software that truly meets business goals without over-engineering.[1] These principles underpin extensions like Behavior-Driven Development (BDD), which applies them to automated testing frameworks.[6]Practices
Ubiquitous Language
Ubiquitous language refers to a shared vocabulary and terminology derived directly from the business domain, designed to foster clear communication among stakeholders while deliberately excluding technical jargon from specifications. This concept, adapted from domain-driven design principles, ensures that all team members—ranging from business analysts to developers—use the same terms to describe requirements and behaviors, thereby creating a consistent foundation for collaboration in specification by example (SBE). Developing a ubiquitous language involves iterative discussions and refinement sessions where domain experts and technical team members clarify ambiguous terms through real-world examples and scenarios. These sessions focus on identifying and resolving discrepancies in understanding, such as varying interpretations of business concepts, and gradually building consensus on precise definitions that reflect the domain's nuances. Over time, the language evolves as the team's comprehension of the domain deepens, with terms being tested and adjusted during collaborative workshops to eliminate translation gaps between business and technical perspectives.[1] In SBE, the ubiquitous language plays a critical role by preventing misunderstandings and misalignments that could lead to incorrect implementations or documentation drift. By embedding this shared terminology directly into executable examples and specifications—such as in Given-When-Then formats—it ensures that the living documentation remains aligned with business intent, facilitating symmetric changes where updates to requirements automatically reflect in tests and code. This integration promotes a single source of truth, reducing errors in communication and enabling faster validation of software behavior against domain rules. For instance, an ambiguous term like "process payment" might initially evoke different meanings, such as immediate fund transfer for one stakeholder or deferred billing for another; through refinement, the team agrees on a precise definition, such as "deduct the specified amount from the customer's linked account and generate a confirmation receipt only if the balance is sufficient," which is then consistently used across all related specifications. This contrast highlights how ubiquitous language transforms vague phrases into unambiguous, domain-specific expressions that support reliable example-based testing. In practice, such as during example mapping sessions, this language provides the foundational terms for structuring discussions around user stories.[1]Example Mapping
Example Mapping is a collaborative workshop technique used in Specification by Example to break down user stories into actionable rules and concrete examples, typically facilitated in group sessions with physical index cards or digital equivalents.[7] It involves participants such as product owners, developers, and testers to foster shared understanding and identify requirements gaps through structured discussion.[8] The process emphasizes using colored cards—yellow for the user story, blue for rules, green for examples, and red for questions—to visually organize and refine specifications.[7] The technique follows a structured sequence of steps to ensure comprehensive coverage. First, the facilitator writes the user story on a yellow card and places it at the top of a shared workspace. Next, the group identifies and captures key rules or acceptance criteria on blue cards, arranging them below the story. For each rule, participants then generate concrete examples on green cards to illustrate positive and negative scenarios, placing them under the corresponding rule. Any unresolved questions, ambiguities, or assumptions are noted on red cards for later research or iteration. The session typically lasts 20-30 minutes per story, iterating until the scope is clarified or time expires, after which questions are addressed in follow-up discussions.[8][7] Within Specification by Example, Example Mapping offers several benefits, including the rapid discovery of edge cases and inconsistencies in requirements, which helps prevent overlooked scenarios in development.[7] It promotes efficient collaboration by keeping discussions focused and productive, often filtering out overly broad stories before they enter sprints.[9] Additionally, the visual mapping ensures balanced coverage across rules, enhancing the quality of executable specifications derived from the examples.[8] Variations of Example Mapping include real-time sessions for co-located teams using physical cards, which allow for immediate tactile interaction and feedback. Asynchronous adaptations enable distributed teams to contribute via shared digital tools like virtual whiteboards or spreadsheets with color-coded cells, maintaining the collaborative essence without synchronous meetings. For remote teams, techniques such as screen-shared sessions or pre-populated digital boards facilitate adaptations, ensuring inclusivity across time zones.[7][9] This approach integrates briefly with ubiquitous language by encouraging rule statements in domain-specific terms during the workshop.[8]Deriving Examples
Deriving examples in specification by example (SBE) involves creating concrete, realistic scenarios that illustrate business rules without attempting to cover every possible input or outcome. Good examples are precise and verifiable, focusing on key business functionality, technical edge cases, and troublesome areas to ensure they effectively communicate requirements to all stakeholders. They should be representative rather than exhaustive, as a small set of well-chosen examples—such as three to five per rule—provides greater value than numerous poorly defined ones, emphasizing clarity over completeness.[10][11] To derive high-quality examples, teams often begin with brainstorming sessions where participants, including business analysts and developers, generate simple scenarios based on acceptance criteria or user stories. This process draws from user personas to ground examples in realistic user behaviors and needs, ensuring they reflect actual usage patterns rather than abstract assumptions. Additionally, techniques like the "five whys" are applied to probe deeper into requirements, repeatedly questioning underlying motivations or conditions to uncover hidden scenarios and dependencies. Examples are then refined through iteration, incorporating feedback from stakeholders to validate assumptions and adjust for emerging insights during collaborative discussions.[12][13][14] Common pitfalls in deriving examples include over-specification, where teams produce too many variations—such as enumerating all combinations of inputs instead of key ones—leading to bloated documentation and maintenance challenges. Conversely, under-specification occurs when examples overlook boundary conditions, like thresholds between acceptable and unacceptable values (e.g., $24.99 versus $25.00 for a transaction risk score), resulting in incomplete requirements and downstream defects. To achieve balance, guidelines recommend limiting examples to 5-10 per business rule or concept, prioritizing those that highlight critical boundaries and variations while using summarization tests to eliminate redundancy—ensuring no example can be simplified further without losing essential details.[11][15][16] Examples in SBE are typically structured in a tabular format using the Given-When-Then template to enhance readability and executability. The "Given" clause establishes the initial context or preconditions, "When" describes the action or event, and "Then" specifies the expected outcomes, often presented in tables for multiple data variations. This format promotes clarity by separating setup, exercise, and verification, making it suitable for automation while remaining accessible to non-technical stakeholders; for instance:| Given | When | Then |
|---|---|---|
| A user has a balance of $100 | The user attempts to withdraw $90 | The withdrawal succeeds and the balance is $10 |
| A user has a balance of $100 | The user attempts to withdraw $110 | The withdrawal fails with an insufficient funds error |