This article attempts to describe "Utility Storming" a lightweight collaborative approach to brainstorming the non-functional requirements of technical recommendations. Encouraging transparent and methodical system design within and across teams.
In an effort to de-risk projects and improve predictability of delivery, technical design activities are common throughout the software industry.
These activities attempt to minimise and improve resilience of bad decision-making.
The approach I'm about to describe is an evolution of various lean design processes I've participated in over the last decade, inspired by the ATAM Utility Tree and Scenario brainstorming and SEI's Quality Attribute Workshop.
Exploring the requirements and trade-offs of a single or group of related technical recommendations. A Utility Storming session is best done at a whiteboard with post-it notes, a collaborative session specifically for those responsible for the architecture of a system or application, usually the software engineers themselves. If physical whiteboards are not possible tools such as Miro also work well.
It's best to start the session by ensuring the team understand the context and problem. A team member will have done some preliminary investigation before hand, and have sufficient understanding to propose and explain a recommendation to the group.
The facilitator of the session calls out the primary quality attribute (usually Security, Performance or Modifiability), writing that attribute onto the whiteboard. The attribute can be randomly selected when there is no obvious starting place.
The group is asked to write on post-it notes the likely outcomes (of the given recommendation) that are pertinent to that quality attribute. These believed outcomes can be positive or negative and are grouped as such on the board. The post-its are placed horizontally next to the attribute. Duplicate outcomes are removed and similar outcomes grouped as one.
Some of the outcomes suggested may effect other quality attributes. The group is asked to call out those attributes. Each of the raised attributes are added to the board, and the related post-its moved to become aligned with those attributes.
The facilitator moves onto the next attribute on the board and repeats steps 2β4 until no-more quality attributes are called. The quality attributes on the board represent the utility of the recommendation. It's useful at this point to remind the group of undiscussed quality attributes, if the group feel they have missed something the newly discovered attribute should be added to the board and the process repeated.
From the top, for each of the outcomes the facilitator asks how relevant the outcome is to the success of the recommendation and encourages the team to define acceptance criteria, where that criteria is demonstrable upon completion.
The whiteboard can then be digitised, if legible enough a photo is sufficient. If the recommendation is ratified the decision can be written up, perhaps as an Architecture Decision Record and/or backlog tickets, linking back to the digitised Utility Storm for future reference.
In my experience, this approach improves engagement with technical decision making, increasing awareness and ownership of the systems non-functional requirements. Knowledge is shared earlier making backlog refinement and sprint planning more productive, aiding the breakdown of tickets which in turn improves the predictability of delivery. It helps democratise software architecture.
Your mileage with "Utility-storming" may vary depending on the team and the recommendation. Like anything it requires practice but I'd recommend you give it a try.
Top comments (0)