This quote is a truism is because it’s how we work. Measurements help us identify areas we should focus on or improve. Measurements let us know when we succeeded, or not. I love the idea of measuring BAs and yet, I have spent years balking at the idea of measuring requirements and BAs. It doesn’t work very well in practice.* I had cause to rethink the how to measure BAs when one team I worked on took down the following action item, “Decide (if,) how and what BA velocity to track.”
One of my favorite agile principles is this,
At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Agile teams review the software they have developed at the end of every iteration or sprint. Teams talk about and discuss the big issues standing in the way of delivering more value. Teams make a plan and track their progress towards it. Teams consciously decide to make improvements so they will become more potent and productive. The best teams become a small-scale learning organization.
Most measurements and discussions are about developers and the development process, the rest of the work does not get the same level of attention. After thinking about the issue for awhile, I think it does make sense to measure Business Analysts, or rather what BAs do. We should measure BA outputs similar Dev outputs.
We don’t measure individual developers (as a general rule), and I don’t think we should be trying to measure individual BAs. I do think it is appropriate to measure the analysis process.
- BAs should not be afraid to discover it typically takes N days to complete a story.
- Teams should have insight into how many stories are In Analysis (WIP) at the same time and the average quantity.
- We can track the analysis rate and predict if we will meet the goals (keeping the dev pipeline fed, completing the project on the same timeline as development, ?)
- We should be able to use numbers to point to recurring patterns (a personal pet peeve: cycling from a full backlog to an empty backlog & idle developers every other iteration).
- We should think about average story size and splitting stories as a function of analysis, not just how it impacts development estimates. (See posts by Paul James Hammant and myself elsewhere.)
We should ask questions and, based on the project and client, decide when we need to raise our hand and ask for more information or suggest a change.
In other words, I think it is possible for larger teams to evaluate the development of stories like the development of code. And we should watch the metrics around these processes in similar ways. We should watch for unexpected changes. We should make sure trend lines are steady or moving in the right direction. We should openly and honestly talk about problems in these areas during retrospectives. If we find value measuring the development process, then we can uncover value measuring the analysis process.
Caveat: Through no special planning, I typically consult on enterprise projects with a team size ranging from 25 – 50 team members with 4 – 6 business analysts providing user stories for the developers. All the projects are run using agile (scrum) techniques and treat the retrospective as valuable.
I do not believe metrics are worth the effort on every project. I don’t know when / if the same measures are worthwhile on a small team (7 +/- 2). Whatever the size, when your team is high functioning, I suggest measuring less of everything.
* I withhold the right to change my mind about how easy it is to measure BA performance after I read Adriana Beal’s ebook, Measuring the Performance of Business Analysts.