Search
  • Craig Drayton

Escape velocity: Four signs your estimation may be hurting you

If you're measuring just one thing about your agile delivery, it's probably velocity - the number of story points delivered per sprint.


But there are some major pitfalls with this common agile metric and, used poorly, velocity can derail your delivery. Here are four signs that your approach to estimation may be hurting you:


1) You don't know if your estimates are a good predictor of reality

Velocity is based on story points, and story points are a method of estimating the size of work items relative to each other. In other words, a "4" should be twice as large as a "2".


The problem is that this foundational idea is rarely tested, and may not be true for your team. Is a "4" really taking up twice as long to deliver as a "2"? How do you know?


Agile and Kanban coaches are increasingly comparing teams' story point estimates with cycle time (how long it took to actually complete the work) and finding little correlation between the two. If you don't know whether your own estimates correlate with reality, then your velocity based forecasts could be giving you a lot of false confidence in your plans.


Here are a couple of alternatives:

  • T-Shirt sizing is an estimation method where teams categorise work into buckets of similar size and give them a label (S, M, L, XL, etc.). The non-numeric labels make it harder to fall into the trap described above, since you can't simply add numbers together. Instead, if you measure cycle times for each category of work, you get a much better sense of how estimates are translating into real outcomes.

  • Throughput is the approach of simply counting the number of work items that a team completes in a given period of time. The process of estimation becomes more about slicing work down - "is this story small enough to complete within our sprint/cadence"? This can provide just as good a sense of progress as velocity, without as many harmful side effects. See this blog for an in-depth rundown on throughput.


2) You're estimating epics and features in story points

Story point estimation encourages an overly simplistic forecasting technique: add all of the estimates together and divide by average throughput to get a finish date. The previous section alone should have you very wary of this method.


It's made even worse when large, unrefined work items are given estimates - a "500 point feature", a "2000 point epic". Really, this is just taking a rough calendar-based guesstimate ("about 3 months") and disguising it as an data-driven, agile approach to forecasting.


Because these work items have been given a number they are often then, implicitly or explicitly, given an expected completion date based on a team's velocity. This leads to problems when the team breaks down the work. If they discover the work to be larger than the initial "2000 point" guesstimate then they are already behind on a commitment to a date made prematurely.


Velocity is a measure of how much small, sprintable work a team has been able to complete in the past. The huge, unrefined ideas in your backlog are not sprintable work, so estimating them in terms of velocity lacks validity and confuses the use of the metric.


What you really want to know is how much sprintable work these big ideas will end up breaking down into. This often requires doing some of the work - discovery, prototyping, user story mapping, spikes. If you can't do this, then keep your estimates vague enough to match your current level of certainty, and describe them in different terms to your sprintable work - "about 3-5 months".


3) You have a target for velocity

One of the easiest places to spot dysfunction in agile delivery is to take a look at how a team does their planning. It's common to see teams with a fixed scope of work, fully estimated in story points, with a "target velocity" calculated to meet a fixed date deadline.


Product development is a process of progressive learning and discovery, and your team may find that the work is harder and taking longer than the original estimation assumed. If the team's performance has been pinned to a "target velocity" then they have two main options:

  • Speed up in the short term by producing lower quality work.

  • Start inflating estimates. What used to be a "2" is now a "5".

Both of these choices give you a false sense of progress and will end up hurting you.

People and teams respond to incentives, so judging them on their ability to meet a target velocity will likely cause the harmful behaviours described above. If the team isn't making as much progress as you'd like, work out what's getting in their way, help them remove blockers and support their continuous improvement.


Consider using throughput as your main output measure, as the incentives are much better aligned than with velocity. Teams looking to increase their throughput start slicing their work smaller, and teams with smaller work items are more productive, predictable and responsive.


4) You're equating velocity with performance

Teams that use velocity have a tendency to use it as their primary or only measure of delivery health and performance. This provides a very narrow, blinkered view of how the team is going.


We don't just care about how much "stuff" the team does. We also care about predictability, responsiveness, value, risk, cost and sustainability. Good delivery metrics provide a balanced view across these dimensions.


We use velocity now - what should we consider?

You don't have to stop using velocity, but try adding some additional metrics to supplement and sense-check it (throughput, work age and work in progress are a good start). At a minimum, you'll get a more rounded view of your delivery and the validity of your forecasts. You may find that you can eventually phase out velocity and put estimation time to better use.


If your team uses JIRA, then you're already collecting a lot of data on your delivery - but the standard JIRA reports make it hard to understand and communicate. Mazzlo's easy to read metrics and insights provide a better way to communicate delivery health and performance. If you're interested in a free trial of Mazzlo Delivery Analytics, click here or contact us.

99 views

Recent Posts

See All

Agile Metrics Explained: Cycle Time

What is Cycle Time? Cycle Time is a measure of how long completed work items spent in progress - how long it took to get them done. Unlike velocity or throughput, which is measured at the team level,

Copyright © 2020 Mazzlo Pty Ltd. All rights reserved   |   support@mazzlo.co   |   Legal   |   Privacy