Flight Levels Blog
Six Dimensions of Performance Six Dimensions of Performance
Troy Magennis
10.11.2020
0 Comments

Just increase team velocity. Just do ten times the work in half the time. These things alone won’t guarantee success. Organizations need more than just “more, faster” to succeed. They need the right things; they need these things to work; they need to know they can continue to do those things. In short, it isn’t as easy as measuring “more.” 

Many problems occur because we view “good” or “bad” using a single measure. The real world is more complex, requiring people to solve dilemmas. The problem with individual performance metrics is that they obscure the price paid by improving that one measure. We need to help people see these hidden costs. 

This article lists the six dimensions that compete with each other in flow-based systems that aim to deliver the highest value to customers. For each measurement dimension, you need to challenge teams to not only improve one metric but look at the warning signs of improving that metric too fast or too much. You can focus too much and too little on each measurement dimension, its about balance. For each dimension, I’ve listed some starting points of over and under focus in a system when pushing to improve this metric. 

My tip to getting a balanced performance dashboard is to find one measure in each dimension. Don’t worry about the perfect metric; worry about getting one metric in each even if it’s not perfectly captured or clean.

“Do Lots” – How Much/Many?

Measure how much raw work product is flowing. In a perfect world, not just in development but to customers. This measure isn’t about customer value; it is evidence that the system is moving items (has flow). This measure is also useful for forecasting future delivery when in balance with the other five dimensions.

Too little focus or capability

  • Dis-satisfied customers or stakeholders not getting what they need. 
  • Demand > supply, but you don’t know it.

Too much focus

  • Declining quality causing defects and re-dos (not “really” done yet)
  • Less valuable “easy” features delivered rather than most needed

Typical metrics in this category

  • Throughput – count of items or tickets per day/week/sprint
  • Velocity – sum story points per sprint

“Do it Fast” – How fast?

Respond and deliver things quickly, given its complexity and novelness. The easiest way to improve this measure is to finish something in-progress before starting something else.

Too little focus or capability

  • Customers frustrated in how long it takes to get changes

Too much focus

  • Declining quality causing defects and re-dos (not “really” done yet)
  • Less valuable “easy” features delivered rather than most needed

Typical metrics in this category

  • Time in State – the time an item was within a “state,” for example, “In Development.”
  • Cycle time – the time from start to finish at some boundaries in your system
  • Lead time – the time from some commitment to delivery (to the person committed too)

“Do it Predictably” – How consistent is the delivery of value?

Delivery occurs at a consistent pace rather than huge feasts or famine of delivered value to the customers; for example, the variance of pace “Do Lots.” This dimension helps see shorter-term process instability (the sustainability metric measures longer-term system stability; it’s coming up soon).

Too little focus or capability

  • Periods of progress and others of lower value to customers.

Too much focus

  • Less risky “known” features delivered rather than most valuable or needed
  • Little incentive to push process improvement in case they cause a temporary decline

Typical metrics in this category

  • Variability of throughput of velocity
  • Variability of the delivered customer value
  • Net Process Flow: Things Delivered – Things Started. This measure shows balance through the system with variability represented as a higher or lower peak, with the desired state hovering around zero.

Tip: For a variability measure, consider using the Coefficient of variability: Standard Deviation / Mean rather than the Standard Deviation alone (higher values naturally have a higher Standard Deviation for the same percentage change. Dividing by the mean normalizes that.

“Do it Well” – How good was the quality versus expectations?

A measure of how well the delivery of things that solve a problem or need. Often this measure is called Quality and is one of the hardest measures to get a handle on. The goal isn’t purely quality; it serves as an early warning sign that a system is being pushed to deliver beyond its capability. 

Too little focus or capability

  • Rework. What is delivered needs to be corrected
  • Customer dissatisfaction. 
  • Production issues. 

Too much focus

  • Little or no delivery of value or flow of items due to “just a little more testing.”
  • Slow feedback if the wrong thing is built (albeit perfectly functioning)

Typical metrics in this category

  • Escaped defects. Defects found outside of the development and delivery team
  • Customer satisfaction. Customers don’t like what you built and tell you
  • Production rollbacks. Second and third releases to get a stable, working system
  • Unplanned downtime. Issues in production outside of planned change windows

“Do Valuable Stuff” – How valuable was it to the customer?

A measure of how much value customers derive from released features or projects. The goal isn’t purely customer value; it serves as an early warning sign that a system is being pushed to focus on work output rather than an outcome. 

Too little focus or capability

  • Rework. What is delivered needs to be revisited to deliver “more” of this feature
  • Customer dissatisfaction. Internal feeling that work is flowing well, but the customers aren’t feeling the value.

Too much focus

  • Increasing technical debt. Teams consistently skip technical debt reduction items for supposedly higher value items.
  • Lack of prioritization for strategic work that is mid to longer-term (current customers happy, but declining entry into new markets or targets).

Typical metrics in this category

  • Cost of delay. An economic view of the cost of NOT doing work to the customer and organization.
  • Alignment to strategy. Prioritized work allocation matches a planned strategic allocation
  • Customer satisfaction. Customer feedback confirms what was delivered solved a problem with high satisfaction.

“Keep Doing It” – How sustainable is the delivery system (and people)?

A measure of how likely the current performance of the development and delivery system can continue in the future. Often called the “happiness” metric, but it’s more important than that label describes. When teams push hard on the improvement of the other metrics, it sometimes takes a toll causing a decline in the future. The goal of this metric is to be an early warning indicator of that gloomy performance in the future.

Too little focus or capability

  • The current performance measures aren’t maintained.
  • The collapse of delivery.

Too much focus

  • Stagnate performance improvement over time. The other metrics stay flat.

Typical metrics in this category

  • Team health via survey or team retrospective (honest answer to “are we able to continue at this pace?” 
  • The aggregate of the other performance metrics (D1 to D5) metrics

History and credit:

Larry Maccherone came up with the first four dimensions in his Software Development Performance Index (SDPI). He combined measures of Productivity, Predictability, Quality, and Performance to assess different Agile folklore (best team size, sprint length, co-location as examples) against 10,000 projects for his employer Rally Software with help from Carnegie Mellon’s SEI. He noted that it was possible to do all four and suggested the addition of the Happiness metric, which is coded here as Sustainability. I added the value metric based on the observation we could deliver lots of un-useful stuff, and wanted a balance against just increasing velocity or throughput for no real customer impact (I’m sure others also noted that omission). I also took liberties, renaming them after teaching them in training. I think the Agile community owes a massive debt of gratitude to Larry’s work, I certainly do.

Troy Magennis is a Flight Levels Guide. He offers training and consulting on Forecasting and Metrics related to Agile Planning. Find out more at FocusedObjective.com

OKRs and Flights Levels @ Bayer | #FlightClub OKRs and Flights Levels @ Bayer | #FlightClub
Cliff Hazell
03.07.2020
0 Comments

This weeks episode of #FlightClub we talked with Verena Fischer and Christian Putter from Bayer about how they turn Strategy into Reality with OKRs and Flight Levels.

They share their story from inside the company, and how the journey evolved over time.

Including the simulator they used to test out their design before they started, and invite others to experience each of the different Flight Levels in action.

Importantly they used this simulator to invite feedback from everyone, working together to improve how they operate. This is crucial because it shifts the focus from “Installing change” to “Design collaborative improvement”. So important if you want long lasting change

Check out the episode here…

Don’t forget to hit the Subscribe button on YouTube

Signup on Meetup to join future events

As you know, the first rule of Flight Club is “Always talk about Flight Club”. So tell you friends, colleagues and family!

Fair Pricing for Online Workshops
Cliff Hazell
01.07.2020
1 Comments

(Updated: July 9th – Added 0.4 region, including Thailand, Brazil and South Africa)

From today, we’re offering adjusted fair pricing based on where you live, for all online workshops.

Our aim is to make our workshops more accessible to all folks around the world, regardless of where they live, or how their economic situation looks.

Today, We’re starting by removing some pricing obstacles that prevent fair and broad access to knowledge and learning.

Price adjustment will initially be group into 3 regions and based on PPP.

Standard Rates

Australia, Austria, Bahrain, Belgium, Brunei, Canada, Cyprus, Denmark, Finland, France, Germany, Ireland, Italy, Iceland, Japan, Kuwait, South Korea, Luxembourg, Macau, Malta, Netherlands, New Zealand, Norway, Oman, San Marino, Saudi Arabia, Singapore, Spain, Sweden, Switzerland, Taiwan, United Arab Emirates, United Kingdom, United States, and Qatar.

Adjusted Rates 40% (Standard price x 0.4 = PPP.4 price)

Brazil, Thailand, South Africa

Adjusted Rates 60% (Standard price x 0.6 = PPP.6 price)

All Countries not listed above.

We value Fairness, and want to continue to improve how we promote fairness.

This is only the start.