Thursday, May 31, 2007

Metrics in the Lean and Agile World

I ran across an interesting list of metrics from this site. The author of the post had gathered these in a brainstorming sessions with his organization on defining potential metrics on how they were doing with Agile. The author also poses a question to his readers on what metrics could be used by the team to possible deceive management. It's an interesting read. Here's the list of metrics that they came up with:

Short Iterations - Planning Game, Variable Scope:
* Relative velocity (measures accuracy of estimation and amount of developer time spent on other activities)
* Number of stories planned for the delivery vs number delivered
* Overflow of stories (measures accuracy of estimation)
* Degree of story change (measures activity in the business area and stability of business process)
* Number of problems fixed from previous iteration vs number of new stories implemented

Continous Integration:
* Time to build
* Number of successful/failed builds

Available Customer:
* Hours spent with developer/customer per week
* Speed of feedback vs resolution
* Number of raised features vs number of implemented features

* Lines refactored
* Time spent refactoring vs new code
* Most frequently changed files
* Number of files changed per feature

PM Tools:
* Effort/time spent per feature
* Actual effort vs expected effort

Frequent delivery into Production:
* Amount of defects not caught by test/review

Regular demos for feedback:
* Degree of incorporation of feedback into iterations (measures accuracy of story gathering and accuracy of implementation)

Lessons learned for each iteration:
* Traceability of lesson as applied to future iterations/practices

Test-Driven Development:
* Degree of test case coverage
* Test case failure rates

Many managers would look at this list and think it's pretty good. In fact, they probably would think of some other measures to add to the list. However, what is the return for that investment (ROI)? Some of these metrics might be easy to gather, others could require more tools or manual effort to get. Even if you do get the information, how are you going to use the information? If you don't know how to interpret the data, you may find yourself managing the wrong things.

For these reasons, I have always struggled with finding useful metrics that are easy to gather and are promoting positive behavior. Metrics seem to be used more to punish bad behavior than to encourage better behavior. Lean software development advocates are pushing to measure up. What does that mean? Metrics are typically managed at too low a level in the organization. Therefore, companies come up with dozens of metrics (similar to a list like above) and ultimately get consumed with too many things to keep track of. Therefore, why not find metrics that can give you just enough information that measures things in terms of customers and deliveries? What if these metrics would encourage position behavior while helping you know where to look in more detail for inefficiences?

Following are the measures that I am considering in our adoption of Lean and Agile. Note, I haven't tried these out yet as the focus right now has been more on getting the practices and principles down before looking at efficiencies. However, based on my reading and understanding of both Agile and Lean here is my list:

Story Cycle Time (Measuring One Piece Flow): Average time from Customer Request to Customer Implementation for each Story. Encourages smaller pieces (Stories) that bring value. Encourages finding ways throughout organization to deliver those pieces faster.

Iteration Efficiency (Measuring Value vs. Waste): Percentage of Time spend on Value (Customer Features) vs. Waste (Defects, Refactoring, anything else the customer doesn't believe provide value). Encourages those activities that the customer sees value. Discourages those activities that the customer sees as waste.

Iteration Return on Investment (ROI) (Measuring Value vs. Cost): Ratio of Total Value (Customer and Business Value) against what it took to implement stories. Encourages the prioritization of things based on not just value but against the cost of implementation. Confirms to the team that they are working on the most important things.

Story Acceptance Measure (Measuring end user acceptance): Average acceptance (user rating) of each Story after it has been implemented. Encourages making sure the customer needs were really met. Encourages quality of solution from a customer standpoint. This may be the most difficult to gather because it requires polling at least a good sample of customers.

What do you think? Did I miss any measures at the organizational level?

No comments: