Home » Metrics » Your Best Metrics May Be Intangibly Quantum Entangled

Your Best Metrics May Be Intangibly Quantum EntangledThat underscores the limits of trying to come up with a single statistical measure of the nation’s economic activity. Government statisticians have to make some heroic assumptions and generalizations to incorporate intangibles like R&D and works of art. Not knowing their true worth, the BEA assigns value to them by estimating the cost to create them. It thus undervalues Hollywood blockbusters and overvalues big-budget flops, assuming that the errors will cancel out in the long run. The Rise of The Intangible Economy, Bloomberg Businessweek, Jul 22, 2013.

It didn’t need to be perfect. It just needed to roughly represent what was really going on. The engineering manager kept telling us that the defects would be fixed by Friday, and today was Monday. But before another day went by, we got in a new batch of defects from the test team, and even more the day after that, and then we were told that everything was going to be fixed by next Monday. We were slipping out our project delivery date, week by week, and no one seemed to see what was going on, and everyone kept saying we were just about done. I just needed something that put a hard, visible measurement on what we were clearly experiencing so we could manage this monstrosity.

By Thursday, we had a whole new batch of defects, and the issues from Monday were all but forgotten, it seems. Engineering keeps saying it takes them about three days to resolve defects. Nothing we are actually seeing—the constant daily and weekly slips—matches up with what people are saying. It is just complete confusion. There are so many incoming defects, and no one really seems to be on top of when they’ll stop.

I decided to back up, jump out of the fire, and take a look at what was going on with those defects that were reported Monday morning. Good, some actually had been resolved but some were clearly not getting any traction. By the time the week was over and we had slipped out another week and received a new onslaught of issues, not even half of those Monday defects had been resolved.

So, it is clearly not the case that we are solving defects in three days, as everyone keeps saying. It is also this “three days” belief that keeps us claiming we’ll be done by next week. This is a constant mantra being said by everyone from the engineering manager up to our senior managers. It is driving me nuts that no one seems to see this disconnect!

I decided to pull all the defect data from the last 30 days. I noticed that the average time to fix a defect varied widely, so I then pulled data from the previous 90 days and computed a moving average (later, I would do a monthly moving average over the entire length of the project). I computed the mean and standard deviation for the defects, but then noticed the defect data distribution was skewed and did not even remotely look like a normal distribution. Since this meant the computed standard deviation was probably useless, I instead enumerated the data to find the number of days it was taking us to fix 95% of the defects reported in any given period (i.e., in a given day, week, and month).

What did I find? We were averaging 10 days to resolve a defect (i.e., know how to fix it), 20 days to actually fix it (i.e., get it into a product build), and 45 days to fix all the defects that were reported in any one week (this was a huge project that totaled out to about 15K defects over the course of the project). I later went back and pulled defect data from several recently completed projects. The numbers were consistent for the stage of the project we were in (the defect fix rate began slowly, accelerated to a peak, and then slowed near the end).

For more on using defect data see Defect Reports Are Your Best Friend!

So the conclusion was, as our best people continued to say each week that everything would be fixed by Friday, that even if we didn’t receive one more defect, it would still take us 45 days to just clean up the backlog of defects we already knew about. Adding to that was the nice, emerging bell-shaped curve of our daily new reported defects. It was clear we were not anywhere near shipping by next week. What finally happened? The project was completed nine months late.

See more on Our Best People Are The Source Of Our Problems

The key to using the defect data was that it captured, was a good proxy for, everything that was going on in the last half of the project. I dug for and found data—the defect data in this case—that closely mirrored the state of the project. It represented the net effort of the testers, programmers, manufacturing, managers, etc.—the combined process. Certainly, just using defect data didn’t fully describe everything that was going on in the project (including requirements changes, changes of leadership, customer feedback, etc.), but it was clear that the defect data reflected in a very accurate way where the project stood at any given point in time relative to being ready to deliver to the customer.

See also The Average Is Your Friend.

As accurate and predictive as these defect trends were, a lot of folks argued, for example, that averaging large and small defects or the time to repair defects from different parts of the product was like mixing apples and oranges. Well, I eventually concluded that as long as my fruit cocktail matched well with where the project was in its lifecycle, then these hypothetical criticisms didn’t matter much. My results were further supported by their historical consistency from past projects.

Compare with Managing By Theory Or By Facts

In future projects, I would primarily use the defect data trends to tell me where we really were in the project. Yes, I still used all the classic tools (PERT, Gantt, earned value, milestone completion, burn down, backlog, etc.), but nothing came close to the accuracy and predictiveness of the defect trends.

Our project management metrics just need to capture how our project is really going. It may not be the classic measures, and it might be measuring something that appears only loosely related to the nuts and bolts of the project. However, as long as the metric effectively reflects what is going on, then we want to use the best project management tools we have available. If they still don’t get it, just tell them with a straight face that the metrics work because they are intangibly quantum entangled.

Are your metrics effective at getting above the fuzzy assumptions and showing how your project is really doing?

Thank you for sharing!