Home » Planning » Project Management Or Anesthetized By Analytics?

Objective project management tools are something we all look for.  However, too often the move to become fully objective results in a loss of insights while we are trying to figure out how to become analytical.  Here are a few “gut” insights that have helped us avoid this problem.

I am a great advocate of being objective when managing people, projects or organizations.  However, my central theme is always that “you are your best project management tool.” This is a reminder that we humans, as tool users, need to always keep control of the tool.

In “7 Steps To Better Decisions” Information Week, May 3, 2010, Gary Smith says of analytic methodologies (emphasis added):

“Decisions made based solely on intuition, gut feelings, and years of experience, while valuable, tend to be less effective than scientific methods.”

project management tool analyticsI love analytics, I’ve been doing it for many years before the word “analytics” came into popularity.  Yet I’ve seen many methods (not just analytics) – perfectly good ideas – that just didn’t seem to work well in practice.  Gary goes on to give seven steps for success at implementing analytics:

1. Define the Problem
2. Identify relevant factors
3. Focus on data collection and preparation
4. Model the solution
5. Report the results
6. Implement the decision
7. Follow up

This is where, while I still agree with Gary,  I start to get that uncomfortable feeling.  This list reminds me of approaches such as Six Sigma, that are very prescriptive, where the notion is that good decisions will now just pop out at the end of the process.  Our experience is that it just doesn’t happen that way.

I’ve describe the challenge as the “diet dilemma” where we get so caught up in following the methodology that we lose track of what we are trying to achieve.  In the diet dilemma, there are dozens if not hundreds of diet plans and approaches available, but the bottom line is we need to take in less calories than we burn. If we do that, the net effect is we will lose weight.  No, it is not easy for many people, but we know that it works if we do it right.  We understand, at a “gut” level, the underlying approach.

In using analytics (or even simple averages, which I advocate as a place to start) what we should be doing is “training our gut.” Gary doesn’t put much stock in the “gut” and in Project Management or Death by Detail I relate in one example how folks who lived through project after project, still could not observe nor admit to the patterns that led them to late project deliveries.  This same organization had implemented a detailed analysis process for new customer requirements to achieve a rigorous and accurate estimate of project costs and schedule.  One of the biggest companies in management consulting had helped them implement their approach.  They did not succeed at achieving better estimates nor on time delivery.

project management tool patternWhat finally broke the pattern of late projects was to fall back on simple averages and to use them.  Here we noticed that it was taking nine months typically to get new features to customers.  This was in contrast to the shorter durations from the analytic approach that came out of the development department.  They key here was we used simple averages to help train everyone to know how long projects had taken in the past.  People did not always like the estimates, but they understood them.   We used our analytics to fully understand our experience (i.e., “train our gut”), NOT to substitute for our judgment.

What this suggests is that if we don’t understand the results, because they are too analytic or too complex, then the probability of the results being useful is much less.  It still takes people, understanding the solution, to implement it. So to Gary’s list above, I would add “4.5 Understand the solution – make sure we truly believe it is a solution that could work.”

We had a Six Sigma project that intended to tell us when our complex electronics product was ready to ship to the customer.  It analyzed existing defects in products about to ship, correlated them with product returns from past products, and attempted to predict how many product returns we would have if we released the product now.  It had correlations and all sorts of impressive charts.   It was an interesting idea, but it never got used.  It was never used because no one could understand it nor the numbers it produced.  But it sure sounded like it should have told us what we wanted to know.

What worked instead was again something much simpler.  We looked at previous products and noticed the typical number of incoming defects that existed when customers accepted our products.  It turned out that when the total defects being reported dropped to under about a 100 a month (all error types from worldwide testing, many of them duplicate reports of the same issue) is when the product generally got accepted by the customer.  This understandable average gave a clearer indicator than the more intricate and mathematically intensive Six Sigma based analysis.  It also emphasized what we needed to achieve before we could expect the customer to accept the product (and before we would want to allow a customer to accept a product).

project management tool monte carloAnother example was the use of Monte Carlo simulations to estimate schedules.  A great idea but in a Fortune 50 company I worked for it was clear that no one in management understood the methodology or how it worked.  The pressure for a schedule marketing wanted, rather than a realistic schedule, worked its way into the simulation inputs.  All products with schedules estimated using this new and improved Monte Carlo method were late to market.  They averaged the same three to four month slips as did the previous products.  Our final solution, that worked, was to simply assign the average product schedule actually observed in the recent past to the current products (i.e., used an average schedule).  We then delivered on time, with good quality, and just about all follow on products also achieved on time delivery (in equal part due to increased quality from the initial products not being developed using compressed schedules).

The key to this success was that the analytics done were not overly complex and they were easy to understand (see also project management tools on steroids).  We knew it had a good chance of working for the simple reason we understood where it came from.  It also helped that during the project, since we understood how the estimates were made and what they were based upon, it was fairly easy to make decisions when things went off the track.

To make any analytic or objective approach work well, we found it worked best to use the results to train our guts and to help us understand the difference between what was normal and what was not normal.  Analytics were not used to replace our judgment.  It took decades before a computer was able to master Chess well enough to beat the best human players.  It should not be hard to believe that it will be a very long time before an analytic approach to project or business management will replace human intuition, decisions, and judgment.  If we as humans don’t understand the results or the underlying methodology it will be hard for us to manage effectively using the results.

Analytics and objective data based management are a great way to improve the management of our projects or organizations.  However, use the analysis to train our “gut” and not to replace our judgment.  If we don’t understand the results and where they come from, then we won’t be as successful at using the analysis in our decision making.  Often, during the initial transition to analytic decision making, the results will challenge our conventional judgment. Our experience, now trained by the use of analytics, should tell us if these results make sense or not.  If not, ensure we truly understand what the analytics are saying and why, before making a decision based upon them.

Thank you for sharing!

5 thoughts on “Project Management Or Anesthetized By Analytics?

  1. Betty says:

    You can’t always go by gut feelings and need to consider trend shown by analytics. In case of project management tools, it really depends on individual’s ability to use it effectively and to gain insights through information provided by such tools. We’ve migrated to MS project 2010 and it offers many advanced tools for better project management.

    1. Bruce Benson says:


      Agreed. That is why I talk about “training the gut.” I’ve seen many Power Point slides of data that had no meaning to anyone. They may have been wonderful analytical results, but they had no impact on the audience. Also in many cases, we seen too many high data density charts that ended with conclusions such as “we’ll be on time” or “we’ll increase sales by 30%” that did neither. Too often we are numb to data, because too much of it has been effectively noise and bore no relationship to what did finally happen.

      My pitch is that we have to train ourselves to understand the data and the analytics. That generally means living with the data as we go through a project or product launch. We come to understand what the analytic data looks like – really looks like – as things go right and when things go wrong. This helps to provide the individual with the ability to use the data effectively and to gain insight. I could glance at a defect arrival chart for a product and I knew immediately where the product was in its life cycle and how long until it would be ready to ship, all without knowing a thing about the schedule or even much about the specifics of the product.

      Thanks for the comment. Good luck with MS Project 2010.


  2. Betsey Smith says:

    I’m in the middle of reading “How We Decide” (I’m sorry I’ve forgotten the author’s name)and I believe the author would agree with you. As do I. Basically the book says you can’t make a decision without emotions (or “gut”). In fact he sites examples of people who have had injury to the emotion related parts of their brain and they are incapable of making a decision. They agonize over every detail and assess every possible outcome.When presented with two possible dates for his next appointment, one patient took 30 minutes deciding!

    Like most other things, its balance and knowing when to go with your gut and when to really on “logic.”

    1. Bruce Benson says:


      I see the “gut” a little differently. When I tell my kids not to touch a hot burner, they have knowledge – memorization. When they then touch the hot stove they’ve now trained their “gut.” They know, at a deeper level than memorized knowledge, not to touch something that is hot. They do also have significant emotion related to this learning. We can also think of it as “experience” – where we know something because we’ve done it or been through it, rather than only having heard about or seen it. I think of the “gut” as something that gets developed by experience. We know immediately what to do or what something means – even though it might be hard to explain or articulate that understanding. It has dimensions of logic and emotion – melded together.

      Starts to sound a bit metaphysical, but operationally – when we exposed people to the objective data as they live through the project events – they more quickly and accurately recall what happened during the project. They also more quickly recognize significant events when they happens again and have better knowledge of what it means and how to respond to it. Without the objective data we see too many “business myths” arising that often don’t match any reality once we gather the data and compare it to the actions being taken.

      Thanks for the insight into emotions and decision making. I may need to look more into it.


Comments are closed.