Five Reasons Managing Projects Objectively Is Hard And How To Succeed Anyway
I’m a great advocate of managing using objective, data driven, project management tools. I often call it being “brutally honest” but with the insight it is only “brutal” to organizations that have locked themselves into other than objective approaches. Here are five reasons I’ve observed why being objective is hard, and then ways we successfully overcame these barriers.
Good Data Is Hard To Find
Often, the data that is available has been “sanitized” so much that it just does not provide any real insight into how the project is going. In one case, we had hundreds and eventually thousands of defects we had to solve in our product. Because no one could comprehend all these defects, the decision was to only talk to senior management about the most important issues. This small percentage of all defects didn’t provide much of a true view on all the issues in the product. The somewhat humorous breakthrough was when we noticed that the movement of this 1% correlated well with the other 99%. It occurred to me that this would also be the case if the 1% of the defects to be reported were just picked at random. I called it managing by the “tip of the iceberg.” The insight here was to confirm that the subset still did provide useful information. Too often however this kind of “selective” data use obscured real trends and hence rendered the data being reported as essentially useless for decision making.
Make sure our data shows the big picture and is not so filtered and sanitized that we miss the overall trends in the data.
Good Data Is Locked Away
Too often, data is locked away in systems that we can’t get to. The most grievous approach, in my mind, is where a committee of some type controls the type of reports that can be created and the data that can go into them. This generally results in useless reports over time. In one company, every time the controlling powers created a group of charts and reports to be used, within weeks the managers had come up with their own and used them. This was because the “reports” group simply could not keep up with all the different interests by managers (often time legitimate as each project is different, even while being very similar). The best approach we found is to allow anyone to query data and make their own use of it. It is OK to have an “approved” set of charts, slides, reports, but we should always have the ability to pull the data ourselves to generate new insights or often to just check the “official” data. The saddest approach I’ve ever seen was a project requirements tracking database, where the “controllers” eventually removed all ad hoc reporting so data could not be independently retrieved. It had been a great system, one that I had helped get going at the beginning, that became another “data beast” to be fed which then produced “required” reports that provided almost no insight into requirements management and hence no one relied upon it for actively managing projects.
In response, I spent a lot of my time creating mash-ups of data. I would come in early in the morning and run a collection of VBA programs embedded in Excel, Access, Outlook and Internet Explorer. This would pull data from various databases often by screen scraping web pages to get data. It took about an hour to run, but that one comprehensive snapshot at the beginning of the day gave me great insight into how my own project was going (and many other projects I was tracking that were related to mine). The bottom line was that I had to dust off my programming skills to get the real data I needed because it had been locked away.
Make sure our data is available for wide use, so that it can be vetted by the masses and hence can be more accurate and useful for managing all projects.
Too Much Noise From Throwing Data Around
Just prior to a high level project management meeting I was sitting with one of the development VPs. My job was to help him manage his far flung empire. While we sat and waited for a conference call to begin he sketched out a “data plot” on a piece of paper that showed near perfect progress from today until the project delivery date. He then, quickly and with great efficiency, typed this onto a PowerPoint slide and built a great looking chart. He then, during his portion of the meeting, popped it up to show how we had a plan to meet the deadline. Another VP asked him “and your development managers agreed with this plan?” He said yes. This was a project that was currently running very late and eventually completed nine months late. The point here is too much data is “whipped together” to provide the picture one wants to present, not to present the picture that exists based upon objective data.
The solution we found was to collect and present the actual data, early and often. I’ve seen “honest data” fall out of favor when it no longer showed we would deliver on time. Here, the exercise was to persist with time, sometimes showing the “approved charts” but pushing the envelope by including the “objective data” next to the approved approach, to show a “range” of views (often as a risk analysis).
Work towards an environment where data is verifiable. For any information that is presented the raw data behind it should be available to anyone for subsequent analysis.
Many People Just Do Not Believe Data
We had become the premier product project in the corporation. Now, however, the “rest” of the corporation was showing up to help us out. They were going to make it even bigger and better. I spent two days straight from 3am to 9pm getting all the data and associated plan together. This was to show what we could realistically do and to demonstrate it was the best data and plan available. This Herculean effort was dismissed in a five second statement of “oh, the development team can do better than that. I’ve talked with them!” That was it. No one else in the meeting found this unusual — not even the development managers. The main reason, I would figure out later, was that many people came in with slides and data that showed all sorts of notions. None of that data ever worked — was ever predictive of what would really happen — so why should mine? It so happen that the product shipped, with mediocre quality that haunted later products for years, in the month I said it would ship. This was several months after the “new and improved plan” said it would ship.
The senior VP had the Project Management Office generate a chart that color coded projects. Green meant they were on track, and expected to deliver on time and meet volumes and margins. Yellow meant off track, but recoverable, and red meant off track — not recoverable and being re-planned. Over time, green came to mean a new project, yellow was associated with in-progress projects, and red were all the projects trying to be completed. The color coding remained objective, it just no longer had meaning as the organization found no effective way to deal with all their projects continually being late.
Data is often just not seen as something that really works. So it is hard for folks to believe it when they see it. Because they’ve not seen good analysis before, they can’t recognize it when it shows up. Here, the best approach I’ve seen is to build a track record with often smaller or off the critical path projects. When we finally, with great effort, got a “data based” schedule put together, and we hit our ship deadline, the business manager who was one of our greatest critics commented “hey Bruce, this stuff really works!”
Work towards accuracy, verifiability and reliability in the data used. We often needed to start small to build confidence and to show that accurate data provided predictability that helped to manage projects to successful completion.
Too Many Conflicting Efforts To “Improve” The Organization
We had gained some great insights into managing defects. Defect management, before a horribly unknown and out of control monster, became predictable and boring (at least to me). Our Chief Quality Officer decided to help further “improve” our defect management process. It seemed he heard that some defects turned out to be non-software defects (usually configuration problems) so if reported incorrectly could inflate the software defect trends. He wanted to call them what they really were, so no one would group them into the set of real defects. He edicted that they must be “recorded in the database correctly.”
The team that made the changes now required that these issues could only be entered in one way. Since these were defects that only over time turned into non-issues, they had to traverse the “real defect” path until late in the process. The change required them to be magically encoded as non-defects when they were entered. This required that each of these issues had to be “backed up in time” to now look like we always knew they were not software defects. It rarely happened. So these non software defects — which still had to be fixed usually in a configuration change — often just disappeared from the system or were never shown as fixed. Because someone believed that the the internal database coding had to show a certain value, we significantly degraded a great process that we had incrementally built up over time.
Work to ensure that our multiple improvement efforts don’t inadvertently cancel out the good improvements that have been made.
Objective project management is arguably one of the best ways to manage a project. However, the process of being “objective” is fraught with obstacles and requires patience, persistence and practice to get the right data, get the people to trust the data, and then manage projects to on time delivery with good quality.
8 thoughts on “Five Reasons Managing Projects Objectively Is Hard And How To Succeed Anyway”
Comments are closed.
Good article. What I do is provide an explanation as to exactly what the metric is and the parameters (i.e. what the stoplight colors mean) and ensure everyone is aware. If we change a criteria on a metric it goes through a tem and stakeholder approval to avoid confusion. JGH
Jeff,
Sounds like a good process. I’ve seen some start out with good intentions but wither after awhile. Sticking with it and making it work can turn a metric into a great project management tool.
Thanks
Bruce