Monday, February 12, 2007

Re-thinking what makes a project successful

In 1994, The Standish Group published a study called the CHAOS Report. This study was conducted to focus on what causes projects to not be "successful". Success being defined as on-time and on-budget. I have seen this report used by many Agile practitioners as a basis on why traditional waterfall develop projects do not work. Now that I have been using Agile practices for some time, I thought I would go back and review the report first-hand to get some insights.

I found the Introduction interesting, here is what it said:

In 1986, Alfred Spector, president of Transarc Corporation, co-authored a paper comparing bridge building to software development. The premise: Bridges are normally built on-time, on-budget, and do not fall down. On the other hand, software never comes in on-time or on-budget. In addition, it always breaks down. (Nevertheless, bridge building did not always have such a stellar record. Many bridge building projects overshot their estimates, time frames, and some even fell down.)

One of the biggest reasons bridges come in on-time, on-budget and do not fall down is because of the extreme detail of design. The design is frozen and the contractor has little flexibility in changing the specifications. However, in today's fast moving business environment, a frozen design does not accommodate changes in the business practices. Therefore a more flexible model must be used. This could be and has been used as a rationale for development failure.

But there is another difference between software failures and bridge failures, beside 3,000 years of experience. When a bridge falls down, it is investigated and a report is written on the cause of the failure. This is not so in the computer industry where failures are covered up, ignored, and/or rationalized. As a result, we keep making the same mistakes over and over again.


I find it interesting that in my many years of doing software projects, that there are always project stakeholders who view projects this way and compare these kinds of projects to either building bridges or houses. However, for the reasons mentioned above, this assumes that once completed the software will last forever (or at least a very long time). This kind of thinking doesn't handle change well. It assumes that you have a control over the future and can predict change. It assumes by restricting change you will have success. So, was that true? Here's what they found:

In the United States, we spend more than $250 billion each year on IT application development of approximately 175,000 projects. The average cost of a development project for a large company is $2,322,000; for a medium company, it is $1,331,000; and for a small company, it is $434,000. A great many of these projects will fail. Software development projects are in chaos, and we can no longer imitate the three monkeys -- hear no failures, see no failures, speak no failures.

The Standish Group research shows a staggering 31.1% of projects will be canceled before they ever get completed. Further results indicate 52.7% of projects will cost 189% of their original estimates. The cost of these failures and overruns are just the tip of the proverbial iceberg. The lost opportunity costs are not measurable, but could easily be in the trillions of dollars. One just has to look to the City of Denver to realize the extent of this problem. The failure to produce reliable software to handle baggage at the new Denver airport is costing the city $1.1 million per day.

Based on this research, The Standish Group estimates that in 1995 American companies and government agencies will spend $81 billion for canceled software projects. These same organizations will pay an additional $59 billion for software projects that will be completed, but will exceed their original time estimates. Risk is always a factor when pushing the technology envelope, but many of these projects were as mundane as a drivers license database, a new accounting package, or an order entry system.

On the success side, the average is only 16.2% for software projects that are completed on-time and on-budget. In the larger companies, the news is even worse: only 9% of their projects come in on-time and on-budget. And, even when these projects are completed, many are no more than a mere shadow of their original specification requirements. Projects completed by the largest American companies have only approximately 42% of the originally-proposed features and functions. Smaller companies do much better. A total of 78.4% of their software projects will get deployed with at least 74.2% of their original features and functions.

This data may seem disheartening, and in fact, 48% of the IT executives in our research sample feel that there are more failures currently than just five years ago. The good news is that over 50% feel there are fewer or the same number of failures today than there were five and ten years ago.


Would you call that success? Here we are 12 years later, yet we hear or have experienced projects that took longer, cost more, and didn't meet customers' needs. Those doing Agile have come up with a different definition of project success - "Continually producing increments of working software with the customer that is prioritized based on highest value for the cost (ROI) all while improving the software and how you produce it over time through constant customer and team feedback." Success if defined by meeting or exceeding customer expectations early and often, and not necessarily on the costs it will take to get there.

Read more of the report for yourself here.

No comments: