The Protective Collective Inertia of Failed IT Projects

[article]
Summary:

Most projects do not fail because of bad business cases, poorly defined requirements, or inadequate testing.These are all symptoms that turn up at the project level. Project failures along with failed software process improvement initiatives, reflect a fundamental failure of "tone at the top." Often a Collective Protective Inertia Syndrome and other organizational pathologies control how decisions, including project governance decisions, are made.

Most companies blame project failures on poor planning and development practices. Ask the veterans of a major failed project what caused it and you will usually hear any or all of the following:

  • The requirements kept changing
  • There were TOO MANY requirements
  • We didn't have enough time for testing
  • Client Personnel did not work for us

About 43 years have passed since large-scale business information systems - and the projects that create them - first became part of our lives.One would think that the collective project experience of those 4 decades would have made successful IT projects the norm, and failures the sad rarity. One would think.

During a comparable 43 years in aviation, 1903 to 1946, aircraft design and production went from the Wright Flyer to the Lockheed Constellation airliner. But, despite trillions of Dollars - and Euros, and Francs, and Deutschmarks, and Pounds - invested, thousands of advanced, and easy to use development tools, customizable packaged software, and billions of hours worked by smart and dedicated professionals, we STILL don't get software right.Nowhere is this more apparent than in the huge number of IT projects that have no right to exist, but, like Energizer Bunnies from Hell, keep going, and going, and going.

Thousands of articles and books have examined the problem of failed projects. The Capability Maturity Model for SoftwareÒ, Version 1, and its successor, the Capability Maturity Model Integration® [1] (CMMI®), have both been attempts to reduce the likelihood, the impact, or both, of failed projects. So have the offshoots of these models, including the Team Software Process, the People Capability Maturity Model (P-CMM), and others. But none of these approaches to address the root cause of project failures.

All of these analyses and improvement models treat symptoms of failed projects as the causes of failed projects. Some of these symptoms that are mistaken for causes are:

  • Poor requirements definition and management
  • Poor project planning, tracking, and oversight

· Poor communications between sponsors, users and the development team

  • Dubious or poorly managed business cases
  • Inadequate testing

Those who focus on symptoms often express a belief that doomed or worthless projects usually result from either insufficient or disorganized information provided to decision makers. After all, if CIOs or sponsors knew the true state of a project that is going off the rails, they would immediately apply the brakes. To the Auditor whose findings are ignored, or the maverick Exit Champion punished for not being a Stand-Up Guy, this sounds like the expression popular in pre-1917 Russia: If only the Tsar knew!

The Auditor and the Exit Champion understand that the Tsar DID know!To put it another way, they understand that the fish rots from the head.

The idea that bad software practices result from an overall pathological decision-making culture is not new. In their 1979 classic, Peopleware [2], Tom DeMarco and Tim Lister said that the problems of software development are not so much technological as sociological. Around the same time, Imperial College Professor Anthony Finkelstein, developed a Software Process Immaturity Model to describe this concept, and Tom Schorsch later expanded on Finkelstein's idea to create the Capability IM-maturity Model [3].

The Capability IM-maturity Model starts with the premise that the typical designation of most software cultures as CMM "Level 1 - Initial" is a polite - and politically correct - fiction. Most "Level 1" organizations exhibit pathological behavior at increasing levels of depravity, from Level 0 to Level -3. Here is an excerpt from Schorsch's description of CIMM Level - 2, the Contemptuous Level:

All new people are expected to know their jobs already or to be trained by on-the-job training (by the person who left two months before they arrived for work). If trained software engineers are hired, they are criticized for having book learning but no real-world software development experience. If new hires have software development experience, they are criticized for having software "development" experience instead of software "maintenance" experience (or vice versa depending on the circumstances). If they have both types of experience, they are told that this system (the one being developed or maintained) is different, the organization is different, or the end user is different and that those software engineering ideas will not work in this environment. Any existing experienced software engineers are either disinvited from any forums that would enable them to air their views or are congratulated on their keen insight and then quietly reassigned to administrative positions far away from software development.

When I show or describe the CIMM to colleagues or clients, most cringe with recognition, then say something like "Yep, that's us. "But Finkelstein and Schorsch, hundreds of other best practices experts, and 20 years of Dilbert, only address the WHAT of pathological decision-making that results in failed or worthless projects. To cure, or at least treat the disease, we need to address the WHY.

The "Why" is a group behavior pattern found in most organizations that I call Collective Protective Inertia Syndrome, or CPIS CPIS is the internal group logic that drives an organization to deliberately repeat either its own mistakes or those of its predecessors. Marilyn Monroe's character in "Some Like It Hot" is a good example. She knows that saxophone players will inevitably do her wrong, but she keeps falling for them. Wile E. Coyote knows that every Acme death gadget he buys in order to kill the Road Runner will blow up in his face, but he keeps buying them [4].

CPIS results from cultural defects that demand a change in "tone at the top." A general attitude prevails in many organizations, especially in public companies, that it is safer to not do the wrong thing than it is to do the right thing. This has the corollary that the nail that sticks up is the one that gets hammered down.

In companies that allow, even encourage CPIS, corporate executives try to avoid punishment more than they try to achieve successful results. While their words often exhort their subordinates to do the right thing, their actions say the opposite. We see CPIS in two highly visible behavior patterns:

  • Deaf-Effect Response.
  • Non-Rational Escalation of Commitment

The Deaf-Effect Response is familiar to anyone who has been confronted by a putative adult with fingers in his or her ears, and singing "La-la-la-la, I can't HEAR you!!!" In reality, the common indicator of Deaf-Effect Response is that those who receive bad news simply refuse to acknowledge that it exists, and shun the person delivering it.

The Deaf-Effect Response and Non-Rational Escalation of Commitment often go together.A management team following Non-Rational Escalation of Commitment in deciding whether or not to keep a project alive invariably does so.They base their decisions on beliefs, attitudes, and questions like:

· If we just push a little harder this month, we can catch up

· If this ship goes down, our jobs will go down with it

  • It's all [pick a name]'s fault

· How are we going to justify what we have already spent?

The Money Pit, the Rat Hole, Doubling Down, and other metaphors all reference the robotic inertial response to a course of action that decision-makers already know is bound to fail, or if it succeeds, will not be worth the cost.

A significant percentage of Dilbert cartoons reference Non-Rational Escalation of Commitment. The Pointy-Haired Boss is a classic example of a decision maker controlled by CPIS. His reactions when subordinates object to continuing along the same doomed path point to the cultural norms within an organization that encourage his seemingly illogical behavior patterns.

Tragically, the Pointy-Haired Boss is illogical only in the light of the welfare of the organization - NOT his own self-interest.He knows from experience that changing course when the evidence demands it could mark him in the eyes of his peers and superiors as:

  • Indecisive
  • Unable to stay the course
  • A flip-flopper
  • Less than a Stand-Up Guy
  • All of the above

In many organizations, these fears are well-founded, so much so that W. Edwards Deming made "Drive Out Fear" one of his 14 management actions essential to organizational improvement. Where fear encourages decision-making based on CPIS, senior management needs to work overtime. It needs to create disincentives for decision making that is dysfunctional for the organization, but logical in terms of the career needs of the individual. It needs to balance these disincentives with even stronger rewards for logical decision-making.

This is the heart of the matter. Projects do not fail because of bad business cases, poorly defined requirements, or inadequate testing. These are all symptoms that turn up at the project level. So are failed software process improvement initiatives, in fact, ALL failed improvement initiatives. They reflect a fundamental failure of "tone at the top".

When software projects continually fail, senior management neglect of tone at the top, whether unintentional or deliberate, is the root cause. It is the source of Collective Protective Inertia Syndrome and other organizational pathologies that control how decisions - including project governance decisions - are made. Wherever this syndrome defines the terrain for corporate decision-making, the CMMI, P-CMM, TSP, and every other model for achieving organizational excellence will remain bright, but ultimately meaningless promises.


[1] "Capability Maturity Model Integrated"

[2] Tom DeMarco and Timothy Lister Peopleware: Productive Projects and Teams (Second Edition). (New York, NY: Dorset House, 1999)

[3] Capt. Tom Schorsch, "The Capability IM-maturity Model" (CrossTalk, 1996)

[4] As an aside, the coyote finally wised up and sued Acme for product negligence. See COYOTE V. ACME PRODUCTS CORP, in The United States District Court, Southwestern District, Tempe, Arizona

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.