If you want to hear a big number that sums up a key conundrum IT leaders face today, it’s this: The Consortium for Information and Software Quality estimates that the annual cost of poor software quality in the US has grown to at least $2.41 trillion, or 9.4% of total GDP.
The big picture implication is that, if CIOs were to ‘do IT right,’ we could save on a macro basis trillions of dollars. But here’s the rub: Despite the CIO title having existed for 42 years, what CIOs should be doing continues to be the subject of heated debate. Can we — living in the digital age, working in an information economy — say unambiguously, “Company X is managing IT right and Organization Z is managing IT wrong?” Is there a spectrum of measurable “IT rightness/IT wrongness”?