I receive a weekly email containing links to tech business news articles from Bizjournals.com. This morning’s email contained a link to a story that I find just shocking.
The article’s thesis: as software becomes more complex, it contains more defects and is more difficult to test:

Despite ever-more sophisticated tools and procedures for spotting software problems before they imperil systems, more bugs than ever are fouling up computers. The Standish Group, a Yarmouth research firm, estimated software deficiencies cost the U.S. economy $59 billion in 2003 and says the total has been rising since.
Systems fail more often today due to the demand for “intelligent” programs that execute complicated tasks instantaneously. But new theories on software development are becoming mainstream, ending what some believe is a vicious cycle of escalating system failures — and perhaps create a virtuous cycle for vendors who can anticipate bugs before they are ever born, rather than rooting them out after the fact.

And if that thesis alone isn’t enough to rock your world, the reporter dug deep to find new and relevant examples of such problems: the Y2K bug and the 2003 Northeast blackout.
Boy, this article took some first-class reporting.

Categories: Professional