I just read What Happened to Software Engineering by Phil Japikse. It's a decent article as far as it goes, but on one point - a central point - it perpetuates an assumption that I believe is dead wrong. To quote:
"Yet a significant number of software projects were failing outright, and many more went significantly over budget and/or missed deadlines. This was due to several factors, but probably the most significant have been both the speed of change in software and hardware and the speed of change in business needs. These changes in the software industry would be similar to that of having brand new vehicles requiring a complete redesign of the roads they drive on about every 18 months."
Both emphasized assertions are incorrect. Speed of change in software and hardware? Who is forcing anyone to use the latest bleeding-edge hardware or software? Nobody - that's who. And in fact a large percentage of sizeable projects work with hardware and software that is changing slowly or not at all. Old or ancient servers, databases, browsers, languages and libraries have substantial or majority market share. It's actually frequently the case that problems - other types of problems - are caused by customers not adopting changes quickly enough.
Not once in 20+ years in the IT business have I, or any of my colleagues or professional acquaintances, encountered a situation where hardware and supporting software was a moving target during the course of a project...at least not to the degree that it caused a problem for planning and design. I'm not saying it never happens, and for a really long-duration project (5 or more years) I can see it being a factor, but usually this is a non-issue.
More importantly, business needs proper do not actually change very much at all over the course of a typical project that lasts less than five years. We just think they do. What really happens is that we start the average project with missing, incomplete or incorrect requirements. That's it in a nutshell.
If we then - a few months or a few years into the project - discover that a requirement was wrong, or poorly understood, or some business analyst had dropped the ball and never asked the right questions, please don't make out like a business need changed. It didn't - it was always there.
Again, in 20+ years of working in IT, not once - and this is intentionally a strong statement - have I ever finished a project, and been able to point to a significant requirement that would not have been known at the beginning...provided that requirements analysis had been properly done after professional business analysis. A tardy answer that is due to a tardy question is not a changing business need, it's a business need that you failed to discover at the appropriate time.
Most businesses have processes that change slowly, and many don't change at all. As much as we IT professionals might wish otherwise, a typical business IT project consists of re-implementing existing, established processes into newer (but not constantly changing) technology. I've worked on business applications where the core business processes and workflows are decades, sometimes half a century or more, old. The business needs often do not change at all.
Let's be honest with ourselves. The reason waterfall so often fails, and the reason we invented agile, is because we suck at business and requirements analysis. Agile methodologies exist because neither we nor the customers are willing to do hard, upfront work, and because all of us are pretty poor at communication.
Blaming rapidly changing hardware and software, or rapidly changing business needs, is a serious cop-out. Next time you hear these arguments, take them apart. Ask some hard questions. Look at your own experiences. Don't get dazzled by the pretty but imperfect physical analogies.