The enemy of software is complexity. The more complex a program, the slower it will run, the more likely it is to crash or have security holes, the longer it takes to write and test and the harder it is to modify.
I think that most software is much more complex than it really needs to be. I think that there are a few basic causes for most of the complexity we see in modern software. In no particular order:
- Premature optimization. Programmers making changes to eliminate a perceived inefficiently without checking that there is a real inefficiency, that the change fixes that inefficiency and doesn't cause a slowdown in the system as a whole.
- Legacy optimization. Optimizations which used to be important, but are not necessary with modern hardware design. This can be a tricky one because it's often difficult to justify removing old, fast, working code and replace it with something simpler but slower. I think MAME has struck a nice balance here.
- Political considerations. Management decreeing "this project must be written in Java as it is the latest and greatest thing" when that might not be the best tool for the job.
- DRM. Adding snake-oil to software to temporarily appease copyright holders. To be maximumally difficult to break, DRM must be extremely invasive so tends to cause complexity throughout entire systems. Windows Vista is probably the best example of DRM run horribly amok.
- Hoarding. A company invents some spiffy new way of doing something and doesn't want that method to be available to their competitors, so they patent it or copyright it to force their competitors to solve the problem in a different way. The result is that there are multiple ways of doing any given task, rather than one canonical one. This has a much greater second-order effect as a lot of software needs to be compatible with many other pieces of software, so has to know about n different ways of doing things.
- Refactoring debt. As code is added, removed and changed, the surrounding code should be modified so that the result is as simple as possible, or the resulting software will be an overly complex conglomeration of disjoint pieces and unnecessary duplication. Unfortunately this important step is often left undone (either because there isn't time, or management doesn't realize what an important investment it is, or because there isn't sufficient test coverage to be sure that the refactorings don't cause additional problems.
- Excessively clever techniques. Sometimes somebody will happen on a technique which seems to be particularly powerful, but actually ends up causing more problems than it solves. For example, the preprocessor in C allows you to do all sorts of things which the language does not directly support (like generic programming and domain specific languages) but only in a way which does not integrate well with tools like source analyzers and debuggers, unless these are made much more complex.
I sometimes wonder if there is a great opportunity for someone who would dare to blow up the universe and build a whole new computing infrastructure with a serious case of Not Invented Here (though reusing the best bits of today's software), creating something much simpler and more reliable.