Here’s a horror story from the BBC technology department:

The BBC’s infrastructure is shockingly outdated, having changed only by fractions over the past decade. Over-priced Sun Enterprise servers running Solaris and Apache provide the front-end layer. This is round-robin load balanced, there’s no management of session state, no load-based connection pool. The front-end servers proxy to the application layer, which is a handful of Solaris machines running Perl 5.6 – a language that was superseded with the release of Perl 5.8 over five and a half years ago. Part of the reason for this is the bizarre insistence that any native modules or anything that can call code of any kind must be removed from the standard libraries and replaced with a neutered version of that library by a Siemens engineer.

A huge portion of the development disasters that I encounter and read about are the result of having to shoehorn applications into infrastructure that just doesn’t quite work and that was imposed by people outside the team that’s building the application.

I recently encountered a project where the developers were not allowed to use any open source libraries at all, even JavaScript libraries. They were also required to use a specific implementation of the Java Server Faces standard. When they got a mandate to make their applications cross-browser compliant, they were looking at a huge development effort, mainly because they were handcuffed by technology choices that were externally imposed. (They wound up getting the policy on which libraries they were allowed to use loosened.)

Some analyst group should do a study estimating the amount of money spent each year coding around bad technology choices imposed by business partnerships and outsourcing. The inefficiencies probably account for 1/3 of the IT jobs world wide.