Waaaay back, in the UK in the early '80s, I used to watch some of the excellent TV programmes from the The Open University on computers and data processing. One of these was brought to mind
Thinking back, I can clearly visualise the punch cards flashing through a reader, the disk packs spinning wildly and a chain printer rattling away. Something like this state-of-the-art "1440 Data Processing System":
(Image lifted from The IBM Archives: http://www-03.ibm.co … ainframe_PP1440.html)
I can hear a narrator talking about how computer technology would allow up-to-the-minute access to data and how data duplication would be a thing of the past.
Fast forward…it's 2009. 25+ years later.
I keep getting letters from the Australian Tax Office addressed to 'TRTNSENTIA PTY LTD' Every time I reply to a letter so addressed, I ask the Powers That Be to change the name on their records. I have sent a number of letters to the
ATO specifically to get my records updated. No dice.
I recently had a small 'prang' with my car. Phoned up the insurance company and made a claim. Somehow, my telephone number was recorded incorrectly during my first interaction with the claims section. Never mind, on my second
interaction I corrected the number. I also corrected it on my third and fourth interactions…
Why is this?
We have had technology allowing us to live that '80s dream for decades…the technology has changed, sure. It has got more powerful and easier to use. It has become more scalable, easier to
develop with and easier to manage. It is orders of magnitude cheaper to implement solutions now. Our development teams have grown up with computers: most of them don't know a world without the devices; programming is
pretty much literally in their DNA. In contrast to the '80s, commercial data processing is no longer heading into unknown waters: our tools and componentry incorporate recognized, tried-and-trusted algorithms and
patterns. The art of programming has become better defined and better controlled and (like most other professions) we are building up a substantial body of evidence and experience that can guide our work
One thing has remained constant: my old friend: "fear." We still have management with 'marginal' technological literacy. Molehills can quickly become mountains… Inaction can become the only contemplatable way 'forward.'
I have come across the following sort of muddy, fear-driven reasoning: If my desktop PC computer can crash, any computer can crash. A cluster contains many computers…therefore clusters are much more 'dangerous' than a single computer
and so their use shouldn't be countenanced. Since we are only able to use a single box, our ability to handle load is restrained. Given this, "live dips" into the database may put too much load on the system and so to be disallowed. Given
this, we'll need multiple copies (maybe a unique copy per target application) of the same data. Periodic pushes of the 'master' database will allow us to work everything out…
Sounds OK in practise but reconciliations fail, oftentimes silently; the master goes down and data is lost; maintainance happens; the update timetable for application A starts to conflict with that of applications B and C and D
and…leading to reduced periodicity of updates, etc., etc.. The end result is what started this rant!
Technologies for clustering application servers and systems like Oracle RAC (and to a lesser degree MySQL Clustering) have been around for a long time now. They are mature and capable of really impressive uptimes. There really is no good reason not to use them (IMHO, "too difficult
for us" is not a good reason!). I have seen organisations (facing externally imposed change) adopt Oracle DataGuard because
it sounded like what they had already been doing (almost "by hand") over many years and was thus the smallest possible, least fear-laden step they could (reluctantly) take.
Upshot: we have technologies and techniques that predate many of the practitioners of the art, but which still don't get applied.
So there you go: another rant over and done with.