Monday, 15 September 2008

Memory, cheaper by the gigabyte or kb?

I was phoned by one of my customers* today, seeking my advice on a number of topics relating to one of his projects – one of them being the claimed under-sizing of a server by his supplier. Without going into the detailed contractual position of this, the supplier had sized the server, and for a couple of years it had performed well until the supplier had provided a new release of his software with a pre-requisite of new release of Microsoft system software (which had been acquired and installed by the customer’s own IT dept). Following the upgrade, performance had deteriorated, and now extra memory was required – the question was who should pay for it?

My customer was willing to pay for a day’s consultancy to help with this issue – until I asked him how much the memory would cost. A short delay and he phoned me back to say the cost of the memory would be significantly less than the cost of the day’s consultancy – needless to say I recommended the pragmatic approach, that he agree to purchase the memory (! selling myself short again) - but only after getting his supplier to agree that it would solve the problem, and if anything else was necessary the supplier would cover the cost. (The “brownie points” in being seen to help the supplier out would also help the customer-supplier relationship).

My customer was like me – he comes from an era where memory was expensive – and it is necessary to change mindsets with many computer systems – software and services are much more expensive than hardware – it really is worth spending extra on hardware (and/or extra system software) to reduce the amount of spend on software and services. But do watch the memory needs on individual PC’s – the cost of an extra 512 Mb may be low, but multiplied by a large PC populations, and including the services cost of installation, it may prove to be a project-breaker.

I remember that my first automated taxi despatch system covered the whole of Mississauga (Ontario) and supported over 300 taxis (all with mobile data displays) was delivered on a 16 Kb (yes – 16 kilo byte) Data General Nova computer with a single 2 Mbyte disc – well it was 1975. (For the first command & control system in new Scotland Yard we used a massive 48 Kb memory Nova with a 10 Mbyte disc). In those days, hardware costs normally vastly exceeded the cost of software and services, so the pressure was on to use clever software to keep the hardware cost down.

Not now. Now we use inefficient software generators that typically use massive amounts of memory (and processing capacity) to keep down software costs. But why is it that software development times and man-day estimates for major projects seem to have stayed around the same, or increased???

Don’t get me wrong – we should use the latest software technology – but let’s not get stingy with the server hardware – and bear in mind that application software developers do not always know what hardware requirements Microsoft or Oracle (or others) will build into their next releases....

* I maintain complete confidentiality on my customer (and their suppliers’) names, and only release their names with prior written approval. In this case my customer has allowed my blog on this item, but without revealing names.

No comments: