Monday, 27 April 2009

Inefficiency in Local Authorities

Tony Travers’ presentation at Socitm09 set the likely financial scene for local authorities over the future years:

* Increased demand for services due to the recession
* the need for increased support for NGOs
* the need to assist more in economic development
* at best, 0% grant increases for all councils
* a tighter cap on Council Tax

Speaking to several attendees from the supplier community at the conference, there was widespread belief that – perhaps – this much increased pressure on LA’s will at long last force many laggard councils into the significant changes in processes that can not only reduce costs significantly but also, potentially, provide a better service to citizens.

Anecdotal evidence from several suppliers who supply self-service solutions, supported the view that currently several authorities are unprepared to adopt self-service and/or abandon traditional high cost channels of service delivery, as they cannot stomach the implied loss of staff. Procurement decisions are, in many cases, still being left to the departments and staff likely to be affected by the adoption of self-service, i.e. “Turkeys voting for Xmas” – unsurprisingly resulting in no, or slow, adoption of such cost-saving initiatives.

It’s disappointing to note that these laggard authorities give local government a bad name in the efficiency stakes, even though the top quartile of LA’s have already moved to high levels of self-service, taking the pain of staff cuts in their drive to reduce costs. I just hope that this top quartile will not be penalised in the future as it becomes easier for laggard authorities (who currently have plenty of slack) to reduce costs, whilst top quartile authorities who have already reduced costs significantly find it more difficult to reduce costs even further.

Looking forward, for suppliers, it’s clear that my
forecast for 2010 onwards remains valid – indeed, I now definitely believe that, for public sector software suppliers, poor revenues will undoubtedly continue into 2012 and possibly beyond....

Operational in-Efficiency Report

Snuck out on the morning of last Wednesday’s budget statement was the 92-page final report on the Treasury’s Operational Efficiency Programme. In theory the 20-odd pages on Back Office Operations and IT should have made for interesting reading as it seemingly identified some £7.2b of savings.

However, those readers that take the time to view the document soon realise that the document merely supports the setting of targets for saving of some £4b on back office operations and £3.2b on IT, with no detail or outline plans for how those savings will be made.

Even worse, the paper identifies that Central Government doesn’t know how much it spends on back office operations and IT, let alone how such expenditure compares with the private sector. Indeed, the findings summary states that “examination of back office operations and IT has focused on the need to improve the collection and integration of management information into departmental processes, and to introduce benchmarking and performance reviews. [The] work on IT has aimed particularly at better governance of IT-enabled projects, and greater standardisation and simplification of IT across the public sector.”

Estimates of expenditure on IT across the wider public sector vary from £12.5b to £18.5b – let’s say it’s £16b – against which we must be able to make some savings – let’s say 25% - giving savings of £3.2b – that’s sounds OK - let’s go with that. (I believe that the authors could have been far more scientific, but in practice they must have been stymied by the lack of any meaningful financial information on the real costs being incurred).

The report hints at the types of savings that can be made by the use of shared services and outsourcing, but makes no commitments to the introduction in any named areas. As ever, decision seem likely to be left with the departments and organisations themselves, with no clear plans other than to start measuring how much it really costs now.....

Although the OEP report indicated that it had found it difficult to get information on costs within Local Government (“it is hard to conduct a detailed analysis of this expenditure as it lies in a very devolved landscape”), I believe that local authorities have a much better handle on their costs that central government bodies. Last Thursday at Socitm09, Tony Travers suggested that the OEP report effectively increases the 3% Gershon target for savings to 4% for Local Government. Hardly challenging in the current environment, and I would not be surprised to see this raised further by a new government.

Friday, 24 April 2009

Gladstone produces sound interim results

Gladstone, the supplier of software solutions and services to the leisure and education markets has announced a sound set of interim results today. Despite the current economic and financial environment, turnover has not reduced, and underlying operating profit was up 6% to £698k.

However, on the negative side, as predicted in my last post, Gladstone has incurred exceptional costs of £690,000 - primarily as a result of the costs associated with defending Constellation's hostile bid – and continued with capitalising the costs of development of its new product, £352k in the 6 months.

I was fortunate to have a meeting with Gladstone’s Chairman and Chief exec, Dr Said Ziai, last month and was impressed with his confidence in the business. As well as the significant exceptional costs incurred in fighting off Constellation, the fight and EGM undoubtedly deflected management time away from driving the core business forward, but Said was confident that the business would be successful as the Constellation issues reduced and the new developments came on stream.

I regard Gladstone as one of the “old style” of software houses – cash in the bank (available for funding development in a recession and/or acquiring distressed competitors) – clear market leader - and yet having realistic plans for growth over the coming years, recognising the current financial situation and not looking for excessive growth.

A prudent company that will, in my estimation, survive the recession and, once we see the green shoots of recovery, will power ahead.

Thursday, 23 April 2009

Socitm – a conference of two halves....

I’ve just got back from today’s socitm09 National Conference at Stoneleigh, and whilst most of the content was informative, it was also quite depressing....

The morning sessions focussed on the future – Tony Travers giving a broad brush view of the impact of the Credit Crunch on future funding (more about that in a later blog post), Richard Allen a refreshing presentation on unlocking the power of local information, and Rose Crozier on how socitm is at long last listening to, and focussing more on the needs of, its members. Although the messages were in some cases setting strong challenges for the future, there is clearly an uphill struggle for Local Authority IT departments and their staff over the coming years.

The afternoon session, however, was spent looking backwards and did nothing more than to emphasise how the majority of local authorities have not embraced the Internet and the use of electronic self-service sufficiently to reduce costs significantly. Martin Greenwood seems to have drawn the short straw to encourage LA’s to “use the concept of ‘avoidable contact’ to reinvigorate transformation”, and Dhanushka Madawala gave examples of the work Hillingdon had undertaken to reduce avoidable contact – all very basic stuff, but apparently necessary for many authorities....

(I will own up to having skipped the last session on “Digital Inclusion, LA’s and the third sector” in favour of attending IBM’s presentation on Business Process Optimisation).

Yet the exhibitors gave a far more encouraging message on what (presumably the upper-quartile of) Local Authorities are doing. Far from re-enforcing basic messages on Internet usage, suppliers were extolling their LA customers’ use of Web 2.0, blogs, Twitter and the like to try to really communicate with their citizens. Very encouraging.

Can so many other LA’s really be burying their heads in the ground and failing to embrace modern technology and, amongst other uses, adopt self-service to reduce their own costs?

Ordnance Survey’s new strategy....

Richard Allan’s socitm09 presentation (he’s the Chairman of the Power of Information Task Force) again brought up the serious data licensing problems that have been introduced by Ordnance Survey.

It would appear that their highly restrictive licensing of data is providing a major obstacle for the public sector to use graphical presentations of data – something that is absolutely key to good public access. I’ve encountered these sorts of problems with trying to negotiate access to NLPG, and it relates not to licensing but to payment – i.e. the licences are being used as an attempt to extract further funds from the OS customers - sometimes very significant sums.

This has surely been disjointed government at its best – one agency, OS, trying to cross-charge other government organisations (e.g. LA’s) – charges that the other organisations can’t afford, so the data doesn’t get displayed, and the citizen loses out. No wonder Google and Microsoft Maps are growing in use for displaying geographic data...

However, and perhaps in response to the criticism and competition, I note that the OS has today announced a
New Business Strategy, which promises to focus on five key areas:

Promoting innovation – with an enhanced free OS OpenSpace service to allow experimentation with digital information and a clear path from this service to greater commercialisation;

Reforming Ordnance Survey’s licensing framework – so that it is much simpler to use Ordnance Survey data and services in other applications;

Reducing costs over time – to ensure that Ordnance Survey continues to offer value-for-money;

Supporting the sharing of information across the public sector – to enable better public policy and services;

Creating an innovative trading entity – to explore commercial opportunities around providing a better platform for consumers to access Ordnance Survey products.

I hope that this will result in a complete about-turn by OS on the fees for use of its data – personally I’m pessimistic – my experience is that strategies in the public sector can take years to implement (and sometimes implementations never see the light of day). The strategy is currently light on detail, but results are promised in the next year – we’ll have to wait to see what transpires over the coming months.....

Wednesday, 22 April 2009

Contracting for developments using Agile methods

One of the questions I had following my article on Agile vs. waterfall methods, was how could customers contract with suppliers for developments using Agile methods?

As I’ve stated in previous posts on contractual matters, the key to success is to ‘define the deliverable’.

One approach is to go for a heavily documented Statement of Requirements and/or Specification (i.e. have a fairly well defined software deliverable), then look for a fixed price development. But this negates much of the benefit of the use of agile methods, and in my mind, the level of risk for the supplier could be too great (although new entrants to a market, who see the developed software as having some extra value – always assuming they retain the IPR – may see this as an investment to get into a new market). Beware that if a fixed price approach allows for multiple/costed changes for all changes to requirements (inevitable if true agile methods are adopted), then the original fixed price will be purely illusory.

On the basis that a complete SOR doesn’t exist, the development team will involve both customer staff (helping to define/refine requirements) and supplier staff, and will have no clear definition of the deliverable (although I regard an outline scoping document as the minimum for starting any such agile development). In such an environment, the easiest form of contract is a straight Time & Materials contract where the deliverable from the contractor is a number of “warm” bodies (although, hopefully, will particular application and/or technical skills and/or experience).

But such a T&M approach puts virtually all the risk back with the customer, and less-trustworthy suppliers may use the agile methods to encourage requirements creep and rework to ensure that their staff’s time on the contract is extended beyond original expectations.....

My discussions on these types of projects have shown the need to develop a trust relationship between customer and supplier – if a watertight contract is proposed so that companies who mistrust each other can work together, it will almost always fail.

As I noted in my earlier post, agile projects are much easier to run with in-house teams where the staff skills and experience are known (hopefully), and the team know that their performance will be measured by their success in getting a quality solution developed to schedule and on budget. If an external supplier is to be brought in, and management of the agile development retained by the customer, then the best contractual approach is one of a T&M form.

If the development project is to be managed externally by the supplier, then clearly during the procurement cycle, the customer needs to satisfy himself of the track record of the supplier in similar developments, prior to contract award. No one-size-fits-all solution exists for the form of contract – they all depend on too many variables ranging from the scope of the project to team size to outline budgets and schedules – but some of the recommendations I make include:

* ensure there is a broad goal for the project
* outline constraints such as budget and timetables upfront so the whole team is aware of them
* define standards for the project up front (UI, coding, data, documentation, testing, etc ...)
* as the development progresses, set detail goals every few weeks
* define in advance the level of customer involvement in the project (and ensure it is met)
* agree the priorities in order – budget vs. dates vs. functionality
* consider splitting the contract into multiple smaller contracts – some fixed price and some T&M – possibly using the “phased fixed price” approach I’ve discussed before
* consider a performance bonus for meeting defined targets (I know that the current trend is for negative penalties for not achieving targets – but I prefer the much more positive approach of bonuses – they’re much more motivational and, in my experience, deliver more successfully than the threat of penalties).

As noted at the beginning of this post, the key is to be clear on what the deliverables are, and then agreeing a pricing formula based on the level of risk to be accepted by each party.

P.S. As well as providing a “fire fighting” service for
problem projects, I also provide consultancy to help new or existing projects proceed successfully and avoid becoming problem projects. If you would like to discuss ways of avoiding problem projects in the future, please contact me at Phil@systemsolveconsultancy.co.uk

Tuesday, 21 April 2009

Disney systems

Today’s advertisement for a Tender for a new Command and Control system for Lothian and Borders Police reminds me of a visit I made many years ago to a Scottish police force to view one of its brand new computer systems.

In discussion with one of the end users, in a broad Scottish accent, he described the system as a “Disney” system. Confused, we visitors from south of the border had to ask what was a “Disney” system?

Very simple said the officer – this function disney work, and that function disney work......

Monday, 20 April 2009

Agile methods for package enhancements?

Firstly, it’s worth welcoming back a number of my readers after what I hope was a good Easter week off. The number of readers and page hits was down around a third during last week – but despite that I received a good number of comments on my article Agile vs. waterfall methods.

Not surprisingly to me, the majority of them were from readers who had previously believed that the “waterfall” methods were the only way that a software house could undertake developments safely, and their comments reflected some strong cynicism about the motives of developers and development teams who proposed “Agile” methods.

Further discussions with a couple of those commenting on the post introduced an interesting problem – if a software product has been developed using traditional (waterfall) methods, is it possible to introduce Agile methods for new upgrades?

Unfortunately, as in my original article, I had to revert to using the phrase “horses for courses” in my discussions, as again there is no simple answer – it really depends on what you want to develop, who’s developing it, and the expectations of the management team sponsoring the development.

Surprisingly enough, a brand new module may be just the right project to trial agile methods on – standards for the product will already have been established and documented, and typically the functionality will be relatively small compared to the existing product. Agile methods should allow the product to be seen quickly, and possibly even installed and reviewed early at “warm” customer sites.

However, as in last week’s article, the key to the decision must be the composition and skills of the development team. Not only must they have the necessary technical skills and experience, the team must include someone who understands the requirements in depth – without this latter person, the risk for such an agile development increases dramatically.

It is easy for an agile development to be never ending, with more functionality being identified and added as the weeks progress. Once the original objectives have been met, it is essential to decide on a first deliverable, stop development, and get the new module out to customers.

Once the new module is completed, as with waterfall development, it is also essential to ensure that it is adequately tested by staff external to the development, and supported by comprehensive documentation to allow for easy maintenance by separate support personnel.

However, as noted last week, perhaps the biggest problem is with senior management – are they prepared to take the risk of an agile development?

Amongst my contacts, it is almost invariably the smaller companies that have adopted the agile methods – I know of very few larger application software package developers that have done so successfully. But perhaps that reflects my horses for courses view – perhaps smaller organisations have a level of specific staff experience and knowledge that is possibly missing in larger organisations, their projects tend to be smaller, and their management closer to the development teams. However, as the song goes, I detect that times they are a-changing......

Wednesday, 15 April 2009

Agile vs. waterfall methods

I’m regularly asked by my customers to advise on the benefits of agile vs. waterfall methods of software development, and which method they should adopt (or continue to use). In my response I regularly use the phrase “horses for courses” in my discussions, as there is no simple answer – it really depends on what you want to develop, who’s developing it and the expectations of (or contractual relationship with) the person/organisation funding the development.

From a technician’s point of view, the agile approach lets him get down to the bits and bytes quickly, using the technology to help him work with end users to collect requirements and build solutions to satisfy those requirements. If you have skilled technicians, either with a high level of understanding of the requirements (unlikely) or with a very good working relationship with end users who not only know the requirements, but also have the ability to spend significant time with the developers, then you may have a project for the agile approach. If those end users are also the project sponsors and are funding the development, then the case for agile development gets stronger.

If, however, the development is to be built in a client-contractor environment, normally against a fixed price or firmly set estimates and timetables, then the level of risk in using agile methods increases. Add in larger contract size, leading to larger teams, and more remote communication with end users, then going down a waterfall approach can quickly become the preferred route.

For application package developers the problems become even greater – they have no one customer, but look for a solution that can handle multiple sets of requirements. Then they have to not only maintain, but also enhance the package over many future years, in many cases without the designers and technicians who wrote the package initially. Also, and perhaps most importantly, senior management are unlikely to sign blank cheques for the development of new packages without some degree of certainty that the developed product will meet their target customers’ requirements, be marketable and delivered within an agreed budget against a realistic time schedule.

From my own experience, I’ve always believed that the coding stage is, surprisingly enough, one of the least important stages – as with any stage, if you get it wrong the project will suffer, but it will suffer less than if you get the requirements collection wrong, or adopt the wrong design. Coding is used to implement a design – no matter how good a coding team is, if it has a poor or incomplete design, it is highly likely to produce a poor system.

Ever since my early days of developing software the 50-30-20 split of work (50% of the effort spent on requirements collection and system design, 30% coding and unit testing, and 20% system/acceptance testing) has remained curiously static for new software developments over the years. (Before I get a lot of comments - I note that the development of new versions of, or replacements for, existing products means that a large proportion of the requirements capture has been completed already – however, the success of any new development is still dependent on the overall design being absolutely right).

Evangelists of agile methods will point to large projects developed using waterfall methods, where the requirements collection and analysis phase goes on forever in an attempt to capture every last requirement before the development starts. Then any new project spends so long in development, that by the time it is complete, the requirements have changed and the solution no longer meets the users’ needs.

Exponents of waterfall methods will point to agile developments where the level of rework to build in missed functionality has been many times the effort that would have been required if the requirement had been identified before coding started, to projects where the quality assurance team weren’t able to start to generate their test plans until just before the development was complete, or systems where the user and technical interfaces were inconsistent across the code built by separate individuals.

The advice I give my customers is based on the size of the project, the size and skills of the team, the expectations of the funding organisation/person, and, most importantly, the criteria for success. If I advise a waterfall approach, I encourage the use of modern software development tools to produce (normally disposable) prototypes during the analysis and design stages (one of the – waterfall - projects that I reviewed was using Microsoft Expression Blend for prototyping – it just blew my mind, and the prototypes could be used in the later stages of development). I also encourage the use of high levels of automation for documentation and testing processes.

If I advise an agile approach, I stress the importance of an overall scoping document, with clear standards for look-and-feel, navigation and interactions between modules. I also look for strong management and ownership, not only of the project but also of all common aspects of the development, be it the database or common classes. Also, no matter how good the developers and the methods adopted, I still recommend a separate testing team.

In all cases I emphasise the involvement of the main project sponsor to decide on scope, give clear direction to the team, and help to avoid requirements creep. All too often, early expectations are set far too high, and once reality sets in, when forecasts seem unattainable and budgets insufficient, there needs to be open communication between the team and the sponsor to arrive at an acceptable solution.

All too often problems get hidden – “we’re behind but we’ll recover” - and it is only later in the project that confidence crumbles and reality sets in. Unfortunately with agile developments it is difficult to regain management confidence in revised project plans due to a lack of measurable performance to date that can be extrapolated towards a final completion date for each phase. It is at this time that a good and involved project sponsor can be the saviour of a project – all too often I see sponsors returning to a ‘hands off’ position and failing to get involved – making a problem project into an even bigger problem......

P.S. As well as providing a “fire fighting” service for
problem projects, I also provide consultancy to help new or existing projects proceed successfully and avoid becoming problem projects. If you would like to discuss ways of avoiding problem projects in the future, please contact me at Phil@systemsolveconsultancy.co.uk

Tuesday, 7 April 2009

Proactis back into profit

The spend control software house Proactis has today announced its interim results for the first half-year after its restructure last year.

The company appears to have brought its cost base back into line with its revenue, reducing costs by 24%, and as a result has returned in a half-year profit of £362k (vs. a loss of £438k for the same 6 months last year), on revenue up 6.7% year-on-year. Reassuringly, the business has returned to being cash generative, despite capitalisation of £207k of R&D expenditure in the period.

As I’ve
noted previously, whilst Proactis does sell directly (e.g. to the UK Public Sector), they have wisely focussed on building their network of accredited resellers, and have diversified their offerings away from just pure e-procurement and spend control. It would appear that Proactis has strengthened its relationship with Agresso, where I suspect Agresso will use the better functionality of the Proactis products to help it compete against the likes of Civica and others who have stronger offerings than Agresso's in the procurement area.

As I noted last year, with its move into other markets, and international coverage, Proactis will, I believe, survive and, once we move into a more positive financial environment, should thrive as companies look to replace outdated back office systems. However, I believe that there is a strong chance that Proactis will be acquired by a bigger player (see my
post from last year for the names of some potential candidates).

Proactis’ share price is up 2p at 19.5p this morning, giving a market cap of £6M – in my view, in the current financial environment, a fair valuation for a company turning over c £7M per annum and with a half-year EPS of 1.3p.

Monday, 6 April 2009

Don’t forget the 2012 Olympics

Based on the comments I've received, it would appear that most of you agree with my prediction that 2009-10 will be a tough, but manageable year for those public sector software suppliers who have order books and good recurring revenues to live off, whilst 2010-11 looks like it could be one of the worst years for public sector software suppliers.

However, my prediction of a recovery in 2012 has not received such a high level of agreement – particularly for suppliers to UK Police Forces.


Here, suppliers are concerned about the impact of the 2012 Olympics on Police Forces’ normal procurements – yes, there will inevitably be some addition expenditure on software and services specific to the Olympics themselves, but it would appear that the expectation is that Forces will put off the procurement of other systems and service until after 2012, both to conserve funds for the inevitable extra workload and costs of the Olympics (not just confined to the Met) and to wait to see what impact the Olympics will have on security issues going beyond 2012.

I raised this with a supplier of software and services to larger local authorities, to see if he felt the Olympics could also affect procurements amongst London Boroughs and those authorities outside London that are hosting Olympic events. He felt not – and agreed with my prediction of a recovery in 2012 – but added the caveat that in the current environment, there could be no certainty on any date for a recovery, and that 2012 could be yet another reason for government generally to delay procurement decisions....

e-mail distribution

If you’re one of my many subscribers who receive my posts by e-mail, please accept my apologies for the late delivery of your posts. This seems to be a problem related to the change to BST – I’ve tweaked the time settings and hopefully you will now receive your e-mail updates on the same day new posts are made.....

Thursday, 2 April 2009

IT Forecasts – Holway’s Rant

If you are a Software or IT Services supplier, I strongly recommend that you read Richard Holway’s article today, where (unlike many other optimistic forecasters) he forecasts negative growth for the UK SITS sector in 2009 with no return to positive growth until 2010.

From the conversations that I’ve had with a number of software suppliers, I have to agree with his views for the general software and IT services (SITS) market, although I sense that the downwards curve for SITS companies working in the Public Sector market will be some 12-18 months behind the decline being suffered by suppliers to the private sector.

As I see it, many public sector software suppliers are currently living off orders won over previous years, with new business being hard to find. They are managing their cost bases to reflect their declining order backlog and, in many cases, do not appear to be unduly worried about the future – with the new financial year about to start, they believe that 2009-10 will produce as much business as 2008-09.

As I’ve said
before, this may be true in some specific areas (e.g. in Social Services, Housing and, probably, Education), but in other areas my discussions with potential customers have shown an increase in the “if it ain’t broken, don’t fix it” attitude. Also, with increased requirements for information and transactions to be made available electronically, I detect a much stronger move to bringing such services work back in-house, rather than contracting it out.

What could be worse for public sector software suppliers is that 2010 will (almost certainly) be a General Election year, with June 2010 being pencilled in for the actual election. Historically, this has meant that procurement processes and their decisions, typically kicked off early in the new financial year, will now be delayed until at least summer 2010 and, I believe in many cases, potentially through to the following 2011-12 financial year – with organisations waiting to see what financial constraints the new government will be imposing.

So - 2009-10 will be a tough, but manageable year for those suppliers who have order books and good recurring revenues to live off, whilst 2010-11 looks like it could be one of the worst years for public sector software suppliers, with any recovery delayed until 2012 at the earliest......

P.S. For me, the companies I’ve been associated with have always suffered from a recession in a year ending in one – starting with 1971 which saw me made redundant (by Fraser Williams) before I even started with them. We survived 1981 in IAL Gemini on the back of our niche applications, and fortunately Systemsolve was acquired by Radius just before the 1991 recession hit. 2001 was our last bad year, and it looks as if 2011 could continue the trend .....

Wednesday, 1 April 2009

Capita's acquisition of IBS is ruled anti-competitive...

The Competition Commission (CC) has provisionally concluded that the completed acquisition by Capita of IBS could damage competition in the market for the supply of revenues and benefits (R&B) software to local authorities in the UK. This will have come as no surprise to regular readers of this blog, nor will the CC’s decision that there are no similar concerns in the Social Housing software market.

Christopher Clarke, Inquiry Group Chairman, commented:

“This merger combines two closely competing suppliers of revenues and benefits software to local authorities, leaving only one other supplier actively competing for business. In a stable market with little prospect of entry by new suppliers, our provisional conclusion is that the enlarged Capita revenue and benefits business will be able to take advantage of the lack of competition, for example by increasing prices or reducing levels of service to its customers.

We consider it likely that the adverse effects of the merger will have an impact on all customers, whether they are in the process of tendering for new revenues and benefits software or already have a contract for such software in place.”

Given the remit of the CC, this decision was expected, but I still regard it as disappointing, given the nature of the small and declining market for R&B systems. By not giving the CC a wider remit, Government has, I believe, missed a great opportunity to put in place protection for the interests of existing & future IBS users, and possibly even some safeguards for other Capita R&B customers.

But now the decision is made, and the discussion of remedies commences. It would appear that the CC is unlikely to agree to “behavioural remedies” such as price controls or Capita’s maintenance and ongoing development of two R&B systems in parallel. Rather it is looking at the feasibility of splitting off just IBS’s R&B business (from the social housing part) and its viability as a stand-alone business unit, or whether a divestment of the full IBS business is required.

The CC is also looking for the views of potential purchasers of the IBS business (full or R&B only) and constructive suggestions for other remedies, behavioural or structural – although I doubt that there will be any serious suggestions in this latter area.

All parties have been requested to provide any views in writing, including any practical alternative remedies they wish the CC to consider, by 20 April 2009. The CC states that its findings may alter in response to comments it receives on its provisional findings, in which case the CC may consider other possible remedies, if appropriate. – but I’ll be surprised if they stop short of divestment.

Given the current position, it seems as if divestment is the preferred route, but of what, to whom, and for how much?


As I understand it, the CC will have considerable control over any divestiture, deciding on the form of the business to be divested, setting a timetable (typically 6 months), vetting/approving potential purchasers, and generally overseeing the divestment through to completion. I can think of at least two potential, serious bidders who will no doubt be knocking on CC doors over the next few weeks......

NHS to award new NPfIT contract to HHI

One of my moles told me yesterday that the NHS will announce on Wednesday that it has decided to recommend that HHI be appointed as an alternative software supplier to iSoft and Cerner for NPfIT projects.

This is a tribute to the Rapid Application Development (RAD) technology used by HHI, a small UK software company employing only 8 staff, who have been able to complete their development of a full care records system using Microsoft Visual Studio tools in just 6 months, and have been able to get the system live at the pilot site at Midshire PCT in only 8 weeks.

Andrew Gile, HHI’s Managing Director, praised his three technicians who, prior to the start of the development, had little or no experience of health applications. He explained that the team had been able to work with doctors and nurses from Midshire PCT to understand their requirements and build them into working prototypes, that had then been used to explain to NHS managers what the system was supposed to be used for.

Ursula Sit, HHI’s Implementation Manager, believes that the short 8-week implementation time was due to the end user interface being modelled on Facebook. She noted that “most end users were already heavy users of Facebook outside (and some inside) work, and were fully aware of the rich functionality available. The intuitive interface meant that the majority of end users required only a two hour introduction to the system functionality before they felt happy enough to use the system in practice”.

The selection of HHI should be confirmed at a meeting to be held on Wednesday morning with a formal announcement due around midday. Andrew notes that he hopes the announcement will stop the circulation of a great deal of the hot air that has been around the NPfIT project since its inception, and will allow his company, Hoof Hearted Inc, to get on with the planned 3-month roll out of the HHI system to the remaining 2,000 Trusts awaiting a new system......

I should have the confirmation of the selection of HHI around lunchtime on Wednesday – check back in the afternoon for confirmation.


Midday, Wednesday 1 April update: Yes, you guessed it, this was an April Fool post. But what about "agile" development processes - are they better than "waterfall" processes? Revisit this blog over the coming days for my views.....