Wednesday, 16 December 2009

2010: Another year of “if it ain’t broken….

…. don’t fix it”.

I’ve had the opportunity to talk to a number of software and service suppliers to the UK local authority market over the past few weeks, and with the exceptions of out-sourcers and suppliers into the social services, education and housing sectors, they are broadly pessimistic about the opportunities for new business in 2010.

As I predicted early this year, 2009 has not been too bad a year, with the April re-organisation throwing up some good new contracts and opportunities, whilst some existing customers have had the budget to buy additional functionality and services from existing suppliers. But now supplier orderbooks seem to be depleted, and without significant new business in the offing, many suppliers continue to review their costs and staffing structures to batten down the hatches for a tough 2010.

As 2010 is an election year (both general and local), it will spread uncertainty in many purchasing areas and exacerbate the opportunities for indecision. In normal times suppliers would have expected a poor year, but in today’s troubled financial times, it promises to be even worse.

Not surprisingly, several companies are looking at acquisitions as a way of growing their customer base, and with poor sales forecasts for many smaller suppliers potentially driving down their valuations, I think 2009 will see further consolidation in the application software market. Indeed, a few suppliers are not expecting their competitors to survive the the next couple of years, and I would agree that there are a number of smaller suppliers that will not survive what I believe will be an orders drought through to at least 2012.

As I've said before, existing suppliers will try to maximise revenue from existing customers, both through increased services offerings and new modules & functionality for existing systems. Larger suppliers will attempt to cross-sell between departments within existing customer sites (I think, in 2010, with limited success, although customers may be tempted by the lower cost of procurement).

However, the corollary to all this, particularly for the smaller or bolt-on applications, is that we may see the arrival of new, smaller players with new offerings that are significantly cheaper and potentially technically superior to existing suppliers who have not invested enough in their products.

So 2010 – an even tougher year for the software suppliers.

Tuesday, 15 December 2009

Agile vs waterfall – the debate continues

It’s been good to see the number of new readers that have found this blog by searching for this topic – and I welcome the many comments that I’ve received on my previous posts.

Over the past year I’ve met with a number of advocates of agile methods, and had the opportunity to review a number of new software products being developed using agile methods. Whilst my view remains that it’s “horses for courses” when deciding on agile vs. waterfall methods for development (see my original post - Agile vs. waterfall methods), I’ve yet to find anyone that is using agile development methods for new application package products effectively.

Yes – I’ve seen some very successful agile developments of bespoke systems for single customers, but agile methods seem unable to cope with the development of package products that need to be designed to meet multiple and differing customer needs. The key appears to be the need to understand the many different customer requirements in advance, so as to be able to decide upfront on the core parameters for the product.

As noted in my post Agile methods for package enhancements?, when there is an existing product, where the majority of core parameters have been defined already, agile methods can be used effectively to develop new modules – always providing that you have the right types of team members and a true agile methodology (rather than a “let’s give the techies control of this development” approach).

However, do any of my readers know of new application software products that have been built successfully using agile methods? If so, please let me know.....

OT - America’s Cup 2010

For those of you that don’t keep up-to-date on sailing, the next meeting of the giants of match-race sailing, the America’s Cup, has been totally changed by the courts following legal battles between Larry Ellison (CEO of Oracle) and the Swiss challengers, Alinghi. Rather than a competition open to all entrants to fight to challenge the holders, next year’s competition will be solely between the holders, Alinghi and BMW Oracle Racing – racing in mighty multi-hulls rather than the normal 12 metre yachts.

We may not yet know for sure where the February competition will be held (my betting is on Valencia), but next year’s America’s Cup racing looks as if it will be more a technology rather than a tactical race. Just look at the competitors....



Visit the BMW Oracle Racing and Alinghi web sites for more information, photos and videos.





I can’t wait to see who wins this competition next year – I just hope that the two boats are relatively evenly matched, otherwise the competition could be as boring as some Formula 1 GPs, where one car can be so dominant as to result in predictable, processional races. I don’t see multi-hulls as the ideal vehicles for close match racing, however, if we get some good breezes, just watching these monsters at speed could prove exciting on its own....


I hope that we will see a return to common sense rather the courts after 2010, with the competition returning to open races in boats built to an agreed specification (ideally 12 metres, but if not, let’s have a tight standard that puts the emphasis back onto the skills of the teams).

Friday, 11 December 2009

After the downturn

Let me draw your attention to a joint paper by CIPFA and SOLACE on the Pre Budget Report – a paper that tries to start the discussion on how Local Authorities will have to start planning for public spending cuts. A bit like the PBR, it neither identifies specific areas for change, nor the levels of cuts, but discusses the strategies that LA’s will have to adopt over the coming years.

The paper focuses on two scenarios, one envisaging a 7.5% cut in real terms over 2011-14, the other 15%. Whilst the 7.5% cut is possible, I believe that the 15% cut scenario is much more likely, and I believe it may even be more than 15% – particularly as I believe the ability for LA’s to increase Council Tax will be substantially reduced over the same period.

I won’t repeat the contents of the paper here, other than to say that I strongly agree with the need to re-think the delivery of services, and the paper’s three options of:

  1. redefining the relationship between the state and the individual
  2. a significant de-layering of the public sector
  3. a major initiative to maximise economies by much more effective collaboration between public bodies

After the ‘ring fencing’ of some key services, I believe that many LA’s will have no choice but to terminate or almost remove some other services (e.g. the library service is one area that could be under threat in some areas). However, de-layering of the public sector, combined with more effective collaboration between public bodies, in my mind, potentially gives the greatest potential for savings.

Perhaps severe cutbacks in funding will force organisations into sharing services, and the government into more ‘vertical integration’ of services (e.g. between national, regional and local bodies). This will inevitably lead to more out-sourcing, but if the public sector was to think more in an out-of-the-box way, perhaps we will see the more innovative use of out-sourcing to commercial operations where true synergy is possible – e.g. local supermarket chains, or even banks (or can some of them be already regarded as part of the public sector?).

Congratulations to CIPFA and Solace on their paper. Hopefully it will encourage the public sector to discuss the major shifts in service delivery that the current crisis in public sector finance demands.

Tuesday, 3 November 2009

Tories to reduce Government’s commitment to large IT vendors?



The Tory Shadow Minister for Science and Innovation, Adam Afriyie gave a very interesting speech last Thursday, in which he outlined some of the Tories plans for major IT projects if/when they get into power.

Regular readers will know of my confirmed belief in the vital importance of true inter-operability, and it was a pleasure to hear Adam’s views on this....

"By using standard data formats, like XML, government can open up the procurement process to the widest possible base of suppliers. With inter-operability, large projects can be split into manageable, modular chunks. The outcome is a more flexible procurement process where it is easier to change suppliers and resolve problems as they emerge."

Then, as if he had been reading my post on How NHS NPfIT should have been procured, he announced that...

"One option we are considering is the use of multiple proof-of-concept pilot projects. If several suppliers are asked to come up with working solutions, they can then be piloted, and the most successful can be scaled up and rolled out nationally. The use of multiple early-stage pilot projects could reduce reliance on a handful of big vendors and increase the proportion of IT budgets spent with innovative young companies."

I only hope that the Civil Service allows this to happen – I remember that one of the objectives of the LA Pathfinder projects in 2001-02 was to involve smaller companies who were more innovative and faster to react than larger IT companies – unfortunately 24 of the 25 Pathfinder projects went to the major service suppliers – some of whom had no track record of LA application software development at all.

But perhaps, given the spate of government IT disasters over the past few years, these sorts of initiatives will have a chance over the next couple of years.

Adam also gave what was, in my view, a very good summary of the current government’s e-initiatives:

".... some worthy objectives, such as joined-up government and personalised public services. But their approach has been deeply flawed. While the pace of technological change was breath-taking, the response from government was not.

Internet access empowers people. It improves productivity and opens the door to self-improvement. But while the internet was empowering individuals to take control over their lives Labour was attempting to maintain the old bureaucratic machinery.

Ministers were mesmerised by the transformative potential of technology but failed to integrate it seamlessly into everyday use

Perhaps the next few years will see significant changes in the way government procures and develops new IT systems – let’s hope so.....

Capita divests IBS R&B unit to Civica

Over the summer, Civica acquired IBS's revenues & benefits unit from Capita - following the Competition Commission's decision to force Capita through a divestment – see Capita to divest IBS Revenues & Benefits unit.

I was quite surprised that Capita divested the unit to Civica as, with Civica's existing base of R&B back office and Comino workflow/DIP customers, it makes a serious competitor to Capita's own R&B business. But as I understand it, Civica were the only credible bidder with the cash to complete the deal.

In practice, I believe that Civica has done well and bought the unit at an apparently bargain price. I'm sure that the indecision brought about by the CC investigation has harmed both the IBS business and staff, but now Civica has a complete, competitive R&B offering – and a good upgrade offering to its existing R&B back office customers running Civica's very old, Pick-based, back office software.

However, I still believe that Civica will have an uphill struggle to sell its now Progress-based solution to new customers – particularly the larger users such as the new unitaries. But perhaps they will win a sympathy vote from those customers unhappy with Northgate's decision to switch off support for the old Anite Pericles product (and unwilling to move into Capita's extensive grasp).

Over the past year I've spent some time with external companies looking at the UK local authority market and considering trying to enter the R&B market by developing new back office products from scratch, but all seem put off not just by the development cost, but also by LA prospects' desire to see three live reference sites, the resulting lengthy time to market, and the possibilities of future central government changes in the way revenues are collected and benefits handed out.

Seeing no potential new entrants, we now must live with the three R&B suppliers, each of them with competent solutions, but neither of them with clearly the best solution, and each of them with at least one major drawback.......

Sunday, 1 November 2009

Back to blogging

As regular readers will have recognised, I’ve not been posting to this blog during the summer, primarily due to pressure of work. But after a very busy summer, including a stint as interim Operations Director of an AIM-listed software company, work has reduced a bit, so I’ll be getting back to posting a few items each week.

Monday, 8 June 2009

Government IT projects – time for change

The debacle of the C-Nomis project (see Tony Collins blog for the background) highlights the need for fundamental changes in the way that government (primarily Central Government – local government seems to be far better) procures new IT systems.

As I have previously
posted, I recommend breaking large projects down into smaller, more manageable chunks. But, perhaps more importantly, ensuring that the requirements for the project have been accurately and completely defined – prior to a specification stage that includes detailed walk-throughs with real-life end users. This is the most important phase of an IT project – yet is typically rushed or overlooked, and frequently completed without adequate reference to the managers and end-users that will be using the system.

I’m a great believer in “phased fixed price” contracts for dealing with large projects that require the development of customised software – splitting out each phase into separate contracts where the current phase is on a fairly firm basis (ideally fixed price against an agreed definition), with budgeted prices for the next phases (typically based on some broad brush assumptions of what will come out of the each phase). Such an approach allows for the requirements collection phase to be contracted separately and carried out by potential eventual developer (if you want to know how to do this in a way that allows for subsequent changes in contractor, please contact me).

All too often, major government IT contracts are awarded to big service suppliers rather than splitting off the development stages (that frequently generate lower revenues than the roll-out and related infrastructure stages) to specialist software developers, and leaving the other stages to the service suppliers. The culture of specialist software development businesses is different to that of the major service suppliers, a culture which is more likely to deliver a better software solution (whilst service suppliers would be better at the other stages of a large new IT project).

As the C-Nomis project proved, a lack of focus on the core requirements, system design and expected benefits, has resulted in a system that met neither the business objectives, nor the project budget and timescale. I suspect that it’s been a great business success for the service supplier who has benefited from the budget increase from £234M to £513M – no doubt far outweighing any bad PR from this obvious failure.

How many more project failures will there have to be before we see Government recognising the need for splitting up large projects into smaller, more manageable chunks, ideally allocated to different specialists for the different types of contract?

Thursday, 4 June 2009

Capita to divest IBS Revenues & Benefits unit

The Competition Commission has published its final report on Capita’s acquisition of IBS and, unsurprisingly, it has announced that it is requiring Capita to divest the IBS Revenues & Benefits unit.

The report points to the several problems with the partial divestment of the R&B unit (Capita would be allowed to keep the Social Housing unit if it can achieve the partial divestment of the R&B unit), but has apparently obtained assurances from Capita that it will pick up any additional customer costs such as additional licensing costs and/or separation of an integrated IBS database into separate R&B and SH databases.

I would imagine that existing joint R&B/SH customers of IBS will not be too pleased about losing their single contractor, single point of contact and integrated database, but if Capita had been allowed to keep the IBS R&B unit, in the long run, perhaps Capita would have moved the customers off the IBS R&B product any rate, so the customers would have lost many of these benefits without the forced partial divestment.

There seems little doubt that the value of the R&B unit to purchasers will be reduced by the partial divestment, rather than a full divestment of the whole IBS business including the SH unit, but I suspect that Capita will much prefer such a partial divestment. Although the value realised will be less, and the partial divestment will be more time consuming and messy, given the likely smaller price tag, there are likely to be more potential purchasers – and, given the unknowns, I suspect that it will be some time before the divested unit becomes a forceful competitor again in the R&B market.

There is the risk that if a partial divestment is not achieved, the CC will require Capita to go down the full divestment route – a route that I’m sure Capita will wish to avoid at all reasonable costs. Although Capita has
announced that it is “in early discussions with interested parties”, no doubt the due diligence and purchasing process will take several months, but it will be very interesting to see who the succesful purchaser of the IBS unit turns out to be .....

Wednesday, 3 June 2009

FOI to apply to suppliers?

UKauthorityITy.com has pointed out that there is an impending extension of Freedom of Information powers to cover suppliers (see here for the article).

However, I believe that software and associated service suppliers will not be affected – any extension is likely to be limited to BPO contractors – and not software suppliers operating on either a conventional licensing or SaaS approach.

It will be interesting to see where any extension stops as far as IT services and managed services are concerned. Where a managed service is limited to supplying a working computer system for use by public sector employees and/or their contractors, then I believe that any FOI extension will not affect this sort of service any more than current contracts/legislation require.

Where a service supplier provides a fully-outsourced IT service, any extension is likely to be less well defined. My personal view is that FOI should not apply to such contracts, but from the comments flying around at the moment, I believe that there is pressure to bring these types of contracts into the FOI arena. This will no doubt provide lots of work for the legal profession over the coming months/years....

Monday, 1 June 2009

Google Waves hello to Microsoft

Possibly to try to counter Microsoft’s launch of its new search engine Bing, Google last week pre-announced its new Google Wave product - a “new-age communication and collaboration tool” that seems to combine e-mail, instant messaging, and bulletin board functionality, with strong support for multi-media, into a single product.

I’ve only seen the developer preview at last week’s Google I/O 2009 conference (see
here for the 80-minute video of the presentation), and whilst it was clearly still buggy in its development form, and will not be on general release before the end of this year, it seems to be a serious future competitor to the Microsoft tools that currently dominate corporate communications.

Particularly impressive to me were the collaboration aspects of Google Wave – potentially very effective for project teams to communicate ideas and make decisions – together with the ability to replay the messages/discussions in order to see how the discussion went from initiation through to current time. Given the potential benefits for improved collaboration, I can see that small technology operations (like software houses) – particularly those using a high level of remote working, be they home-working or distributed offices – will be the early adopters; the question is, will the major corporates follow?

However, whilst such tools would be great for in-house use, I struggle to see how one might manage the security implications of opening up such ‘waves’ to users external to one’s own organisation – or even across departments in large corporates or organisations.

I also wonder about the level of server computing power (and network capacity) necessary to support just internal heavy usage – let alone the power/bandwidth necessary for public utilisation.

Google is planning for Google Wave to be open source, and through its pre-launch in San Francisco last week, is trying to encourage the developer community to embrace the product, and use the API’s to produce bolt-ons (‘gadgets’ and ‘robots’). The plan appears to promote open networks, with anyone being able to become a Wave operator, with ‘Wave’ running on a distributed network model operating on a peer-to-peer basis.

Whilst I suspect that Google Wave will need to change before its open launch - to address both potential security and performance issues - its innovative functionality will undoubtedly drive significant changes in the way we all use e-mail and instant messaging in the future. I’m intrigued to see how Microsoft will respond....

Bing launches this week

Wednesday 3 June sees Microsoft launch Bing, its new search engine– although Microsoft describes Bing more as a “decision engine”.

If you want to know more then there are a number of write-ups around, or try looking at Microsoft’s
press kits or view a short (less than 3 Minutes) video.

However, the initial launch will focus on the American market, with only a Beta version available in the UK – where we’ll have to wait around another 6 months before we see a version of Bing optimised for UK searches.

I had the opportunity to have a demo of Bing when it was known as Kumo, and I was impressed by the potential of Kumo to make better sense of the search results. But the demo had a very strong American bias, and worked best to the demo script – adhoc departures from that script showing up the early stages of the development.

Only in practice will we be able to see if Microsoft has come up with a Google-beater. It will be interesting to try out Bing in anger – even if the results will be focussed towards users on the other side of the Atlantic....

Tuesday, 19 May 2009

Oracle’s acquisition of SUN – what about the software?

Surprisingly, it was discussion about Oracle/SUN and my postscript about Business Objects that raised most comments in my post Microsoft to acquire SAP? – in particular, what will happen to the many software products that Oracle acquires with SUN?

Top of the list of products at risk must be the MySQL database. Personally I can’t see Oracle putting much investment in here – however, fortunately, there is a complete community of developers out there that will, I suspect, ensure that the product is not killed off entirely.

Top of the list of products to survive must be Java – Oracle remains a big supporter of Java and I believe we will see continued development and support and, despite rumours to the contrary, I believe that Oracle will continue to keep Java entirely free.

What about SUN’s operating system Solaris? – Oracle has always run best on Solaris, but with Linux seeming to have greater volume and presence in the market, I can see that Oracle will find a way to retire Solaris as the year’s pass by....

Then there’s OpenOffice.org - Larry has always been anti-Microsoft and looking for a product to attack the mighty MS – so perhaps this is a weapon in the great Oracle-vs-Microsoft war. But does OpenOffice really fit into Oracle’s toolbox? – particularly for the types of corporate customers Oracle aims at? I think OpenOffice could be at risk but – like MySQL – I believe that its long term future will be secured by the existing user community, and that it’s not in Oracle’s interest to kill off the products, but to limit its exposure to ongoing costs. No, I don’t see Star Office re-appearing as an Oracle product....


P.S. Every comment I received agreed that Microsoft would aim to take over SAP, although several felt that the timing would not be until next year....

Tuesday, 12 May 2009

Microsoft to acquire SAP?

Rumours have abounded for the past few years about Microsoft bidding to acquire SAP, and now Microsoft has raised $3.75bn in the first bond issue in its financial history, refuelling those rumours. (Microsoft already has a formidable cash mountain of around $25bn, even after last year’s special dividend of $32bn).

Would SAP make a good acquisition for Microsoft?

In the past, Microsoft has focussed on selling volume products to smaller customers; in the business area, typically focussing on the SME market for ERP-type applications. Through several acquisitions it has built a Dynamics business around primarily its Great Plains, Navision and Axapta product lines. I believe that the AX ERP product is extremely strong in the mid-market for businesses, but Microsoft has continued to suffer from its “small company” image, and had found it difficult to make significant inroads into larger companies.

Microsoft has also suffered from the same problems with its database, SQL Server, easily the best technical product in the marketplace and, in my mind, far superior to Oracle. Although SQL Server is at long last making some inroads into larger organisations, it has been a long and steep slope that has taken many years more than such a good product deserves.

So one can see why Microsoft is considering SAP. Its acquisition would immediately put Microsoft onto the same level as Oracle in the battle for ERP market share, and open up the top end of the market - although one has to wonder what Microsoft will offer to SME’s – Dynamics or SAP Business One? I believe that the Dynamics AX product set is the superior offering in the middle market, although SAP may have more market share. In the short-term I would expect Microsoft to support all products, although it must aim to converge towards one or possible two products in the long term.

With Oracle continuing down its acquisition route, with SUN Microsystems its last major buy, and Microsoft failing in its bid to acquire Yahoo, Microsoft needs a big acquisition to move forward, and SAP seems perfect.

However, I can see problems ahead if the acquisition is made - my experience of cultures of the two companies is that they are significantly different. In addition, despite its protestations to the contrary, Microsoft remains a technology company, and has never fully understood the business applications software market – if Microsoft tries to impose its own culture and processes onto SAP, then I hate to think what the result may be. But perhaps SAP will be allowed the upper hand in the business applications area, and Microsoft will benefit from their different approach....

P.S. What about Business Objects, SAP’s last large acquisition? Although it comes with Crystal Reports, the reporting engine of choice for many Visual Studio programmers, the Business Intelligence (BI) software would compete directly with Microsoft’s own BI tools. Again, I see the two BI products co-existing in the short-term, but, I suspect that Microsoft would aim to see convergence towards a single BI product set – in this case I’d put my money on the Microsoft tools winning in the long run....

Monday, 11 May 2009

What about IRACIS?

Following on from my post on bringing back Systems Analysts, and continuing a theme related to grandmothers and eggs, I received a comment about development teams not understanding the true objectives of a new system – a comment I agree with - indeed, in my experience, there are situations where many directors/managers cannot explain nor quantify why a development is taking place (although I’ve found that this latter symptom is much more prevalent in the public sector than the private sector).

Back in my analyst days (admittedly a long time ago) we had the acronym IRACIS drummed into us to define the benefits of new systems –
Improve Revenue,
Avoid Cost,
Improve Service

Personally, as well as bringing back Systems Analyst roles, I believe it’s time to re-emphasise the importance of IRACIS to ensure that the benefits of new systems are better defined and understood across the whole organisation – from sponsoring Director, through managers and down to the most junior of the development team.

(Fortunately, many developments do have the expected benefits defined well, but it’s amazing how many don’t - and also how many projects that have their expected benefits defined don’t try to quantify those expected benefits, nor measure them post implementation, to see if the benefits have been realised).

Friday, 8 May 2009

Time to bring back Systems Analysts?

I’ve had a number of discussions with directors and project managers since my posts on Agile vs. Waterfall methods of development (see Agile vs. waterfall methods and Agile methods for package enhancements?) and there has been two common themes of those discussions – the capture of requirements and production of an overall design.

For developments using Agile methods, it would appear that where these go wrong is, typically, in the collection of an overall set (not necessarily 100% complete) of requirements. Where development teams include senior end users (full- or nearly full- time) who understand the overall requirements fully, the risk of problems diminishes (but definitely doesn’t disappear) – but when the end user involvement is limited (either in time or level of seniority/experience), the risk of problems increases greatly.

Meanwhile, in waterfall developments, whilst the capture of an overall set (again, not necessarily 100% complete) of requirements usually seems fairly good, it is the next design stage that seems to be skimped on, and in some cases even not carried out. (If necessary, see my post
Requirements vs. specifications for an explanation of the difference).

Several of my contacts have rued the passing of the old “Systems Analyst” role, pointing to the current rise in the use of Business Analysts who focus more on requirements rather than system design, and “Analyst/programmer” roles who really focus on the programming and technical aspects, rather than overall system design – leaving a systems design gap between requirements and coding stages.

Perhaps it’s time to re-emphasis the systems analysis and systems design role, and bring back good old fashioned Systems Analysts......

Wednesday, 6 May 2009

NPfIT moving in the right direction?

Last week, the Department of Health announced an initiative that seems to be moving the NPfIT forwards in the right direction.

DoH has announced the development of a new toolkit which will “allow new products to be developed locally, accredited centrally and linked to existing deployments of information systems such as Cerner and Lorenzo.” However, it is envisaged that work on this toolkit – “a pioneering initiative to take advantage of the latest technological developments” - will not be available until March 2010.

The Department of Health's Director General for Informatics, Christine Connelly said:

"We now want to open up the healthcare IT market to new suppliers and new technological developments, to inject more pace into this programme. Working together we can help Trusts configure systems to best meet their local needs as well as take advantage of market developments to make more use of the information held in the core systems.”

I just hope that this initiative is used to truly open up the market for new and existing software developers, and not used to restrict access and constrain developments to protect incumbent system suppliers. Early indications seem to be positive – I just hope that this initiative is not stifled by a bout of protectionism similar to that encountered by several of the Government’s e-government interoperability initiatives of the past few years.

Tuesday, 5 May 2009

Capita IBS decision delayed

Decision day for Capita was due to be no later than 5 May, but the Competition Commission “now considers that the completion of its investigation, including the remedies process, and the publication of its final report, will not be possible within the original reference period and has concluded that an extension is necessary because of delays in the provision of information necessary to carry out the inquiry and the need to consider the effectiveness of both a full and a partial divestiture of the IBS business.”

Accordingly, the CC has
announced that the “reference period” has been extended by 8 weeks, and now the CC report is expected by 30 June 2009.

Unfortunately, due to reasons of confidentiality, I can currently only comment on information in the public domain, but I would not advise anyone to hold their breath whist awaiting a final decision on the remedies – this could be a long drawn out process.....

For my previous comments see:
Capita/IBS – Competition Commission progress (February)
Capita/IBS – Northgate notes published (March)
Capita's acquisition of IBS is ruled anti-competitive... (April)

Monday, 27 April 2009

Inefficiency in Local Authorities

Tony Travers’ presentation at Socitm09 set the likely financial scene for local authorities over the future years:

* Increased demand for services due to the recession
* the need for increased support for NGOs
* the need to assist more in economic development
* at best, 0% grant increases for all councils
* a tighter cap on Council Tax

Speaking to several attendees from the supplier community at the conference, there was widespread belief that – perhaps – this much increased pressure on LA’s will at long last force many laggard councils into the significant changes in processes that can not only reduce costs significantly but also, potentially, provide a better service to citizens.

Anecdotal evidence from several suppliers who supply self-service solutions, supported the view that currently several authorities are unprepared to adopt self-service and/or abandon traditional high cost channels of service delivery, as they cannot stomach the implied loss of staff. Procurement decisions are, in many cases, still being left to the departments and staff likely to be affected by the adoption of self-service, i.e. “Turkeys voting for Xmas” – unsurprisingly resulting in no, or slow, adoption of such cost-saving initiatives.

It’s disappointing to note that these laggard authorities give local government a bad name in the efficiency stakes, even though the top quartile of LA’s have already moved to high levels of self-service, taking the pain of staff cuts in their drive to reduce costs. I just hope that this top quartile will not be penalised in the future as it becomes easier for laggard authorities (who currently have plenty of slack) to reduce costs, whilst top quartile authorities who have already reduced costs significantly find it more difficult to reduce costs even further.

Looking forward, for suppliers, it’s clear that my
forecast for 2010 onwards remains valid – indeed, I now definitely believe that, for public sector software suppliers, poor revenues will undoubtedly continue into 2012 and possibly beyond....

Operational in-Efficiency Report

Snuck out on the morning of last Wednesday’s budget statement was the 92-page final report on the Treasury’s Operational Efficiency Programme. In theory the 20-odd pages on Back Office Operations and IT should have made for interesting reading as it seemingly identified some £7.2b of savings.

However, those readers that take the time to view the document soon realise that the document merely supports the setting of targets for saving of some £4b on back office operations and £3.2b on IT, with no detail or outline plans for how those savings will be made.

Even worse, the paper identifies that Central Government doesn’t know how much it spends on back office operations and IT, let alone how such expenditure compares with the private sector. Indeed, the findings summary states that “examination of back office operations and IT has focused on the need to improve the collection and integration of management information into departmental processes, and to introduce benchmarking and performance reviews. [The] work on IT has aimed particularly at better governance of IT-enabled projects, and greater standardisation and simplification of IT across the public sector.”

Estimates of expenditure on IT across the wider public sector vary from £12.5b to £18.5b – let’s say it’s £16b – against which we must be able to make some savings – let’s say 25% - giving savings of £3.2b – that’s sounds OK - let’s go with that. (I believe that the authors could have been far more scientific, but in practice they must have been stymied by the lack of any meaningful financial information on the real costs being incurred).

The report hints at the types of savings that can be made by the use of shared services and outsourcing, but makes no commitments to the introduction in any named areas. As ever, decision seem likely to be left with the departments and organisations themselves, with no clear plans other than to start measuring how much it really costs now.....

Although the OEP report indicated that it had found it difficult to get information on costs within Local Government (“it is hard to conduct a detailed analysis of this expenditure as it lies in a very devolved landscape”), I believe that local authorities have a much better handle on their costs that central government bodies. Last Thursday at Socitm09, Tony Travers suggested that the OEP report effectively increases the 3% Gershon target for savings to 4% for Local Government. Hardly challenging in the current environment, and I would not be surprised to see this raised further by a new government.

Friday, 24 April 2009

Gladstone produces sound interim results

Gladstone, the supplier of software solutions and services to the leisure and education markets has announced a sound set of interim results today. Despite the current economic and financial environment, turnover has not reduced, and underlying operating profit was up 6% to £698k.

However, on the negative side, as predicted in my last post, Gladstone has incurred exceptional costs of £690,000 - primarily as a result of the costs associated with defending Constellation's hostile bid – and continued with capitalising the costs of development of its new product, £352k in the 6 months.

I was fortunate to have a meeting with Gladstone’s Chairman and Chief exec, Dr Said Ziai, last month and was impressed with his confidence in the business. As well as the significant exceptional costs incurred in fighting off Constellation, the fight and EGM undoubtedly deflected management time away from driving the core business forward, but Said was confident that the business would be successful as the Constellation issues reduced and the new developments came on stream.

I regard Gladstone as one of the “old style” of software houses – cash in the bank (available for funding development in a recession and/or acquiring distressed competitors) – clear market leader - and yet having realistic plans for growth over the coming years, recognising the current financial situation and not looking for excessive growth.

A prudent company that will, in my estimation, survive the recession and, once we see the green shoots of recovery, will power ahead.

Thursday, 23 April 2009

Socitm – a conference of two halves....

I’ve just got back from today’s socitm09 National Conference at Stoneleigh, and whilst most of the content was informative, it was also quite depressing....

The morning sessions focussed on the future – Tony Travers giving a broad brush view of the impact of the Credit Crunch on future funding (more about that in a later blog post), Richard Allen a refreshing presentation on unlocking the power of local information, and Rose Crozier on how socitm is at long last listening to, and focussing more on the needs of, its members. Although the messages were in some cases setting strong challenges for the future, there is clearly an uphill struggle for Local Authority IT departments and their staff over the coming years.

The afternoon session, however, was spent looking backwards and did nothing more than to emphasise how the majority of local authorities have not embraced the Internet and the use of electronic self-service sufficiently to reduce costs significantly. Martin Greenwood seems to have drawn the short straw to encourage LA’s to “use the concept of ‘avoidable contact’ to reinvigorate transformation”, and Dhanushka Madawala gave examples of the work Hillingdon had undertaken to reduce avoidable contact – all very basic stuff, but apparently necessary for many authorities....

(I will own up to having skipped the last session on “Digital Inclusion, LA’s and the third sector” in favour of attending IBM’s presentation on Business Process Optimisation).

Yet the exhibitors gave a far more encouraging message on what (presumably the upper-quartile of) Local Authorities are doing. Far from re-enforcing basic messages on Internet usage, suppliers were extolling their LA customers’ use of Web 2.0, blogs, Twitter and the like to try to really communicate with their citizens. Very encouraging.

Can so many other LA’s really be burying their heads in the ground and failing to embrace modern technology and, amongst other uses, adopt self-service to reduce their own costs?

Ordnance Survey’s new strategy....

Richard Allan’s socitm09 presentation (he’s the Chairman of the Power of Information Task Force) again brought up the serious data licensing problems that have been introduced by Ordnance Survey.

It would appear that their highly restrictive licensing of data is providing a major obstacle for the public sector to use graphical presentations of data – something that is absolutely key to good public access. I’ve encountered these sorts of problems with trying to negotiate access to NLPG, and it relates not to licensing but to payment – i.e. the licences are being used as an attempt to extract further funds from the OS customers - sometimes very significant sums.

This has surely been disjointed government at its best – one agency, OS, trying to cross-charge other government organisations (e.g. LA’s) – charges that the other organisations can’t afford, so the data doesn’t get displayed, and the citizen loses out. No wonder Google and Microsoft Maps are growing in use for displaying geographic data...

However, and perhaps in response to the criticism and competition, I note that the OS has today announced a
New Business Strategy, which promises to focus on five key areas:

Promoting innovation – with an enhanced free OS OpenSpace service to allow experimentation with digital information and a clear path from this service to greater commercialisation;

Reforming Ordnance Survey’s licensing framework – so that it is much simpler to use Ordnance Survey data and services in other applications;

Reducing costs over time – to ensure that Ordnance Survey continues to offer value-for-money;

Supporting the sharing of information across the public sector – to enable better public policy and services;

Creating an innovative trading entity – to explore commercial opportunities around providing a better platform for consumers to access Ordnance Survey products.

I hope that this will result in a complete about-turn by OS on the fees for use of its data – personally I’m pessimistic – my experience is that strategies in the public sector can take years to implement (and sometimes implementations never see the light of day). The strategy is currently light on detail, but results are promised in the next year – we’ll have to wait to see what transpires over the coming months.....

Wednesday, 22 April 2009

Contracting for developments using Agile methods

One of the questions I had following my article on Agile vs. waterfall methods, was how could customers contract with suppliers for developments using Agile methods?

As I’ve stated in previous posts on contractual matters, the key to success is to ‘define the deliverable’.

One approach is to go for a heavily documented Statement of Requirements and/or Specification (i.e. have a fairly well defined software deliverable), then look for a fixed price development. But this negates much of the benefit of the use of agile methods, and in my mind, the level of risk for the supplier could be too great (although new entrants to a market, who see the developed software as having some extra value – always assuming they retain the IPR – may see this as an investment to get into a new market). Beware that if a fixed price approach allows for multiple/costed changes for all changes to requirements (inevitable if true agile methods are adopted), then the original fixed price will be purely illusory.

On the basis that a complete SOR doesn’t exist, the development team will involve both customer staff (helping to define/refine requirements) and supplier staff, and will have no clear definition of the deliverable (although I regard an outline scoping document as the minimum for starting any such agile development). In such an environment, the easiest form of contract is a straight Time & Materials contract where the deliverable from the contractor is a number of “warm” bodies (although, hopefully, will particular application and/or technical skills and/or experience).

But such a T&M approach puts virtually all the risk back with the customer, and less-trustworthy suppliers may use the agile methods to encourage requirements creep and rework to ensure that their staff’s time on the contract is extended beyond original expectations.....

My discussions on these types of projects have shown the need to develop a trust relationship between customer and supplier – if a watertight contract is proposed so that companies who mistrust each other can work together, it will almost always fail.

As I noted in my earlier post, agile projects are much easier to run with in-house teams where the staff skills and experience are known (hopefully), and the team know that their performance will be measured by their success in getting a quality solution developed to schedule and on budget. If an external supplier is to be brought in, and management of the agile development retained by the customer, then the best contractual approach is one of a T&M form.

If the development project is to be managed externally by the supplier, then clearly during the procurement cycle, the customer needs to satisfy himself of the track record of the supplier in similar developments, prior to contract award. No one-size-fits-all solution exists for the form of contract – they all depend on too many variables ranging from the scope of the project to team size to outline budgets and schedules – but some of the recommendations I make include:

* ensure there is a broad goal for the project
* outline constraints such as budget and timetables upfront so the whole team is aware of them
* define standards for the project up front (UI, coding, data, documentation, testing, etc ...)
* as the development progresses, set detail goals every few weeks
* define in advance the level of customer involvement in the project (and ensure it is met)
* agree the priorities in order – budget vs. dates vs. functionality
* consider splitting the contract into multiple smaller contracts – some fixed price and some T&M – possibly using the “phased fixed price” approach I’ve discussed before
* consider a performance bonus for meeting defined targets (I know that the current trend is for negative penalties for not achieving targets – but I prefer the much more positive approach of bonuses – they’re much more motivational and, in my experience, deliver more successfully than the threat of penalties).

As noted at the beginning of this post, the key is to be clear on what the deliverables are, and then agreeing a pricing formula based on the level of risk to be accepted by each party.

P.S. As well as providing a “fire fighting” service for
problem projects, I also provide consultancy to help new or existing projects proceed successfully and avoid becoming problem projects. If you would like to discuss ways of avoiding problem projects in the future, please contact me at Phil@systemsolveconsultancy.co.uk

Tuesday, 21 April 2009

Disney systems

Today’s advertisement for a Tender for a new Command and Control system for Lothian and Borders Police reminds me of a visit I made many years ago to a Scottish police force to view one of its brand new computer systems.

In discussion with one of the end users, in a broad Scottish accent, he described the system as a “Disney” system. Confused, we visitors from south of the border had to ask what was a “Disney” system?

Very simple said the officer – this function disney work, and that function disney work......

Monday, 20 April 2009

Agile methods for package enhancements?

Firstly, it’s worth welcoming back a number of my readers after what I hope was a good Easter week off. The number of readers and page hits was down around a third during last week – but despite that I received a good number of comments on my article Agile vs. waterfall methods.

Not surprisingly to me, the majority of them were from readers who had previously believed that the “waterfall” methods were the only way that a software house could undertake developments safely, and their comments reflected some strong cynicism about the motives of developers and development teams who proposed “Agile” methods.

Further discussions with a couple of those commenting on the post introduced an interesting problem – if a software product has been developed using traditional (waterfall) methods, is it possible to introduce Agile methods for new upgrades?

Unfortunately, as in my original article, I had to revert to using the phrase “horses for courses” in my discussions, as again there is no simple answer – it really depends on what you want to develop, who’s developing it, and the expectations of the management team sponsoring the development.

Surprisingly enough, a brand new module may be just the right project to trial agile methods on – standards for the product will already have been established and documented, and typically the functionality will be relatively small compared to the existing product. Agile methods should allow the product to be seen quickly, and possibly even installed and reviewed early at “warm” customer sites.

However, as in last week’s article, the key to the decision must be the composition and skills of the development team. Not only must they have the necessary technical skills and experience, the team must include someone who understands the requirements in depth – without this latter person, the risk for such an agile development increases dramatically.

It is easy for an agile development to be never ending, with more functionality being identified and added as the weeks progress. Once the original objectives have been met, it is essential to decide on a first deliverable, stop development, and get the new module out to customers.

Once the new module is completed, as with waterfall development, it is also essential to ensure that it is adequately tested by staff external to the development, and supported by comprehensive documentation to allow for easy maintenance by separate support personnel.

However, as noted last week, perhaps the biggest problem is with senior management – are they prepared to take the risk of an agile development?

Amongst my contacts, it is almost invariably the smaller companies that have adopted the agile methods – I know of very few larger application software package developers that have done so successfully. But perhaps that reflects my horses for courses view – perhaps smaller organisations have a level of specific staff experience and knowledge that is possibly missing in larger organisations, their projects tend to be smaller, and their management closer to the development teams. However, as the song goes, I detect that times they are a-changing......

Wednesday, 15 April 2009

Agile vs. waterfall methods

I’m regularly asked by my customers to advise on the benefits of agile vs. waterfall methods of software development, and which method they should adopt (or continue to use). In my response I regularly use the phrase “horses for courses” in my discussions, as there is no simple answer – it really depends on what you want to develop, who’s developing it and the expectations of (or contractual relationship with) the person/organisation funding the development.

From a technician’s point of view, the agile approach lets him get down to the bits and bytes quickly, using the technology to help him work with end users to collect requirements and build solutions to satisfy those requirements. If you have skilled technicians, either with a high level of understanding of the requirements (unlikely) or with a very good working relationship with end users who not only know the requirements, but also have the ability to spend significant time with the developers, then you may have a project for the agile approach. If those end users are also the project sponsors and are funding the development, then the case for agile development gets stronger.

If, however, the development is to be built in a client-contractor environment, normally against a fixed price or firmly set estimates and timetables, then the level of risk in using agile methods increases. Add in larger contract size, leading to larger teams, and more remote communication with end users, then going down a waterfall approach can quickly become the preferred route.

For application package developers the problems become even greater – they have no one customer, but look for a solution that can handle multiple sets of requirements. Then they have to not only maintain, but also enhance the package over many future years, in many cases without the designers and technicians who wrote the package initially. Also, and perhaps most importantly, senior management are unlikely to sign blank cheques for the development of new packages without some degree of certainty that the developed product will meet their target customers’ requirements, be marketable and delivered within an agreed budget against a realistic time schedule.

From my own experience, I’ve always believed that the coding stage is, surprisingly enough, one of the least important stages – as with any stage, if you get it wrong the project will suffer, but it will suffer less than if you get the requirements collection wrong, or adopt the wrong design. Coding is used to implement a design – no matter how good a coding team is, if it has a poor or incomplete design, it is highly likely to produce a poor system.

Ever since my early days of developing software the 50-30-20 split of work (50% of the effort spent on requirements collection and system design, 30% coding and unit testing, and 20% system/acceptance testing) has remained curiously static for new software developments over the years. (Before I get a lot of comments - I note that the development of new versions of, or replacements for, existing products means that a large proportion of the requirements capture has been completed already – however, the success of any new development is still dependent on the overall design being absolutely right).

Evangelists of agile methods will point to large projects developed using waterfall methods, where the requirements collection and analysis phase goes on forever in an attempt to capture every last requirement before the development starts. Then any new project spends so long in development, that by the time it is complete, the requirements have changed and the solution no longer meets the users’ needs.

Exponents of waterfall methods will point to agile developments where the level of rework to build in missed functionality has been many times the effort that would have been required if the requirement had been identified before coding started, to projects where the quality assurance team weren’t able to start to generate their test plans until just before the development was complete, or systems where the user and technical interfaces were inconsistent across the code built by separate individuals.

The advice I give my customers is based on the size of the project, the size and skills of the team, the expectations of the funding organisation/person, and, most importantly, the criteria for success. If I advise a waterfall approach, I encourage the use of modern software development tools to produce (normally disposable) prototypes during the analysis and design stages (one of the – waterfall - projects that I reviewed was using Microsoft Expression Blend for prototyping – it just blew my mind, and the prototypes could be used in the later stages of development). I also encourage the use of high levels of automation for documentation and testing processes.

If I advise an agile approach, I stress the importance of an overall scoping document, with clear standards for look-and-feel, navigation and interactions between modules. I also look for strong management and ownership, not only of the project but also of all common aspects of the development, be it the database or common classes. Also, no matter how good the developers and the methods adopted, I still recommend a separate testing team.

In all cases I emphasise the involvement of the main project sponsor to decide on scope, give clear direction to the team, and help to avoid requirements creep. All too often, early expectations are set far too high, and once reality sets in, when forecasts seem unattainable and budgets insufficient, there needs to be open communication between the team and the sponsor to arrive at an acceptable solution.

All too often problems get hidden – “we’re behind but we’ll recover” - and it is only later in the project that confidence crumbles and reality sets in. Unfortunately with agile developments it is difficult to regain management confidence in revised project plans due to a lack of measurable performance to date that can be extrapolated towards a final completion date for each phase. It is at this time that a good and involved project sponsor can be the saviour of a project – all too often I see sponsors returning to a ‘hands off’ position and failing to get involved – making a problem project into an even bigger problem......

P.S. As well as providing a “fire fighting” service for
problem projects, I also provide consultancy to help new or existing projects proceed successfully and avoid becoming problem projects. If you would like to discuss ways of avoiding problem projects in the future, please contact me at Phil@systemsolveconsultancy.co.uk

Tuesday, 7 April 2009

Proactis back into profit

The spend control software house Proactis has today announced its interim results for the first half-year after its restructure last year.

The company appears to have brought its cost base back into line with its revenue, reducing costs by 24%, and as a result has returned in a half-year profit of £362k (vs. a loss of £438k for the same 6 months last year), on revenue up 6.7% year-on-year. Reassuringly, the business has returned to being cash generative, despite capitalisation of £207k of R&D expenditure in the period.

As I’ve
noted previously, whilst Proactis does sell directly (e.g. to the UK Public Sector), they have wisely focussed on building their network of accredited resellers, and have diversified their offerings away from just pure e-procurement and spend control. It would appear that Proactis has strengthened its relationship with Agresso, where I suspect Agresso will use the better functionality of the Proactis products to help it compete against the likes of Civica and others who have stronger offerings than Agresso's in the procurement area.

As I noted last year, with its move into other markets, and international coverage, Proactis will, I believe, survive and, once we move into a more positive financial environment, should thrive as companies look to replace outdated back office systems. However, I believe that there is a strong chance that Proactis will be acquired by a bigger player (see my
post from last year for the names of some potential candidates).

Proactis’ share price is up 2p at 19.5p this morning, giving a market cap of £6M – in my view, in the current financial environment, a fair valuation for a company turning over c £7M per annum and with a half-year EPS of 1.3p.

Monday, 6 April 2009

Don’t forget the 2012 Olympics

Based on the comments I've received, it would appear that most of you agree with my prediction that 2009-10 will be a tough, but manageable year for those public sector software suppliers who have order books and good recurring revenues to live off, whilst 2010-11 looks like it could be one of the worst years for public sector software suppliers.

However, my prediction of a recovery in 2012 has not received such a high level of agreement – particularly for suppliers to UK Police Forces.


Here, suppliers are concerned about the impact of the 2012 Olympics on Police Forces’ normal procurements – yes, there will inevitably be some addition expenditure on software and services specific to the Olympics themselves, but it would appear that the expectation is that Forces will put off the procurement of other systems and service until after 2012, both to conserve funds for the inevitable extra workload and costs of the Olympics (not just confined to the Met) and to wait to see what impact the Olympics will have on security issues going beyond 2012.

I raised this with a supplier of software and services to larger local authorities, to see if he felt the Olympics could also affect procurements amongst London Boroughs and those authorities outside London that are hosting Olympic events. He felt not – and agreed with my prediction of a recovery in 2012 – but added the caveat that in the current environment, there could be no certainty on any date for a recovery, and that 2012 could be yet another reason for government generally to delay procurement decisions....

e-mail distribution

If you’re one of my many subscribers who receive my posts by e-mail, please accept my apologies for the late delivery of your posts. This seems to be a problem related to the change to BST – I’ve tweaked the time settings and hopefully you will now receive your e-mail updates on the same day new posts are made.....

Thursday, 2 April 2009

IT Forecasts – Holway’s Rant

If you are a Software or IT Services supplier, I strongly recommend that you read Richard Holway’s article today, where (unlike many other optimistic forecasters) he forecasts negative growth for the UK SITS sector in 2009 with no return to positive growth until 2010.

From the conversations that I’ve had with a number of software suppliers, I have to agree with his views for the general software and IT services (SITS) market, although I sense that the downwards curve for SITS companies working in the Public Sector market will be some 12-18 months behind the decline being suffered by suppliers to the private sector.

As I see it, many public sector software suppliers are currently living off orders won over previous years, with new business being hard to find. They are managing their cost bases to reflect their declining order backlog and, in many cases, do not appear to be unduly worried about the future – with the new financial year about to start, they believe that 2009-10 will produce as much business as 2008-09.

As I’ve said
before, this may be true in some specific areas (e.g. in Social Services, Housing and, probably, Education), but in other areas my discussions with potential customers have shown an increase in the “if it ain’t broken, don’t fix it” attitude. Also, with increased requirements for information and transactions to be made available electronically, I detect a much stronger move to bringing such services work back in-house, rather than contracting it out.

What could be worse for public sector software suppliers is that 2010 will (almost certainly) be a General Election year, with June 2010 being pencilled in for the actual election. Historically, this has meant that procurement processes and their decisions, typically kicked off early in the new financial year, will now be delayed until at least summer 2010 and, I believe in many cases, potentially through to the following 2011-12 financial year – with organisations waiting to see what financial constraints the new government will be imposing.

So - 2009-10 will be a tough, but manageable year for those suppliers who have order books and good recurring revenues to live off, whilst 2010-11 looks like it could be one of the worst years for public sector software suppliers, with any recovery delayed until 2012 at the earliest......

P.S. For me, the companies I’ve been associated with have always suffered from a recession in a year ending in one – starting with 1971 which saw me made redundant (by Fraser Williams) before I even started with them. We survived 1981 in IAL Gemini on the back of our niche applications, and fortunately Systemsolve was acquired by Radius just before the 1991 recession hit. 2001 was our last bad year, and it looks as if 2011 could continue the trend .....

Wednesday, 1 April 2009

Capita's acquisition of IBS is ruled anti-competitive...

The Competition Commission (CC) has provisionally concluded that the completed acquisition by Capita of IBS could damage competition in the market for the supply of revenues and benefits (R&B) software to local authorities in the UK. This will have come as no surprise to regular readers of this blog, nor will the CC’s decision that there are no similar concerns in the Social Housing software market.

Christopher Clarke, Inquiry Group Chairman, commented:

“This merger combines two closely competing suppliers of revenues and benefits software to local authorities, leaving only one other supplier actively competing for business. In a stable market with little prospect of entry by new suppliers, our provisional conclusion is that the enlarged Capita revenue and benefits business will be able to take advantage of the lack of competition, for example by increasing prices or reducing levels of service to its customers.

We consider it likely that the adverse effects of the merger will have an impact on all customers, whether they are in the process of tendering for new revenues and benefits software or already have a contract for such software in place.”

Given the remit of the CC, this decision was expected, but I still regard it as disappointing, given the nature of the small and declining market for R&B systems. By not giving the CC a wider remit, Government has, I believe, missed a great opportunity to put in place protection for the interests of existing & future IBS users, and possibly even some safeguards for other Capita R&B customers.

But now the decision is made, and the discussion of remedies commences. It would appear that the CC is unlikely to agree to “behavioural remedies” such as price controls or Capita’s maintenance and ongoing development of two R&B systems in parallel. Rather it is looking at the feasibility of splitting off just IBS’s R&B business (from the social housing part) and its viability as a stand-alone business unit, or whether a divestment of the full IBS business is required.

The CC is also looking for the views of potential purchasers of the IBS business (full or R&B only) and constructive suggestions for other remedies, behavioural or structural – although I doubt that there will be any serious suggestions in this latter area.

All parties have been requested to provide any views in writing, including any practical alternative remedies they wish the CC to consider, by 20 April 2009. The CC states that its findings may alter in response to comments it receives on its provisional findings, in which case the CC may consider other possible remedies, if appropriate. – but I’ll be surprised if they stop short of divestment.

Given the current position, it seems as if divestment is the preferred route, but of what, to whom, and for how much?


As I understand it, the CC will have considerable control over any divestiture, deciding on the form of the business to be divested, setting a timetable (typically 6 months), vetting/approving potential purchasers, and generally overseeing the divestment through to completion. I can think of at least two potential, serious bidders who will no doubt be knocking on CC doors over the next few weeks......

NHS to award new NPfIT contract to HHI

One of my moles told me yesterday that the NHS will announce on Wednesday that it has decided to recommend that HHI be appointed as an alternative software supplier to iSoft and Cerner for NPfIT projects.

This is a tribute to the Rapid Application Development (RAD) technology used by HHI, a small UK software company employing only 8 staff, who have been able to complete their development of a full care records system using Microsoft Visual Studio tools in just 6 months, and have been able to get the system live at the pilot site at Midshire PCT in only 8 weeks.

Andrew Gile, HHI’s Managing Director, praised his three technicians who, prior to the start of the development, had little or no experience of health applications. He explained that the team had been able to work with doctors and nurses from Midshire PCT to understand their requirements and build them into working prototypes, that had then been used to explain to NHS managers what the system was supposed to be used for.

Ursula Sit, HHI’s Implementation Manager, believes that the short 8-week implementation time was due to the end user interface being modelled on Facebook. She noted that “most end users were already heavy users of Facebook outside (and some inside) work, and were fully aware of the rich functionality available. The intuitive interface meant that the majority of end users required only a two hour introduction to the system functionality before they felt happy enough to use the system in practice”.

The selection of HHI should be confirmed at a meeting to be held on Wednesday morning with a formal announcement due around midday. Andrew notes that he hopes the announcement will stop the circulation of a great deal of the hot air that has been around the NPfIT project since its inception, and will allow his company, Hoof Hearted Inc, to get on with the planned 3-month roll out of the HHI system to the remaining 2,000 Trusts awaiting a new system......

I should have the confirmation of the selection of HHI around lunchtime on Wednesday – check back in the afternoon for confirmation.


Midday, Wednesday 1 April update: Yes, you guessed it, this was an April Fool post. But what about "agile" development processes - are they better than "waterfall" processes? Revisit this blog over the coming days for my views.....

Monday, 30 March 2009

The importance of touch....

Just by coincidence, one of the companies I met last week was fired up by the use of touch screens using an “iPhone”-like user interface, and over the next week I hope to be looking at a product developed using such technology alongside very large screens in some highly innovative application areas. Then, at a dinner, I raised the topic of such a “Touch” PC being installed in kitchens, primarily for use with cooking and recipes – and was surprised by the remarkable interest it received from the cooks around the table.

Windows Touch will be included in the next version of Microsoft Windows 7, and includes a very usable touch interface (see this
BBC article for a short demonstration). This touch interface is very similar to the user interface built into the iPhone and is apparently being incorporated into Apple Snow Leopard OS Update. (Indeed Windows 7 has been described as slick to the (Apple?) core.......)

Microsoft had a false start with this technology a couple of years ago with its
Microsoft Surface (TM) technology, based on tabletop computing. This technology was superb to use on a high-resolution, large screen laid flat as a tabletop – but for some reason the product doesn’t appear to have caught on and seems to be limited to the giving of very exciting demos. However, Microsoft Surface has now been rolled out to countries outside the USA, and appears to have gained some take-up in non-hostile user environments – although I’ve yet to see the technology in live operation myself, although perhaps it’s still early days in the UK (as its official launch in the UK was only last week).

Personally, I believe that on the back of Windows 7, touch technology will really start to take off. No, it’s unlikely to be adopted for “power” or professional” users who use a PC for hours on end each day – but for the adhoc user (e.g. the “kitchen” PC) it could become the ‘norm’ for good user interface experiences. If you are not yet looking at such an interface, then I would encourage all product managers to investigate and start planning now for the use of the touch facilities of Windows 7.

Wednesday, 25 March 2009

Maxima and Touchstone

Following on from yesterday’s post on listed software company valuations and private equity interest, I now turn to two more software houses that have grown rapidly through acquisition, but, following recent setbacks, have seen their share prices plummet (to a level where they have become acquisition targets themselves). Neither is heavily involved in the supply of software and services to the UK public sector, although both have completed projects for Government organisations.

Through my involvement with Microsoft, I have been a close follower of both Maxima and Touchstone who have sought to grow by acquisition of many smaller software companies over the past few years, primarily focussed on the supply of software and services from Microsoft’s Dynamics range, although both supply software and services based on other supplier’s accounting systems and infrastructure products.

Both companies have issued profit warnings this month – Touchstone’s was expected, as its share price was already down 90% since its peak in 2007, and only moved down a bit – whilst Maxima’s was a bit of a surprise which saw its share price plunge to under 60p (down from a peak of 330p in 2007).

There is obvious concern about both companies’ performances deteriorating further, but looking at the revised expectations for the two companies:

Touchstone Maxima
Revenue c. £30M £50-56M
Forecast profit £0.3M £5M
Net debt None/minimal c. £15M?
Forecast eps 1.9p 19.1p
Current share price 18p 57p
Market Cap £2.3M £14M
(apologies for the poor formatting - I've yet to find out how to publish tables in Blogger format)

Looking at these revised estimates, Maxima seems to managing the downturn better than Touchstone – perhaps due to over 50% of its revenue being recurring, and therefore giving the company some time to cope with any contract terminations – whilst Touchstone seems more dependent on new sales, and these seem to have been much more difficult to make in the current financial environment.

The concern must be that Maxima’s downturn, partially protected by its contracted recurring revenue, might be just as great as Touchstone’s but the downwards curve is perhaps 6-12 months behind Touchstone’s. However, if their share price continues its downward movement in expectation of this, perhaps Maxima will decide on a “kitchen sink job” for reporting in the year to 31 May 2009, so as to make the future prospects more palatable.

As with all shares, the key is to call the bottom – I think (hope?) Touchstone’s was the 15p it touched last week – meanwhile I fear that Maxima’s lowest price is a few months away. However, both are acquisition targets as soon as the worst news is out of the way - and then I wouldn’t be surprised to see them both bought by their existing management once the credit markets have recovered.

Of course, a merger of the two would build a very strong player in the Microsoft Dynamics market, a market that I believe will grow very rapidly once the financial climate changes for the better......

P.S. I have been a short-term holder of Maxima shares twice over the past 2 years, both times exiting at the right time – which is good as I’ve been a long-term holder of Touchstone, with my profits on Maxima only just off-setting my losses on Touchstone. I still hold a small number of Touchstone shares.

Tuesday, 24 March 2009

Private equity starts to target software companies again?

With many listed software companies shares prices hammered over the past months, there is no surprise that private equity has started to target these companies. In the Public Sector market, these depressed share prices and company valuations are likely to lead to more consolidation of suppliers (e.g. by the likes of 3i behind Civica, or KKR behind Northgate).

I’ll cover other companies in later posts, but looking at today’s news....

The Innovation Group receives bid

Ok – TIG is not really into the supply of public sector software, but it is a well known, listed software supplier with a chequered history, that has had some very well known names on its board (the most recent being Geoff Squire who resigned as Chairman recently).

Its share price had dropped from 55p in 2005 to 20p last year on the back of reduced profits, only to be hammered down to just over 4p recently following December’s announcement that it was being sued by one of its Canadian customers for £42M.

Now it is on the receiving end of a
potential offer of 15p from the private equity group Carlyle, valuing the company at c £100M – offering to complete due diligence in the next two weeks, but placing a pre-condition that the legal case be defeated by TIG....

The interesting hook is the timing of any court cases ... “Carlyle said that it could complete the first round of due diligence in two weeks and that its cash offer was subject to Innovation defeating a C$75 million (£42 million) lawsuit from Allstate, of Canada.” Clearly, if this case is to go to court (and possibly appeal) it will take years to get a court decision. So what are the options?

Firstly, I guess that the purchase price is discounted by all or part of the potential £42M claimed – leading to a bid price of around 8p (perhaps explaining why the current share price is only around the 7.5p mark).

Next, the Board is encouraged to agree a quick settlement with the customer out of court – say agreeing to pay £10-20M – personally I think this is unlikely – having been in such a situation before, senior management will be firmly entrenched in a “we will win at all costs” mode rather than a pragmatic “let’s get out of this as quickly and as cheaply as possible” mode.

The most probable scenario is that the case remains open, and the £100M offer is discounted by, say, 20% to cover a potential settlement (which the new owners are more likely to look for, to enable them to clean up their acquisition). So the current 15p offer drops to, say, 12p. (Or there is the risk that Carlyle do not bid at all).

The other option is that another bidder appears who is prepared to make an offer without any conditions on the legal action (except that his price will reflect whatever discount the current bidders have applied against their 15p bid). Then a bidding war might break out, without any pre-conditions.

If you are a shareholder like me, then I think it’s time to sit tight and see what happens – the bidding war would result in the best price, but without it, I’d bet on at least 12p from the current bidder.

Monday, 23 March 2009

apologies for the typo

Please accept my apologies for the typo in my previous blog - I've corrected the post, but for thos of you that received the post by e-mail or RSS feed, the sentence in question should read:

(However, projects like NPfIT where the contracts appear to have limited this ability for suppliers to exploit changes to fund overruns, have shown that in such confrontational approaches no-one ‘wins’, both client and contractor lose – and the end user gets an IT system that does not meet his needs).

Sunday, 22 March 2009

Civil Servants to lose their large pensions?

The IT disaster known as C-Nomis - an initiative, begun in 2004, by the National Offender Management Service to build a single offender management IT system for the prison and probation services – has been well documented over the past week following a National Audit Office investigation that found that the project had been hampered by poor management leading to a three-year delay, a doubling in project costs and reductions in both scope and benefits.

Computer Weekly used the project to illustrate one of its top tips for project managers, in its
editorial:

“Number one in the list of Computer Weekly's top tips for project managers is advice that's supposed to be humorous, even slightly cynical. It says that projects with realistic budgets and timetables don't get approved.

But reality trumps our satire: big projects keep being approved on the basis of unrealistic estimates of their cost and time to completion.

One government project executive has told Computer Weekly that budgeting in government is a game: if the Treasury and the department in question want the scheme approved, they turn a blind eye to irrationally low initial estimates of the cost and the timescales.”


C-Nomis joins many Central Government projects that have been unmitigated financial disasters – costing taxpayers over half a billion pounds – but small in comparison with other failed or failing projects like NPfIT that have cost taxpayers billions of pounds.

In any commercial environment, there would have been an internal search for the guilty, inevitable unemployment, and (if the taxpayer had to bail out the company) potentially loss of agreed pension rights. It has been suggested that the same fate should befall civil servants who bear responsibility for these IT disasters – not only unemployment, but also loss of their valuable, taxpayer funded pensions. Will it happen?

No chance.....

Firstly, we’ll never find out who was truly guilty – as I noted in my post
NHS NPfIT – a successful Government project? the definition of a successful Central Government project is one that lasts more than two years. As civil servants typically only stay in their positions for around two years, this ensures that the person that started the project doesn’t finish it, and the person that finishes it doesn’t start it. If the project is a success, both can claim the credit; whilst if it fails, both can blame each other.

Next, it will always be the fault of the supplier – “underestimated the project”, “Government had no choice but to fund the overrun”, .... etc .... To a certain extent, and in some cases, I might agree, but in most cases the supplier has no choice but to work on the incomplete brief given for the project, bidding low to win the business, on the basis that the gaps in the requirements can be exploited to increase the contract value greatly. (However, projects like NPfIT where the contracts appear to have limited this ability for suppliers to exploit changes to fund overruns, have shown that in such confrontational approaches no-one ‘wins’, both client and contractor lose – and the end user gets an IT system that does not meet his needs).

Finally, and I would agree with this, through no fault of any one individual, the current method of Central Government procurement of major IT projects remains seriously flawed – contracts are insufficiently scoped, requirements incomplete, end-users inadequately consulted – and contracts let prematurely, before either client or contractor know what is really required. (e.g. see my post
How NHS NPfIT should have been procured)

We must move back to a procurement process that allows for major projects to be properly analysed and designed before final contracts are let – ideally in a phased approach that, whilst giving us less certainty on final costs, is more likely to wind up with a properly designed system to meet real end user requirements, for less cost and shorter timescales than the current processes......