Wednesday 11 August 2010

Fraud–if only Government knew what Government knows….

The Government has announced yet another attack aimed at getting fraud out of the benefit system. But like all such pledges made over the past 15 or so years, will it succeed?

The use of external credit checking organisation is a worthwhile step forward – these organisations have the sorts of sophisticated databases and search engines that, despite continual recommendations by the IT industry, Government has consistently failed to put in place over the past two decades. (Within Radius we submitted numerous proposals to central government, and yet not one was picked up – the common reason being given that “there’s no money for fraud detection”).

At the time our strap-line was:

“If only local authorities knew what local authorities know”

It’s still the same today – within local authorities (and some Government departments), data is held in individual departmental silos – inaccessible to their own internal fraud teams, yet alone front-line staff dealing with benefit claims. Then each local authority is an island of information separated from its neighbouring (and all other) authorities – how many housing benefit claimants claim benefit in one local authority area whilst having a taxi driver licence in another (or even working for another authority?).

Meanwhile, central government has numerous lists of relevant names (e.g. the names of tens of thousands of immigration/asylum offenders and absconders who have exhausted the appeals process, and are not entitled to public funds), that, even if they were made available to local authorities and other departments, could not be used because of the lack of investment in counter-fraud computer systems.

The National Fraud Initiative (NFI) operates a data matching service for participating organisations, using data matching across a large number of databases to identify potentially fraudulent activity. In its last report, the NFI claims to have found £215M p.a. of fraud, perhaps a reasonable result in absolute terms, but a very small percentage of the National Fraud Authority’s figure of £7 billion p.a. of public sector fraud estimated to be in the system. The biggest gap in this initiative is that only one government agency took part – not a single central government department participated.

The next gap is within local authorities and other organisations using NFI – in 2008/09 only 269 prosecutions resulted from the NFI – and although 16,535 blue badges and 21,534 concessionary travel passes were cancelled, this is hardly tackling serious benefit fraud. NFI sends authorities lists of ‘potentially fraudulent activity’ – many (most?) authorities lack the funds/staff/time to investigate the people identified by NFI. As one councillor told me “there’s no money in fraud detection” whilst another told me “I don’t want to catch housing benefit fraudsters amongst my electorate – they clearly need the money to live, and if government is prepared to pay them, why should the council seek to stop them?”.

Also, as most Officers will say, it is a lot easier to detect fraud at the point it tries to enter the system, rather than after it is in the system. Carrying out an annual data matching exercise is too little too late – such data checking should be available at the time a claim is submitted – not up to 12 months later. Also, where authorities’ fraud teams are investigating individuals, or have a known fraudster, there are very few ways that the information on that individual can be shared with other authorities or organisations (worthwhile regional initiatives such as LTAF – London Team Against Fraud – have been starved of cash and doomed from the time of their birth).

So how to move forward?

The best way may be to have a unified benefits system under the control of a single body. If this were assisted by each claimant having a unique ‘entitlement’ number or card then so much the better (NINO’s would have helped, had there not been, reputedly, over half a million extra NINOs in existence). This may be the new government's aspiration, but I doubt that it will happen quickly, so in the meantime key initiatives should include:

  • local authorities must be encouraged (via much more generous financial subsidies/payments) to find and stop fraud both entering the system, and once it is in the system
  • the NFI must be expanded to include all government departments, and
  • the NFI’s systems should be re-engineered to allow for much more frequent data checks (to help stop fraud entering the system)
  • a secure, centralised database/network must be created to allow the fraud departments with local authorities and other public sector (and perhaps private sector) organisations to share information on confirmed and suspected fraudsters.

P.S. Many Councillors and Officers try to hide their lack of support for counter-fraud data matching behind the Data Protection and other Acts. I won’t try to examine the detail of the complex legal framework, nor add the caveats about informing citizens, but suffice it to say that data matching exercises are OK provided they are solely ‘for the purpose of assisting in the prevention and detection of fraud’.

Thursday 5 August 2010

Wave goodbye to Google Wave

Only just over a year after its launch, Google has announced the end of its Google Wave initiative as a stand alone product.

As I noted in my Google Waves hello to Microsoft post last year, the initiative introduced an integrated set of tools around email and instant messaging that would make it a whole lot easier to manage internal collaboration that other tools then on the market. It has driven some changes in other suppliers’ offerings, with Microsoft’s Outlook 2010 introducing a small part of the same functionality (and I think more will come over the coming years).

However, Google apparently never overcame the security implications of Google Wave – one of the factors that has contributed to a very poor take up of the technology by corporate customers. To quote from the Google announcement:

“ …. Wave has not seen the user adoption we would have liked. We don’t plan to continue developing Wave as a standalone product, but we will maintain the site at least through the end of the year and extend the technology for use in other Google projects.”

So perhaps some of the technology will find its way into other Google products….

Monday 2 August 2010

SaaS is the answer – not G-Cloud

Over the past couple of weeks, I’ve posed an interesting conundrum to a number of decision makers involved in purchasing software applications for local authorities. All discussions started off on the premise that ‘cloud’ computing is the way forward, and the question was:

If you were looking for a new back office application, assuming the 5-year cost of ownership is the same, which of the following would have the greatest impact on your decision:

  1. the application runs in G-Cloud?
  2. the application runs in a supplier-managed cloud?
  3. the application runs on the LA-managed cloud/data centre?
  4. the licence agreement is on a SaaS basis?

In all but one case, the answer was a pricing based on SaaS, where the LA pays based on usage – there was little concern about where the application was run, the focus was on price and matching that price to usage.

To me this was a surprise, firstly, because LA’s have historically disliked the ‘blank cheque’ approach of true SaaS agreements, where if usage increases so does the cost – they have typically liked to budget for a cost and know that that budgeted cost will not be increased, come what may. In discussions, most did not expect transaction volumes to increase, and wanted the flexibility of SaaS agreements to allow them to manage and move their transaction volumes as the years progressed.

This led to discussions on the types of SaaS agreements being offered. Most said that they did not regard SaaS agreements that stipulated a minimum level of commitment as true SaaS contracts. They regarded volume pricing as acceptable under SaaS (i.e. if volumes decline below certain levels, then per unit pricing increases), but only if they were not punitive. (However, perversely, one person believed that suppliers should have a maximum cap on total annual cost - whilst not having a minimum level of commitment on the LA).

I was also surprised by the lack of emphasis on, or questioning about security levels. Either the messages about overcoming the security hurdles are getting through, or there is a growing cynicism about IT departments over-stating the potential problems to support their own preferred solution. Not surprisingly, the one dissenting voice came from Social Services who put security top of his list, and would insist on the applications only running under the management of the LA.

Although my survey size was not large enough to draw firm conclusions, it would appear that suppliers will need to look harder at their own commercial terms, rather than just the technical hosting solution, if they are to win over new customers in the future.

Thursday 15 July 2010

G-Cloud – data centre or true cloud?

My latest project has given me considerable exposure to the design and development of new applications to be hosted in the ‘cloud’ - albeit for a commercial company rather than a public sector customer. However, it has led to a number of contacts asking for my opinion on the Government’s G-Cloud initiative.

On paper, G-Cloud is potentially a money-saving initiative even if, as I suspect, it is more a shared data centre than a true cloud computing initiative. The question is whether the project can be made attractive to the many, largely autonomous, organisations that form the UK Public Sector. Whilst Central Government departments can perhaps be expected to sign up to this initiative (although I doubt that there will be universal acceptance without the use of a stick or a very large carrot), past experience has shown that other organisations like local authorities, police and others will be far more reluctant.

As I’ve discovered in my cloud projects, once the security issue has been overcome, the next most important factor is cost and the ease with which additional computing power can be bought on stream to deal with peaks in demand. Can G-Cloud match commercial cloud providers such as Microsoft, Amazon or others?

My view is that it is unlikely to. Running a cloud purely for the UK Public Sector means that it will have to be sized to cope with the peak demands of its customers – peaks that will in many cases all occur at the same time. If a flexible pricing policy is adopted, then I suspect that it cannot be competitive with commercial cloud suppliers who manage a wide variety of peak demands, and can therefore spread their costs better. And what comes first? – the computing power or the demand? – in a public sector heavily constrained by budget restrictions can the computing power be put in place before the demand is contracted?

No – I suspect that G-Cloud will be more a shared data centre, with organisations committing to take up a dedicated level of computing power, with the level of pay-for-what-you-use computing limited by relatively high on-demand pricing (but still likely to be cheaper than running in-house options).

Also, it will be interesting to see if the concept of the Government App Store succeeds or not. In theory it should, but in practice I fear it will be limited by the current architecture of many of the existing applications used within the UK Public Sector. In the short-term, what I expect instead is a few suppliers cleverly offering a SaaS pricing approach on existing ‘legacy’ applications without embracing a true on-demand use of hardware – most probably against a ‘minimum commitment’ that will limit the cost savings for users.

But given a software generation or two, I believe new applications will be developed that have been designed to make optimum use of true on-demand, cloud computing systems. Only then will the real cost benefits of cloud computing possibly be realised by the UK Public Sector.

Monday 12 July 2010

No surprises with cancellation of BSF

I guess that the largest announcement whilst I was in N America was the cancellation of the Building Schools for the Future (BSF) programme.

The cancellation should not come as a surprise to suppliers – the Conservatives’ opposition to such programmes was very clear, and even Alastair Darling’s announcements on capital spending (in the 2009 Budget Report he signalled a halving of the government’s capital programme from £44 billion to £22 billion per annum) meant that were Labour re-elected, the BSF programme was unlikely to continue. Over the past 18 months I’ve worked on a number of business plans in the Education sector, and each one identified the potential cancellation of BSF as a major external risk factor.

Leaving aside the political arguments, working in the software supplier space, personally I’m pleased to see the back of BSF. The programme was vastly complicated, bid processes involved crowds of unnecessary people, far too many tiers of contractors and sub-contractors were involved, and the resulting main contracts seemed over-priced and poor value to the customer.

The impact on the ICT main contractors will be immense. They have incurred massive bid costs on the basis of gaining volume and recouping their costs from rolling out solutions to multiple schools across the regions – roll-outs that are now unlikely to happen. However, I believe that the canny suppliers will keep in there – the BSF programme may be dead, but the need for new build and modernisation of secondary schools remains – the building programme will be cut back, but the need for ICT will remain – and if the procurement is devolved more, then perhaps that ICT procurement will become less complex, it may again be possible to deal with the customer staff that matter, and perhaps, just perhaps, customers will focus more on the real educational computing needs, rather than the suits’ view of life…..

However, in the post-BSF (and post-BECTA) era, if we are to have devolved procurement of new ICT for schools, I believe it will be essential for central government to retain a central advisory and supervision role to help the agreement and implementation of open standards across all areas of software in use at schools. Allowing an unmanaged explosion of small-system developments/implementations could result in a bigger long-term waste of taxpayers’ money than the planned ICT expenditure in the expensive BSF programme.

Seeing off the grizzly in my OWL…

I was fortunate to spend the last week of my time in North America in Yellowstone, where I was able to knock off another item on my OWL (Outrageous Wish List) when I met up (at a safe distance) with a grizzly bear (and its 3 cubs).

In talking to my hosts about my OWL, I was, once again, made aware of the American focus almost exclusively on work, and their general lack of planning of their own personal lives. At this stage I can plug the services of Richard Maybury who, in addition to helping me manage my work activities better, taught me the need to use the same techniques in managing my personal life – one of the ideas being to build an OWL of 10-20 things that you want to do in your life – and plan to knock 1-2 items off the list each year.

OK – so I’ve only managed to knock 4 items off my OWL in the past 5 years (getting to Yellowstone and meeting a bear was one) – but having the OWL (as well as using several of other time-management techniques in my personal, as well as my work, life) has made a big difference in my life.

In the USA they talk of bucket lists instead of OWLs, but surprisingly none of the people I met out there have one – do you?

Thursday 20 May 2010

Is the FiReControl project at risk?

In today’s document outlining the new coalition government’s programme for government there is a short comment about the fire services, namely that the government will “stop plans to force regionalisation of the fire service.”

Whether this refers to the FiReControl project or Prescott’s wishes for a merging of the existing 46 English fire services into 9 regions is unclear. Personally I hope that this does not spell the end of FiReControl.

As I’ve said before, the case for FiReControl is very strong – as a nation we would have far a better response to major emergencies were an effective FiReControl system implemented. As ever, the problem is that the procurement of the technology for this project was fatally flawed from day 1 (see FiReControl – a catalogue of poor judgement and mismanagement ).

Whilst the incoming government will inevitably focus on the costs of the project, and the potential savings from cancelling the project, I hope that it doesn’t throw the baby out with the bath water. The current project may be flawed, but the underlying ideas and vision aren’t. If the current project is to be cancelled, let’s hope that the vision remains, and that it is taken forward more effectively and efficiently in a well-specified project with strong user engagement.

HIPs to go – at last….

OK – I predicted the demise of HIPS prematurely (see HIPS will go - but when... ) – but the new coalition government has today suspended the use of Home Information Packs (HIPs) by home sellers.

Introduced in 2007, the aim was to speed up the house selling process by obliging sellers to provide much of the required conveyancing information when properties are first put up for sale.

The packs were paid for by sellers and contained property information, title deeds, and local searches. But in practice many prospective purchasers ignored the HIP whilst making their decision, and actual purchasers resorted to getting their own local authority searches.

"Today the new government is ensuring that home information packs are history," said Housing Minister Grant Shapps.

"By suspending home information packs today, it means that home sellers will be able to get on with marketing their home without having to shell out hundreds of pounds upfront. We are committed to greener housing so from now on all that will be required will be a simple energy performance certificate" he added.

Thursday 1 April 2010

FiReControl – a catalogue of poor judgement and mismanagement

No – that title isn’t mine – it’s from Communities and Local Government (CLG) Committee Chair Dr Phyllis Starkey when launching the report of an enquiry into the FiReControl project (a programme to replace 46 local fire and rescue service control rooms with nine purpose-built regional control centres).

This will come as no surprise to regular readers who may remember my December 2008 post on this project:

“the project itself smacks of Government’s usual inability to follow best practices when procuring new IT systems….….. it has failed to involve key users in its design early enough, initially imposed a massively optimistic timescale for implementation, and seemingly failed to allow any contingencies in its plans and budgets.”

Yet again Central Government is giving us a lesson on how not to procure and implement new IT projects. Quotes from the evidence presented to the committee include:

“The problem stems again from a lack of user engagement at the early stages of the project.”

“the rush to procurement meant the level of detail in the specification did not reflect what the professional people were saying. That has plagued the project ever since, both in terms of delays and being over-optimistic about how quickly it could
be delivered, how much it was going to cost, and why certain things that were absolutely necessary were never specified and other things were put in that were not needed.”

As I have posted so many times, this is yet another project that has gone wrong before the initial contract was signed. The matter appear to have been compounded by (yet again one of my pet topics) the “adversarial relationship between the customer and contractor”. Central Government must get out of the current ways of procurement of these innovative systems:

  • Government under-defines requirement
  • Suppliers bid knowing that the requirements will change
  • Government awards contract on the basis of price rather than value
  • Government then involves end users who identify substantial changes to requirements
  • (in many cases like this, initial software solution is found not to meet the new requirements)
  • Suppliers use change control procedures to delay the schedule and increase the price of the contract to reflect the additional work required to meet the changes
  • Contractor and Supplier fall out – to the overall detriment of the project

I have great sympathy with both the supplier’s and the CLG’s management staff on this project. They appear to have done the best they can given the framework under which Government procures these types of projects. Although I do wonder what the unsuccessful bidders for the original project said in their proposals – did they point out the likely problems, allow for them in their bids (and get ruled out because of the resulting higher price and/or delayed schedule?).

As I noted in my 2008 posts, we need this project to work – once implemented it should give us one of the best operational systems in the world. The good news is that main contractor EADS has entered into a new subcontract with Intergraph for its well-respected I/CAD product. Intergraph already appears to have stamped its authority and experience on this project and, whilst the lack of fully defined requirements so late in the project gives cause for concern, I have more confidence that they will be able to deliver a working central system than before their appointment.

P.S. You may be interested in some of my other posts on these topics:

Thursday 18 March 2010

Microsoft Mix 10 – designing Modern Web Apps

My current project is based around an interactive application delivered over the Internet to both full screen browsers and mobile devices with small form factors. Of key importance to the business plan is that the application be easy to use, and be capable of use by citizens who are not necessarily computer literate. (It’s also going to be delivered via Microsoft’s Azure cloud computing platform – but that’s another story).

As the product is being built with Microsoft tools, I’ve been up early this week watching the videos from the sessions given at Microsoft’s Mix 10 event in Las Vegas (you need to have downloaded or streamed the videos before the east coast of the US wakes up – from about midday onwards the response is very, very slow).

There is a lot about technologies not strictly relevant to my current project, but out of the Azure and Web Apps presentations I’ve seen to date, the best has been from Luke Wroblewski (not an MS employee) on the topic of Modern Web Form Design. In summary, Luke describes how to use modern web technologies/tools to deliver better end user experiences, and illustrates his talk with results from research into the end user acceptance, and use of, tools/techniques such as in-line validation, AJAX accordions and other such tools aimed at providing a better end user experience.

If, like me, you have an interest in this area (and would like to learn more about the methods to adopt in building web apps for small form factor mobile devices), then I thoroughly recommend the video (although be advised that it is over an hour long, just).

P.S. For those who want to know more about the Azure cloud computing platform at a fairly high level, then I recommend a Lap around the Windows Azure platform (although, yet again, this is about an hour long). This demonstrates the ease with which apps can be deployed to the cloud – although I can’t believe it’s as easy as the demo……

Monday 15 March 2010

How to split large Government IT projects

I’ve been intrigued by the debate on large Government projects and the use of the larger service suppliers that has been prompted by the Conservative Technology manifesto. Some rush to the defence of the larger suppliers, whilst others, typically coming from the SME sector like me, point to the way the current procurement process fails to include SME’s adequately.

My experience of working as potential subcontractors to the big service suppliers is that even though you may have a market-leading software solution, they will try to find a way to prove that the end customer will be better off with a customised solution built, typically from scratch, with lots of chargeable days from the main contractor, rather than making use of an SME solution. And how many software package selections do main contractors make on the basis of the amount of services required from the main contractor to implement the solution (rather than possibly a better/cheaper solution that doesn’t involve oodles of services from the main contractor)?

These large services companies are in these large projects to generate services revenue for themselves, maximise their margins, and to make money for themselves - not their subcontractors – who they will use only when they really need to – and typically then only with loads of chargeable time from the main contractor to oversee the subcontract procurement and subsequent management of the implementation project.

But why would we expect otherwise – it costs a great deal to bid for Government work, and once it’s won, who would expect the supplier to do anything else. No – the problem lies in the way Government structures, procures and manages these projects, not the way the big services companies work.

Even with the Conservatives’ proposed limit of £100M on IT projects, the projects are likely to fall outside the types of project that SMEs can bid for directly. Central government needs to change the way that it structures larger deals, and uses the larger services suppliers to oversee them. Yes, use a main contractor in a management role or responsible for integration, but making it clear how far that role goes, and in particular that whoever manages the procurement and oversees the project cannot fulfil any of the other roles. Why not set a minimum percentage of the project value that must be spent with SME subcontractors?

More importantly, split the application software development out into separate projects from the implementation and roll out (and have separate infrastructure supply and support projects). For major new developments, fund two or three SMEs to develop software in competition, keeping the best solution but being prepared to throw away one or more developed solutions before the cost of implementation and roll out – even though the developments have been paid for. Get experienced software developers involved sooner, and in touch with the end users to develop software that really meets their needs.

Let’s get a contractual framework where the main contractors are focused less on where their own services revenue will come from, and more on how to provide the best solution for the customer.

P.S. The Conservative plan for a small in-house ‘skunkworks’ team, to develop low cost applications and advise on the procurement of larger projects seems like step in the right direction. But will Government be able to recruit the appropriate resources – with all due respect to the IT civil servants I’ve met, in most cases, they are not the types of staff that will be the best for this new role. As noted above, why not make use of those staff in SMEs, calling on a much wider pool of experience, and in many cases with experience relevant to the specific project in mind…..

Thursday 4 February 2010

Fraudulent misrepresentation – what now?

There has been a lot of press comment over the past week about the recent Court ruling that HP/EDS must pay damages (“in excess of £200M”) to BSkyB for a failed CRM system. Surprisingly, much of this comment seems to suggest that this case will result in significant changes to the ways that IT suppliers will sell and contract in the future.

Yet the basis of the Court decision is that HP/EDS was guilty of fraudulent misrepresentation, and that HP/EDS could not rely on its ‘limit of liability’ clause to limit he amount of damages it had to pay to BSkyB. But this is neither a change to contract law – nor a new interpretation – under the Unfair Contract Terms Act 1977 (UCTA) suppliers have always been unable to exclude fraudulent misrepresentation, and under UCTA they cannot limit liability for such fraud.

What are surprises are the size of the likely damages (several times the value of the original contract), the apparently blatant misrepresentation carried out, and that the case ever came to court (it most similar cases there is an out of court settlement – note that in this case HP/EDS is rumoured to have spent over £40M in legal fees to date – suggesting that, yet again, the real winners in such cases are the legal eagles).

Hopefully, this case will serve as a wake-up call to Directors and senior managers to revisit their own internal procedures, training and guidance to all their customer-facing staff – and not just their sales staff (although they are the main concern), as it is just as likely that pre-sales staff, consultants and/or other staff could misrepresent the capabilities of a system being proposed to a prospect.

Most importantly, in addition to the proposal/tender vetting process, the contractual negotiation phase must be used by a supplier to fully vet its own proposal, collecting together any documentation and/or ‘side letters’, to try to avoid any prospect from relying on any statements that could be false. When I used to negotiate larger contracts I always openly asked the customer if there were any statements, email or documents that he was relying upon – and if so I insisted that they were referenced or included in the contract.

As I have said many times before, from my own experience gained in trying to turn around problem projects – as clearly happened in this BSkyB project - most failed projects have gone wrong before the contract is signed….

Thursday 28 January 2010

iPad – not yet for me

After all the hype the iPad is released – and it looks as if it’s just a larger version of an iPhone or iPod touch. Yes, it looks great with its sharp colour display, oleophobic screen and touch interface, but will it perform? (Many thanks to gizmodo for the picture – they also have a good overview of the iPad).

If one steps back from the hype (Steve - great launch by the way), and compares this to, say, a tablet PC running Windows 7, then the iPad comes up as lacking many key features that existing laptop, PC and Mac users would require.

The biggest failing I’ve seen is the iPad’s lack of any multi-tasking (already a major failing on the iPhone) – except, of course, for Apple’s own applications. So forget writing an e-mail whilst writing a document or having a Twitter.

Then of course there is Apple’s continuing refusal to support Flash – apparently to be continued with the iPad – this rules out many of the best web sites – which will appear on the iPad with large blank holes.

Will the iPad run any applications? – almost certainly not – Apple will retain its control over the apps it allows – all of which can only come through the Apple applications store – allowing Apple to control the types of application one can access, effectively censoring the apps that individual users have access to. Want to run Google Voice or a Browser other than Safari? – not allowed.

Cameras? – not yet – so no video conferencing.

So my view is that the iPad will be solely a consumer device – and one bought primarily by individuals who don’t make heavy use of computers at the moment – functionally, it really is like an iPhone with a larger screen.

What is more interesting is what the impact on the market will be. Microsoft, HP and others have pushed tablet PCs in the past without much success, and the touch-screen user interface of Windows 7 has not had much success to date. Will the launch of the iPad revitalise these initiatives and see the rise in keyboard-less, touch screen netbooks and tablets? – I certainly think so. Like many initiatives started by Apple, in the longer term, I think the iPad will result in significant changes to the way we view and interact with mobile computers and PCs.