Wednesday, 11 August 2010

Fraud–if only Government knew what Government knows….

The Government has announced yet another attack aimed at getting fraud out of the benefit system. But like all such pledges made over the past 15 or so years, will it succeed?

The use of external credit checking organisation is a worthwhile step forward – these organisations have the sorts of sophisticated databases and search engines that, despite continual recommendations by the IT industry, Government has consistently failed to put in place over the past two decades. (Within Radius we submitted numerous proposals to central government, and yet not one was picked up – the common reason being given that “there’s no money for fraud detection”).

At the time our strap-line was:

“If only local authorities knew what local authorities know”

It’s still the same today – within local authorities (and some Government departments), data is held in individual departmental silos – inaccessible to their own internal fraud teams, yet alone front-line staff dealing with benefit claims. Then each local authority is an island of information separated from its neighbouring (and all other) authorities – how many housing benefit claimants claim benefit in one local authority area whilst having a taxi driver licence in another (or even working for another authority?).

Meanwhile, central government has numerous lists of relevant names (e.g. the names of tens of thousands of immigration/asylum offenders and absconders who have exhausted the appeals process, and are not entitled to public funds), that, even if they were made available to local authorities and other departments, could not be used because of the lack of investment in counter-fraud computer systems.

The National Fraud Initiative (NFI) operates a data matching service for participating organisations, using data matching across a large number of databases to identify potentially fraudulent activity. In its last report, the NFI claims to have found £215M p.a. of fraud, perhaps a reasonable result in absolute terms, but a very small percentage of the National Fraud Authority’s figure of £7 billion p.a. of public sector fraud estimated to be in the system. The biggest gap in this initiative is that only one government agency took part – not a single central government department participated.

The next gap is within local authorities and other organisations using NFI – in 2008/09 only 269 prosecutions resulted from the NFI – and although 16,535 blue badges and 21,534 concessionary travel passes were cancelled, this is hardly tackling serious benefit fraud. NFI sends authorities lists of ‘potentially fraudulent activity’ – many (most?) authorities lack the funds/staff/time to investigate the people identified by NFI. As one councillor told me “there’s no money in fraud detection” whilst another told me “I don’t want to catch housing benefit fraudsters amongst my electorate – they clearly need the money to live, and if government is prepared to pay them, why should the council seek to stop them?”.

Also, as most Officers will say, it is a lot easier to detect fraud at the point it tries to enter the system, rather than after it is in the system. Carrying out an annual data matching exercise is too little too late – such data checking should be available at the time a claim is submitted – not up to 12 months later. Also, where authorities’ fraud teams are investigating individuals, or have a known fraudster, there are very few ways that the information on that individual can be shared with other authorities or organisations (worthwhile regional initiatives such as LTAF – London Team Against Fraud – have been starved of cash and doomed from the time of their birth).

So how to move forward?

The best way may be to have a unified benefits system under the control of a single body. If this were assisted by each claimant having a unique ‘entitlement’ number or card then so much the better (NINO’s would have helped, had there not been, reputedly, over half a million extra NINOs in existence). This may be the new government's aspiration, but I doubt that it will happen quickly, so in the meantime key initiatives should include:

  • local authorities must be encouraged (via much more generous financial subsidies/payments) to find and stop fraud both entering the system, and once it is in the system
  • the NFI must be expanded to include all government departments, and
  • the NFI’s systems should be re-engineered to allow for much more frequent data checks (to help stop fraud entering the system)
  • a secure, centralised database/network must be created to allow the fraud departments with local authorities and other public sector (and perhaps private sector) organisations to share information on confirmed and suspected fraudsters.

P.S. Many Councillors and Officers try to hide their lack of support for counter-fraud data matching behind the Data Protection and other Acts. I won’t try to examine the detail of the complex legal framework, nor add the caveats about informing citizens, but suffice it to say that data matching exercises are OK provided they are solely ‘for the purpose of assisting in the prevention and detection of fraud’.

Thursday, 5 August 2010

Wave goodbye to Google Wave

Only just over a year after its launch, Google has announced the end of its Google Wave initiative as a stand alone product.

As I noted in my Google Waves hello to Microsoft post last year, the initiative introduced an integrated set of tools around email and instant messaging that would make it a whole lot easier to manage internal collaboration that other tools then on the market. It has driven some changes in other suppliers’ offerings, with Microsoft’s Outlook 2010 introducing a small part of the same functionality (and I think more will come over the coming years).

However, Google apparently never overcame the security implications of Google Wave – one of the factors that has contributed to a very poor take up of the technology by corporate customers. To quote from the Google announcement:

“ …. Wave has not seen the user adoption we would have liked. We don’t plan to continue developing Wave as a standalone product, but we will maintain the site at least through the end of the year and extend the technology for use in other Google projects.”

So perhaps some of the technology will find its way into other Google products….

Monday, 2 August 2010

SaaS is the answer – not G-Cloud

Over the past couple of weeks, I’ve posed an interesting conundrum to a number of decision makers involved in purchasing software applications for local authorities. All discussions started off on the premise that ‘cloud’ computing is the way forward, and the question was:

If you were looking for a new back office application, assuming the 5-year cost of ownership is the same, which of the following would have the greatest impact on your decision:

  1. the application runs in G-Cloud?
  2. the application runs in a supplier-managed cloud?
  3. the application runs on the LA-managed cloud/data centre?
  4. the licence agreement is on a SaaS basis?

In all but one case, the answer was a pricing based on SaaS, where the LA pays based on usage – there was little concern about where the application was run, the focus was on price and matching that price to usage.

To me this was a surprise, firstly, because LA’s have historically disliked the ‘blank cheque’ approach of true SaaS agreements, where if usage increases so does the cost – they have typically liked to budget for a cost and know that that budgeted cost will not be increased, come what may. In discussions, most did not expect transaction volumes to increase, and wanted the flexibility of SaaS agreements to allow them to manage and move their transaction volumes as the years progressed.

This led to discussions on the types of SaaS agreements being offered. Most said that they did not regard SaaS agreements that stipulated a minimum level of commitment as true SaaS contracts. They regarded volume pricing as acceptable under SaaS (i.e. if volumes decline below certain levels, then per unit pricing increases), but only if they were not punitive. (However, perversely, one person believed that suppliers should have a maximum cap on total annual cost - whilst not having a minimum level of commitment on the LA).

I was also surprised by the lack of emphasis on, or questioning about security levels. Either the messages about overcoming the security hurdles are getting through, or there is a growing cynicism about IT departments over-stating the potential problems to support their own preferred solution. Not surprisingly, the one dissenting voice came from Social Services who put security top of his list, and would insist on the applications only running under the management of the LA.

Although my survey size was not large enough to draw firm conclusions, it would appear that suppliers will need to look harder at their own commercial terms, rather than just the technical hosting solution, if they are to win over new customers in the future.

Thursday, 15 July 2010

G-Cloud – data centre or true cloud?

My latest project has given me considerable exposure to the design and development of new applications to be hosted in the ‘cloud’ - albeit for a commercial company rather than a public sector customer. However, it has led to a number of contacts asking for my opinion on the Government’s G-Cloud initiative.

On paper, G-Cloud is potentially a money-saving initiative even if, as I suspect, it is more a shared data centre than a true cloud computing initiative. The question is whether the project can be made attractive to the many, largely autonomous, organisations that form the UK Public Sector. Whilst Central Government departments can perhaps be expected to sign up to this initiative (although I doubt that there will be universal acceptance without the use of a stick or a very large carrot), past experience has shown that other organisations like local authorities, police and others will be far more reluctant.

As I’ve discovered in my cloud projects, once the security issue has been overcome, the next most important factor is cost and the ease with which additional computing power can be bought on stream to deal with peaks in demand. Can G-Cloud match commercial cloud providers such as Microsoft, Amazon or others?

My view is that it is unlikely to. Running a cloud purely for the UK Public Sector means that it will have to be sized to cope with the peak demands of its customers – peaks that will in many cases all occur at the same time. If a flexible pricing policy is adopted, then I suspect that it cannot be competitive with commercial cloud suppliers who manage a wide variety of peak demands, and can therefore spread their costs better. And what comes first? – the computing power or the demand? – in a public sector heavily constrained by budget restrictions can the computing power be put in place before the demand is contracted?

No – I suspect that G-Cloud will be more a shared data centre, with organisations committing to take up a dedicated level of computing power, with the level of pay-for-what-you-use computing limited by relatively high on-demand pricing (but still likely to be cheaper than running in-house options).

Also, it will be interesting to see if the concept of the Government App Store succeeds or not. In theory it should, but in practice I fear it will be limited by the current architecture of many of the existing applications used within the UK Public Sector. In the short-term, what I expect instead is a few suppliers cleverly offering a SaaS pricing approach on existing ‘legacy’ applications without embracing a true on-demand use of hardware – most probably against a ‘minimum commitment’ that will limit the cost savings for users.

But given a software generation or two, I believe new applications will be developed that have been designed to make optimum use of true on-demand, cloud computing systems. Only then will the real cost benefits of cloud computing possibly be realised by the UK Public Sector.

Monday, 12 July 2010

No surprises with cancellation of BSF

I guess that the largest announcement whilst I was in N America was the cancellation of the Building Schools for the Future (BSF) programme.

The cancellation should not come as a surprise to suppliers – the Conservatives’ opposition to such programmes was very clear, and even Alastair Darling’s announcements on capital spending (in the 2009 Budget Report he signalled a halving of the government’s capital programme from £44 billion to £22 billion per annum) meant that were Labour re-elected, the BSF programme was unlikely to continue. Over the past 18 months I’ve worked on a number of business plans in the Education sector, and each one identified the potential cancellation of BSF as a major external risk factor.

Leaving aside the political arguments, working in the software supplier space, personally I’m pleased to see the back of BSF. The programme was vastly complicated, bid processes involved crowds of unnecessary people, far too many tiers of contractors and sub-contractors were involved, and the resulting main contracts seemed over-priced and poor value to the customer.

The impact on the ICT main contractors will be immense. They have incurred massive bid costs on the basis of gaining volume and recouping their costs from rolling out solutions to multiple schools across the regions – roll-outs that are now unlikely to happen. However, I believe that the canny suppliers will keep in there – the BSF programme may be dead, but the need for new build and modernisation of secondary schools remains – the building programme will be cut back, but the need for ICT will remain – and if the procurement is devolved more, then perhaps that ICT procurement will become less complex, it may again be possible to deal with the customer staff that matter, and perhaps, just perhaps, customers will focus more on the real educational computing needs, rather than the suits’ view of life…..

However, in the post-BSF (and post-BECTA) era, if we are to have devolved procurement of new ICT for schools, I believe it will be essential for central government to retain a central advisory and supervision role to help the agreement and implementation of open standards across all areas of software in use at schools. Allowing an unmanaged explosion of small-system developments/implementations could result in a bigger long-term waste of taxpayers’ money than the planned ICT expenditure in the expensive BSF programme.

Seeing off the grizzly in my OWL…

I was fortunate to spend the last week of my time in North America in Yellowstone, where I was able to knock off another item on my OWL (Outrageous Wish List) when I met up (at a safe distance) with a grizzly bear (and its 3 cubs).

In talking to my hosts about my OWL, I was, once again, made aware of the American focus almost exclusively on work, and their general lack of planning of their own personal lives. At this stage I can plug the services of Richard Maybury who, in addition to helping me manage my work activities better, taught me the need to use the same techniques in managing my personal life – one of the ideas being to build an OWL of 10-20 things that you want to do in your life – and plan to knock 1-2 items off the list each year.

OK – so I’ve only managed to knock 4 items off my OWL in the past 5 years (getting to Yellowstone and meeting a bear was one) – but having the OWL (as well as using several of other time-management techniques in my personal, as well as my work, life) has made a big difference in my life.

In the USA they talk of bucket lists instead of OWLs, but surprisingly none of the people I met out there have one – do you?

Thursday, 20 May 2010

Is the FiReControl project at risk?

In today’s document outlining the new coalition government’s programme for government there is a short comment about the fire services, namely that the government will “stop plans to force regionalisation of the fire service.”

Whether this refers to the FiReControl project or Prescott’s wishes for a merging of the existing 46 English fire services into 9 regions is unclear. Personally I hope that this does not spell the end of FiReControl.

As I’ve said before, the case for FiReControl is very strong – as a nation we would have far a better response to major emergencies were an effective FiReControl system implemented. As ever, the problem is that the procurement of the technology for this project was fatally flawed from day 1 (see FiReControl – a catalogue of poor judgement and mismanagement ).

Whilst the incoming government will inevitably focus on the costs of the project, and the potential savings from cancelling the project, I hope that it doesn’t throw the baby out with the bath water. The current project may be flawed, but the underlying ideas and vision aren’t. If the current project is to be cancelled, let’s hope that the vision remains, and that it is taken forward more effectively and efficiently in a well-specified project with strong user engagement.