More than 200,000 people have wrongly faced investigation for housing benefit fraud and error after the performance of a government algorithm fell far short of expectations, the Guardian can reveal.
Two-thirds of claims flagged as potentially high risk by a Department for Work and Pensions (DWP) automated system over the last three years were in fact legitimate, official figures released under freedom of information laws show.
It means thousands of UK households every month have had their housing benefit claims unnecessarily investigated based on the faulty judgment of an algorithm that wrongly identified their claims as high risk.
It also means about £4.4m has been spent on officials carrying out checks that did not save any money.
The figures were first obtained by Big Brother Watch, a civil liberties and privacy campaign group, which said: “DWP’s overreliance on new technologies puts the rights of people who are often already disadvantaged, marginalised and vulnerable in the backseat.”
The DWP said it was unable to comment in the pre-election period. Labour, which could be in charge of the system in less than two weeks time, has been approached for comment.
An information commissioner inquiry into algorithms and similar systems used by a sample of 11 local authorities last year reported: “We have not found any evidence to suggest that claimants are subjected to any harms or financial detriment as a result of the use of algorithms or similar technologies in the welfare and social care sector.”
But Turn2us, a charity that supports people who rely on benefits, said the figures showed it was time for the government to “work closely with actual users so that automation works for people rather than against them”.
To determine the risk that a claim could be wrong or fraudulent, the technology weighs claimants’ personal characteristics including age, gender, number of children and the kind of tenancy agreement they have.
Once the automated system flags a housing benefit claim as potentially fraudulent or erroneous, council staff are tasked with reviewing and validating whether claim details are correct, which involves seeking evidence from claimants over the phone or digitally. They must identify changes of circumstances and potentially recalculate claimants’ housing benefit awards.
The DWP decided to deploy the automated tool, which does not use artificial intelligence or machine-learning, after a pilot that showed 64% of cases flagged as high risk by the DWP model were indeed receiving the wrong benefit entitlement.
But outcomes of subsequent case reviews that claimants later faced revealed far less fraud and error. Only 37% of suspicious cases were wrong in 2020-21, 34% in 2021-22 and 37% in 2022-23. This is almost half as effective as the prediction.
Nevertheless, the system did save the taxpayer money, with every pound spent undertaking full case reviews of suspect claims returning £2.71 in savings, accordng to figures for 2021/22 released by the DWP.
Last year the DWP widened its deployment of artificial intelligence to uncover fraud and error in the universal credit system – which cost £6.5bn in the last financial year – despite warnings of algorithmic bias against groups of vulnerable claimants. It has been criticised for its lack of transparency about how it is using machine learning tools. In January it emerged the DWP had stopped routinely suspending benefit claims flagged by its AI-powered fraud detector. That move came in response to feedback from claimants and elected representatives.
Susannah Copson, a legal and policy officer at Big Brother Watch, said: “This is yet another example of DWP focusing on the prospect of algorithm-led fraud detection that seriously underperforms in practice. In reality, DWP’s overreliance on new technologies puts the rights of people who are often already disadvantaged, marginalised and vulnerable in the backseat.”
She warned of “a real danger that DWP repeats this pattern of bold claims and poor performance with future data-grabbing tools”.
“It was only recently that the government tried – and failed – to push through intrusive measures to force banks to conduct mass algorithmic monitoring of all customer accounts under the premise of tackling social security fraud and error. Although the powers failed to make it through legislative wash-up, concerns for DWP’s relentless pursuit of privacy-invading tech remain.”
One version of the system is for sale to local authorities on the government’s digital marketplace website from a company called D4S DigiStaff.
It tells councils: “Our innovative HBAA intelligent automation solution will allow you to process all of your reviews with minimal impact on your staff.” It cites benefits as including releasing staff to undertake higher-value tasks and making savings on councils’ DWP funding.