You dont have javascript enabled! Please enable it!

Algorithms & Benefits

Big Brother Watch (@BigBrotherWatch), a group dedicated to defending civil liberties and protecting privacy, has just release a report exposing the algorithms local councils have been using to screen benefit applicants. This is a quite disturbing exposé of how the population of the country are now being reduced to being ‘policed’ by computer algorithms. And don’t think that it is just the poorest elements of society that are being treated this way. You are probably even at this moment the subject of some computer program making decisions that effect your health, wealth and wellbeing. The more its tolerated, the more prevalent it will become to the degree that it becomes irreversible.

 

The following is an extract from this excellently researched report. The full report is available at the bottom of this introduction. Please take time to read it, awareness means you can make a decision on fighting it,. Never mind the Stockholm Syndrome, the entire British Isles seem to be suffering from a collective Tory Syndrome.

 

“Our report on Poverty and Digital Suspicion reflects many months of investigative research into the secretive emergence of a digital welfare state that risks perpetuating deeper data collection, processing and algorithmic bias, further disadvantaging some of our country’s most vulnerable people. This digital reshaping of the country’s century-old welfare system has happened behind closed doors, with minimal scrutiny and little public or parliamentary awareness. This report, which focuses on the digitisation of suspicion within local authorities, aims to shine a light on this drastic change and calls for transparency and reform.

In the course of our investigation, we have sent Freedom of Information requests (FOIs) to more than 400 local authorities about their use of algorithms, data analytics and automation in welfare systems; we have used Contracts Finder, Spend Network’s Insights and a data scraper to identify contracts from councils who claimed not to use algorithms in their decision making; and we have questioned both councils and contracted companies. We have been able to advance research of algorithms and data analytics in the welfare system both in breadth, across all local authorities, but also depth, taking deep dives into specific local authorities’ systems. In summary, we have found:

 

  • Approximately 1 in 3 local authorities risk-score people who receive housing benefit and council tax support when they apply, using opaque, privately-developed algorithms, covering more than 540,000 people.
  • Approximately 1 in 3 local authorities and more than 1 in 3 housing associations run predictive analytics to assess whether social housing occupants will keep up with rent payments. , adding up to 1.6 million tenancies.

Some large local authorities use bigger predictive systems that can model who is at risk of homelessness (Newcastle, Maidstone, Cornwall, Croydon, Haringey); others use similar systems to model children at risk of harm (Hillingdon, Bristol); whilst others can model general financial vulnerability (Barking and Dagenham). , with at least 250,000 people’s data being processed by huge predictive tools.

 

The Department for Work and Pensions conducts risk modelling of housing benefit recipients on a regular basis to predict who poses the highest fraud/error risk due to change of circumstance, and passes this data to local authorities.

 

Surveillance of the poor and vulnerable is becoming more deeply integrated in Britain’s welfare state. Algorithms decide who to subject to the most intrusive questions over their claim for welfare payments. Vast numbers of people who live in social housing are profiled every month to predict who will miss their rent payments. Complex predictive tools model the risk of homelessness, financial vulnerability and even the chance of harm from the pandemic. All of this happens without the knowledge or consent of the people whose data is secretively entered into these hidden systems.

 

Years of austerity and increasing pressure on local services have led to private companies advertising their supposedly high-tech solutions as the answer to diminishing council budgets, offering a way to do more with less.

 

Software firms are pitching technology as the answer to catching out fraudsters, making staff time more efficient and allowing for cost-saving early interventions.

 

However, automation and algorithms are not all they claim to be when it comes to public services. Whilst the claimed “benefits” are rarely evidenced, these opaque, invasive and often unfair systems rely on masses of personal data, generalised rules and stereotypes. Many of the predictive systems used by councils are, in practice, governed by private sector designers rather than publicly accountable officials. Furthermore, proxification of characteristics and algorithmic bias make discrimination in automated public services a real and threatening prospect.

 

Uncovering the real impact of AI and algorithmic decision-making in welfare is challenging, owing to low transparency in the welfare system, proprietary systems and the influence of private tech firms. This means that risks to people’s data rights go unchallenged. We are still unaware of a single case where an individual has been informed that they have been subjected to a purely automated decision, as per their legal rights under Article 22 of the General Data Protection Regulation (GDPR), indicating that vague legal definitions and a lack of oversight are facilitating a dangerous grey area between automated and human decisions.

 

Freedom of Information (FOI) requests have formed the basis of this report. Ever more refined queries and a ruthless approach to appeals yielded important disclosures and gave a detailed, if incomplete, insight into the surveillance of society’s most vulnerable.

 

Nevertheless, the influence of private suppliers in the FOI process is evident. There has been a pattern of identical responses from different authorities on the same issues that suggests coordination, possibly from a third-party supplier. At the same time, one data company even attempted to attend our meeting with FOI officers about a request.

With high-tech systems often needing support from software suppliers, there is a worrying fusion of public and private sector interests that have harmed transparency.

 

In addition, extensive corporate and desk research, conversations with colleagues in the sector and the wider welfare state have further informed this research and have been vital in shaping some critical avenues of research.

 

The overall picture is alarming. Algorithms and predictive models that treat the poor with suspicion and prejudice risk automating stigmas. Local authorities using these tools rarely seem to undertake thorough assessments of individuals’ data rights, including the risk of embedding discrimination in the wider welfare state.

 

Local authorities’ ignorance of the effects of digital surveillance and automation is mute complicity in the harm they can cause, and risks a failure to comply with Data Protection Act 2018, the Public Sector Equality Duty, and even the Human Rights Act. Radical transparency measures and greater clarity of how data protection laws apply to public sector algorithms are needed to stop the United Kingdom from becoming a country where the poor’s existence is conditional on constant surveillance by intrusive and secretive digital tools.

 

This is the United Kingdom’s digital welfare state. The harm it can cause has risked being unchecked and unchallenged for too long.”

The full report is shown below.

Leave a Comment

Send this to a friend