Read the Beforeitsnews.com story here. Advertise at Before It's News here.
Profile image
By Big Brother Watch
Contributor profile | More stories
Story Views
Now:
Last hour:
Last 24 hours:
Total:

Councils’ “hidden algorithms” profile millions on benefits, Big Brother Watch investigation finds

% of readers think this story is Fact. Add your two cents.


Big Brother Watch’s new report “Poverty Panopticon: the hidden algorithms shaping Britain’s welfare state”, published today, reveals:

  • 540,000 benefits applicants are secretly assigned fraud risk scores by councils’ algorithms before they can access housing benefit or council tax support.1

  • Personal data from 1.6 million people living in social housing is processed by commercial algorithms to predict rent non-payers.2

  • 250,000+ people’s data is processed by a range of secretive automated tools to predict the likelihood they’ll be abused, become homeless or out of work.3

  • Campaigners file complaint with data watchdog

Big Brother Watch claims that most of the algorithms it uncovered are “secretive, unevidenced, incredibly invasive and likely discriminatory”.

The privacy campaign group’s long term investigation has found councils across the UK conducting “mass profiling” and “citizen scoring” of welfare and social care recipients to predict fraud, rent non-payments and major life events. The campaigners complain that councils are using “tools of automated suspicion” without residents’ knowledge and that the risk scoring algorithms could be “disadvantaging and discriminating against Britain’s poor”.

Invasive”

An algorithm by tech company Xantura, used by two London councils, claimed to predict residents’ risks of negative impacts arising from the coronavirus pandemic and even whether they were likely to break self-isolation rules. The ‘Covid OneView’ system is built on thousands of pieces of data held by councils, including seemingly irrelevant personal information such as details on people’s sex lives, anger management issues or if they possess a dangerous dog.

Unevidenced”

Algorithms that assign fraud risk scores to benefits claimants, used by over 50 councils, are set targets to assign 25% of claims as medium risk and 20% as high risk by the Department for Work and Pensions. However, in documents obtained by Big Brother Watch, some councils found the “risk based verification” algorithm was “yet to realise most of its predicted financial efficiencies” and approximately 30 authorities have dropped the tool in the past 4 years.

One woman in receipt of housing benefit, who sent a formal request for her data to Brent council, said she was “stunned” to find she had been flagged as “medium risk” of fraud. The woman, who wished to remain anonymous, said:

I wasn’t aware that my council was risk-scoring me and it is disgusting they use so much of my personal data in the model, something I had no idea about.

I’ve noticed the amount of evidence I’ve been asked for has changed over the years, which makes it really stressful. I’ve been made to go through all my bank statements line by line with an assessor, which made me feel like a criminal. Now I wonder if it’s because a machine decided, for reasons unknown, I could be a fraudster.

It feels very unjust for people like me in genuine need, to know I’m being scrutinised and not believed over evidence I provide.”

Discriminatory”

Big Brother Watch’s report details how the London Borough of Hillingdon’s ‘Project AXIS’, aimed at assessing children’s risk of future criminality, gathers data from police, schools, social care, missing persons, care homes, and even social media, without residents’ knowledge. The council claims “no piece of information is too small” for the database. Campaigners warn of similarities to the Metropolitan Police’s controversial Gangs Matrix database which the Information Commissioner found was operating unlawfully by holding data on people who were unconnected to gang activity and disproportionately profiling young, black men.

Big Brother Watch’s long term investigation involved over 2,000 Freedom of Information requests, covering over 400 local authorities in the UK.

Secretive systems”

The campaign group says this information should be publicly available, and “secretive systems of digital suspicion should not be used behind closed doors”. With private companies contracted to supply many public sector algorithms there is still little detail known about how most of these so-called ‘black box’ algorithms work. Commercial confidentiality can also mean that individuals rarely know how automated systems could be influencing decisions about their lives.

Calls for transparency

The group is calling for a public register of algorithms that inform decision-making in the public sector, and for authorities to conduct privacy and equality assessments before using predictive tools to mitigate the risks of discrimination. Such assessments, the group found, were rarely conducted.

Big Brother Watch is encouraging people in receipt of welfare or social care to send Data Subject Access Requests to their council to request their risk scores and has published draft request forms.

The campaigners have also lodged a complaint with the Information Commissioner, calling for an “urgent inquiry to uncover and regulate the Wild West of algorithms impacting our country’s poorest people.”

QUOTES

Jake Hurfurt, Head of Research and Investigations at Big Brother Watch, said:

Our welfare data investigation has uncovered councils using hidden algorithms that are secretive, unevidenced, incredibly invasive and likely discriminatory.

The scale of the profiling, mass data gathering and digital surveillance that millions of people are unwittingly subjected to is truly shocking. We are deeply concerned that these risk scoring algorithms could be disadvantaging and discriminating against Britain’s poor.”

Unless authorities are transparent and better respect privacy and equality rights, their adoption of these technologies is on a one way ticket to a poverty panopticon.”

Lord Clement-Jones, Chair of the Select Committee on Artificial Intelligence said:

The evidence presented by Big Brother Watch in this report of the widespread secretive collection by public authorities of the personal data of the most vulnerable in society and its use in opaque algorithmic decision making systems is deeply alarming.

It increasingly adds up to a surveillance state where data protection and equality rights are ignored.

The Government, as Big Brother Watch recommends, need to urgently strengthen regulation over these algorithmic systems, introduce a public register and assert the ethical values of transparency, privacy, freedom from bias and accountability that should govern their use alongside a duty to openly publish impact assessments before their deployment and regular audit thereafter.”

Sara Willcocks, Head of External Affairs at the charity Turn2Us, which helps people living in poverty in the UK, said:

“This new research by Big Brother Watch has highlighted an incredibly concerning trend where those of us on low incomes are treated with suspicion, bias and discrimination.

“A decade of cuts, caps and freezes to our social security system has left cash-strapped councils relying on outsourced algorithms. We urge both the DWP and local authorities to review the findings of this report and ask themselves whether this is an ethical or even practical way to go about their work and to develop a fairer and more compassionate approach to this moving forward.”

NOTES

Spokespeople are available – enquiries can be directed to [email protected] or 07730439257

Read the full report here

1Figures based on the number of housing benefit recipients living in areas operating risk based verification. As limited data on council tax support is available the real number is likely to be larger.

2Figures taken from Mobysoft RentSense marketing materials – available on request

3Figure based on the total number of people known to be profiled by Bristol Council in addition to the number of housing benefit claimants in the 6 areas using Policy in Practice’s LIFT. The true figure will be higher as it is not known how many people are on the Xantura OneView systems or the Project Axis database.

The post Councils’ “hidden algorithms” profile millions on benefits, Big Brother Watch investigation finds appeared first on Big Brother Watch.


Source: https://bigbrotherwatch.org.uk/2021/07/councils-hidden-algorithms-profile-millions-on-benefits-big-brother-watch-investigation-finds/


Before It’s News® is a community of individuals who report on what’s going on around them, from all around the world.

Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.

"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.

Please Help Support BeforeitsNews by trying our Natural Health Products below!


Order by Phone at 888-809-8385 or online at https://mitocopper.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomic.com M - F 9am to 5pm EST

Order by Phone at 866-388-7003 or online at https://www.herbanomics.com M - F 9am to 5pm EST


Humic & Fulvic Trace Minerals Complex - Nature's most important supplement! Vivid Dreams again!

HNEX HydroNano EXtracellular Water - Improve immune system health and reduce inflammation.

Ultimate Clinical Potency Curcumin - Natural pain relief, reduce inflammation and so much more.

MitoCopper - Bioavailable Copper destroys pathogens and gives you more energy. (See Blood Video)

Oxy Powder - Natural Colon Cleanser!  Cleans out toxic buildup with oxygen!

Nascent Iodine - Promotes detoxification, mental focus and thyroid health.

Smart Meter Cover -  Reduces Smart Meter radiation by 96%! (See Video).

Report abuse

    Comments

    Your Comments
    Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

    MOST RECENT
    Load more ...

    SignUp

    Login

    Newsletter

    Email this story
    Email this story

    If you really want to ban this commenter, please write down the reason:

    If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.