Palantir

Openly authoritarian US surveillance company Palantir is reportedly in talks with the Metropolitan Police to supply AI tools for use in criminal investigations. The talks have sparked concerns within the force itself over allowing the tech company access to sensitive data.

The news comes during the ongoing scandal surrounding the Labour government’s willingness to deal with the shady tech firm. Palantir has also supplied AI tools to the Israeli military and Trump’s murderous anti-immigrant militia, ICE.

Likewise, the Palantir outrage is also enmeshed with the ongoing scandal of ex-US ambassador Peter Mandelson’s ties to Jeffrey Epstein. Global Counsel, a lobbying firm which works for Palantir, just happened to be co-owned by Mandelson.

Palantir currently holds £600m in contracts with UK public bodies. These extend from the Ministry of Defence to law enforcement, and even to the NHS. As such, it’s completely unsurprising that two public petitions against Palantir in the UK recently attracted almost 230,000 signatures.

AI, at all costs

Back in January, Home Secretary Shabana Mahmood ordered police to initiate the use of AI “at pace and scale”. The initiative includes the creation of a police data centre which will cost £115m of public money. The national police chief in charge of AI use has already admitted that the centre will produce discriminatory results.

Some UK forces have already used elements of Palantir’s AI. For example, Leicestershire Police signed its contract with Palantir back in October 2024, for an eye-watering £818,750. It was the first UK police force to sign a deal with the US-based firm. The Good Law Project also reported that it:

has been sending freedom of information requests to all 45 police forces in the UK to ask if they also had deals with this dangerous firm. Thirty-five of them refused to answer, citing national security and law enforcement exemptions.

As part of the push for AI adoption, Palantir demonstrated some of its tech to senior Met intelligence division officers in March. The Guardianreported that the Palantir systems could be used to automate intelligence analysis for criminal investigations.

Sensitivity, overspending and bias

Whilst other forces have already struck contracts with Palantir, the Met would be by far the largest and most prominent in the UK to do so. However, this potentially multimillion-pound deal has met opposition within the force itself.

In particular, the issues relate to allowing the deeply unethical company access to highly sensitive data. This could include otherwise-confidential intelligence on criminal activities, and even victims’ personal information. Still other concerns relate to the potential waste of public money, with one anonymous source stating that:

We don’t need £100m AI. We would like the more basic systems we already have to work properly.

Beyond this, there are also massive issues with the use of AI for policing in general. Whilst AI decision-making is sometimes perceived as unbiased and emotionless, this couldn’t be further from the truth. Rather, it simply hides the — very human — biases in its training dataset behind a veneer of cold ‘fairness’.

In her report on AI biases in policing, the UN’s Ashwini K.P. — Special Rapporteur on contemporary forms of racism — specifically called out predictive policing. Back in 2024, Ashwini explained that:

When officers in overpoliced neighbourhoods record new offences, a feedback loop is created, whereby the algorithm generates increasingly biased predictions targeting these neighbourhoods. In short, bias from the past leads to bias in the future.

Palantir’s ‘Opaque and untested tools’

Oddly enough, police have already proven themselves perfectly capable of recognising the problems with AI. However, that only seems to apply when the AI is used against them.

You see, Palantir has already provided experimental AI to Scotland Yard, with the aim of catching ‘rogue’ officers. A Met spokesperson explained that:

There is evidence to suggest a correlation between significant levels of sickness, increased absences or unusually high overtime, and failings in standards, culture and behaviour.

However, the Police Federation (the cops’ quasi-union organisation) called the AI “automated suspicion”. It stated that:

Officers must not be subjected to opaque or untested tools that risk misinterpreting unsustainable workload pressures, sickness or overtime as indicators of wrongdoing.

The Federation spokesperson also added:

Any system that profiles officers using algorithmic patterns must be treated with extreme caution. Policing already operates under some of the broadest and deepest scrutiny of any profession … If forces are serious about raising standards and public confidence, the focus must remain on proper supervision, fair processes and human judgment, not the automation of suspicion.

Funny how they can notice the problems with AI when it’s cops in the hot seat, isn’t it? And, of course, that “broadest and deepest scrutiny” bit is pure bullshit, too.

Petitions against Palantir

Given the many grave concerns surrounding the US tech firm – not least of which relate to its recent publication of an authoritarian political manifesto — public anger against Palantir’s involvement in UK institutions has only continued to grow.

In particular, campaigning organisation 38 Degrees has recently promoted two anti-Palantir petitions. One of them called on Labour to end all public contracts with the firm. The other urged health secretary Wes Streeting to scrap the £330m Palantir-NHS patient data contract. It stated that:

right now we’ve got a major chance to get their NHS deal scrapped – and kick start a domino effect that sees ALL their contracts ended.

How? Ministers actually have the option to cancel the NHS deal early. But with these contracts flying under the radar, it’ll take all of us speaking up to make that happen.

Matthew McGregor, 38 Degrees’ chief executive, said:

Almost a quarter of a million people have said loud and clear: they don’t want a company like Palantir, whose technology is used by ICE and the Israeli army, to have access to their most sensitive data.

Palantir is a corporate embodiment of the dire risks of handing over our data and public services to amoral tech barons. The public have spoken, ministers have spoken, and the anger at this scandal only continues to rise. Both the Met and the charlatans in Labour must listen to the people — we don’t want Palantir anywhere near us.

Featured image via 38Degrees

By Alex/Rose Cocker


From Canary via This RSS Feed.