palantir

The sea remains blue(ish), the sky remains grey, and Palantir is embroiled in yet another nascent big-data privacy scandal. We’re beginning to wonder if the tech firm that chose to name itself after the dark wizard Saruman’s seeing-stone might not be entirely on the level, you know.

The Guardianhave uncovered the Financial Conduct Authority’s (FCA) plans to hand Palantir access to financial regulation figures. The US defense contractor recently won the FCA contract, beating one unnamed competitor and gaining still-greater access to highly sensitive UK data.

Palantir: serious security concerns

The FCA ostensiblyregulates the UK’s financial services firms. However, it will now pay over £30,000 a week for Palantir to route through its data on UK firms. The Miami-based tech giant will be looking for evidence of irregularities like money laundering, insider trading, and fraud.

If the FCA is pleased with Palantir’s work, the company may be tasked with producing an AI system for the financial watchdog.

The Guardianexplained that:

Palantir is expected to apply its AI system, known as Foundry, to huge quantities of information held by the watchdog, including case intelligence files marked highly sensitive; information on so-called problem firms; reports from lenders about proven and suspected frauds; and data about the public, including consumer complaints to the financial ombudsman.

The data includes recordings of phone calls, emails and trawls of social media posts, the Guardian understands. The FCA is one of several UK agencies which aim to stop financial crimes that underpin harms such as the drug trade and human trafficking.

The article also reported concerns from within the FCA itself. One anonymous staff member at the watchdog asked:

Once Palantir understands how we detect money-laundering threats, how do we know that they are ethically reliable enough not to go to share that information?

‘Vast quantities of data’

Likewise, financial-crime specialist Christopher Houssemayne du Boulay, a partner at Hickman & Rose law firm, also raised serious security worries. He explained that:

When the FCA carries out an enforcement investigation, it has powers to compel firms to hand over vast quantities of data. We could be talking about hundreds of whole email accounts and full financial records. Many innocent people will be caught up in that and the data may contain bank account details, email addresses, telephone numbers and other personal information.

If you ingest that data and use it to train an AI system, there are very significant privacy concerns. There should be serious confidentiality requirements regarding what Palantir does with the data.

However, the FCA have insisted that it hasn’t given Palantir permission to copy the data it examines. Which is all fine then, because surely the data company wouldn’t do anything it isn’t allowed to do.

The watchdog insists that Palantir would be a “data processor” rather than a “data controller”. This means that the FCA would retain sole access to the encryption keys for sensitive files. Likewise, the data would only ever be stored in the UK.

However, the FCA reportedly stopped short of scrambling company and individual names, or using dummy data as a test. That’s in spite of the fact that it’s own guidance suggests using synthetic data in pilot runs.

Palantir: Somehow worse than it sounds

Palantir was co-founded by billionaire Peter Thiel, a major donor for Donald Trump and the 2022 Republican election campaign – to the tune of over $32m. Unsurprisingly, Thiel was also a friend of Jeffrey Epstein. The tech magnate once famously claimed that:

I no longer believe that freedom and democracy are compatible.

Given his openly authoritarian views, it’s both utterly outrageous and depressingly predictable that the UK government simply won’t stop handing his company public deals. To date, Palantir has raked in over £670m of UK public funds. These include contracts with the military, police, and even the NHS.

Amnesty International’s AI and human rights researcher Matt Mahmoudi said the firm:

has a track record of flagrantly disregarding international law and standards, both in violations of the human rights of migrants in the United States, which it risks contributing to, and its ongoing supply of artificial intelligence products and services to the Israeli military and intelligence services.

Anybody with an even-passing interest in data security, privacy and basic human freedoms might recognise that Palantir shouldn’t be allowed within 50 metres of a fucking Excel account, never mind the FCA’s database.

However, it’s glaringly clear that the heads of our public institutions no longer share these concerns.

Featured image via the Canary

By Alex/Rose Cocker


From Canary via This RSS Feed.