Palantir will be granted access to troves of highly sensitive UK financial regulation data, the Guardian can reveal, sparking new concerns about the US AI company’s deep reach into the British state.
The Financial Conduct Authority (FCA) has awarded Palantir a contract to examine the watchdog’s internal intelligence data in an effort to help tackle financial crime, including investigations of fraud, money laundering and insider trading.
The Miami-based company co-founded by billionaire Donald Trump donor Peter Thiel has been appointed for a three-month trial that will pay it more than £30,000 a week to analyze the FCA’s vast “data lake”, which could lead to a full buyout of the AI system.
The deal is part of the FCA’s campaign to use digital intelligence to better focus resources on rule-breaking among 42,000 financial services firms, from major banks to crypto exchanges.
There was only one other, unnamed competitor for the contract. Palantir already has more than £500m in UK public deals, including for the NHS, military and police.
The contract warns of “very significant privacy concerns.” Palantir is expected to apply its AI system, known as Foundry, to the vast amounts of information held by the watchdog, including case intelligence files marked as highly sensitive; Information on so-called problem firms; reports from lenders about proven and suspected fraud; and data about the public, including consumer complaints to the Financial Ombudsman.
The Guardian believes the data includes recordings of phone calls, emails and social media posts. The FCA is one of several UK agencies aimed at preventing financial crimes that cause harm such as drug trading and human trafficking.
This deal has raised concerns within the FCA. One source said: “Once Palantir understands how we detect money-laundering threats, how will we know they are ethically trustworthy enough not to share that information?”
Palantir’s technology is used in the Israeli military and the US President’s ICE immigration crackdown, leading left-wing lawmakers in the House of Commons last month to call it a “highly suspect” and “terrible” company. In 2023 it signed a £330m deal with the NHS, which sparked protests from doctors, and in December 2025 signed a £240m contract with the Ministry of Defence, prompting MPs to highlight “serious allegations of complicity in human rights violations and undermining of democratic processes against Palantir”.
Palantir has previously defended its work, saying it has led to almost 99,000 extra operations being scheduled in the NHS, has helped UK police tackle domestic violence and it “takes a rigorous approach to respecting human rights”.
Professor Michael Levy, an internationally recognized expert on money laundering at Cardiff University, said that data held by financial regulators is “severely under-exploited”, so AI is a potentially valuable technology to combat financial crimes. But he said it was “a relevant question whether Palantir’s owners can tell their friends about the methodology”.
“What protocols have been agreed between the FCA and Palantir regarding the further use of what is learned in that process?” He said.
The FCA said the terms of the contract meant Palantir would be a “data processor” and not a “data controller” – meaning it could only act on the regulator’s instruction, adding that it would retain exclusive control over encryption keys for the most sensitive files and that the data would only be hosted and stored in the UK. Palantir must destroy the data after the contract is completed and any intellectual property obtained from data trawling must be retained by the FCA.
The FCA considered using dummy data or company and individual names, but decided that using real data was the only worthwhile test, even though guidelines encourage the use of synthetic data in pilots.
“When the FCA carries out an enforcement investigation, it has powers to force companies to hand over large amounts of data,” said barrister Christopher Housemayne du Boulay, partner and barrister at law firm Hickman & Rose, which specializes in defending serious and complex financial crime cases. “We can talk about hundreds of complete email accounts and complete financial records. Many innocent people will be caught up in this and the data may include bank account details, email addresses, telephone numbers and other personal information.
“If you take that data and use it to train an AI system, there are very significant privacy concerns. There should be serious privacy requirements about what Palantir does with the data.”
The FCA said Palantir could not copy data to train its products. Palantir sent a request for comment to the FCA.
An FCA spokesperson said: “The effective use of technology is vital in the fight against financial crime and it helps us identify risks to the consumers we serve and the markets we oversee. We ran a competitive procurement process and have strict controls in place to ensure the security of data.”