Perplexity Enterprise Pro
Fails UK GDPR
Perplexity Enterprise Pro routes your prompts through four US companies โ OpenAI, Anthropic, Google, and xAI. One of them has been fined โฌ15 million for GDPR violations. None offer UK data residency. All are subject to the US CLOUD Act.
For UK solicitors and accountants handling client data, this creates unacceptable, compounding GDPR risk.
Perplexity Is Not One AI โ It's a Data Relay to Four US Companies
Perplexity Enterprise Pro is a multi-model routing platform, not a single AI. When you type a client query, it is transmitted to whichever underlying LLM provider you selected โ OpenAI, Anthropic, Google, or xAI โ each operating their own US-based infrastructure, each with different GDPR compliance profiles. You don't get one data agreement. You get four data risks.
- โ SOC 2 Type II certification (security processes)
- โ Zero data retention by Perplexity itself
- โ No AI training on Enterprise data (Perplexity)
- โ DPF certified โ Perplexity only
- โ Data Processing Agreement available
- โ ๏ธ Data flows to OpenAI, Anthropic, Google, xAI
- โ ๏ธ US/EU servers only โ no UK option
- โ UK data residency โ not available on any plan
- โ OpenAI is NOT DPF certified (โฌ15M GDPR fine)
- โ Protection from US CLOUD Act on all 4 providers
- โ Claude defaults to US storage for Article 9 data
- โ Google Gemini has human review (3-year retention)
- โ 4 separate Transfer Risk Assessments required
- โ SRA "no raw client data" rule โ violated by design
5 Reasons Perplexity Enterprise Pro Fails UK GDPR Compliance
Each point is backed by official sources, specialist data protection law firm analysis, and regulatory guidance. Click any item to jump to the full evidence section with clickable source links.
Perplexity Enterprise Pro vs. AI Guard
Every claim in this table is verifiable against the sources linked throughout this page.
| Compliance Requirement | Perplexity Enterprise Pro | AI Guard |
|---|---|---|
| UK Data Residency | โ Not AvailableUS/EU only โ no UK option on any plan | โ UK ServersAll data stays in the UK, contractually guaranteed |
| Client PII Reaches the LLM | โ Yes โ Raw PromptsFull prompt sent to OpenAI / Anthropic / Google | โ NeverPII masked before any model processes the query |
| Number of US Companies Handling Data | โ Up to FourPerplexity + OpenAI + Anthropic + Google | โ ZeroNo US company involvement โ UK provider only |
| Data Privacy Framework (DPF) Status | โ MixedPerplexity: YES โ OpenAI: NO โ Anthropic: YES โ Google: YES | โ Not NeededNo international transfer occurs |
| US CLOUD Act Exposure | โ Fully ExposedAll four providers are US companies subject to CLOUD Act | โ Zero ExposureUK provider, UK jurisdiction, UK servers |
| Transfer Risk Assessments Required | โ Four RequiredOne per sub-processor โ all likely to fail | โ None RequiredNo restricted transfer = no TRA needed |
| GDPR Article 9 Special Category Data | โ At RiskClaude defaults to US storage; misaligned with Art. 9 | โ ProtectedSpecial category data masked before any transfer |
| Human Review of Conversations | โ Possible (Gemini)Google reviewers may read data; retained up to 3 years | โ No Human ReviewNo third-party humans access your queries |
| SRA "No Raw Client Data" Rule | โ Violated by DesignUnmasked prompts go to US providers automatically | โ Compliant by DesignPII masked automatically โ every query, every time |
| Audit Trail (Which Client โ Which Model) | โ UnclearPerplexity logs may not record which LLM processed each prompt | โ CompleteFull masking/unmasking log per client matter |
| Zero Data Retention | โ Perplexity OnlySub-processors may retain for security/abuse purposes | โ Guaranteed by DesignNo data stored beyond session โ UK jurisdiction |
US/EU only โ no UK option on any plan
All data stays in the UK
Perplexity + OpenAI + Anthropic + Google
No US company involvement
Full unmasked prompt sent to US providers
PII masked before any model sees it
All four providers are US companies
UK provider, UK jurisdiction
Unmasked prompts go to US providers
PII masked automatically every time
Your Prompts Route Through Four Different US Companies
Perplexity Enterprise Pro is marketed as a single platform, but technically it is a multi-model routing layer that transmits your queries to different LLM providers depending on which model you select. This is confirmed directly in Perplexity's own Help Center and verified by independent UK business analysis.
According to TopTenAIAgents UK's February 2026 analysis, when an Enterprise user queries GPT-4 via Perplexity, "data flows through an encrypted pipe, is processed for the answer, and is then discarded." The critical word here is flows โ it leaves Perplexity's servers and enters OpenAI's US infrastructure. The same applies to every other external model: Claude routes to Anthropic, Gemini routes to Google, Grok routes to xAI.
"No, Perplexity's agreements with third-party model providers like OpenAI and Anthropic prohibit using Perplexity data for training their models."๐ Verify this source โ perplexity.ai Help Center
Notice what this statement does and does not say. It says providers won't train on your data. It does not say your data won't be transmitted to their servers. It does not say they won't retain it temporarily. It does not say the US CLOUD Act doesn't apply. All of those risks remain fully intact.
"Perplexity Enterprise Pro provides access to GPT-4.1 and GPT-5 (OpenAI), Claude 4 Sonnet and Claude 4.5 Opus (Anthropic), Gemini 3 Pro (Google), Grok 4.1 (xAI), and proprietary Sonar / Sonar Pro models. When an external model is selected, the query is transmitted via API to that provider's infrastructure."๐ Verify this source โ datastudios.org
What this means for UK solicitors: On any given working day, your firm may unknowingly transmit client data to four different US companies. A trainee using GPT-4 on Monday. A partner using Claude on Tuesday. An associate using Gemini on Wednesday. Each transmission is a separate restricted transfer under UK GDPR Chapter V, requiring its own legal basis, its own Transfer Risk Assessment, and its own sub-processor DPA. Can your compliance team manage four separate international transfer risk frameworks for one AI subscription?
OpenAI Is NOT Data Privacy Framework Certified โ and Has Been Fined โฌ15M for GDPR Violations
The UK-US Data Bridge (based on the EU-US Data Privacy Framework) is the most straightforward legal mechanism for UK-to-US data transfers. It became operational on 12 October 2023. When you use GPT-4 or GPT-5 via Perplexity Enterprise Pro, your client data is routed to OpenAI โ a company that has not obtained DPF certification and has been fined โฌ15 million by Italy's data protection authority for GDPR violations.
"Since OpenAI is not certified under the EU-U.S. Data Privacy Framework, the U.S. company is forced to use alternative methods to ensure an adequate level of data protection when transferring personal data to the United States. OpenAI relies on the Standard Contractual Clauses... However, it must be critically considered that the risk of data transfers to third countries and possible data protection violations still exists."๐ Verify this source โ activemind.legal
SimpleAnalytics' independent GDPR compliance analysis of OpenAI, updated November 2025, confirms: "Partially, but with serious caveats. OpenAI provides GDPR-aligned structures... but it has been fined โฌ15 million by Italy's regulator" for lack of transparency, insufficient age verification, and inadequate legal basis for data processing. The fine was upheld through 2025 โ meaning you are routing client data through a company with a documented, penalised compliance record under the very regulation you must satisfy.
"Partially, but with serious caveats. OpenAI provides GDPR-aligned structures for enterprise users (DPA, SCCs, no training on API data), but it has been fined โฌ15 million by Italy's regulator and faces ongoing complaints. The lack of DPF certification means EU/UK transfers rely on SCCs, which require Transfer Risk Assessments that may be difficult to satisfy given US surveillance law."๐ Verify this source โ simpleanalytics.com
The US CLOUD Act compound problem: Even if OpenAI were DPF-certified, the US CLOUD Act allows US law enforcement to compel any US company to disclose data stored anywhere in the world, regardless of DPF status. activeMind.legal explains: "The CLOUD Act determines that U.S. law enforcement authorities may request personal data from US-based technology companies when there is a suspicion of a crime by issuing warrants or court orders, regardless of the data's location." This directly conflicts with GDPR Article 48, which prohibits data transfers based on foreign court orders without proper MLAT agreement.
"The CLOUD Act determines that U.S. law enforcement authorities may request personal data from US-based technology companies when there is a suspicion of a crime by issuing warrants or court orders, regardless of the data's location. Accordingly, a service provider shall disclose any information related to a customer within the provider's possession regardless of whether such communication, record, or other information is located within or outside of the US."๐ Verify this source โ activemind.legal / US CLOUD Act Guide
No UK Data Residency โ On Any Plan, For Any Model
Perplexity's official privacy policy and public documentation confirm that data is hosted in US and EU data centres only. There is no option, on any Perplexity plan including Enterprise Pro, to specify that your data must stay within the United Kingdom. Post-Brexit, UK data is subject to its own adequacy framework โ and "EU hosting" is not the same as "UK hosting."
"We comply with the EU-U.S. Data Privacy Framework and the UK Extension to the EU-U.S. DPF as set forth by the U.S. Department of Commerce... We currently host data in the United States and the European Union. If you are located outside the United States, your data may be transferred to and processed in the United States and European Union."๐ Verify this source โ perplexity.ai/hub/legal/privacy-policy
heyData's independent technical analysis confirms the deeper infrastructure problem: Perplexity's entire compute infrastructure runs on AWS. According to AWS's own published case study, Perplexity uses Amazon EC2 P4de instances โ US-based GPU clusters โ for inference workloads. The AWS case study records Perplexity CTO Denis Yarats stating: "Our infrastructure for model training and inference is all powered by Amazon SageMaker HyperPod." There is no AWS UK-specific deployment mentioned, and no contractual commitment to route UK users exclusively to UK or EU infrastructure.
"Many servers run on cloud service providers such as AWS, whose infrastructure is distributed worldwide. Storage or short-term processing outside the EU cannot be ruled out technically. There are no contractually guaranteed data flow obligations (standard contractual clauses, binding corporate rules, etc.). Without these, all data transfers outside the EU are considered questionable under data protection law."๐ Verify this source โ heydata.eu
The post-Brexit distinction matters enormously: Even if Perplexity successfully routes your data to an EU data centre (e.g. Frankfurt or Dublin), this remains a restricted international transfer under UK GDPR. The UK has granted the EU an adequacy decision, but this applies to data controllers in the EU โ not to Perplexity's US-headquartered infrastructure. A transfer from a UK law firm to a US company's EU server is still a UK-to-US transfer under UK GDPR, because the controller receiving your data (Perplexity, OpenAI, Google) is domiciled in the US.
Claude Defaults to US Storage for Article 9 Data โ Gemini Has Human Reviewers Reading Your Conversations
UK solicitors and accountants routinely handle GDPR Article 9 special category data โ health information, criminal offence data, trade union membership, and biometric data. This category of data carries the highest level of protection under UK GDPR. Two of Perplexity's most popular models create direct Article 9 compliance risks: Claude defaults to US-based storage, and Gemini conversations may be reviewed by human employees and retained for up to three years.
"By default, Claude data is stored in the US. While the UKโUS Data Bridge exists, it does not align cleanly with GDPR Article 9 special category data, particularly where legal matters involve criminal offence data, health data, or other highly sensitive information... The retention period jumped from 30 days to 5 years for training-enabled accounts... A tax attorney might use personal Claude Pro for research. Client tax strategies then train future models without anyone realising."๐ Verify this source โ amstlegal.com
The LinkedIn analysis by UK legal technology specialist Mark Barrett, published December 2025, adds a critical compliance action requirement: firms using Claude must "complete Transfer Risk Assessments where personal data is involved" and "update AI acceptable-use policies so they reflect reality, not assumptions." This is not optional guidance โ it is a legal obligation under UK GDPR Article 46 when transferring special category data to a third country.
"Shadow IT and Compliance Nightmares: Employees had signed up independently for Claude accounts. They accepted new terms without corporate oversight. Sensitive data potentially entered training pipelines without authorization... Action required: completing Transfer Risk Assessments where personal data is involved."๐ Verify this source โ linkedin.com / UK Law Firms Analysis
The Gemini human review problem is equally serious. Ascot London's August 2025 analysis of Google's data protection practices reveals that human reviewers โ including Google employees and third-party contractors โ may read, annotate, and process Gemini conversations. Critically, this data can be retained for up to three years even after a user has deleted their Gemini activity. For a solicitor who discussed a client's mental health assessment or criminal charge through Gemini via Perplexity, this means a stranger may have read it โ and the record persists for years.
"Human reviewers, including Google employees and third-party service providers, may 'read, annotate, and process' user conversations with Gemini apps... retention of reviewed conversations for up to three years, even after users delete their Gemini activity."๐ Verify this source โ ascot.london
You Are Personally Liable โ The SRA February 2026 Warning and the ICO Accountability Principle
The Solicitors Regulation Authority does not speak in ambiguous terms on this point. In their February 2026 regulatory webinar, recently analysed by the Society of Asian Lawyers, the SRA stated explicitly: "Data Protection โ firms must not put any identifiable client data into AI tools without informed consent. No raw client data should ever be put into public AI tools."
"Data Protection โ firms must not put any identifiable client data into AI tools without informed consent. No raw client data should ever be put into public AI tools."๐ Verify this source โ societyofasianlawyers.co.uk
Perplexity Enterprise Pro sends full, unmasked prompts to OpenAI, Anthropic, Google, and xAI by design. There is no automatic PII masking. Unless a solicitor manually redacts every client identifier, case reference, financial figure, and personal detail from every single query โ which defeats the productivity purpose of the tool โ this is a direct violation of SRA guidance. The SRA's own compliance tips reinforce this under Paragraph 6.3: solicitors must keep client affairs confidential, and this obligation cannot be contracted away or delegated to a US technology provider.
"keeping the affairs of the current (and former) clients confidential [paragraph 6.3]... having effective governance structures, arrangements, and systems and controls in place... maintaining client information securely and safely."๐ Verify this source โ sra.org.uk
Under GDPR Article 5(2) โ the Accountability Principle โ and Article 24 โ Responsibility of the Controller โ you, as the data controller, are personally responsible for demonstrating compliance. Not Perplexity. Not OpenAI. Not Google. If the ICO investigates a breach originating from a Perplexity session that routed to OpenAI's US servers, you must demonstrate that the transfer was lawful, that a Transfer Risk Assessment was completed, that essentially equivalent protection was confirmed, and that client consent was obtained. Each step requires documented evidence.
"You must not transfer personal data to a third country unless that country, territory, or one or more specified sectors within that country ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data."๐ Verify this source โ ico.org.uk
The consequences of getting this wrong: ICO enforcement action and fines up to ยฃ17.5M or 4% of global annual turnover. SRA disciplinary proceedings and potential strike-off for serious breaches. Professional indemnity claims from clients whose confidential data was exposed to US government access. Reputational damage that cannot be undone. And critically โ your firm's competitive position destroyed the moment a breach becomes public knowledge.
AI Guard: UK Data Sovereignty by Architecture
AI Guard does not solve GDPR compliance through contracts, certifications, or configuration options. It eliminates the problem at the architectural level โ before any data leaves UK jurisdiction.