But OpenAI Says
They're GDPR Compliant…
They're right — about themselves. OpenAI as a company has data policies in place. But that does not make you, the UK solicitor feeding real client data into their US-hosted servers, GDPR compliant. Not by a long way.
These are two completely different legal positions. And the difference could end your practice.
ChatGPT Business is not the only one!
OpenAI's Compliance ≠ Your Compliance
When OpenAI says "compliance with GDPR and CCPA," they are describing their own internal data handling processes — their DPA, their certifications, their security architecture. Under GDPR, OpenAI acts as a Data Processor. You — the solicitor — are the Data Controller. These are entirely separate legal roles with entirely separate obligations. A processor being compliant does not make the controller compliant.
- • Their internal data handling policies
- • A Data Processing Addendum (DPA) available to sign
- • Security certifications (SOC 2 Type 2)
- • Not training on business customer data by default
- • Encryption at rest (AES-256) and in transit (TLS 1.2+)
- • Your client's data being transferred to US servers
- • Your lawful basis for that international transfer
- • Your Transfer Risk Assessment (TRA) obligations
- • Your SRA Code of Conduct duties (Para 6.3)
- • Your client's legal professional privilege
- • The US CLOUD Act — which OpenAI cannot override
7 Reasons ChatGPT Business Fails UK GDPR Compliance
Each point below is backed by official regulatory guidance, OpenAI's own documentation, or established legal authority. Click any item to jump to the full evidence section.
ChatGPT Business vs AI Guard
Every claim in this table is verifiable against the sources linked throughout this page.
| Compliance Requirement | ChatGPT Business | AI Guard |
|---|---|---|
| Data Residency in the UK | ✗ Not AvailableOnly Enterprise, Edu, Healthcare | ✓ UK ServersAll data stays in the UK |
| Client PII Reaches the LLM | ✗ Yes — Raw PromptsFull input sent to US servers | ✓ NeverPII masked before any model sees it |
| UK-US Data Bridge (DPF) | ✗ UnavailableOpenAI not DPF-certified | ✓ Not NeededNo international transfer occurs |
| CLOUD Act Exposure | ✗ Fully ExposedUS govt can compel disclosure | ✓ Zero ExposureNon-US provider, UK jurisdiction |
| SRA "No Raw Client Data" Rule | ✗ ViolatedUnless every prompt is manually redacted | ✓ AutomatedPII masked automatically, every time |
| Transfer Risk Assessment | ⚠ Required & Likely FailsCLOUD Act makes it near-impossible | ✓ Not RequiredNo restricted transfer to assess |
| Data Used for Model Training | ✓ Off by DefaultBusiness plan excluded from training | ✓ NeverSelf-hosted; no data shared externally |
| Legal Professional Privilege | ✗ At RiskSharing with US third party may break it | ✓ PreservedNo identifiable data leaves your control |
Only Enterprise, Edu, Healthcare
All data stays in the UK
Full input sent to US servers
PII masked before any model sees it
OpenAI not DPF-certified
No international transfer occurs
US govt can compel disclosure
Non-US provider, UK jurisdiction
Unless every prompt is manually redacted
PII masked automatically, every time
CLOUD Act makes it near-impossible
No restricted transfer to assess
Business plan excluded from training
Self-hosted; no data shared externally
Sharing with US third party may break it
No identifiable data leaves your control
ChatGPT Business Has No UK Data Residency
OpenAI offers data residency — the ability to choose where your data is stored at rest — but not for the ChatGPT Business plan. OpenAI's own business data page lists the eligible products explicitly. ChatGPT Business is not on that list. The plans that do qualify are: ChatGPT Enterprise, ChatGPT Edu, ChatGPT for Healthcare, and API platform customers.
"Who can use this feature? ChatGPT Enterprise and Education customers."🔗 Verify this source → help.openai.com
"Eligible ChatGPT Enterprise, ChatGPT Edu, ChatGPT for Healthcare, and API platform customers can store sensitive customer content at rest in the U.S., Europe, UK, Japan, Canada, South Korea, Singapore, Australia, India, and the UAE."🔗 Verify this source → openai.com
Even for Enterprise customers who do have data residency, OpenAI acknowledges that certain data — including workspace metadata, billing information, user authentication, and data processed by third-party integrations — may still be stored outside the chosen region. And the GPU inference step — the actual AI processing of your prompt — still routes through the United States.
OpenAI Is Not Certified Under the Data Privacy Framework
The UK-US Data Bridge became operational on 12 October 2023. It allows UK organisations to lawfully transfer personal data to US organisations that have self-certified to the EU-US Data Privacy Framework (DPF). This is the simplest and most robust legal mechanism available for UK-to-US data transfers. OpenAI has not obtained this certification.
"Since OpenAI is not certified under the EU-U.S. Data Privacy Framework, the U.S. company is forced to use alternative methods."🔗 Verify this source → activemind.legal
"UK organisations cannot simply transfer personal data to any data importer/recipient in the US — for the data to flow freely, the relevant recipient must be certified to the UK Extension and appear on the DPF List."🔗 Verify this source → gov.uk
"You can search the Data Privacy Framework List directly. Search for 'OpenAI' — it does not appear."🔗 Verify this source → dataprivacyframework.gov
The US CLOUD Act Can Access Your Client Data at Any Time
The Clarifying Lawful Overseas Use of Data Act (CLOUD Act), enacted in March 2018, grants US law enforcement the power to compel US-based technology companies to disclose data in their possession — regardless of where that data is physically stored. OpenAI is headquartered in San Francisco, California. This law applies to them fully.
"A service provider shall disclose any information related to a customer within the provider's possession 'regardless of whether such communication, record, or other information is located within or outside of the US.'"🔗 Verify this source → activemind.legal
"The European Data Protection Board (EDPB) concluded that service providers subject to EU law cannot legally base the disclosure and transfer of personal data to the US on such requests. This creates a direct conflict between the CLOUD Act and GDPR Article 48."🔗 Verify this source → activemind.legal
"No US-headquartered cloud provider can guarantee that EU/UK personal data will not be accessible to US authorities. Microsoft's chief legal officer in France stated under oath before the French Senate that the company cannot guarantee EU data is safe from US access requests."🔗 Verify this source → exoscale.com
The SRA Has Explicitly Told You: No Raw Client Data in AI Tools
The Solicitors Regulation Authority — your professional regulator — updated its compliance guidance on 9 February 2026. The SRA's position is that all existing professional obligations continue to apply in full regardless of whether AI or other technologies are used. Its guidance on AI is unambiguous.
"Keeping the affairs of the current and former clients confidential [paragraph 6.3] … maintaining client information securely and in line with any timeframes specified in relevant data protection legislation [Paragraph 2.1(a)]. Your client's best interests must remain at the centre of your decisions about the use of technology."🔗 Verify this source → sra.org.uk
"At an SRA-hosted webinar on 4 February 2026, the SRA outlined that firms must not put any identifiable client data into AI tools without informed consent, and no raw client data should ever be put into public AI tools."🔗 Verify this source → societyofasianlawyers.co.uk
You Are Personally Liable — Confidentiality Cannot Be Outsourced
Under the SRA Code of Conduct and established legal principle, the duty of confidentiality belongs to the solicitor — not to the technology provider. If an AI system exposes, compromises, or enables unauthorised access to client data, you bear the legal and regulatory liability. Not OpenAI. Not Microsoft Azure. You.
"The SRA has issued guidance emphasising that lawyers remain personally responsible for confidentiality, even when outsourcing or using technology. If a solicitor uses an AI system and that system compromises client data, the solicitor — and not the AI provider — is liable. The bottom line is that confidentiality cannot be outsourced."🔗 Verify this source → leedsbeckett.ac.uk
"Information shared with an AI platform can be interpreted as sharing it with a third party, thereby breaking privilege. These interactions are not protected by legal professional privilege."🔗 Verify this source → leedsbeckett.ac.uk
Your Transfer Risk Assessment Would Almost Certainly Fail
Because OpenAI is not DPF-certified, any use of ChatGPT Business with client data requires your firm to rely on Standard Contractual Clauses and conduct a Transfer Risk Assessment (TRA). The ICO's 2026 updated guidance requires you to demonstrate that the level of protection for personal data after transfer is "not materially lower than in the UK." Given CLOUD Act exposure, this standard is nearly impossible to meet honestly.
"The ICO's 2026 updated international transfer guidance requires organisations to confirm that the level of protection for personal data after transfer is 'not materially lower than in the UK' and to implement additional technical, organisational, or contractual measures identified by the TRA."🔗 Verify this source → kennedyslaw.com
"The ICO states that international transfers require appropriate safeguards to ensure that the standard of protection for individuals guaranteed by the UK GDPR is not undermined. Without such safeguards, the transfer is unlawful."🔗 Verify this source → ico.org.uk
The CCBE and Law Society Explicitly Warn Against This Exact Practice
The Council of Bars and Law Societies of Europe (CCBE) — representing over one million European lawyers — and The Law Society of England and Wales have both issued formal guidance warning lawyers against entering client data into generative AI platforms. This is the consensus position of the legal profession's own representative bodies.
"Lawyers should refrain from entering any personal, confidential, or other data related to the client into the user interface of the GenAI."🔗 Verify this source → ccbe.eu
"As a general rule, you should not permit the use of personal data or client confidential information in any testing, templating or similar context on generative AI. Always create and use fictional data. Many generative AI companies are located outside the UK, meaning data may be transferred outside UK borders."🔗 Verify this source → lawsociety.org.uk
"In January 2026, the Law Society asked the UK government to clarify rules regarding the anonymisation of client data added to AI platforms and data security, storage and sharing — signalling that current rules are insufficient and more protection is needed."🔗 Verify this source → todayswillsandprobate.co.uk