Did you know it only takes one well-intentioned click to create a data breach?
Maybe you export a list of customers from your store, copy a few client emails into ChatGPT to “tidy up” a reply, or paste a set of survey answers to get a summary. In less than five minutes, you’ve mixed ChatGPT and client data — and handed someone else’s personal information to a third-party processor whose servers sit outside the UK.
That one small shortcut could turn into a GDPR headache before you’ve even finished your coffee.
What happens when you put client data into ChatGPT?
When you use ChatGPT, your prompts and uploads don’t just vanish into the ether. They’re stored and processed on OpenAI’s servers, which may be in other countries (mostly the US). According to the OpenAI Privacy Policy, your “content” — anything you type or upload — can be collected and processed to provide the service and (unless you’ve switched off data-sharing), may be used to improve the model.
Did you even turn off data sharing to improve the model? That doesn’t stop data sharing but at least it turns off one aspect!
Free and Plus users don’t get the same privacy features that Enterprise customers do. There’s no option to keep data physically in the UK or EU, and no guarantee of “zero data retention.” Even if you delete the chat, OpenAI may still store metadata or backups for a limited period. You can read their own explanation on data controls and retention.
ChatGPT is a tool for brainstorming and drafting — but it’s not designed for securely processing identifiable client information or any form of personal data.
Why ChatGPT and client data can be a GDPR risk
If you’re a Virtual Assistant, coach, trainer, or small-business service provider, you’re always the data controller for your own business (and responsible for not randomly sharing personal data around the world). But almost always a data processor when handling personal information you are given access to by your client.
Your client — the data controller — is legally responsible for deciding what can happen to that data and must issue a Data Processing Agreement (DPA) to you. This is not the same thing as a data privacy policy – see our earlier blog on that.
In reality, most clients never do. That’s why every KoffeeKlatch contract pack includes the right structure to fix this. It gives you and your client a way to create the missing elements and have a real conversation about who is entitled to do what.
If you’re using Chatgpt and client data in your workflow you need to understand that, under UK Data Privacy, if you upload 3rd party data (personal data your client has collected) client data into ChatGPT (or any AI tool) without the controller’s written authorisation, you’ve done so without consent.
Even if the Controller has given you a DPA if it doesn’t say you can upload to ChatGPT (and most would not say this) then uploading is a breach.
Some more sophisticated clients may give you a log in to their Enterprise system, but if you are using yours (free or paid) this is where things go wrong
The “instant breach” trap
It’s frighteningly easy to export data from your store, CRM, or course platform and drop it into ChatGPT for analysis or marketing help.
That exported CSV may include names, emails, addresses, or notes — all personal data. The moment you upload it, that information leaves the UK.
If any of it includes details about children, health, ethnicity, or anything else classed as special category data, the stakes rise sharply.
Even if your client knows you’re using ChatGPT and says “that’s fine,” and the DPA is updated, the original lawful reason for processing their customers’ data in the UK doesn’t automatically carry over when it’s sent abroad.
The ICO explains why international transfers need extra safeguards here: ICO international transfers guidance.
If you’re transferring special category data outside the UK, as a data controller, you’ll normally need explicit consent or another recognised condition. Without it, the transfer itself could be unlawful — even if the UK processing was perfectly legal. If you’re doing it absent mindedly as a data processor, you are putting yourself and your client into a difficult situation.
Who’s responsible for the DPA (and why most people get this wrong)
It’s your client — the data controller — who should provide the DPA to you as their processor, not the other way round. But because so few small businesses actually do, KoffeeKlatch builds the DPA structure into your contract.
It means your clients can comply and you’re both protected.
If you’re unsure how this works with AI tools, come and ask in the KoffeeKlatch Customer Group — we’ll talk you through how you and your client should complete the KoffeeKlatch Data Processing Form to include Chatgpt if that’s what you both really want to do.
Practical steps to stay safe
- Don’t paste exported data (from stores, CRMs, spreadsheets) into ChatGPT.
- Use dummy data or examples when you want to test prompts.
- Switch off “Improve the model” in your ChatGPT settings.
- Avoid special category data — if you handle health, children’s, or equality information, never upload it.
- Record what tools you use in your RoPA (you have a spreadsheet for this in your KoffeeKlatch online GDPR programme) and note any international transfers.
- Get explicit consent from the data controller and they from their data subjects, if you ever must process special category data with AI.
- Review your KoffeeKlatch DPA clauses, your data privacy policy and your client’s data privacy policy and make sure they all align with what is actually going on.
Worried you’ve already shared client data with ChatGPT?
Don’t panic — but do act quickly.
Check what you uploaded, who it belonged to, and whether it included personal or special category data. Then follow the steps in our guide on what to do if you have a data breach.
It walks you through how to assess the risk, when you might need to notify your data subjects or the ICO, and how to document the incident properly.
Final thought
You don’t have to swear off ChatGPT — you just have to know where the lines are.
AI can make your work faster and smarter, but it doesn’t replace the rules on confidentiality, data handling, and informed consent.
Before you paste, pause.
Because good data practice isn’t about paperwork — it’s about respect: for your clients, their customers, and the trust they place in you.