Microsoft Copilot is often described as Microsoft’s answer to ChatGPT, but it works very differently behind the scenes. While ChatGPT stores and processes information through OpenAI’s own systems, Copilot runs inside your existing Microsoft 365 environment.
That means your data stays within the same trusted framework that already holds your emails, Teams chats and OneDrive files.
Where your information goes when you use Copilot depends on where your Microsoft 365 data is stored. For most UK businesses, that’s within Microsoft’s UK data centres. If you have a more advanced or international subscription, your Copilot data will follow the same regional rules that apply to the rest of your Microsoft 365 content.
How to Check (and Set) Where Copilot Stores Your Data
If you use Microsoft Copilot inside Microsoft 365 (Word, Excel, Outlook, Teams and so on), your chats with Copilot are stored in Microsoft’s cloud — not on your own computer. The question is where that storage actually happens.
Step 1: What “PDL” Means
Microsoft uses something called a Preferred Data Location (PDL) to decide which country or region your Copilot data lives in.
- If you don’t set one, Microsoft uses your business’s default region — usually based on where your Microsoft 365 subscription was first registered.
- If you signed up for Microsoft 365 in the UK, your data almost always stays in the UK data centres (UK South / UK West).
For most UK-based sole traders and small businesses using Microsoft 365 Business Standard or Business Premium, that means your Copilot data is already stored in the UK.
Step 2: How to Check Where Your Data Actually Lives
You can easily check where Microsoft stores your business data:
- Sign in to Microsoft 365 using your business account.
- Go to https://aka.ms/WhereIsMyData.
Microsoft will show you the data-centre region where your emails, OneDrive files and Teams chats are stored. - If it says United Kingdom – UK South or United Kingdom – UK West, don’t panic — those are simply the names of Microsoft’s two UK data-centre regions (one near London, the other near Cardiff).
Microsoft spreads data between them for backup and reliability, so your business information – including anything Copilot stores – stays inside the UK even if one site goes offline.
Step 3: How to Check Which Version You’re On
f you’re not sure which Microsoft 365 version or subscription you have, there’s an easy way to find out:
- Log in to Microsoft 365 at https://www.office.com.
- Click your profile picture or initials in the top right corner.
- Choose View account.
- Look for Subscriptions or My products – this will tell you which plan you’re using (for example, Business Standard, Business Premium, or Microsoft 365 Family).
If it says “Microsoft 365 Family” or “Microsoft 365 Personal,” that’s a consumer account rather than a business one.
Step 4: How to Handle Personal Data Responsibly
When you use Microsoft 365 and Copilot in your own business, you’re the data controller for the personal information you hold about your clients, suppliers, and team members. Your data privacy policy should explain how that business data is stored and protected and people need to know if their data is going out of the UK.
If you also process data on behalf of a client (for example, accessing their customer information or inbox as part of your work), that’s a separate role — you’re acting as a data processor, and that activity should be covered by a Data Processing Agreement with the client. It’s best practice to keep that processing within the client’s own systems wherever possible, rather than importing third-party data into your own. They should give you a unique log in to their 365 system so your access can be controlled and monitored.
A Word of Caution About Free or Personal Microsoft Accounts
Free Microsoft accounts (for example Outlook.com, Hotmail, or Microsoft 365 Personal/Family) are not suitable for storing or processing any kind of client personal data.
These versions don’t give you control over where data is stored or which legal terms apply. They’re designed for personal use — not for business compliance or data protection.
If you use Copilot or any other Microsoft tool for client work, you need a business subscription (such as Microsoft 365 Business Standard or Business Premium).
That’s what gives you the contractual protection, data-residency controls, and GDPR-compliant processing terms you rely on when handling personal data responsibly.
Insurance note: If you carry cyber or professional indemnity cover, check your policy wording. Many brokers like emphasise the need for appropriate cyber cover and professional-grade setups when handling personal data; details and eligibility vary by policy and insurer. Don’t assume incidents involving personal/free accounts are covered — verify with your broker.
Quick Recap
| Microsoft 365 plan | Can you set PDL? | Where your data lives | What to do |
|---|---|---|---|
| Business Standard / Business Premium (most sole traders) | No | UK (if registered in UK) | Sign in then check aka.ms/WhereIsMyData to confirm |
| Free / Personal Microsoft account (Outlook.com, Hotmail, Microsoft 365 Family) | No | Often EU-wide | Not suitable for client or team personal data — upgrade to a business plan |
How Microsoft Copilot Uses Your Microsoft 365 Data (and How That Differs from ChatGPT)
ChatGPT sends prompts and responses to OpenAI’s servers in the USA, where data may be used to train or improve the underlying models. Microsoft Copilot, on the other hand, runs inside your existing Microsoft 365 environment. That means it uses the same privacy, security and compliance framework that already governs your emails, Teams chats, OneDrive files and SharePoint documents.
When you type a question or request into Microsoft Copilot, it draws on the information you already have permission to access within Microsoft 365 — for example, your own documents, spreadsheets, meeting notes or emails. Copilot does not give you access to data you wouldn’t normally see. Its results are shaped entirely by your existing permissions, and your data never leaves Microsoft’s trusted cloud for processing elsewhere.
This design is what makes Microsoft Copilot and data privacy a better fit for regulated or professional environments. Your prompts and responses are encrypted in transit and at rest, and they’re stored in the same region as your Microsoft 365 content.
Microsoft also states that the information processed by Copilot is not used to train its foundation models. In other words, what you enter stays private within your tenant. You can read more in Microsoft’s official data privacy and security guidance for Copilot.
For most small businesses, this means that Microsoft Copilot is as secure as the rest of your Microsoft 365 setup. If your organisation already meets UK GDPR standards for Microsoft 365, using Copilot does not change your compliance position — provided you continue to follow good practice about what data you input and who has access.
Where many users go wrong is in assuming Copilot is a sealed box that hides all traces of their activity. In reality, Microsoft Copilot logs prompts and responses in the same way Teams or Outlook logs activity. That helps with accountability and auditing but also means you need to be thoughtful about the type of personal data you include. Avoid pasting sensitive details, health information or confidential third-party data into prompts unless you have a clear lawful basis and proper security controls.
To sum up: Microsoft Copilot and data privacy are closely tied to your existing Microsoft 365 settings. The tool isn’t a separate system — it inherits your own access permissions, data-residency settings and security standards. If those are configured correctly, Copilot can be a safe and compliant way to benefit from generative AI inside your business.
Understanding how Microsoft Copilot handles data privacy helps you stay compliant, confident and credible. You don’t need to be a tech expert — just make sure your Microsoft 365 setup and privacy documents reflect how you actually work.