Skip to main content

Copilot: what do I need for your secure operation?

What steps does your organisation need to take to securely and efficiently use Copilot?

Technology adoption

Man on laptop 5
Josh Bates   Headshot

Josh Bates

Technologist - Microsoft Security

Microsoft’s most recent announcement regarding Copilot for Microsoft 365 has removed all barriers to entry for customers, with the redaction of the 300-order minimum on an Enterprise Agreement. This has heightened interest, with many more customers now keen to place their orders and try it out.

"Every day, we change the world, but to change the world in a way that means anything, that takes more time than most people have. It never happens all at once. It’s slow. It’s methodical” A great quote from the hit cyber series Mr Robot, nevertheless, resonating quite perfectly to an organisation’s journey when deploying Copilot. Before using Copilot right away, which is a natural impulse, it is important to assess your organisation's readiness, complete due diligence, and create a strategy for the rollout. The good news is, Softcat is positioned to support you with all these steps.

This blog post will explore and discuss the following topics:

1. What are the problems caused by Copilot?

2. What are the data governance compliance and security requirements necessary prior to deploying Copilot?

3. What are the risks of failing to do the above?

Let’s begin with what problems Copilot for M365 causes. Well, the main reason why Copilot can automate workflows and ultimately produce fully fledged documents, emails, and PowerPoints and more in a matter of seconds is because it’s entrenched in the Microsoft Graph - where it pulls data from. So, in essence, whatever a user has access to, Copilot can reference and use. You can probably see for yourselves the dangers associated with the oversharing of data.

A second point that needs to be flagged is one generic to a lot of AI tools, hallucinations. Microsoft has been very clear that Copilot can sometimes produce false information - these are called hallucinations. There’s a frankly shocking case of a lawyer using ChatGPT and citing fake cases in court.  

Additionally, if the correct compliance procedures are not in place, Copilot can use stale data or reference out of date information (stale content) when producing responses. This can be deceiving for users with the information seeming legitimate and this is why it is good practice to review everything Copilot produces.

The final point that needs to be considered, is the joiner-movers-leavers (JML) process. What happens with Copilot when you move roles internally and now have requirements for different access packages? Well, Copilot doesn’t have this context, all it’s concerned about is what resources it’s capable of capturing and using, so failure to remove old and replace with new could cause data leakage.

How can we mitigate these issues?

Now that we’ve addressed some of the concerns with Copilot, let’s look how we can mitigate these by implementing suitable standards of data governance and compliance policies. Let’s begin with tackling the permission structure of an organisation. To prevent the oversharing of documents, it’s critical to lock down the permissions within your organisation’s SharePoint and OneDrive. Copilot will understand and respect the permission structure of an organisation. By only allowing access to data that employees require, such as certain SharePoint sites designed for HR or Finance, you prevent Copilot from being able to access these documents for non-permitted users and mitigate the risk of data leakage.

Additionally, you need to be able to identify and classify data within your tenant. The effective classification of your organisation’s data allows you to apply labels for encryption on emails and documents and identify stale content for deletion. If you use M365 E5 you’ll be at an advantage as you’ll be able to automate this process, depending on your Sensitive Information Types (SITs) and trainable classifiers. Ultimately, by applying labels to documents and files, you can apply encryption to highly confidential documents. Therefore, Copilot will not be able to pull any reference to these documents. While here, it’s important to note that as standard, Copilot will always take the highest priority of a label, but there are some key scenarios where it doesn’t, you can find them here.

To identify stale data to prevent Copilot pulling old information, you can use two tools within Microsoft Purview: Data Lifecyle Management and Content Search. Let’s start with Content Searches. With these you can set parameters where in which you want to scan, date ranges, and the keywords you want to find. This will identify any documents you have which are past your legal obligation for data retention. Data Lifecycle Management lets you retain or delete content based on policies created for emails, documents, Teams and even Viva Engage messages. A practical example of this is creating a policy to delete content older than six years e.g. legal firms are required to retain documents for six years, after this, they can delete.

Finally, regarding the JML process, implementing a solution such as Entra ID Governance can help automate the access packages by using machine learning. It does this through detecting user affiliation with other users in a group based on the organisation’s structure. So based on the reporting structure, users who are statistically distant from other users would be denied certain access by the automated system.

Ultimately, with Copilot, security, governance, and compliance are not and cannot be an afterthought, they’re the foundations. Failure to implement this will inadvertently cause more problems than Copilot will ‘fix’. These security, governance and compliance discussion points are not recommendations, they are necessary.

You can visit our dedicated Copilot page on to find out more. And no, Copilot didn’t write this, not all of it anyway