A Smart Copilot With a Blind Spot

A large number of organizations are evaluating or adopting Microsoft Office 365 Copilot. The smart copilot assistant is integrated into Word, Excel, Outlook, and other applications we leverage on a daily basis. Office 365 Copilot provides users with capabilities such as generating meeting notes, analyzing sales data, drafting a presentation, etc. to make users more productive and generating deliverables at a faster pace. 

Microsoft designed Office 365 Copilot to leverage context from documents, presentations, spreadsheets, and other files available to provide more accurate and precise responses. The context from the organization ensures the response is relevant to the user’s prompt. This functionality presents a new risk when sensitive and confidential data is involved. Copilot can search an organization’s archives in seconds and expose that data to its employees.

How Copilot Works: It Sees Access, Not Intent

Microsoft 365 Copilot taps into the Microsoft Graph ecosystem to provide full context into the user’s permissions to understand which files and folders they can access. Below are the steps Microsoft 365 Copilot uses as indicated by Microsoft:

  1. In a Microsoft 365 app, a user enters a prompt in Copilot.
  2. Copilot preprocesses the input prompt using grounding and accesses Microsoft Graph in the user’s tenant.
    • Grounding improves the specificity of your prompt, and helps you get answers that are relevant and actionable to your specific task. The prompt can include text from input files or other content Copilot discovers.
    • The data Copilot uses to generate responses is encrypted in transit.
  3. Copilot sends the grounded prompt to the LLM. The LLM uses the prompt to generate a response that is contextually relevant to the user’s task.
  4. Copilot returns the response to the app and the user.

Copilot only recognizes permissions to dictate if the data should be used to generate a response. This reminds me of red team exercises when penetration testers leveraged open SMB shares to perform reconnaissance or steal data. Somewhere in the network, an employee left a file share open for access to any domain user – this is very likely to happen with Office 365.

A Real-World Scenario: The Senior Software Engineer Leaves to a Competitor

John, a Senior Software Engineer, recently accepted a new opportunity to further advance his career at a competing technology company. Two months ago, John’s manager placed him on a Performance Improvement Plan despite John helping complete two major projects. He felt frustrated, underappreciated, and overlooked which led him to leave to a competitor. 

Prior to leaving, John decided to create documents he can use at his new company. He opens Microsoft Word and enters the following prompt:

“Generate a detailed document of our AI strategy with architectural designs and our five year roadmap.”

As designed, Microsoft Office 365 Copilot begins to generate a response leveraging Microsoft Graph. As a reminder, Microsoft Graph aggregates and indexes content from across the company that John has access to like documents, emails, chats, OneDrive folders, etc.

One of the documents that John has access to is called AI_Strategy_Leadership_Only.pptx. The presentations contain product strategy, five year roadmap, and technical architecture diagrams. The folder containing this presentation accidentally had permission sharing set to “Everyone in the organization.” Since copilot sees that John has access to this presentation, it uses the file to help generate a detailed response. 

John now has everything he asked for and more. All from one single prompt. 

There are several implications to data leakage whether it is intentional or not.

  • Legal Compliance: Disclosing compensation details without proper authorization can trigger GDPR, CCPA, and other data privacy breach statutes. Reporting the breach could be legally mandatory if the breach contains Personal Identifiable Information (PII).
  • Employee Trust: Trust is hard to come by and easy to lose. If employees find that their compensation data was accidentally disclosed—within or outside the company—the resulting loss of morale can be difficult to reverse..
  • Reputational Risk: When intellectual property is exfiltrated from an exiting employee going to a competitor, this may result in a competitive disadvantage. What started as a low-profile departure can easily become a headline in the public spotlight.

How to Enable Safe & Effective Use of Microsoft Office 365 Copilot

In cybersecurity, it is important to enable the business to leverage technology. However, there are inherent risks associated with Office 365 Copilot. 

Organizations should take a methodical approach to adopting this technology:

  • Shadow AI: Organizations should understand their employees’ overall AI usage. Employees are leveraging public AI websites that have not been approved by the business and the cybersecurity team.
  • Observe & Understand Intentions: The cybersecurity team needs visibility into the prompts and responses between employees and AI. Most importantly, it’s imperative to understand the intentions behind these conversations for every employee. Unlike an email where there is a subject line, understanding a conversation with an AI is more time consuming without an automated way to classify intentions for each prompt.
  • Reduce Risk & Save Cost: Office 365 Copilot requires a significant investment as it is licensed per user, per month. By observing and understanding the intentions, organizations can better understand the usage of their users – should a user that runs two prompts per month need a license? In addition, an intent based policy enables users to utilize this powerful tool within the parameters defined in an AI Acceptable Use Policy. 
  • Control: Based on the intentions, an organization needs the ability to create policies based on specific topics. For example, HR is the only approved group to ask an AI about compensation data.
  • Protect: Use a solution that will help protect the AI and its employees. In some cases, an AI may hallucinate and provide a harmful response to an employee–who is responsible for the AI’s malicious response? 

Final Thoughts: Smarter Tools Need Smarter Governance

AI is a technology that enables businesses to innovate at a faster pace and become significantly more efficient. With tools like Office 365 Copilot, there are risks introduced that many organizations are not aware of or ready to govern. 

WitnessAI enables businesses to securely deploy Office 365 Copilot and other generative AI technologies into the enterprise with real-time visibility and policy-level control over every prompt. Because WitnessAI understands intent, organizations can reduce the risk exposure introduced by Office 365 Copilot—the automated reconnaissance of information is a feature, not a bug.

Related content: Unlocking the Full Potential of Microsoft Copilot with WitnessAI