Steering the Microsoft Copilot Fleet: What Every Enterprise Needs to Know
April 30, 2025
By:
Summary: The possibilities in the Microsoft Copilot universe continue to expand. Discover the latest Copilot features and related risks you should know about along with guidance on how to better manage them.
As part of our ongoing Microsoft 365 Academy series, we gathered with eDiscovery and information governance leaders to discuss the current state of Microsoft 365 Copilot and what it takes to govern it. We explored how Copilot and its related risks differ depending on the M365 application and the data it is working with. We also provided guidance on how Microsoft Purview can help enterprises maintain control as they scale AI across the organization.
Here are seven key takeaways to help you better manage Copilot and AI at your organization.
1. Folks are already using Copilot—even without the license
You might be surprised to learn that users can access Copilot Chat—a free, browser-based tool—without an enterprise Copilot license, creating potential discovery headaches. If enterprise data protection is turned on, those prompts and responses are stored in Exchange mailboxes, making them discoverable and subject to retention.
Enterprise risk: Even without a license, your users may already be creating discoverable Copilot content.
Opportunity: Use this as a catalyst to implement retention policies and audit how employees access Copilot.
2. Copilot creates multiple discoverable artifacts
Every Copilot interaction—whether it’s generating a document or summarizing a file—creates at least two key artifacts:
- The user prompt
- Copilot response
- Optionally, referenced files (e.g., from SharePoint or OneDrive)
Reference file artifacts may be considered cloud attachments, making them subject to discovery, retention, and labeling requirements.
Enterprise risk: Your data volumes for discovery will significantly increase without an information governance strategy to manage these artifacts.
Opportunity: Reduce eDiscovery costs and overcollection of data by understanding these artifacts and fine-tuning Purview search, labeling, and retention strategies.
3. Retention policies must account for new Copilot content types
Admins can now use Purview controls to separate retention policies for Copilot interactions from Teams chats. However, organizations cannot set separate retention for Copilot prompts generated in different applications (like Word vs. Teams).
Enterprise risk: Without dedicated retention settings, Copilot-generated content from M365 apps may be retained longer than intended or potentially prematurely deleted, creating legal and compliance problems.
Opportunity: Use metadata in Microsoft Purview to filter Copilot artifacts by application and content type for eDiscovery.
4. Security by obscurity no longer works
Security by obscurity is the incorrect assumption that data is secure because users don’t know it exists. But with the Microsoft Graph driving Copilot’s intelligence and ingesting all information it can access, it runs the risk of oversharing. If an employee has access to content, even if they’ve never viewed it, Copilot may use that content to generate responses.
Enterprise risk: Copilot can expose stale, outdated, or unprotected sensitive content, increasing the likelihood of using inaccurate data or unintentionally disclosing sensitive information.
Opportunity: Improve access hygiene and content quality by mapping enterprise data, performing information governance assessments like site access reviews, and defensible data deletion processes. SharePoint Advanced Management and Microsoft Purview tools make this scalable.
5. Sensitivity labels shape what Copilot can and cannot use
Copilot works with Microsoft Sensitivity labels, restricting access to sensitive data and inheriting the highest-level label from any reference files and applying the same level to the newly generated content. This means responses that reference documents labelled as "top secret" would label new content as "top secret."
Enterprise risk: Without sensitivity labels in place, Copilot may generate content using sensitive or restricted data and apply inconsistent labels to the new files it creates, increasing the risk of exposure.
Opportunity: Roll out a labeling strategy to ensure Copilot respects content boundaries and automatically applies appropriate protections, strengthening proactive risk management and compliance.
6. Purview tools are evolving to support AI-specific risk management
We emphasized how organizations can use existing Microsoft Purview features to govern Copilot:
- Data Lifecycle Management to control retention and deletion
- Data Loss Prevention to detect sensitive information in Copilot prompts, block risky actions, and generate alerts for review
- Communication Compliance to flag unethical or inappropriate prompts
- Insider Risk Management to correlate behavior across file access and Copilot interactions
- Data Security Posture Management (DSPM) for AI for centralized visibility into Generative AI usage
Enterprise risk: Without configuring these tools for Copilot, AI usage can quickly outpace your ability to monitor, retain, or protect sensitive content, introducing compliance gaps and other risk exposure.
Opportunity: Most Copilot governance controls are available in your existing Purview suite. Proactively configure them for Copilot to scale AI use safely and defensibly.
7. Start small, then scale with confidence
With Copilot’s reach extending into every corner of the M365 ecosystem, it’s easy to feel overwhelmed. One practical approach we recommend is restricting SharePoint search to a curated set of sites and expanding that as you clean up access controls and site ownership.
Enterprise risk: A full-scale rollout without restricting Copilot’s data access can surface ROT (redundant, obsolete, or trivial) or sensitive content, potentially undermining user trust or violating internal and external policies.
Opportunity: Start with a limited, well-governed set of SharePoint locations to deliver early Copilot value while IT, legal, and compliance teams solidify access controls and adopt information governance best practices.
Final thought: education comes first
The hardest part of these Copilot implementation projects isn’t enabling the technology. It’s getting the right stakeholders aligned. Legal, compliance, privacy, and IT all need a shared understanding of how Copilot works and what controls are available.
The policies can be turned on in three minutes. The challenge is getting everyone to agree that they need to be.
Learn more about how Lighthouse can help you configure Microsoft Purview to enhance data security and privacy on our website.
