Microsoft Copilot for M365: 4 Readiness Considerations for Legal and eDiscovery Teams

February 21, 2024



Jamie Brown
Jamie Brown

Enterprise adoption of new technology has always impacted legal and eDiscovery teams. Microsoft Copilot for Microsoft 365, Microsoft’s flagship generative AI tool, is no different. While it promises to increase workplace productivity and creativity for employees using Microsoft 365 applications, it also presents some challenges that should be taken into account.

During a recent webinar, An Expert View on Microsoft Copilot for Microsoft 365, we examined this new solution in depth. Part of that conversation touched on organizational and legal team readiness, which we’ll summarize in this article.

What is Microsoft Copilot?

Microsoft Copilot for Microsoft 365 (“Microsoft Copilot”) is an enterprise-grade generative AI product available within many Microsoft 365 applications, such as Teams, Outlook, SharePoint, OneNote, etc. The tool requires a per-user subscription on top of the enterprise license for Microsoft 365, so it’s not universally available.

Touting itself as an AI assistant, Microsoft Copilot draws on a large language model and all of your Microsoft 365 data to enhance user productivity and creativity. Some examples that demonstrate its capabilities include the creation of meeting notes (from Microsoft Teams meeting transcriptions), responding to emails (within Outlook), generating ideas based upon drawings or notes (from Whiteboard or OneNote), kickstarting collaborative tasks with teammates (using Loop), and creating a draft presentation (in PowerPoint).

How could a tool that promises to transform productivity and creativity—and has generated so much excitement and fervor—create issues for legal and eDiscovery teams? After all, it appears somewhat contained, as it sits inside a company’s M365 tenant and uses only content owned by the enterprise. It doesn’t, for example, pass data to a third-party source, like ChatGPT.  

Let’s take a look at four areas for legal and eDiscovery teams to consider when implementing Microsoft Copilot.

1. Understanding eDiscovery implications

Microsoft Copilot creates new content, including prompts and responses. This data is stored in various locations depending on the specific Microsoft Copilot application, the retention and deletion of which will depend on the policies currently in place. If retained and relevant to actual or anticipated litigation, that data will need to be preserved and, later, potentially collected, searched, reviewed, and produced.  

Thankfully, this data is “discoverable” using the M365 Purview tools (for eDiscovery). But for each Microsoft Copilot in use, discovery teams need to define processes for these eDiscovery tasks and maintain them. The teams also need to have a basic understanding of how Microsoft Copilot works—what data is being generated, where it is being stored, how the data appears, how to validate data creation, and what use cases are most common for various business units to help legal stakeholders know when and what data to preserve and/or collect.  

Finally, Microsoft Copilot will have a significant impact on how certain internal investigations are conducted, such as those involving the access and/or disclosure of confidential and sensitive information—information that might not have been adequately protected using other M365 tools.

2. Assuring data security

Permissions and data access are a significant aspect of securing your enterprise’s data. As part of Microsoft Copilot adoption, you should understand the data security risks given its potential deployment and use. At a minimum, this involves a review of current access and permissions for sensitive, confidential, and proprietary data, an assessment of risks, and consideration of risk mitigation using M365 tools for information protection. Most organizations would benefit from improving their data governance in this area, and Microsoft Copilot deployment presents a good opportunity to do so.  

3. Addressing privacy concerns

Closely related to security, consumer and user privacy should remain top of mind. Answering the following questions can help uncover privacy gaps that may otherwise have gone overlooked:

  • Does the implementation (and potential use cases) comply with privacy regulations and guidance, including laws such as GDPR and HIPAA, where geographically appropriate?
  • Is the data processing compliant with applicable privacy regulations and guidance?
  • How is the data flowing in response to prompts submitted to the AI? Are there any cross-border concerns to be aware of?
  • How is consent being captured and ensured relevant to data usage by Microsoft Copilot?  
  • Do internal privacy policies need to be updated to account for the inclusion of Microsoft Copilot in the technology ecosystem?

4. Empowering teams with knowledge

With any new technology, it’s key to identify and engage in best practices to get the most out of your investment. The same can be said for approaching legal processes, including eDiscovery, related to those tools.

There is fear of the unknown with any new technology, and in no arena has this been truer than with AI solutions. Companies should implement a change management strategy to educate all employees (including those in legal) on the basics of Microsoft Copilot and train them on prompt creation and safeguards for managing sensitive, confidential, and propriety data. It should also address resistance and any lingering questions about your Microsoft Copilot adoption. Finally, this strategy should educate teams on the impact Microsoft Copilot has on eDiscovery and what steps the organization has taken to prepare, including updates to processes.  

Foreseeing a future built on generative AI

From conversations with clients and other organizations, it’s clear that we are embarking on a truly transformative era. Setting aside Microsoft Copilot, companies are also developing proprietary tools for internal use, perhaps to address concerns (actual or perceived) about the protection of confidential data, or to meet a specific use case that could benefit from a different large language model (one that is trained, for example, on a curated and controlled data set). It will be interesting to observe the longevity and viability of proprietary tools deployed for widespread user consumption.  

As Microsoft adjusts its licensing model for Microsoft Copilot with feedback from the market and releases updates for the tool, it’s very likely we’ll see broader adoption among enterprises already using Microsoft 365. From an eDiscovery perspective, it will also reflect a continued shift away from custodian-centric data to collaborative data and often content-centric data, necessitating different methods for scoping preservation and collection requests. Finally, the authenticity of information defined by ownership, access, and use will take on greater prominence in circumstances where AI-generated content is at stake. Legal and eDiscovery teams will be called upon to support these efforts.

If your organization is on the road toward adopting Microsoft Copilot or if you want to learn more about how it might affect your legal function, visit our Gen AI assessment page.

About the Author

Jamie Brown

As the Vice President of Information Governance, Jamie is responsible for leading and managing Lighthouse’s global consulting practice. She also advises clients, particularly those in heavily regulated industries, on legal and regulatory risk-mitigation strategies in connection with digital transformation, data remediation, and litigation/investigation readiness and response.   A former regulator, law firm partner, and in-house counsel, Jamie brings more than twenty years of experience leading and litigating complex matters involving technology and information in federal and state courts, and responding to investigatory demands brought by the U.S. Department of Justice, and U.S. and foreign regulatory bodies. She specializes in information law, pre-trial strategies, cross-border investigations, eDiscovery, and data protection.