Microsoft Copilot for M365: What You Might Not Know

March 1, 2024

|

By:

John Collins
John Collins

Artificial intelligence tools like ChatGPT and Google Bard are here to stay for the long term. Microsoft Copilot for Microsoft 365 has emerged as another player in this field, one that larger organizations extensively using Microsoft 365 will likely be taking a serious look at.

The term ‘Copilot’ is used by Microsoft as a branding term for many of their Generative AI offerings, which can certainly lead to some confusion. We are primarily concerned with Microsoft Copilot for Microsoft 365 (Microsoft Copilot for short in this article).

To further explore and examine Microsoft Copilot for the enterprise, we recently hosted a webinar—drawing from that conversation, we’ll cover what the tool is and some lesser-known aspects, which are valuable to understand if your organization is considering implementing this powerful generative AI tool.

The basics of Microsoft Copilot for M365

Microsoft Copilot is an AI-powered productivity tool that operates across the Microsoft 365 suite of applications. It applies a large language model (LLM) to content stored in an organization’s M365 tenant, leveraging the Microsoft Graph to determine the data and files a user has access to such as emails, documents, and chat messages; this content constitutes a rich dataset from which Microsoft Copilot can draw to create new content or to summarize existing content and repackage it in useful ways. And this all takes place within the familiar Microsoft Office applications.

We detailed more Microsoft Copilot capabilities and readiness considerations before deployment in a recent blog post.

In addition to these fundamentals, there are some facts about Microsoft Copilot you may be unaware of that are important to understand since the tool falls into the broader, but still somewhat misunderstood, realm of AI.

There are requirements for access

As of January 15, 2024, Microsoft has made Microsoft Copilot available to essentially everyone; this is a sharp turn from the previous requirement that organizations purchase a minimum of 300 seats for Copilot. However, Microsoft still requires the target Microsoft Copilot users to be licensed at minimum at the E3 level.  

Then there is the not-so-trivial matter of cost. Microsoft Copilot licenses for enterprise and education customers are $30 per user, per month—almost the same amount as an M365 E3 license. This constitutes a significant expenditure, which organizations must weigh when they consider adoption.

This is not ChatGPT

Microsoft Copilot is not the same thing as ChatGPT. The two solutions share certain similarities, especially along the lines of their generative capabilities. The difference lies mainly in what Microsoft Copilot references to generate its responses to user queries.

Both ChatGPT and Microsoft Copilot’s large language models were trained on a massive publicly available dataset of articles, eBooks, and other open internet sources. However, the large language model that powers Microsoft Copilot has access to the same data and content the user of the application has access to within your M365 tenant. Because all Microsoft Copilot activity happens within a private tenant and with your defined user permissions intact, responses to prompts are tailored to your organization while displaying only information individual users have permission to access.

In contrast, ChatGPT does not have access to any organization’s M365 tenant, nor can it use M365 data to train or retrain the large language model. Responses to prompts are based only on the publicly available dataset, making ChatGPT (even ChatGPT Enterprise) unable to customize responses based on your company’s data.

Microsoft Copilot isn’t immune to hallucinations

"Hallucination" is the term used to refer to inaccurate responses created by generative AI solutions. These take on many interesting, sometimes humorous forms—just do a quick search online and you’ll find some brilliant AI-generated art that defies all logic.

Humor aside, AI hallucinations are very real, and Microsoft Copilot isn’t immune. AI systems simply aren’t able to fully understand context at this point in their development. They also suffer from limitations in the datasets that form the source of their information and possible errors or biases in their training algorithms. Practically speaking, this leads to replies that a human can easily determine aren’t factually true.

When using any generative AI solution, it’s important to closely review the output for accuracy.

Legal requirements vary by region

Data privacy regulations, including the European Union (EU) General Data Protection Regulation (GDPR), have strict provisions about how companies handle data they create, process, and use. This includes the transfer of data across geographic borders. Microsoft Copilot users in the EU have some guarantees that their data remains within EU data boundaries when sending prompts and getting replies. Non-EU users have no such guarantees; traffic is routed to data centers based on capacity.  

Most notably, however, are new AI specific laws that impact the use of this technology and impose requirements on how it is governed. Of significance is the EU’s AI Act, which is in the final stages of approval and promises to be the world’s first comprehensive AI regulation. In the U.S., there are more than a dozen states that have enacted or proposed legislation regulating AI use. These laws are a moving target and should inform the deployment and ongoing maintenance of any AI program.

Conclusion

As the name suggests, generative AI tools generate new data, which can be discoverable as part of the legal process. For example, the Word document and follow-up emails you create using Microsoft Copilot are, of course, discoverable—but so are your prompts for creating those items, as well as any answers Microsoft Copilot provides even outside of generated documents, messages, and presentations.

The nuances of Microsoft Copilot highlight the real readiness and discovery concerns for generative AI, including veracity of responses, requirements for privacy and compliance, and the breadth of what is discoverable and how to handle that data.

To learn more about preparing for generative AI implementation visit our Gen AI Assessment page.

About the Author

John Collins

John is an Executive Director within Lighthouse’s Information Governance team and lead of the Microsoft 365 practice. John brings more than twenty years of legal, regulatory, and consulting experience to this role, having advised organizations on an array of information governance, litigation readiness, and records and information management challenges, including those that involve the use and implementation of cloud technologies. A lawyer by training, John employs a methodical yet creative approach to problem-solving, taking time to understand unique aspects of the client’s landscape and account for each and every variable as part of a future solution.For the past seven years, John has developed a niche practice in Microsoft 365 consulting and, specifically, advising clients on how to adopt and adapt M365 technologies to meet legal and compliance requirements.