Examine how Microsoft 365 Copilot uses your proprietary business data

Completed

Microsoft provides powerful security tools within its Microsoft 365 and Azure ecosystems to help organizations tighten permissions and implement "just enough access." The policies and settings that administrators define in these tools are used not only by Microsoft 365 and Azure to help protect privacy and prevent data oversharing, but also by Microsoft 365 Copilot.

Microsoft 365 Copilot provides value by connecting Large Language Models (LLMs) to your organizational data. Microsoft 365 Copilot accesses content and context through Microsoft Graph with respect to each specific user’s access permissions. It can generate responses anchored in your organizational data, such as user documents, emails, calendar, chats, meetings, and contacts. Microsoft 365 Copilot combines this content with the user’s working context, such as the meeting a user is in now, the email exchanges the user had on a subject, or the chat conversations the user had last week. Microsoft 365 Copilot uses this combination of content and context to help provide accurate, relevant, and contextual responses.

Your prompts (inputs), Copilot's responses (outputs or results), and data accessed through Microsoft Graph:

  • Are NOT available to other customers.
  • Are NOT used to train foundation LLMs, including the LLMs used by Microsoft 365 Copilot.
  • Are NOT used to train or improve Microsoft AI models, unless your tenant administrator opts in to sharing data with Microsoft. Microsoft AI models refer to a broader category of AI models that Microsoft uses, which can include various types of models beyond just the foundation LLMs.
  • Are NOT used to train or improve any non-Microsoft products or services (such as OpenAI models).

Microsoft 365 Copilot can generate responses anchored in the customer’s business content, such as:

  • User documents
  • Emails
  • Calendar
  • Chats
  • Meetings
  • Contacts
  • Other business data

Microsoft 365 Copilot also uses your organization's proprietary business data from Microsoft 365 services to generate responses. In doing so, it only accesses data the user has permission to view within their tenant. It then personalizes each response to a user's business context.

Data access

Microsoft 365 Copilot only displays organizational data to which individual users have at least View permissions. It's important that organizations use the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content.

Microsoft 365 only surfaces the cloud content for the current user's tenant. Microsoft 365 Copilot doesn't search other tenants on which the user may also be a B2B guest, or noncurrent user’s tenants set up with either cross-tenant access or cross-tenant sync.

When you enter prompts using Microsoft 365 Copilot, the information contained within your prompts, the data they retrieve, and the generated responses remain within the Microsoft 365 service boundary. This design is in keeping with Microsoft's current privacy, security, and compliance commitments.

Note

Administrators can choose to let data out of the compliance boundary; for example, to query public web content using Microsoft Bing.

The only input to Microsoft 365 Copilot is the user's prompt. The prompt can also include input files or content discovered and presented to the user (by the tenant's Microsoft 365 Copilot orchestrator instance) that the user selected for input to the LLM.

Note

Organizations can optionally provide customer feedback to Microsoft to improve Microsoft 365 Copilot, similar to how it uses customer feedback to improve other Microsoft 365 services and Microsoft 365 apps. However, Microsoft doesn't use this feedback to train any Large Language Models (LLMs). Customers can manage feedback through administrative controls. For more information, see Manage Microsoft feedback for your organization.

Security, privacy, and data residency

Microsoft 365 Copilot follows these foundational principles:

  • Built on Microsoft’s comprehensive approach to security, compliance, and privacy.
  • Architected to protect tenant, group, and individual data.
  • Committed to responsible AI.

Microsoft 365 Copilot uses a company's organizational content in its Microsoft 365 tenant. This data includes users’ calendars, emails, chats, documents, meetings, contacts, and more. While Microsoft 365 Copilot is already able to use the apps and data within the Microsoft 365 ecosystem, many users also depend on various external tools and services for work management and collaboration. Customers can address this gap by extending Microsoft 365 Copilot to enable users to work with their third-party tools and services. You can extend Microsoft 365 Copilot by building a plug-in or by connecting to an external data source. For more information, see Extend Microsoft 365 Copilot.

Microsoft 365 Copilot doesn't use OpenAI’s publicly available services. Instead, all processing is performed using Azure OpenAI services with its own separate instances of the Large Language Models. Microsoft 365 Copilot doesn't change existing data processing and residency commitments that are applicable to Microsoft 365 tenants. Calls made by Microsoft 365 Copilot to the LLM are routed to the closest data centers in the region. However, they can also call into other regions where capacity is available during high utilization periods. For European Union (EU) users, Microsoft 365 includes other safeguards to comply with the EU Data Boundary. EU traffic stays within the EU Data Boundary while worldwide traffic can be sent to the EU and other regions for LLM processing.

The richness of the Microsoft 365 Copilot experience depends on the data that can be retrieved from Microsoft 365 and connected external data sources. Tenants with the most abundant data in Microsoft 365 (Exchange, OneDrive, SharePoint, Teams) get the best results from Microsoft 365 Copilot. With access to comprehensive organizational data, Microsoft 365 Copilot can suggest more relevant and personalized content based on the user’s work context and preferences.

Microsoft 365 Copilot responses

The responses that generative AI produces aren't guaranteed to be 100% factual. While Microsoft continues to improve responses, users should still use their judgment when reviewing the output before sending them to others. Microsoft 365 Copilot provides useful drafts and summaries to help users achieve more while giving them a chance to review the generated AI rather than fully automating these tasks.

Microsoft continues to improve algorithms to proactively address issues, such as misinformation and disinformation, content blocking, data safety, and preventing the promotion of harmful or discriminatory content in line with the company's responsible AI principles.

Microsoft doesn't claim ownership of the output of the service. That said, it doesn't make a determination on whether a customer’s output is copyright protected or enforceable against other users. Why? Because generative AI systems may produce similar responses to similar prompts or queries from multiple customers. As a result, multiple customers may have or claim rights in content that is the same or substantially similar.

If a third party sues a commercial customer for copyright infringement for using Microsoft Copilots or the output they generate, Microsoft will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit, as long as the customer used the guardrails and content filters Microsoft has built into its products. For more information, see Microsoft announces new Copilot Copyright Commitment for customers.

Data stored about user interactions with Microsoft 365 Copilot

When a user interacts with Microsoft 365 Copilot (using apps such as Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard), Microsoft stores data about these interactions. The stored data includes the user's prompt and Copilot's response, including citations to any information used to ground Copilot's response. Microsoft refers to the user’s prompt and Copilot’s response to that prompt as the “content of interactions.” The record of those interactions is the user’s Copilot interaction history.

For example, this stored data provides users with Copilot interaction history in Microsoft Copilot with Graph-grounded chat and meetings in Microsoft Teams. This data is processed and stored in alignment with contractual commitments with your organization’s other content in Microsoft 365. The data is encrypted while it's stored. It isn't used to train foundation LLMs, including those used by Microsoft 365 Copilot.

To view and manage this stored data, admins can use Content search or Microsoft Purview. Admins can also use Microsoft Purview to set retention policies for the data related to chat interactions with Copilot. For more information, see the following articles:

For Microsoft Teams chats with Copilot, admins can also use Microsoft Teams Export APIs to view the stored data.

Note

Users can delete their Copilot interaction history, which includes their prompts and the responses Copilot returns. Instruction on how to delete user interactions is included in a later training module titled Manage Microsoft Copilot.

Knowledge check

Choose the best response for the following question.

Check your knowledge

1.

What organizational data does Microsoft 365 Copilot display?