Examine how Microsoft 365 Copilot protects sensitive business data
The permissions model within your Microsoft 365 tenant can help ensure that data won't unintentionally leak between users, groups, and tenants. Microsoft 365 Copilot presents only data that each individual can access using the same underlying controls for data access used in other Microsoft 365 services. The Microsoft Graph and Semantic Index for Copilot are also data sources. As such, they also honor the user identity-based access boundary. In doing so, the grounding process only accesses content the current user is authorized to access. For more information, see Microsoft’s privacy policy and service documentation.
When you have data that Microsoft Purview Information Protection encrypted, Microsoft 365 Copilot doesn't return it unless the user has at least the View usage right. You can apply encryption using sensitivity labels or by using restricted permissions in Microsoft 365 apps through Information Rights Management (IRM). While Microsoft 365 Copilot generated content can inherit the most restrictive permissions or label from the source, Microsoft 365 Copilot cites the original source, which retains the protection.
Security teams fight an asymmetric battle against well-resourced, organized, and sophisticated attackers. To protect their organizations, security practitioners must respond to threats that system noise often hides. System noise typically includes the large amounts of data and alerts generated by security systems that can make it difficult to detect real threats. Compounding this challenge is a global shortage of skilled security professionals.
Microsoft addresses these security battles on multiple fronts. In Microsoft 365, Microsoft implements multiple forms of protection to help prevent organizations from compromising sensitive business data and Microsoft 365 services and applications. This protection also prevents customers from gaining unauthorized access to other tenants or the Microsoft 365 system itself. These forms of protection include:
Access controls. First, by implementing strong multifactor authentication, Microsoft 365 accounts are more resistant to credential compromises or phishing attacks. Then to prevent lateral discovery and movement, a strategy of “Just enough access” is recommended to avoid overpermissioning user access to data. Microsoft 365 Copilot only accesses data that individual users have at least View permissions to access within Microsoft 365 services like SharePoint, OneDrive, and Teams. For example, if a user doesn't have access to a confidential project folder in SharePoint, Microsoft 365 Copilot can't view that content to generate responses. For content accessed through Microsoft 365 Copilot plug-ins, encryption can exclude programmatic access, which can prohibit the plug-in from accessing the content. For more information, see Configure usage rights for Azure Information Protection.
Isolated instances. One of the primary benefits of cloud computing is the concept of a shared, common infrastructure across numerous customers simultaneously, leading to economies of scale. Microsoft ensures that data is logically isolated by tenant and encrypted in motion and at rest, per its commitments to enterprise customers. For example, Microsoft 365 Copilot running in Contoso's tenant can't see data from Fabrikam's tenant. Logical isolation of an organization's content within each tenant for Microsoft 365 services is achieved through Microsoft Entra authorization and role-based access control. For more information, see Microsoft 365 isolation controls.
Note
Azure Active Directory (Azure AD) is now Microsoft Entra ID. Learn more.
Encryption. Microsoft 365 uses service-side technologies that encrypt an organization's data at rest and in transit, including BitLocker, Transport Layer Security (TLS), and Internet Protocol Security (IPsec). All data transmitted between Microsoft 365 Copilot cloud components is encrypted using the latest cryptography standards like TLS 1.2. Data stored at rest is also encrypted. This design prevents unauthorized access to data in transit or storage. If a user has access to encrypted data in Dynamics 365 and Power Platform, and the user provides it to Copilot, then Copilot can access it. For specific details about encryption in Microsoft 365, see Encryption in the Microsoft Cloud.
Compliance boundary. When you enter a prompt using Microsoft 365 Copilot, the information within your prompt, the data it retrieves, and the generated response remains within the Microsoft 365 service boundary. This design is in keeping with Microsoft's current privacy, security, and compliance commitments. However, administrators can choose to let data out of the compliance boundary; for example, to query the public web using Microsoft Bing.
No training data used. Microsoft 365 Copilot doesn't use any customer data, including prompts entered by users, to train or improve its underlying AI models. Microsoft 365 Copilot only uses the user's current context to shape responses.
Responsible AI. Microsoft adheres to principles of responsible AI development, including privacy, security, inclusiveness, and transparency. Doing so promotes the ethical treatment of sensitive data. Microsoft's commitment to comply with broadly applicable privacy laws and privacy standards, such as ISO/IEC 27018, the world’s first international code of practice for cloud privacy, reinforces an organization's control over its data.
Security measures. Microsoft 365 Copilot uses Microsoft's comprehensive security measures like threat monitoring, vulnerability assessments, and data protection controls. This design helps secure the Microsoft 365 Copilot service and infrastructure.
How Copilot protects customer data
Microsoft is uniquely positioned to deliver enterprise-ready AI. Microsoft 365 Copilot is powered by Azure OpenAI Service. It complies with Microsoft's existing privacy, security, and regulatory commitments to its customers.
Built on Microsoft's comprehensive approach to security, privacy, and compliance. Copilot is integrated in Microsoft services like Microsoft 365, Dynamics 365, and Power Platform. It inherits their security, privacy, and compliance policies and processes, such as multifactor authentication and compliance boundaries.
Multiple forms of protection safeguard organizational data. Service-side technologies encrypt organizational content at rest and in transit for robust security. Connections are safeguarded with Transport Layer Security (TLS), and data transfers between Microsoft 365, Dynamics 365, Power Platform, and Azure OpenAI occur over the Microsoft backbone network. This design ensures both reliability and safety. For more information, see Encryption in the Microsoft Cloud.
Architected to protect your data at both the tenant and the environment level. Data leakage is a huge concern for customers. Microsoft AI models aren't trained on and don't learn from your tenant data or your prompts. The only exception to this rule is when an organization's tenant administrator opts to share data with Microsoft. Within your environments, you can control access through permissions that you set up. Authentication and authorization mechanisms segregate requests to the shared model among tenants.
Important
Microsoft 365 Copilot only utilizes data that the user can access. In doing so, it uses the same technology that Microsoft has used for years to secure customer data.