Examine how Microsoft 365 Copilot meets regulatory compliance mandates
As regulation in the AI space evolves, Microsoft continues to adapt and respond to fulfill future regulatory requirements. Microsoft 365 Copilot aligns to Microsoft’s current commitments to data security and privacy in the enterprise. There's no change to these commitments. Microsoft 365 Copilot is integrated into Microsoft 365 and adheres to all existing privacy, security, and compliance commitments to Microsoft 365 commercial customers. For more information, see Microsoft Compliance.
Beyond adhering to regulations, Microsoft prioritizes an open dialogue with its customers, partners, and regulatory authorities to better understand and address concerns. This dialogue helps foster an environment of trust and cooperation. Microsoft acknowledges that privacy, security, and transparency aren't just features, but prerequisites in the AI-driven landscape at Microsoft.
Regulatory compliance mandates
As an AI system, Microsoft 365 Copilot must adhere to relevant laws and regulations governing artificial intelligence. Microsoft ensures Microsoft 365 Copilot complies with the following key regulatory mandates:
Data Privacy
- Microsoft reinforces customer control over its data through its commitment to comply with broadly applicable privacy laws and privacy standards. For example, ISO/IEC 27018, the world’s first international code of practice for cloud privacy.
- Microsoft uses rigorous physical security, background screening, and a multi-layered encryption strategy to protect the confidentiality and integrity of customer data.
- For content accessed through Microsoft 365 Copilot plug-ins, encryption can exclude programmatic access, thus limiting the plug-in from accessing the content. For more information, see Configure usage rights for Azure Information Protection.
Transparency
- Microsoft provides detailed documentation on GitHub that explains how it designed Microsoft 365 Copilot, what its capabilities are, and its limitations in generating suggestions. For more information, see Microsoft Microsoft 365 Copilot overview.
- Microsoft also provides information on the extensibility options for Copilot, such as sample code for Microsoft Graph TypeScript GitHub connector. For more information, see Extensibility options for Microsoft Microsoft 365 Copilot.
- Microsoft enables user control over adopting Microsoft 365 Copilot suggestions to ensure transparency. Users have full control in accepting or rejecting Microsoft 365 Copilot suggestions. This design ensures transparency in adopting Microsoft 365 Copilot's contributions.
Fairness
- During its development, Microsoft evaluated Microsoft 365 Copilot using tests designed to detect demographic biases, unfair outputs, or harm. This testing helps prevent unfair treatment.
- Microsoft has an ongoing monitoring process to continuously evaluate Microsoft 365 Copilot suggestions for fairness issues. This process allows detection and correction of problems.
- Users can reject biased or unfair suggestions and report them to Microsoft. You can reject biased suggestions and let Microsoft know about them.
Accountability
- Microsoft provides channels like GitHub discussions and Microsoft 365 Copilot support for giving feedback, lodging complaints, or getting assistance related to Microsoft 365 Copilot.
- Microsoft conducts regular internal reviews of Microsoft 365 Copilot with respect to compliance with regulations and ethical AI principles.
- Microsoft makes a commitment to its customers that they can use Copilot services and the output they generate without worrying about copyright claims. If customers are challenged on copyright grounds, Microsoft assumes responsibility for the potential legal risks involved. For more information, see Microsoft announces new Copilot Copyright Commitment for customers.