Dela via


Responsible AI FAQ for Find facts with Copilot (preview)

Important

Some or all of this functionality is available as part of a preview release. The content and the functionality are subject to change.

This FAQ provides answers to frequently asked questions about the AI technology used in Find facts (preview) in Microsoft Sustainability Manager. It includes key considerations and details about how the AI is used, how it was tested and evaluated, and any specific limitations.

What does it mean to generate facts with Copilot?

Organizations can upload multiple ESG-related documents to generate drafts for facts, including references to support the findings. These facts can then be linked to assessments to fulfill reporting requirements.

What can Find facts with Copilot (preview) do?

Copilot extracts information relevant to environmental, social, and governance (ESG) disclosure requirements from long-form ESG documents to draft ESG report responses. You can select up to five documents at a time from which to generate ESG disclosure responses, in the form of facts in Microsoft Sustainability Manager.

What is the intended of this Copilot?

The intended use is to help sustainability practitioners/controllers, who are tasked with creating ESG reports, to aggregate ESG information across high volumes of data to draft relevant responses that address ESG requirements.

How was the Copilot evaluated? What metrics are used to measure performance?

Copilot was evaluated through a thorough red teaming process, testing for common risks such as violence, self-harm, sexual content, and hate and fairness. In addition, Copilot’s performance was measured using extensive grounded response testing. It tests for coverage of intended uses to assess accuracy and the extent to which responses are grounded in reference data.

What are the limitations of the Copilot? How can users minimize the impact of such limitations when using the system?

Finding facts in documents is currently limited to at most five documents at a time. Additionally, there are currently only three supported file types: PDF, DOCX, and TXT. Finally, Copilot can only recognize and search text at this time. Any images contained within the document aren't supported.

What operational factors and settings allow for effective and responsible use of Copilot?

Users with permissions to read and write to External Reporting Settings can enable and disable Copilot. Copilot also includes the following disclosure messaging:

  • Any fact found with Copilot might be incorrect and should be reviewed for accuracy.

  • This fact has been generated using AI. This feature is in preview. Always review for accuracy before submitting for approval.

  • Use Copilot to start finding facts from your documents. By using this AI feature, which is available through the Azure OpenAI Service, you agree that data may be stored and processed outside of your environment's geographic region, unless service specific terms or product documentation states otherwise. Previews of generative AI features, including Copilot, within Microsoft Sustainability Manager are subject to the Microsoft Generative AI Services section of the Universal License Terms for Online Services in the Product Terms.

How do I provide feedback on Copilot in external reporting?

Users can provide feedback or ask questions on the Microsoft Cloud for Sustainability Community portal.