Microsoft Security Copilot Frequently Asked Questions

General information

What is Microsoft Security Copilot?

Microsoft Security Copilot is an AI cybersecurity product that enables security professionals to respond to threats quickly, process signals at machine speed, and assess risk exposure in minutes. 

How is Microsoft Security Copilot different from other AI security products?

Security Copilot combines advanced GPT4 models from OpenAI with everything that Microsoft brings to the table, including hyperscale infrastructure, cyber-specific orchestration, Microsoft Security's unique expertise, global threat intelligence, and comprehensive security products.

It's currently the only security AI solution built using Microsoft’s unique relationship with OpenAI, giving customers access to the latest and most advanced large language models (LLMs) and Microsoft’s hyperscale AI infrastructure.

Microsoft is in a unique position to transform security for our customers, not only because of our investments in AI, but also because we offer end-to-end security, identity, compliance, and much more across our portfolio. We're optimized to cover more threat vectors and deliver value with a coordinated experience.

What are the use cases and capabilities that Security Copilot unlocks for customers?

Security Copilot can be used for investigating and remediating security threats, building and reverse-engineering scripts, exploring risks and posture management, troubleshooting IT issues, creating and managing policies, building lifecycle workflows, and reporting to stakeholders. These use cases are applicable to various tasks for various roles in the security team. Read Security Copilot use cases for security and IT roles to delve deeper into how these use cases can benefit CISOs, threat intelligence analysts, IT admins, and the like.

Does Microsoft Security Copilot work with other Microsoft products?

Yes. Security Copilot works with other Microsoft Security products. These products include, but aren't limited to:

  • Azure Firewall
  • Microsoft Defender Attack Surface Management
  • Microsoft Defender for Cloud
  • Microsoft Defender Threat Intelligence,
  • Microsoft Defender XDR,
  • Microsoft Intune,
  • Microsoft Purview,
  • Microsoft Sentinel

Security Copilot can access data from these products and provide an assistive Copilot experience to increase the effectiveness and efficiency of security professionals using those solutions. For example, capabilities such as script analysis enable customers to analyze hundreds of lines of code and interpret them via natural language in minutes. This capability drastically surpasses even advanced analyst skills in terms of both speed and expertise. Security Copilot helps security professionals discover risks earlier, respond to them with greater guidance, and remain on top of vulnerabilities in the evolving threat landscape.

How does Security Copilot make Microsoft Defender XDR and Microsoft Sentinel better?

Microsoft Defender XDR and Microsoft Sentinel become even more powerful when security professionals use Security Copilot. Security Copilot delivers an experience that enriches and builds on the security data, signals, and existing incidents and insights sourced from Microsoft Defender XDR and Microsoft Sentinel. The new, embedded experience in Microsoft Defender XDR supercharges security teams with generative AI capabilities to take their efficiency to a new level for the following set of powerful use cases:

  • Respond to threats at the speed of AI with assisted incident investigation and response. With the embedded experience in Defender, Security Copilot provides summaries for active incidents and actionable step-by-step guidance for incident response, creating complete post-response activity, all in seconds and at the click of a button.

  • Scale advanced tasks to all skill levels. Security Copilot enables defenders at all skill levels to discover threats and vulnerabilities across multiple threat vectors with ease. The solution reasons in real time across security data and delivers an accessible way to perform advanced tasks using natural language.

  • Perform malicious code analysis in real time. Previously, malware analysis and reverse-engineering were limited to advanced responders. With Security Copilot, customers can analyze complex command line scripts and translate them into easily comprehensible natural language to help analysts understand actions and motivations of attackers.

  • Apply threat intelligence into your investigation workflows with ease. With Security Copilot, users can gain structured and contextualized insights into emerging threats, attack techniques, and whether an organization is exposed to a specific threat. Security Copilot helps prevent exposure to activity group campaigns and respond to incidents with greater guidance.

Does Security Copilot replace Microsoft Defender XDR and Microsoft Sentinel?

No, Security Copilot doesn't replace Microsoft Defender XDR or Microsoft Sentinel. Security Copilot assists security professionals in their day-to-day work, providing an upskilling experience and increased efficiency. It adds increased value over Microsoft Defender XDR and Microsoft Sentinel.

Does Security Copilot include access to Microsoft Defender Threat Intelligence (Defender TI)?

Yes*. When prompted, Security Copilot reasons over all content and data in Microsoft Defender Threat Intelligence (Defender TI) to return crucial context around activity groups, tooling, and vulnerabilities. Customers also have tenant-level Defender TI premium workbench access, enabling them to access Defender TI's full breadth of intelligence - Intel profiles, threat analysis, internet data sets, and more - to do a deeper dive into the content surfaced in Security Copilot.

*This access doesn't include the Defender TI API, which remains separately licensed.

When is Security Copilot generally available?

Security Copilot will be generally available for purchase on April 1, 2024.

Who are the intended users of Security Copilot?

SOC analysts, compliance analysts, and IT admins are the intended users of Security Copilot.

What languages are supported?

Security Copilot supports multiple languages. The model is available in eight languages* and the user experience is available in 25 languages.** 

*Model: English, German, Japanese, Spanish, French, Italian, Portuguese, and Chinese

**UX: Above languages plus Korean, Dutch, Swedish, Polish, Norwegian, Turkish, Danish, Finnish, and more in the user experience.

For more information, see Supported languages.

Will Early Access Program (EAP) customers receive GA features on April 1, and any other features that are added before their EAP agreement ends?

Yes, EAP customers will receive the features that are in the GA product and any other feature updates that occur during their EAP agreement time.

What happens to customers that are participating in the Early Access Program at GA?

Customer access under the Early Access Program will end six months after their purchase date. The contractual terms applicable to customers' use and consumption of the product under the Early Access Program applies during this period. After this time and after GA, Copilot of Security will be available for purchase and consumption under Microsoft's standard contracting channels. A migration plan is in place to support those customers migrating from EAP to GA to ensure that all their information is carried over to the GA Product.

When an EAP customer decides to purchase the GA product within 90 days of EAP expiry, Customer Data will remain available.

If an EAP customer decides not to purchase the GA product within 90 days of EAP expiry, their EAP Customer Data is deleted in accordance with our data retention policies.

Is Customer Data used to train Azure OpenAI Service foundation models?

No, Customer Data isn't used to train Azure OpenAI Service foundation models and this commitment is documented in our Product Terms. For more information on data sharing in the context of Security Copilot, see Privacy and data security.

What is the GDPR Guidance for EU Markets?

Microsoft complies with all laws and regulations applicable to its providing the Products and Service including security breach notification law and Data Protection Requirements (as defined in the Microsoft DPA). However, Microsoft isn't responsible for compliance with any laws or regulations applicable to Customer or Customer's industry that aren't generally applicable to information technology service providers. Microsoft doesn't determine whether Customer's data includes information subject to any specific law or regulation. For more information, see Microsoft Products and Services Data Protection Addendum (DPA).

Are US Government Cloud (GCC) customers eligible?

GCC isn't available at GA. At this time, Security Copilot isn't designed for customer usage with US Government clouds, including, but not limited to GCC, GCC High, DoD, and Microsoft Azure Government. While the technical path for Microsoft Sentinel connector works, the tenant is unable to access 75% of the product features because the Defender interfaces and data live within Microsoft Azure Government, which Copilot isn't integrated into.

Are US and Canada health care customers eligible?

US and Canada HLS customers are eligible to purchase Security Copilot. Microsoft Security Copilot is now listed and covered by Business Associate Agreement ("BAA"), which is important to healthcare providers who are subject to regulations under HIPAA. For additional information on compliance offerings currently covered for Microsoft Security Copilot can be found in the Service Trust Portal.

How do I export or delete data from Security Copilot?

You will need to contact support. For more information, see Contact support.

What is the difference between ChatGPT and Security Copilot?

ChatGPT and Security Copilot are both artificial intelligence (AI) technologies that were developed with the intent of helping users accomplish tasks and activities faster and more efficiently. While they might seem similar, there are significant differences between the two.

ChatGPT is a natural language processing technology. ChatGPT uses machine learning, deep learning, natural language understanding, and natural language generation to answer questions or respond to conversations. ChatGPT works off data trained from the Internet, uses prompts from users to aid in prompt engineering and model adjustments, and is limited to three concurrent plugins.

Security Copilot is a natural language, AI-powered security analysis tool designed to help organizations defend against threats at machine speed and scale. Security Copilot is built on OpenAI technology and is designed and engineered as an enterprise cyber AI from the ground up. The platform works off of customer connected plugins and Microsoft's global threat intelligence as grounding data. Entered prompts don't inform the model or prompt engineering unless submitted by the customer for review.

A key difference between ChatGPT and Security Copilot is what the systems are designed to accomplish. Microsoft Security Copilot is designed for posture management, incident response, and reporting. The solution draws insights from security signals aggregated from plugins, while ChatGPT works like a chatbot designed to hold a conversation with a user.

Security Copilot has access to up-to-date information from threat intelligence and draws insights from plugins so that security professionals are better equipped at defending against threats. Microsoft Security Copilot doesn't always get everything right and as with all AI tools, responses can contain mistakes. The built-in feedback mechanism provides users with control in helping improve the system.

Purchase information

How can customers purchase at GA?

Security Copilot is available across all channels: EA, MCA-E, CSP, Buy Online, and legacy Web Direct.

Are there any prerequisites to purchase?

An Azure subscription and Microsoft Entra ID (formerly known as Azure Active Directory) are prerequisites for using Security Copilot; there are no other product prerequisites. For more information, see Get started with Security Copilot.

Technical and product questions

Do customers receive onboarding support?

Upon purchase, customers receive easy access to documentation, videos, and blogs.

Where can I find more information on Data Protection and Privacy?

You can learn more at the Microsoft Trust Center.

What are the Compliance Offerings for Microsoft Security Copilot?

Microsoft Security Copilot is dedicated to upholding the highest standards of security, privacy, and operational excellence, as demonstrated by its extensive array of industry certifications. These include ISO 27001 for information security management, ISO 27018 for the protection of personal data in the cloud, ISO 27017 for cloud-specific security controls, and ISO 27701 for privacy information management.

Additionally, Security Copilot holds certifications for ISO 20000-1 in IT service management, ISO 9001 in quality management, and ISO 22301 in business continuity management. It also complies with SOC2 requirements for security, availability, and confidentiality, underscoring our commitment to delivering secure and reliable services. For healthcare-related services, Security Copilot is certified under the HiTrust CSF framework, further enhancing its security and compliance stance, and is covered by HIPAA Business Associate Agreements (BAA), ensuring adherence to healthcare regulations and the protection of sensitive health information.

For additional information on compliance offerings currently covered for Microsoft Security Copilot can be found in the Service Trust Portal.

Is deployed Microsoft Entra ID (formerly known as Azure Active Directory) a requirement for Security Copilot?

Yes. Security Copilot is a SaaS application and requires AAD to authenticate the users who have access.

Does Defender for XDR and Sentinel integration cover stored data or only alerts and incidents (notable events)?

Security Copilot covers both alerts/incidents and stored data. The product retrieves data from advanced hunting tables in Defender for XDR and top data tables in Microsoft Sentinel.

Does Security Copilot support tenant or subscription transfers?

No, at this time Security Copilot doesn't support moving Security Copilot resources across Microsoft Entra tenants or subscription transfers.

What partner tools are integrated with Security Copilot?

Customers can use ISV developed third-party plugins such as Cyware, Netskope SGNL, Tanium, and Valence Security in Public Preview capacity. Microsoft developed third-party plugins such as CIRC.lu, CrowdSec, Greynoise, and URLScan are also available. More plugins are going to be added in future.

Note

Products that integrate with Security Copilot need to be purchased separately.

Can Security Copilot isolate machines using Microsoft Defender for Endpoint and Microsoft Intune? Can you customize and/or block individual IOCs?

Security Copilot can't isolate machines. It can provide recommendations to security admins that they should isolate certain machines, but Security Copilot doesn't take that action itself. You can evaluate whether individual IOCs are present in the environment using Security Copilot. However, Security Copilot doesn't automatically block them. Automation may come at a later date, but for now Security Copilot doesn't take remediation action on its own.

Is Security Copilot IPv6 aware?

There's a capability called Get Web Components by IP Address that currently supports IPV4.

Does Security Copilot make recommendations for IoT/OT scenarios?

No, Security Copilot doesn't currently support IoT/OT.

Does Security Copilot offer dashboarding, or can you only investigate single events?

Security Copilot doesn't provide dashboarding, however you're able to query multiple incidents across Microsoft Sentinel. For the baseline, it can provide a visualization of an attack path.

Can Security Copilot execute workflows - from triaging to using pinned messages, to governing how the customer should label the incident and whether an incident should be closed?

No, workflows are currently not supported in Security Copilot.

What role-based access control or delegation features does Security Copilot have? How are user permissions kept in Security Copilot aligned to user permission configurations in other solutions?

Security Copilot uses "admin on behalf of" (AOBO) rights for the user that is logged in. For more information, see Understand authentication.

Why does Microsoft Copilot transfer data to a Microsoft tenant?

Microsoft Copilot is a SaaS (Software as a Service) offering that runs in the Azure production tenant. Users enter prompts and Security Copilot provides responses based on the insights sourced from other products such as Microsoft Defender XDR, Microsoft Sentinel, and Microsoft Intune. Security Copilot stores past prompts and responses for a user. The user can use the in-product experience to access prompts and responses. Data from a customer is logically isolated from the data of other customers. This data doesn't leave the Azure production tenant and is stored until customers ask to delete them or offboard from the product.

How is the transferred data secured in transit and at rest?

The data is encrypted both in transit and at rest as described in the Microsoft Products and Services Data Protection Addendum.

How is the transferred data protected from unauthorized access and what testing was done for this scenario?

By default, no human users have access to the database and the network access is restricted to the private network where the Microsoft Copilot application is deployed. If a human needs access to respond to an incident, then the on-call engineer needs elevated access and network access approved by authorized Microsoft employees.

Apart from regular feature testing, Microsoft also completed penetration testing. Microsoft Security Copilot complies with all the Microsoft Privacy, security and compliance requirements.

In "My Sessions" when an individual session is deleted, what happens to the session data?

Session data is stored for runtime purposes (to operate the service), and also in logs. In the runtime database, when a session is deleted via the in-product UX, all data associated with that session is marked as deleted and the time to live (TTL) is set to 30 days. After that TTL expires, queries can't access that data. A background process physically deletes the data after that time. In addition to the 'live' runtime database, there are periodic database backups. The backups will age out – these have short-lived retention periods (currently set to four days).

Logs, which contain session data aren't affected when a session is deleted via the in-product UX. These logs have a retention period of up to 90 days.

What Product Terms apply to Security Copilot? Is Security Copilot a "Microsoft Generative AI Service" within the meaning of Microsoft's Product Terms?

The following Product Terms govern Security Copilot customers:

  • Universal License Terms for Online Services terms in the Product Terms, which include the Microsoft Generative AI Services terms and the Customer Copyright Commitment.

  • Privacy & Security Terms in the Microsoft Product Terms, which include the Data Protection Addendum.

Security Copilot is a Generative AI Service within the definition of the Product Terms. Additionally, Security Copilot is a "Covered Product" for purposes of the Customer Copyright Commitment. At this time, in the Product Terms there are no product-specific terms unique to Security Copilot.

In addition to the Product Terms, customers' MBSA/EA and MCA agreements, for example, govern the parties' relationship. If a customer has specific questions about its agreements with Microsoft, engage the CE, the deal manager, or the local CELA supporting the deal.

The Microsoft Customer Copyright Commitment is a new commitment that extends Microsoft's existing intellectual property indemnity support to certain commercial Copilot services. The Customer Copyright Commitment applies to Security Copilot. If a third party sues a commercial customer for copyright infringement for using Microsoft's Copilots or the output they generate, Microsoft will defend the customer and pay the amount of any adverse judgments or settlements that result from the lawsuit, provided that the customer used the guardrails and content filters built into our products.

Can Security Copilot customers opt out of Azure OpenAI Service abuse monitoring? Does Security Copilot engage in any content filtering or abuse monitoring?

Azure OpenAI abuse monitoring is currently disabled service-wide for all customers.

Does Security Copilot make any location of data processing or data residency commitments?

  1. Location of data processing: At this time, Security Copilot doesn't make any contractual location of data processing commitments. Under the Product Terms and by using a Microsoft Generative AI Service, Security Copilot customers agree that their data may be processed outside of its tenant's geographic region. However, Customer administrators can select the location for prompt evaluation. While Microsoft recommends allowing prompt evaluation anywhere with available GPU capacity for optimal results, customers may select from four regions to have their prompts evaluated solely in those regions

    Currently Available Regions for prompt evaluation:

    • Australia (ANZ)
    • Europe (EU)
    • United Kingdom (UK)
    • United States (US).
  2. Location of data storage (data residency): At this time, Security Copilot doesn't make contractual data storage/residency commitments. If a customer isn't opted into data sharing*, Security Copilot stores data at rest in the home Geo of the tenant.

    For example, for a customer tenant whose home is in Germany, Security Copilot stores Customer Data in “Europe” as the designated Geo for Germany.

    *If a customer opts into data sharing, Customer Data such as prompts and responses are shared with Microsoft to enhance product performance, improve accuracy, and address response latency. When this event occurs, Customer Data such as prompts can be stored outside of the tenant Geo.

Is Security Copilot a Microsoft EU Data Boundary service?

At the time of GA, all Microsoft Security Services are out of scope for EU data residency requirements and Security Copilot won't be listed as an EUDB service.

Where is EU customer data stored?

Security Copilot stores Customer Data and Personal Data such as user prompts and Microsoft Entra Object IDs in the tenant Geo. If a customer provisions their tenant in the EU and isn't opted in to data sharing, all Customer Data and pseudonymized personal data are stored at rest within the EU. Processing of Customer Data and Personal Data prompts can occur in the designated Security GPU Geo. For more information on Security GPU geography selection, see Get Started with Security Copilot. If a customer is opted in to data sharing, prompts can be stored outside of the EU Data Boundary. For more information on data sharing, see Privacy and data security in Microsoft Security Copilot.

Are customer prompts (such as input content from the customer) considered Customer Data within the terms of the DPA and the Product Terms?

Yes, customer prompts are considered Customer Data. Under the Product Terms, customer prompts are considered Inputs. Inputs are defined as "all Customer Data that Customer provides, designates, selects, or inputs for use by a generative artificial intelligence technology to generate or customize an output".

Is "Output Content" considered Customer Data within the terms of the DPA and the Product Terms?

Yes, Output Content is Customer Data under the Product Terms.

Is there a transparency note or transparency documentation for Security Copilot?

Yes, the Responsible AI transparency document can be found here: Responsible AI FAQ.

How is Security Copilot dealing with a "token limit"?

Large language models (LLMs) including GPT have limits on how much information they can process at once. This limit is known as a "token limit", and roughly correlates to 1.2 words per token. Security Copilot uses the latest GPT models from Azure OpenAI to ensure we can process as much information as possible in a single session. In some cases, large prompts, long sessions, or verbose plugin output may overflow the token space. When this scenario happens, Security Copilot attempts to apply mitigations to ensure an output is always available, even if the content in that output isn't optimal. Those mitigations aren't always effective, and it might be necessary to stop processing the request and direct the user to try a different prompt or plugin.

The Azure OpenAI Service code of conduct includes "Responsible AI Mitigation Requirements". How do those requirements apply to Security Copilot customers?

These requirements don't apply to Security Copilot customers because Security Copilot implements these mitigations.

Partner information

What are the use cases for Partners?  

Partners can provide signals or build complementary solutions around Security Copilot scenarios.

I work with a managed security service provider (MSSP). Can they use and manage Security Copilot on my behalf?

Yes, MSSPs that provide SOC services for customers are able to access the customer's Security Copilot environment if the customer elects to provide access. (Bring your Own MSSP).

Note

Available in the standalone portal only with limited capability.

Onboarding and managing MSSPs access to your (customer) tenant is offered via Guest Access (B2B) and GDAP. There currently isn't a CSP or reseller multitenant model for MSSPs. Each customer is responsible for purchasing their own SCUs and setting up their MSSPs with the necessary access.

Can MSSPs, use a single instance of Security Copilot to manage multiple tenants?

Security Copilot doesn't support prompting across multiple tenants. Instead, MSSPs can use Tenant Switching and target one customer tenant, using supported delegated access options.

MSSPs can manage multiple customer tenants within Security Copilot by using Tenant Switching from their MSSP tenant. Tenant Switching allows for partners to switch between tenants and secure one tenant at a time with prompts/queries by selecting the customer tenant in a dropdown. Customers can also add the TenantID (GUID) in the Security Copilot session URL as a query string parameter. Customer tenants can be selected based on delegated access to the customer tenant. Security Copilot is used under the user context, so the partner is only able to access what the delegated account has been given access to in the customer tenant.

Are there third-party integrations available today?

We're working with ISVs like Cyware, Netskope, SGNL, Tanium, and Valence Security to release their plugins in Public Preview and continue to build more integrations with the rest of the Partner ecosystem.

Is there a marketplace for the plugins or services? 

There isn't a plugin marketplace at GA. ISVs can publish their solutions to GitHub. At GA, all partners are required to publish their solutions or managed services to the Microsoft Commercial marketplace. More information on publishing to marketplace can be found here:

Publish solution to the Microsoft Commercial Marketplace:

MSSP Specific: Must have a security designation in Microsoft AI Cloud Partner Program.

SaaS Specific:

What about MSSPs who are also participating in Security Copilot EAP as a customer? 

MSSPs who are part of EAP will continue to have access to Security Copilot until the EAP 6-month agreement ends. If an MSSP is using the same tenant for internal and customer managed SOC services, they need to purchase a capacity plan and ensure that supported delegated access models are enabled through the managed SOC used tenant. If an MSSP is using a tenant only for customer managed SOC services that isn't used for internal SOC managed services, the MSSP needs to have the tenant manually provisioned for Security Copilot by Microsoft. So, no capacity plan purchase is needed for provisioning of Security Copilot.

What if MSSPs aren't using Microsoft Defender XDR or Microsoft Sentinel? 

Microsoft Security Copilot doesn't have any specific Microsoft security product requirement for provisioning or use since the solution is built on aggregating data sources both from Microsoft and third party services. With that said, there's a significant value in having Microsoft Defender XDR and Microsoft Sentinel enabled as supported plugins for enriching investigations. Security Copilot only uses skills and accesses data from enabled plugins.

Does an MSSPs SOC Solution need to be hosted on Azure? 

It's recommended that the solution is hosted on Azure but not required. 

Is there a product roadmap that can be shared with Partners? 

Not at this time.