With the use of Security Copilot, it is possible to enrich and triage alerts automatically using GenAI data. Microsoft recently developed new SOC automation playbooks to accelerate AI-automated triage based on Security Copilot and Microsoft Sentinel.

Since the launch of Microsoft Sentinel, the automation flow has primarily been based on Azure Logic Apps within the SOAR solution, built on top of Microsoft Defender XDR and Sentinel. Microsoft has recently improved the integration between Azure Logic Apps and Security Copilot, enabling automatic submission of prompts or promptbooks to enrich the alerting process based on incident data.

Created some months ago, the following blog focussed on Security Copilot: How to onboard and getting started with Copilot for Security

SOAR in Defender/ Sentinel

Many of you have probably heard the term SOAR, but for those who are unfamiliar with it, SOAR stands for Security Orchestration, Automation, and Response. Sentinel has two components that together deliver the SOAR framework, which includes the following components:

  • Automation Rules
  • Playbooks

With the use of Automation Rules, it is possible to define the trigger with some statements and proceed to perform an action based on the trigger and conditions. A simple action can be to change the status to active and add some tags. Automation rules can also perform actions to run actions outside Azure, it can trigger Azure Playbooks.

As already mentioned, the automation is part of the Logic App playbook. A playbook is a Logic App with mostly a trigger from the Sentinel alert of the incident; entity. The Logic App will run automated actions like enrichments of data from 3th party source providers or add custom automation to auto-enrich or auto-close the alert by decisions or data matching (like GenAI analysis). In this blog, we will explore the possibility of enriching Defender XDR/ Sentinel alerts with the Security Copilot playbooks to bring the incident data together with the AI analysis and data.

Good to know when you need SOAR within Microsoft Sentinel, the automation rule and playbooks are still both needed. Automation rules will be required to link analytic rules to playbooks as a mechanism to trigger the playbooks automatically.

Security Copilot example

Microsoft developed a security copilot playbook with the automated enrichments of the following items:

  • Rapid analysis of the process and associated command line elements to present the output
  • Receives Intune data to bring in a summary of OS information, compliance status, and hardware information
  • manually triage
  • Receives external input from AbuseIPDB to enrich alert
  • Add streamlined incident management with auto-commenting and real-time updates of the auto investigation based on AI and the playbook
  • Create an incident summary including the following:
    • Incident overview
    • Incident description
    • Analysis of incident entities
    • Possible mitigation steps
    • Conclusion of the incident

The playbook will start from the Sentinel incident and loop to all the entities to run the security copilot prompts to analyze and enrich the incident. All security copilot-generated results will be added in the comments of the incident (and automatically synced to Defender XDR)

More information about the Microsoft-created Logic App: Boost SOC automation with AI: Speed up incident triage with Security Copilot and Microsoft Sentinel


Playbook requirements

Prerequisites

Before deploying the Logic App, the following prerequisites need to be in place:

  • The user or service principal deploying this Logic App should have the Contributor role on the Azure Resource Group where the Logic App will be hosted
  • Microsoft Security Copilot should be enabled in the tenant
  • The user should have access to Microsoft Security Copilot to submit prompts by authenticating to the Microsoft Copilot for Security connector within the Logic App
  • Microsoft Sentinel is configured and generates incidents automatically
  • Obtain an AbuseIPDB API key to perform IP address reputation analysis

Prerequisites Security Copilot

The following Security Copilot skills are used as part of the playbook flow and need to be enabled in Security Copilot:

(AbuseIPDB API key is performed via the AbuseIPDB skill) – This means Security Copilot is initiating the AbuseIPD API and not directly from the Logic App via API calls.

SkillDescription of the skill
ProcessAnalyzerScrutinizes process names and command lines, providing detailed insights into potentially malicious activities.
GetEntraUserDetailsRetrieves comprehensive user information 
GetIntineDevicesFacilitates the extraction of device details from Intune, ensuring that all devices associated with an incident are thoroughly examined
AbuseIPDBPerforms IP address reputation checks, helping to identify and mitigate threats from suspicious IP addresses

Setting up Microsoft Security Copilot

When Security Copilot is already configured – you can skip this part and go directly to the Logic App configuration

Microsoft Copilot for Security is Generally Available as of April 1, 2024. Microsoft Copilot for Security is available from the standalone experience or embedded in Defender XDR as part of the integration with security.microsoft.com (Defender XDR).

Prerequisites

First of all, we need some prerequisites before we can enable Microsoft Copilot for Security:

  • Azure subscription
  • Azure owner or Contributor (part of the resource group level)

Moneycost

As mentioned, Microsoft Copilot for Security is not free and is based on a specific cost model based on the used resources. Good to know – it is not part of Copilot for Microsoft 365. It is a different service. Copilot for Microsoft 365 charges a fixed monthly fee, whereas Copilot for Security uses consumption-based pricing based on pay-as-you-go based on the usage of a Security Compute Unit (SCU)

The security compute units are required resources for the consistent performance of Microsoft Copilot for Security. Each security compute unit is billed by hour. You can increase and decrease them at any time. Billing is calculated on an hourly basis with a minimum calculation of one hour. This means; you will pay a minimum of 1 hour after enabling the Copilot for Security with 1 SCU.

Based on evaluation activities, a good starting number is at least 2 units. However, this depends on the environment—using performance insights, it’s perfectly fine to start with just 1 unit and review the performance data to determine the optimal number of units. In my experience, 1 unit is too low for more advanced prompts, though it’s fine for evaluation purposes. For production environments, the recommendation is to use a minimum of 3 units for simple prompts. When necessary, you can configure flows via Logic Apps to auto-size based on demand and reduce capacity when Security Copilot is not in use.

The number of SCUs is provisioned on an hourly basis, and the estimated monthly cost is displayed. This means you can always disable SCUs or scale down and turn off Security CoPilot. The billing is calculated for each hour.

Onboarding of Copilot of Security

The onboarding of Copilot for Security is not difficult and is natively integrated into the securitycopilot.microsoft.com portal. In general, the following steps need to be taken at a level:

  1. Log in to the portal securitycopilot.microsoft.com 
  2. Choose an Azure subscription and Azure Resource Group
  3. Selecting a geographical location for prompt evaluation to ensure data remains in the home tenant’s GEO-location
  4. Configure the number of SCUs
  5. Agree with the Terms and Conditions
  6. Assign roles to users and start usage of Copilot for Security

Deploy Copilot for Security

First, we need to onboard Copilot for Security. To create the allocated resources, it is needed to be an Azure subscription owner or contributor to create the capacity. To finalize the onboarding, the global administrator/security administrator role is sufficient.

Go to https://securitycopilot.microsoft.com/ and click on Get Started to set up Microsoft Copilot for the first time:

Now we need to set up the security capacity. For this, we need to configure the Azure Subscription/ Resource Group and Capacity name/ GEO. It is recommended to use a dedicated resource group for Microsoft Copilot for Security.

The prompt evaluation location is important – make sure this matches the geolocation of the environment and existing products to avoid privacy concerns related to generative AI.

It will take a few minutes to deploy the resources on the backend. 5-10 minutes, from my experience.

Security Copilot default environment

The second step is the configuration of the environment. We need a Global Administrator or a Security Administrator role to complete the task for this step. Important is also the permissions on the resource in Azure. Make sure you are an Azure Owner or a contributor where the resource is located (step 1)

Via https://securitycopilot.microsoft.com/, we can start the capacity usage. The next step is the configuration of the data-sharing options. Data sharing is based on the environment and privacy policy related to AI.

You’ll be informed of the default roles that can access Copilot for Security. By default, two roles are generated:

  • Contributors (everyone)
  • Owners (Global Administrators and Security Administrators)

After the initial setup, defining the access model from the role assignment page to specify specific roles is possible.

And the Copilot is configured; from now the Security Copilot is billed per hour based on Azure billing.

The first start view is the Copilot for Security portal with sample promptbooks and a prompt window.

Permissions

Good to know: Authentication in Copilot for Security is based on user credentials. Copilot operates on behalf of the authenticated user. Once authenticated, it accesses data available through the signed-in account. The level of data access you have determines which plugins are available in the prompts and which data is visible.

By default, all users in the Microsoft Entra tenant are given Copilot contributor access. There are two new roles part of Copilot for Security which are part of Copilot for Security. Good to know the roles are not part of Entra ID; they are part of the IAM in Copilot for Security.

The following roles in Entra automatically giving Copilot owner access:

  • Security Administrator
  • Global Administrator

A good example of how the flow works:

  1. As an analyst, the role of Copilot contributor is assigned, this role gives access to the Copilot platform with the option to create new sessions.
  2. For the use of plugins and the reach of security data, it is needed to have security roles.
  3. For the Microsoft Sentinel plugin, you still need a Sentinel Reader role
  4. For Defender XDR you need to have access via the Unified RBAC to built-in security roles

Via role assignment, it is possible to add more members or groups to the owner or contributor role.

When needed it is possible to remove the Everyone group as part of the contributor and define custom groups to limit the usage of Copilot for Security. (recommended from my point of view)

An important step in the configuration is the Owner settings configuration. In this view, it is possible to configure the restrictions for file upload and the usage of plugins/ and upload of custom plugins. It is recommended to restrict the usage of customer plugins to owners.

For more instructions related to plugins and policies; read the following blog: How to onboard and getting started with Copilot for Security


Deploy the SOAR playbook

Now it’s time to deploy the Security Copilot incident investigation platform. Before running the playbook, ensure the Microsoft Sentinel workspace is set up, and Microsoft Copilot for Security is enabled on the Azure tenant, as the playbook uses both resources. To deploy the playbook, visit the GitHub repository.

Before running the playbook, the permissions should be changed of the Logic App. Authenticate Microsoft Sentinel and Copilot for Security connectors within the Logic App. Ideally, the authentication is connected with a managed identity.

Part of the Logic App is the switch for each entity. The automation will run based on each entity, in this case for IP address/ Account/ Host and Process. IP Address uses the API of AbuseIPDB.

Connect sources

In Microsoft Security Copilot, we need to configure the sources for AbuseIDP and Microsoft Defender XDR. For Defender XDR it is needed to connect Microsoft Defender XDR as a source.

AbuseIPDB is part of the Non-Microsoft connector and can be enabled. Via the settings the API can be added, which can be requested via AbuseIPDB itself. (free limited plan is available)


Result

As a result, the incident investigation summary is automatically created and structured in the following order:

  • Incident overview:
  • Incident description:
  • Analysis on incident entities:
  • Possible mitigation steps:
  • Conclusion

When Microsoft Sentinel is connected with the unified agent, the activity log is synced and visible in Defender XDR. As output, the automated analysis is visible; in the case of the example, the investigation is created as a response on the “impacket tookit detection”

Incident summary created in Sentinel and Defender XDR Activity log:


Conclusion

The blog discusses how Microsoft Security Copilot enhances automated alert triage by using GenAI data. New SOC automation playbooks have been developed to integrate with Microsoft Sentinel and Security Copilot, improving AI-driven triage. Automation within Sentinel primarily relies on Azure Logic Apps, which have been enhanced to allow automatic prompt submissions for alert enrichment.

Hopefully, this blog gives a start on how to use Security Copilot in combination with Defender XDR/ Microsoft Sentinel and external sources. Of course, this playbook idea can be changed and customized. Check https://github.com/Azure/Security-Copilot/tree/main/Logic%20Apps for more Logic Apps based on Security Copilot.


Sources

Github: SecurityCopilot-Sentinel-Incident-Investigation

Microsoft: AbuseIPDB Security Copilot