What is Azure Sentinel?
Azure Sentinel is Microsoft’s cloud-native security information and event management (SIEM) AND security orchestration automated response (SOAR) solution all in one!
It brings together the latest in security innovation and advanced AI to provide near real-time intelligent security analytics for a bird’s-eye view over your entire enterprise’s IT estate.
With Sentinel, you can consume security-related data from almost any source – not just sources inside your Microsoft tenant! This removes the need to manage multiple pieces of complex and costly infrastructure components – whilst providing a cloud platform solution that can easily scale to your needs.
Sentinel uses machine learning and AI models to surface important insights based on data consumed through a wide catalog of data connectors. This includes native connections to all key Microsoft sources, together with a range of native 3rd party connectors which includes technologies from AWS, Symantec, Baracuda, Cisco and many others.
The solution analyses in excess of 6.5 trillion signals daily to provide unparalleled threat intelligence. This, coupled with the ability to filter millions of signals into meaningful dashboard alerts provides comprehensive hunting and investigative capabilities – enabling you to expedite your response to potential attacks.
Sentinel also integrates with a wide range of systems – providing the option to automate your incident response activities, thereby allowing you to orchestrate your activities in an efficient and effective manner.
Azure Sentinel Core Features
Microsoft’s objective to re-engineer the SIEM tool was to enable the organizations to focus and invest in security alone and not in infrastructure setup and maintenance. Sentinel comes with the following distinct and prominent features.
- Collect data at cloud scale: Azure sentinel is purely cloud-based. Built on log analytics, Azure Sentinel comes with amazing scaling capabilities that allow connectivity to a wide variety of data sources for the collection of data. This can be from O365, different applications, across all users, different subscriptions as well as from other clouds. There are connectors available that can be leveraged to connect to these different data sources.
- Detect previously uncovered threats: Azure Sentinel detects previously uncovered threats and also minimizes false positives using analytics and threat intelligence from Microsoft. It thereby greatly reduces the effort spent by the security teams in investigating alerts that are raised but are not real incidents.
- Investigate threats with artificial intelligence: Azure Sentinel uses artificial intelligence for threat investigation and looks for any suspicious activities at scale. Microsoft brings over its own cybersecurity experience with Azure Sentinel.
- Respond to incidents and events rapidly: Artificial intelligence (AI) makes Azure Sentinel respond to the threat incidents and events rapidly. There are many possibilities to hunt for threats and orchestrate the responses accordingly. Open-source applications like Jupyter notebook can also be used.
Additional Key Features of Azure Sentinel
Apart from the above core features, there are certain other features, which are equally important and are worth mentioning.
- Intelligent built-in queries: Azure Sentinel has numerous built-in queries that can be leveraged by non-technical users for easily reviewing common attacks.
- Built-in artificial intelligence: As already mentioned above, Sentinel has built-in artificial intelligence to proactively detect real threats, investigate, analyze, and respond in order to mitigate the issues quickly.
- Threat hunting using bookmarks: Threat hunting allows you to proactively look out for security threats before the alerts are triggered creating an incident. You can create custom detection rules to surface the insights to send notifications to the security teams. Sentinel also provides the ability to bookmark suspicious events in order to easily refer and investigate such events in the future. These HuntingBookmark can be used to visualize data directly from the bookmark tab and promote it to incidents in case there is a need.
- Easy Installation: Azure Sentinel is a very easy to install Security Information Event Management (SIEM) tool. Infrastructure setup is very easy and it does not require any complex installation.
- Monitor Data using Azure Monitor Workbooks: Azure Sentinel integrates with Azure Monitor Workbooks that can be used to monitor data. Sentinel also allows the creation of custom workbooks across your data along with the available default templates, thereby allowing you to quickly gain insight as soon as the data sources are connected.
Analytics
As already mentioned, Azure Sentinel has built-in artificial intelligence that provides machine learning rules to detect and report anomalies across all the data sources configured. It is also possible to create your own rules using the built-in rules. Analytics helps in connecting the dots, i.e., it has the ability to combine small alerts into a potentially high-security incident and proactive reports it to the security responders.
Security Automation & Orchestration
Azure Sentinel has the concept of playbooks. These playbooks are built on the foundation of Azure logic apps and help simplify security orchestration by automating the recurring common tasks. As with the machine learning analytics rules, there are prebuilt playbooks with 200+ connectors that also allow you to apply custom logic.
One common example you will find across different Microsoft documentations is that of ServiceNow, where you can use the logic apps to open a ticket in ServiceNow every time a new threat is detected within the services and other workloads in the organization.
How to enable Sentinel in your environment
Enabling Sentinel in your environment is simple, all you need is the following:
- An active Azure subscription.
- A Log Analytics workspace.
- Contributor or reader permission turned on in the resource group that the workspace belongs to.
Once you have that, you can browse to Sentinel within the Azure portal to deploy – then you are ready to begin adding your data connectors.
One thing to consider is that during the preview period Azure Sentinel is free to use, however, the underlying Log Analytics workspace will gather cost for data ingested from your data connectors after you use the first free 5GB.
Currently, there are several Microsoft data connectors that are available out-of-the-box and these provide near real-time integration, including, Office 365, Azure AD, Azure ATP and Cloud App Security (CAS).
Sentinel also provides out-of-the-box data connectors for non-Microsoft solutions, including AWS, Barracuda, Cisco, and Symantec. Sentinel additionally provides support for generic connectors allowing you to send data via Windows Firewall, Syslog, REST API, or common event format (CEF), enabling you to send information from any data source. So, it’s very flexible to your infrastructure.
Once your data connectors are enabled, Sentinel will begin analyzing and reporting on potential threats within your environment using the built-in alert rules.
However, the real power of Azure Sentinel is the ability to write custom alert rules and automated playbooks to help detect and remediate threats in real-time. These custom alert rules and playbooks allow you to tailor Azure Sentinel to help you protect your organization against any specific threats it faces.
Azure Sentinel in action – A typical scenario…
In this example, an organization’s Azure AD Connect instance has been compromised and their credentials have been exfiltrated. We will investigate this attack and highlight how Azure Sentinel could have been used to alert and mitigate this attack at different points of the cyber kill chain.
The cyber kill chain is a series of 8 steps that trace an attack from reconnaissance to data exploitation – enhancing our understanding of the timeline of a cyber-attack.
We will be focusing on the alerting and remediation response against reconnaissance, intrusion and exfiltration.
Why target Azure AD Connect?
For those unaware of Azure AD Connect (AAD Connect), it is a tool that allows organizations to connect their on-premises Active Directory with their Azure Active Directory environment. The most common authentication configurations for AAD Connect are via Password Hash Sync (PHS) or Pass-Through Authentication (PTA).
Password Hash Sync operates by synchronizing the hashed passwords that sit on Active Directory with Azure Active Directory, allowing users to sign into cloud services using their on-premises credentials. Whereas Pass-Through Authentication allows users to sign into cloud services using their on-premises credentials by forwarding authentication requests to an on-premises Active Directory server.
Both these configurations deal with the management of an organization’s credentials, as such, it is often a valuable target for attackers. Hence, it is vital that the AAD Connect service and the server it sits on, are protected to prevent the compromise of credentials.
Reconnaissance
The first step of the cyber kill chain is reconnaissance. Research shows that up to 60% of an attacker’s time is spent investigating an organization and its infrastructure before they begin their attack. So, while reconnaissance is not a threat nor is it an exploit. It is important to remember that reconnaissance is the first step on the path to a cyber-attack. As such it is vital to respond to such threats when they occur.
The most common form of reconnaissance is the use of port scanning to fingerprint servers and identity what OS is in use and potentially what services are running. With this information, attackers will exploit known vulnerabilities or use a password spray attack to attempt to gain a foothold in the system.
Using Azure Sentinel, we can create a custom alert rule that will react when it detects potential port scanning and trigger a playbook to remediate the threat.
To respond to this alert, we can create an automated playbook that is built using the Logic Apps framework available in Azure. Logic Apps uses a simple drag and drop interface to build a series of tasks to execute.
The advantage of Logic Apps is they can be used to build complex workflows that would normally take up valuable time of an organization’s IT personnel – thus reducing the amount of time spent on trivial, repetitive tasks.
This screenshot is an example playbook that will update the firewall rules for every server within our example organization to block the attacker from gaining any more information on our systems.
Intrusion
An ever-growing form of intrusion that many organizations face, is the password spray attack. This is a type of attack where an attacker will attempt to gain access into a system using default or commonly used credentials.
Attackers are also increasingly using lists of the most commonly used passwords to gain access to systems. According to the NCSC, over 75% of organizations had passwords that feature in the top 1,000 most commonly used passwords. So, it’s no surprise that password spray attacks are becoming commonplace!
Attackers are unlikely to attempt to sign into an account manually from their own IP address, instead, they’ll attempt to automate the task using botnets. Hence when an alert is raised for an unusual sign-in, we can look up the IP address of the sign-in alert and check whether it came from a known botnet, block the user from signing in and raise a ticket in Service Now to notify IT personnel of a potential account breach.
While most workflows can be created using the basic building blocks providing in Logic Apps, a more complex workflow is sometimes required. In this case, we can’t easily create a Logic App to compare the IP address of the alert against a list of known botnets. However, Logic Apps allows us to integrate with Functions Apps, which are small blocks of custom code that can be run. As a result, we can create the following Logic App that can perform more complex tasks.
Exfiltration
Once an attacker has gained initial access in a network, they will be looking for ways to extract data from a system. In our fictitious example, the attacker has gained access to a local administrator account and is now looking to export all the user credentials stored in the Active Directory.
As the attacker has breached the server which hosts the AAD Connect service, they can compromise the built-in service account which AAD Connect uses to perform its synchronization, an attack method commonly referred to as DCSync. It impersonates a Domain Controller and can request password data from the target Domain Controller.
Within the Microsoft security stack, Azure Advanced Threat Protection has out-of-the-box detection for DCSync attacks. However, many security teams face the problem of having to navigate the different dashboards for each Microsoft security solution they have deployed, such as Microsoft Defender ATP, Azure ATP, and CAS.
In the past this has meant that time was wasted navigating between different dashboards and consoles with slower response times and potentially missed threats and correlations.
With the introduction of Azure Sentinel, an organization can now view threats and alerts across their entire IT estate. They can also take advantage of incidents within Sentinel to correlate alerts and entities across all data sources to add contextual information that is meaningful to the investigation process.
In this example, Microsoft Advanced Threat Analytics (ATA) has detected a DCSync attack on the AAD Connect server, which in turn has raised an alert in Sentinel. Taking advantage of automated playbooks, we can create a Logic App that will send out an approval email to an IT security team asking them if this is a threat or not. If confirmed, the Logic App will use the built-in Microsoft Defender ATA tool to isolate the server from the network, begin a virus scan and raise a ticket within Service Now.
Methods to automate Azure Sentinel content deployment
Automate Azure Sentinel Deployment
Azure Sentinel is a scalable, cloud-native, security information event management (SIEM) and security orchestration automated response (SOAR) solution.
Azure Sentinel delivers intelligent security analytics and threat intelligence across the enterprise, providing a single solution for alert detection, threat visibility, proactive hunting, and threat response.
Like any other cloud service, you can automate most of the Azure Sentinel deployment and configuration.
And in this post, you will learn how to automate the core components of Azure Sentinel.
Prerequisites
Before we start, there are few global prerequisites that you need to meet:
- Active Azure Subscription, if you don't have one, create a free account before you begin.
- Contributor permissions to the subscription.
- PowerShell V7, if you don't have it installed, install it from the GitHub Repository.
Azure Sentinel Automation tools
Bringing the right set of tools to the mission allows you to provide the best solution in the shortest time.
Before you begin your journey, spend some time getting familiar with the following tools:
PowerShell V7
PowerShell V7 is a cross-platform task automation and configuration management framework, consisting of a command-line shell and scripting language.
Make sure you installed it on your system.
Az Module
Azure PowerShell Az module is a PowerShell module for interacting with Azure.
Az offers shorter commands, improved stability, and cross-platform support.
To install it, you run the following command:
Install-Module az -AllowClobber -Scope CurrentUser
AzSentinel Module
AzSentinel is a module built by Wortel, and it will help us automate a few of the processes.
You install the AzSentinel Module with the following command:
Install-Module AzSentinel -Scope CurrentUser -Force
Splatting
In most of the code examples, I use "splatting" to pass the parameters.
Splatting makes your commands shorter and easier to read.
Connect to Azure with PowerShell
You also need to set up a session to Azure from PowerShell, and you can create one with the Az module.
You need to get your Tenant ID and Subscription ID from the Azure Portal.
With this information, you can use the Connect-AzAccount to create a session with Azure:
$TenantID = 'XXXX-XXXX-XXXX-XXXX-XXXX'
$SubscriptionID = 'XXXX-XXXX-XXXX-XXXX'
Connect-AzAccount -TenantId $TenantID -SubscriptionId $SubscriptionID
You can now interact with Azure from PowerShell and start your journey to automate Azure Sentinel.
A Step by Step To a Fully Automated Deployment
Every automation process starts with multiple small automated processes.
In this post, you will learn how to provision the following components with PowerShell:
- Resource Group
- Log Analytics
- Azure Sentinel
- Saved Queries
- Hunting Queries
- Alert Rules
- Playbooks
- Workbooks
Azure Log Analytics, Azure Sentinel, and Logic Apps are all paid services.
Each component is a piece in the puzzle that builds a fully up and running Azure Sentinel, ready to monitor every environment.
Resource Group
The resource group is a container that holds related resources for an Azure solution.
In Azure, you logically group related resources to deploy, manage and maintain them as a single entity.
With the New-AzResourceGroup, you can create a new resource group.
Every resource in Azure requires a deployment location.
The location is referring to the Datacenter region.
In this guide, I will use the West Europe region.
$Parms = @{
Name = "Sentinel-RG"
Location = "WestEurope"
}
New-AzResourceGroup @Parms
Log Analytics
Log Analytics is a service that helps you collect and analyze data generated by resources in your cloud and on-premises environments.
It gives you real-time insights using integrated search and custom dashboards to readily analyze millions of records across all of your workloads and servers regardless of their physical location.
Azure Sentinel runs on the Log Analytics workspace, and use it to store all security-related data.
With that said, Log Analytics is the first resource we need to provision.
To create a new Log Analytics workspace, you can use the New-AzOperationalInsightsWorkspace.
$Parms = @{
ResourceGroupName = "Sentinel-RG"
Name = "Saggiehaim-Sentinel-WS"
Location = "WestEurope"
}
New-AzOperationalInsightsWorkspace @Parms
Azure Sentinel
After provisioning Log Analytics, you can continue and on-board Azure Sentinel.
Use the Set-AzSentinel to provision the Log Analytics Workspace:
$Parms = @{
SubscriptionId = $SubscriptionID
WorkspaceName = "Saggiehaim-Sentinel-WS"
}
Set-AzSentinel @Parms
Azure Sentinel Saved Queries
Until this point, you only provisioned "Infrastructure."
By enabling Azure Sentinel, you can now start the "configuration" part, and add content to your Azure Sentinel.
When we talk about SIEM and monitoring big data as an essential skill to have, it is the ability to extract the relevant information from the sea of data.
In Sentinel, you use the Kusto Language (KQL).
With KQL, you can run queries inside Log Analytics, and write Sentinel Alerts rules, Hunting rules, Workbooks, and more.
Some queries can be significant and complex, and you don't want to write to them again and again.
You can save your time and keep your queries inside Log Analytics and use them on demand.
You can organize your saved query inside folders by using the Category switch.
You can push saved queries with the New-AzOperationalInsightsSavedSearch command:
$query = @"
// Number of requests
// Count the total number of calls across all APIs in the last 24 hours.
//Total number of calls per resource
ApiManagementGatewayLogs
| where TimeGenerated > ago(1d)
| summarize count(CorrelationId) by _ResourceId
"@
$param = @{
ResourceGroupName = "sentinel-rg"
WorkspaceName = "Saggiehaim-Sentinel-WS"
SavedSearchId = "NumberofAPICallsPerResource"
## Name of the saved query
DisplayName = "Number of API calls per resource"
## The name of the Folder your want to store your saved query
Category = "API Managment"
Query = $query
Version = 1
Force = $true
}
New-AzOperationalInsightsSavedSearch @param
Another method is to use JSON or YAML files to hold the information.
This method is the recommended approach.
It allows you to manage your content inside a git repository, manage versions, and use it in your automated process.
Here is an example of a JSON file:
{
"SavedSearchId": "NumberofAPICallsPerResource",
"DisplayName": "Number of API calls per resource",
"Category": "API Managment",
"Query": "
// Number of requests
// Count the total number of calls across all APIs in the last 24 hours.
//Total number of call per resource
ApiManagementGatewayLogs
| where TimeGenerated > ago(1d)
| summarize count(CorrelationId) by _ResourceId",
"Version": "1"
}
Now you need to adjust the script accordingly:
$SavedQuery = Get-Content .\NumberofAPICallsPerResource.json | ConvertFrom-Json
$param = @{
ResourceGroupName = "sentinel-rg"
WorkspaceName = "Saggiehaim-Sentinel-WS"
SavedSearchId = $SavedQuery.SavedSearchId
DisplayName = $SavedQuery.DisplayName
Category = $SavedQuery.Category
Query = $SavedQuery.Query
Version = $SavedQuery.Version
Force = $true
}
New-AzOperationalInsightsSavedSearch @param
Hunting Queries
Hunting queries help you find suspicious activity in your environment.
While many are likely to return legitimate activity or potentially malicious activity, they can guide your hunting.
If you are confident with the results after running these queries, you could consider turning some or all of them
into Azure Sentinel Analytics to alert on.
To can create Hunting rules, you can use the `Import-AzSentinelHuntingRule' cmdlet.
First, you create a JSON file containing your hunting rule base on this schema:
{
"analytics": [
{
"DisplayName": "Example of Hunting Rule",
"Description": "This the description of the query.",
"Query": "
// sample query
Syslog
| limit 10 ",
"Tactics": [
"Persistence",
"Execution"
]
}
]
}
Now, you can import the Hunting Query into your Azure Sentinel:
$Parms = @{
WorkspaceName = "Saggiehaim-Sentinel-WS"
SettingsFile = .\exampleHuntingRule.json
}
Import-AzSentinelHuntingRule @Parms
Alerts Rules
Alert rules are queries that are defined to trigger incidents.
You use them to raise incidents when security incidents happen in your environment.
Just like Hunting queries, you store your alerts rules in a JSON file.
{
"analytics": [
{
"displayName": "Suspicios activities in Office365",
"description":
"Rare office operations executed on one or more mailboxes.",
"severity": "High",
"enabled": true,
"query": "let timeframe = 1d; OfficeActivity",
"queryFrequency": "5H",
"queryPeriod": "5H",
"triggerOperator": "GreaterThan",
"triggerThreshold": 5,
"suppressionDuration": "6H",
"suppressionEnabled": false,
"tactics": [
"Persistence",
"LateralMovement",
"Collection"
],
"playbookName": "string",
"aggregationKind": "string",
"createIncident": true,
"groupingConfiguration": {
"GroupingConfigurationEnabled": true,
"reopenClosedIncident": true,
"lookbackDuration": "PT6H",
"entitiesMatchingMethod": "string",
"groupByEntities": [
"Account",
"Ip",
"Host",
"Url"
]
}
}
]
}
You can use the Import-AzSentinelAlertRule to import your Alert Rules:
$Parms = @{
WorkspaceName = "Saggiehaim-Sentinel-WS"
SettingsFile = .\exampleAlertRule.json
}
Import-AzSentinelAlertRule @Parms
Playbooks and Workbooks
Playbooks use Azure Logic Apps to respond to incidents automatically.
Logic Apps are a native resource in ARM, and therefore we can automate its deployment with ARM templates.
Azure Sentinel allows you to create custom workbooks across your data.
Workbooks visualize and monitor the data and provide versatility in creating custom dashboards.
Same as Playbooks, Workbooks are native resources in Azure and use ARM templates
Because this is an ARM template deployment, you deploy it to the Resource group and not to the Log Analytics Workspace.
Use the New-AzResourceGroupDeployment cmdlet to deploy either Workbook or Playbook:
$Parms = @{
ResourceGroupName = "Sentinel-RG"
TemplateFile = .\exampleTemplate.json
}
New-AzResourceGroupDeployment @Parms
Take into account that the deployment will fail if a workbook with the same name already exists.
Plan first, Succeed later
You've learned how to provision each component and how to deploy your content, now it's time to prepare the content and learn how to connect securely to Azure to automate the deployment from start to end correctly.
Folder Structure
First, I want to explain the folder structure:
When you have different files, I like to organize them in folders, so it's easy to manage them and use them in the automation process.
In this case, we have five different resources, so, I recommend the following structure:
Sentinel Automation
├───AlertsRules
├───HuntingRules
├───Playbooks
├───SavedQuery
└───Workbooks
The example above allows us to match the right files to the right cmdlets.
For example, to import all your AlertRules, you can do the following:
$AlertRules = Get-Item ".\AlertsRules\*" -Filter '*.json'
foreach ($rule in $AlertRules) {
try {
$Parms = @{
WorkspaceName = "Saggiehaim-Sentinel-WS"
SettingsFile = .\exampleAlertRule.json
SubscriptionId = $SubscriptionId
Confirm = $false
}
Import-AzSentinelAlertRule @Parms
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Error "Unable to import Alert Rule: $($ErrorMessage)"
}
}
Connecting Securely To Azure
Another important topic is how we authenticate to Azure securely.
If you paid attention when you created a session with Azure for the first time, using your credential, it asked you to sign in with one timed password in the Microsoft portal.
One-time passwords are not the behavior we want when we automate things, as it required human intervention.
But this is also the expected behavior from a security point of view, right?
To overcome this, you need to use an App Registration Account.
If you don't know how to create one, you can follow this guide.
A little tip: Keeping the password in plain text in scripts is not so safe, so it’s better to secure it.
The best approach is to use a certificate (in the guide, you will learn how to do it).
But if you still want to go without a certificate, you can always protect the password.
You convert the password to secure string and save it to a file (I recommend changing the ACL for the file).
$CredsFile = "<Path>\PasswordFile.txt"
Read-Host -AsSecureString | ConvertFrom-SecureString | Out-File $CredsFile
Now you can connect to Azure more securely.
$TenantID = 'XXXX-XXXX-XXXX-XXXX-XXXX'
$SubscriptionID = 'XXXX-XXXX-XXXX-XXXX'
$appId = 'XXXX-XXXX-XXXX-XXXX'
$securePassword = Get-Content $CredsFile | ConvertTo-SecureString
$credential = New-Object System.Management.Automation.PSCredential (
$AppId, $securePassword
)
$connectAzParams = @{
ServicePrincipal = $true
SubscriptionId = $SubscriptionId
Tenant = $TenantId
Credential = $credential
}
try {
Connect-AzAccount @connectAzParams
}
catch {
$ErrorMessage = $_.Exception.Message
Write-Error "Unable to connect to Azure: $($ErrorMessage)"
exit
}
Use Cases and Deployment Scope
Azure Sentinel has been used as a SIEM solution. Easy to learn, set up and use. Because it is highly scalable and cloud-based, it has become ideal for managing events and providing security automation by creating automated SOAR responses to different levels of incidents, from undiscovered, simple to more complex. It has collaborated a lot in making business decisions and providing more security for the team and the organization.
Pros
- Easy to deploy and learn to use.
- Artificial intelligence.
- Analysis of any type of threat, including those that have not yet been discovered.
- Automation to respond to security incidents.
- Reduction of false positives.
- Easy to edit log analysis rules.
- The reporting feature can be improved. I sometimes see problems with exportation, instability and compatibility.
- Dependence on Microsoft Azure software.
- Better automation against safety indices.
- Better visualization of threats, deals and solutions.
- Great for checking attempted violations.
Cons
- The reporting feature can be improved. I sometimes see problems with exportation, instability and compatibility.
- Dependence on Microsoft Azure software.
- Azure Sentinel delivers good value for the price
- Happy with Azure Sentinel's feature set
- Azure Sentinel live up to sales and marketing promises
- Implementation of Azure Sentinel went as expected
Usability
The Microsoft Azure Sentinel solution is very good and even better if you use Azure. It's easy to implement and learn how to use the tool with an intuitive and simple interface. New updates are happening to always bring new news and improve the experience and usability. The solution brings reliability as it is from a very reliable manufacturer.