The Art of Tuning Correlation Searches within Enterprise Security
Hand drawn stairs

The Art of Tuning Correlation Searches within Enterprise Security

One major question I get when tuning searches onsite with customers is: “How do we make sure that we are not overrun with notables within our Splunk Enterprise Security (ES) SIEM?” This is an important question, because if left unchecked, notifications become white noise to the security analysts overwhelmed by information overload.

In this blog, we will be discussing the art of tuning correlation searches within ES.  We are going to target three out-of-the-box ES correlation searches today and how I would suggest tuning them to lower the frequency that the notables fire and, as a result, direct attention to more critical incidents.

It’s important to note that having rock solid Assets and Identities tables defined is essential to getting the real benefit of ES.  Criticality is key in ensuring analyst direct their attention to what’s more mission critical or sensitive first. In most instances, when Splunk ES is implemented, a nightly search that populates your assets and identities via ldapsearch (a command included in the Splunk Supporting Add-on for Active Directory) or some combination of AD data and your CMDB will be scheduled. If you are unaware of these searches, make it a priority to find out how and when they are running.

A great tool that I have used to help with expediting the creation of solid Assets and Identities tables can be found here. We won’t be going over this tool in this article, as having solid Assets and Identities is a pre-requisite for any well-functioning ES environment, but please take the time to check it out!


Splunk ES installed (Properly Setup)
Assets and Identities Setup

OOB Correlation Searches

Vulnerability Scanner Detected (by targets)
Brute Force Access Behavior Detected
Default Account Activity Detected

The first search “Vulnerability Scanner Detected (by targets)” is targeted at finding unknown internal vulnerability scanners or external scanners. Most organizations that take security seriously will periodically run vulnerability scans against their internal machines to make sure they are being patched appropriately. This is very important to do, especially to public facing servers. Public facing servers are prey to those who wish to do your company harm, or even those bad actors who are bored.

Let’s take a look at the search

Taking a Look
| from datamodel:”Intrusion_Detection”.”IDS_Attacks”
| stats values(tag) as “tag”, dc(dest) as “count” by “src”
| where ‘count’>25

Looks simple, but it can also generate a ton of notables on days when vulnerability scans are planned. We do have the ability to throttle this search, but I would prefer that our internal and external known scanners be whitelisted. In our Assets we can set categories for our Assets. A category that Splunk has been using for years is “known scanner.” Adding that to those assets, then adding that filter to your search, can do wonders in ensuring that when this notable fires, it’s from something that should be investigated. We can add this category a few different ways, either we update the lookup via ES -> Configure -> Data Enrichment -> Identity Management -> something_assets – select Source:lookup://something_asset_lookup – Once selected you should see something like below

In our instance we know that ip is a known_scanner so we add it to test but with a pipe (|) between the two entries.

Like so, now we have the ability to exclude this scanner from our future searches.

Exclude the Scanner
| from datamodel:”Intrusion_Detection”.”IDS_Attacks”
| search src_category!=known_scanner
| stats values(tag) as “tag”, dc(dest) as “count” by “src”
| where ‘count’>25

In the below search we see that the scanner is still listed where known_scanner is not used.

This image has an empty alt attribute; its file name is cta-on-demand-security-challenges-webinar-art-of-tunning-correlation-searches-within-es-1.png

Now in our below search the src is no longer included

Its important to note how your Assets and Identities are created, if a search is run every morning to create your lookup, then it will overwrite this manual change every day. A better route may be creating a new lookup for items like known_scanners, Digital Crown Jewels (DCJ), or Honeypot accounts. This will allow you to update those single lookups and just feed the main search by doing a lookup to add say categories and criticality if needed.

The next search we are going to look at, “Brute Force Access Behavior Detected”, can use the same known_scanner to remove a large portion of false positives.

Let’s take a look at the OOB search

OOB Search
| from datamodel:”Authentication”.”Authentication”
| stats values(tag) as tag, values(app) as app, count(eval(‘action’==”failure”)) as failure, count(eval(‘action’==”success”)) as success by src
| search success>0
| xswhere failure from failures_by_src_count_1h in authentication is above medium

Here we are using Xtreme Search (XS) to check for a deviation from “medium”. During scanning, this number most likely will spike to “high” for most machines if scanning is done irregularly. So, let’s use the same method from our first search to remove those known_scanners.

Xtreme Search
| from datamodel:”Authentication”.”Authentication”
| search src_category!=known_scanner
| stats values(tag) as tag, values(app) as app, count(eval(‘action’==”failure”)) as failure, count(eval(‘action’==”success”)) as success by src
| search success>0
| xswhere failure from failures_by_src_count_1h in authentication is above medium

In the below search we find another scanner that is triggering our Brute Force correlation

Now we implement the src_category!=known_scanner fix and lets see how our results differ

Just like that we have removed more false positives! Again, the goal is to remove the noise so that analyst can target the real bad actors.

The third and final correlation search we are going to review is “Default Account Activity Detected.” This search is a total opposite of tuning out. In this search, we need to add Identities to our Default category to ensure that when it fires it’s catching ALL DEFAULT accounts for your organization. Maybe you have a machine imaging account that is used ONLY for imaging and you want to alert on it if its being used for something else. This is a great add to monitor it for use by bad actors.

The file we will be updating this time should be static and be manually updated when needed, via ES -> Configure -> Data Enrichment -> Identity Management -> administrative_identities – select Source:lookup:// administrative_identities Once selected you should see known default accounts to Splunk and they will be set with a category of default and privileged. Most of the time these are the default root account for devices.

In this instance we want to add our “setup” user as a default and privileged user because it is our machine imaging account.

Now we have updated our list and saved it to include our default account. So now when the following search fires it will know about our user “setup” and set its category to “user_category=default”

Add 'setup' User
| from datamodel:”Authentication”.”Successful_Default_Authentication”
| stats max(_time) as “lastTime”, values(tag) as “tag”, count by “dest”, “user”, “app”

These changes can seem so simple – and they really are! But sometimes during a quick 3-week Professional Services engagement, not all the tuning can be done.  These items definitely can be done by the ES Admin once PS leaves. I have seen instances where changes like these have removed hundreds if not thousands of notables each day…. YES that many. Say goodbye to white noise and make sure each notable is addressed.

About Aditum

Aditum’s Splunk Professional Services consultants can assist your team with best practices to optimize your Splunk deployment and get more from Splunk.

Our certified Splunk Architects and Splunk Consultants manage successful Splunk deployments, environment upgrades and scaling, dashboard, search, and report creation, and Splunk Health Checks. Aditum also has a team of accomplished Splunk Developers that focus on building Splunk apps and technical add-ons.

Contact us directly to learn more.