Security / September 27, 2018

Intelligent Detections — The Kobayashi Maru of Security

While alert fatigue may sound like a trendy catchphrase, it is a real issue that security analysts face every day. The ever-increasing number of security alerts from a growing number of devices often results in more noise than useful detections.

The unfortunate truth is that many of these alerts are not real security issues nor do they necessarily align with the organization’s strategy to reduce risk. Furthermore, the sheer volume of this noise is costly, leading to complacency, lost productivity and ultimately increased vulnerability to attacks. This puts the organization’s security team in a Kobayashi Maru no-win situation. Course correcting out of this situation, or avoiding it altogether, requires a new approach with a focus on data quality fundamentals and more intelligent detections.

At Gigamon, we believe that effective detection capabilities incorporate the following core components:

  • High-quality data collection — get the right data to enable detections
  • Context-enriched data sets — enhance data to allow for more intelligent detections

This blog post explores our philosophy on two of the foundational components that enable more intelligent detections and ultimately alleviate alert fatigue.

High-Quality Data Collection

Visibility matters when it comes to detections. In a prior blog post, we talked through how maximizing visibility into your environment is the first step to securing it. This is especially true in detecting malicious activity, as analysts can only detect activity they can see.

Unfortunately, many organizations discover they are missing critically important data while in the midst of a security incident. This missing data often includes comprehensive network visibility but also extends to infrastructure logs and limited endpoint coverage. In other cases, even when these data sources are collected, the right level of detail needed to support an investigation is missing.

This complexity of data collection and engineering within the enterprise only increases at scale. For example, consider ingesting events from several data sources distributed across a worldwide enterprise where timestamp format varies across devices, log types and time zones. Normalizing these timestamps is essential, not only for analysts supporting their triage and investigation workflows, but also for an automated detection capability where temporal analysis and correlation are critical.

The secret to succeeding at the detection game starts with high-quality data collection, which builds the foundation for a scalable and robust detection capability that ultimately reduces alert fatigue.

Context-Enriched Data Sets

So, you have the proper visibility and you’re looking to expand into the next phase of protecting your environment — enter context-enriched data. This can be divided into two parts:

  • Event enrichment
  • Curated intelligence

Event enrichment takes the existing data sets and adds applicable context to provide supplementary information that is useful for both detections and investigations. The current status quo finds analysts having to manually pull in additional information from other security tools, services or systems.

At Gigamon, we believe in putting useful contextual details in the analyst’s hands from the beginning, empowering them to move quickly, understand the situation and take action. These additional contextual details can range from static properties, such as Autonomous System Number (ASN) and geo-location, to dynamic features like reputation and prevalence. This provides defenders with the ability to write custom detections based on unique combinations of indicators and metadata.

Curated intelligence matching provides the second aspect to context-enriched data. Gigamon further enriches streaming network data with network intelligence that goes beyond malicious atomic indicators such as known bad IP addresses or domains. This curated intelligence provides analysts additional context on suspicious entities, such as disreputable Virtual Private Servers (VPS), Tor nodes and sinkholes. This is where analysts can start to model how an attacker would operate in an environment to develop refined detections or start a threat hunting program.

When event enrichment is combined with curated intelligence, it opens up new detection opportunities. For example, attackers often download malicious files that are hosted on a disreputable VPS. Leveraging enriched network data, we could search for, or create an automated detection for file downloads that originate from this VPS. The enriched IP address metadata coupled with a basic understanding of attacker methodologies empowers organizations by expanding their detection capabilities.

Final Thoughts

Alert fatigue is real and painful. It causes wasted cycles for organizations and can increase the risk to the business through missed alerts or insufficient incident triage.

The right level of visibility with context-enriched data sets grants defenders an edge by allowing them to efficiently identify and quickly respond to threats. This reduces business risk, saves time and cost through streamlining analysis, and, most importantly, provides analysts the tools they need to successfully rewrite the Kobayashi Maru situation and succeed.

Back to top