SHARE
Security / April 28, 2021

Considering TLS Traffic Inspection After Pulse Secure Vulnerability?

Four Decryption Risks and Responsibilities You Should Know About

Executive Summary

  • This recent exploit has further highlighted the need for inspecting encrypted traffic, and organizations are accelerating their plans for initiatives.
  • There is value in inspecting traffic header metadata, even for encrypted traffic. Make sure you have that covered.
  • According to the U.S. National Security Agency’s guidance, decrypting and inspecting TLS traffic is no longer a nice-to-have.
  • While decryption comes with its own set of risks, we discuss four approaches to mitigate those risks and effectively add TLS inspection to your arsenal to combat adversaries:
    • Centralized decryption
    • Limited access to plaintext
    • Ongoing maintenance of the implementation as networks change over time
    • Selective decryption

Pulse Secure recently issued an advisory on the vulnerability, describing it as “an authentication bypass vulnerability that can allow an unauthenticated user to perform remote arbitrary file execution on the Pulse Connect Secure (PCS) gateway. This vulnerability has a critical CVSS score and poses a significant risk to your deployment.” Based on the level of privilege gained or authentications bypassed, successful exploitation of this vulnerability allows the threat actor to remotely execute code, install malware, create new accounts, gain lateral access, and view and change data.

If your organization is running PCS and hasn’t yet taken remediation steps, we strongly urge you take immediate actions as recommended by Pulse Secure and CIS, including upgrading to the latest PCS server software, disabling share/collaboration services, blocking external access, and managing privileges.

Encrypted Traffic

Encrypted traffic dominates the internet and is growing fast for East-West traffic. Encryption protects enterprise data and intellectual property and helps with privacy. However, adversaries also use encryption for payloads, C2 channels, exfiltration, and so forth, and that can often bypass Suri/Snort rules.

This breach highlights the importance of decrypting and inspecting TLS traffic and has catalyzed organizations’ long-planned TLS inspection initiative.

Traffic Header Data — Plenty to Analyze

The header portion of an IP packet precedes its body and has addressing and other data needed for the packet to reach its intended destination. This header metadata, available for non-encrypted and encrypted traffic, is extremely useful for detecting post-exploitation activities using network detection and response tools.

Not only is traffic header data used to identify beaconing, even in cases where malware or malicious domains are not known, the encrypted traffic metadata enables further hunting by baselining network traffic to/from Pulse Secure devices. It also allows identification of past compromise if indicators are revealed in the future.

While there is much value in header inspection, the most effective detection and hunting combines data from both encrypted and decrypted streams. Also note, TLS 1.3 encrypts some header data (such as SNI and JA3), which leads to some degradation of metadata analysis.

TLS Inspection — for Complete Visibility

Decrypting TLS traffic provides metadata for HTTP and TLS and payload-level visibility for inspection and detection, greatly enhancing post-exploitation detections and threat hunting, as well as providing better resiliency against adversary changes. TLS inspection is particularly important since all traffic in and out of Pulse Secure’s SSL VPN is encrypted.

Several of the recommendations below are from the U.S. National Security Agency (NSA) advisory, which provides the rationale for inspecting TLS traffic and enumerates added risks from decryption.

Risks and Responsibilities

According to the advisory, it’s important to decrypt because adversaries use encryption to hide their activities, like command and control, loading malware onto a network, and exfiltration. The NSA advisory provides actionable guidelines to counter those risks. In short, IT must be cognizant of nuances of decrypting.

1. Decryption — “Do It Well, Do It Once”

Per the NSA advisory, breaking and inspecting TLS traffic should only be conducted once within the enterprise network. Redundant TLS inspection, wherein a client-server traffic flow is decrypted, inspected, and re-encrypted by one forward proxy, then forwarded to a second forward proxy for more of the same, should not be performed. Redundant decryption/encryption increases the risk surface, provides additional opportunities for adversaries to gain unauthorized access to decrypted traffic, and offers no additional benefits.

Inspecting multiple times can greatly complicate diagnosing network issues with TLS traffic. Also, per the NSA’s advisory, “Multi-inspection further obscures certificates when trying to ascertain whether a server should be trusted. In this case, the ‘outermost’ proxy makes the decisions on what server certificates or CAs should be trusted and is the only location where certificate pinning can be performed.”

2. Plaintext — Limit Access

While decryption is a step forward, it does introduce new security concerns around PII data exposure and privacy. An adversary can target devices where traffic of interest is being decrypted, and thus, the tools themselves become new target of interest for exploitation, rather than targeting where the data sits at rest, likely with strong access controls and encryption.

Confidentiality and compliance are crucial for a broad range of government and private organizations alike. Regulations in the financial, insurance, and healthcare industries require that sensitive data be protected. Penalties for noncompliance of such regulations can be severe, resulting in fines or even imprisonment.

In this matter, the NSA recommends “Setting a policy to enforce that traffic is decrypted and inspected only as authorized, and ensuring that decrypted traffic is contained in an out-of-band, isolated segment of the network prevents unauthorized access to the decrypted traffic.”

3. TLS Inspection — Important Post-implementation Work

Once TLS inspection is deployed, security admins should focus on upkeep of the implementation. Networks are constantly shape-shifting and blind spots can pop in and out, and legitimate traffic could be disrupted unintentionally.

There may be cases where previously accessible high-risk websites are being blocked. According to the NSA advisory “Administrators must balance usability with security. Some TLS vendor solutions provide additional features for enhancing usability, such as bypassing traffic for known incompatible applications. Enterprises should enable these usability features when needed.”

4. Selective Decryption — Attention to High-Risk Traffic

Most organizations don’t feel there is a ROI in decrypting (nearly) everything, but then get caught out when they need this capability. For this purpose, Gigamon recommends selective decryption, using the iSSL profile and flow mapping, so that high risk flows are decrypted but the default for everything else is not to decrypt. 

The benefit here is that when you do have a high-risk situation, decryption for risky flows (statically or dynamically selected) can be enabled so that your network detection and response tool and others can have access into the plaintext.

How Gigamon Can Help

Decryption is not the end but is the means to an end goal. Gigamon Visibility and Analytics platform offers a range of functionalities to a) gain visibility into all data in motion and eliminate blindspots; b) decrypt the traffic, including TLS 1.3; and c) properly manage traffic flow to the security tool stack and take precautionary measures in handling plaintext. Allow/Deny lists can be defined based on IP address, certificate status and metadata, URL categorization, domain name, and more. For compliance and legal officers, this selective and fine-grained decryption capability can ease the “all or nothing” concerns and let them choose what’s best based on company policies.

Gigamon Masking capability provides customizable data protection by overwriting specific packet fields with a set pattern. Sensitive information is safeguarded during network analysis. Masking permanently obscures the data before sending it to security and monitoring tools. Regulatory and privacy compliance becomes easier because the sensitive data is never seen, processed, or stored by these tools.

Featured Webinars

Hear from our experts on the latest trends and best practices to optimize your network visibility and analysis.

CONTINUE THE DISCUSSION

People are talking about this in the Gigamon Community’s Security group.

Share your thoughts today

RELATED CONTENT

REPORT
2022 Ransomware Defense Report
WEBINAR
Unlock Ultimate Hybrid Cloud Security: Join Nutanix for Insights
REPORT
2022 TLS Trends Data
WEBPAGE
Suddenly, Ransomware Has Nowhere to Hide

Back to top