NetFlow is used by IT professionals to analyze network traffic: to determine its point of origin, destination, volume and paths on the network. The NetFlow network protocol collects IP network traffic as it flows in or out of an interface. The flow data is then analyzed to create a picture of network traffic flow and volume.
Before NetFlow, network engineers and administrators used Simple Network Management Protocol (SNMP) to monitor network traffic. While SNMP was effective for network monitoring and capacity planning, it didn’t provide detailed insight into bandwidth usage. NetFlow is now part of the Internet Engineering Task Force (IETF) standard as Internet Protocol Flow Information eXport (IPFIX), and the protocol is widely implemented by network equipment vendors.
NetFlow follows a simple process of data collecting, sorting and analysis. The main components include:
An IP flow consists of a group of packets that contain the same IP packet attributes. As a packet is forwarded within a router or switch, it is examined for a set of attributes, including IP source address, IP destination address, source port, destination port, Layer-3 protocol type, class of service and router or switch interface.
The NetFlow cache is a database of condensed information where data is stored once the packets have been examined.
The Command Line Interface (CLI) is one of two methods to access NetFlow data. It provides an immediate view of your network traffic and is useful for troubleshooting.
The second option to access NetFlow data is to export the data to a NetFlow collector, which is a reporting server that collects and processes traffic information so that it is easy to analyze.
NetFlow statistics are useful for several applications:
NetFlow has a performance impact on the devices where it is implemented. To reduce that impact on performance, networking devices often rely on sampling packets to generate NetFlow statistics. Low sampling rates – sometimes as few as one in 1,000 packets – dramatically reduce network visibility and could prevent teams from uncovering critical security threats or performance issues.
Additionally, NetFlow records can only be forwarded to a select number of collectors or monitoring tools. Often, this number can be far fewer than required to properly manage and troubleshoot the network. As businesses face a growing volume of both data and security threats, seeing only a portion of what is happening in the network puts businesses at risk of having insufficient information to combat security threats.
Gigamon eliminates the risks associated with data sampling by running NetFlow statistics in parallel with the raw packet streams. With these processing capabilities, Gigamon users can generate NetFlow statistics either at a much higher sampling rate or even at line rate.
NetFlow generation is typically undertaken by the routers and switches as part of the production network. However, as mentioned above, NetFlow does have a performance impact on the devices where it is implemented. Keeping up with growing data volume and network speeds is a growing concern for most enterprises that are straining to have enough compute resources to match the growing demand.
How does Gigamon address this challenge? Through metadata. While NetFlow provides Layer-4 flow-generated data, organizations also need access to Layer-7 or application-level metadata. The Gigamon Metadata Generation capability, which includes NetFlow, generates both Layer-4 and Layer-7 metadata that is both unsampled and done without impacting performance.
When we implemented this solution here at Gigamon, we saw a reduction in false positives, faster time to threat detection and leveraged our security team much more effectively. By integrating Gigamon Metadata Generation with our Security Information and Event Management (SIEM) solution, we were able to identify unusual patterns in Hypertext Transfer Protocol (HTTP) response codes, specific domains indicating a possible security breach and users attempting to reach sites signed by WoSign Secure Sockets Layer (SSL) certs.
NetFlow has been and will continue to be a powerful application for gaining greater network visibility. By offloading NetFlow to Gigamon Metadata Generation, both SecOps and NetOps teams will be able to keep pace with growing data volume and speed without sacrificing the important insights that can be gained from network monitoring and security analysis.