Discover More: Maximising the value of Flow Data for the SOC

Security analytics and network visibility tools used by defenders collect as much useful data as possible to improve anomaly detection accuracy. This gives defenders proactive network security monitoring and alerting for attempted cyber-attacks or incidents that are in progress.

Written by

Team Nucleus

Content
Written on

6th March, 2020

SHARE ARTICLE

Security analytics and network visibility tools used by defenders collect as much useful data as possible to improve anomaly detection accuracy. This gives defenders proactive network security monitoring and alerting for attempted cyber-attacks or incidents that are in progress.


To achieve the level of granularity needed at carrier scale where the network is more complex and distributed than ever before is difficult. Some of that complexity and resulting overhead can be removed by pre-processing the massive volume of collected data, unifying information from different sources, normalising formatting to simplify the next stage of collection, processing and analysis. It’s important to point out that it doesn’t mean removing any data or dropping anything that may be of value but rather streamlining the dataset into more manageable and easily digestible chunks.


Bringing together data from disparate network infrastructure is key to building a complete picture of network activity, this should include syslog, firewall logs and metrics from NetFlow, IPFIX and other infrastructure and application platforms, often collected via probes and sensors. Once consolidated the data can be enriched and tagged with meta-data based on infrastructure type, end point classification, service type, application type, threat intelligence, IoT classification, Geo IP and ASN.


Data tagging allows downstream analysis tools to rapidly operate on managed objects such as logical subnets, application type or other classification, reducing processing overhead and giving users enhanced rapid navigation by, for example, geographical location, region, or other useful location information such as organisation or ISP based on IP addresses. Enrichment is an important part of the pre-processing process as this data can be used for both immediate threat response and forensic investigations.





Example flow data enrichment process

Once unified and enriched, the data is ready to be exported to one or more collection and analysis tools. Outputs are designated, streams aggregated to ease bottlenecks and congestion, and data routed to where and when you want.


A common point of normalisation, aggregation and enrichment can reduce costs and pressure on downstream tools like SIEM’s, data lakes and analysis. Enrichment improves network visibility, security, and response across all tools and helps security operations teams gain real-time situational awareness of all traffic on the network, in the datacentre, and in the cloud, so they can quickly and effectively respond to anomalies and threats.


Why is this important? It’s because attackers are more sophisticated, agile, and organised than ever before, so maximising the value of the data you already own across all tools increases visibility into any hallmark of suspicious behaviour on the network. It’s critical for defence against attacks.


Talk to us about Telesoft FlowStash and TDAC – Discover more from data you already own.

NUCLEUS

Recommended Posts

Subscribe to Nucleus blog updates.

Subscribe to our newsletter and stay updated.

Subscribe to Nucleus