For more than a hundred years now the Pareto Principle has reminded us that the relationship between inputs and outputs is rarely in balance. Though the 80/20 rule began as a comparison between population and wealth in Italy, it’s been applied in countless other disciplines since.
Pareto’s thinking may even help us to contextualize innovation in the cyber security industry. While technologies like artificial intelligence, machine learning, and automation have garnered a sizable share of our attention, the reality is that most of the work in cyber security is still accomplished by human analysts. Machines with all their promise just aren’t as well equipped for work that is high risk and irregular in nature; work performed by threat hunters or even incident response teams for instance.
While artificial intelligence, machine learning, and automation have a place in augmenting the work of cyber security analysts performing these roles, what helps them accomplish far more in their work is foundational. It’s network visibility.
If you can’t see it, you can’t protect it which is why analysts are blind without network visibility. Teams need visibility across the entire enterprise from their multi-cloud environments to everything that is still on-prem. They need visibility to all that enters and exits the network, plus the traffic that moves laterally. And thanks to increasing loads from existing systems and new devices like IoT, the span of what needs to be seen has never been wider while showing no signs of shrinking.
Visibility gives analysts the context-rich picture needed to get the job done. What devices are on the network and who is using them. Where are they originating and what is being shared. These details and many others are vital to understand exposure and plan a remediation. If only it were that easy.
With the growth in network traffic, the amount of data available to analysts is staggering. So while visibility is foundational to cyber security, teams must take a balanced approach to what is actually collected and stored.
Having too much data may actually hinder an analyst’s work. Consider PCAP data as an example. While full PCAPs might provide an unmatched view into what’s happening on the network, the storage requirements could easily reach into the petabytes and drive increased costs for related technologies such as data lakes and SIEMs.
The problems don’t stop with increased hardware and license costs. Full PCAPs may actually limit an analyst’s ability to work fast when seconds matter as the data would be quite dense and require a major commitment of time to parse and analyze.
All this brings another principle to mind, one named after Goldilocks. What cyber security teams really need is network visibility that’s not too much or too little, but just right. We’ll explore this concept more in our next post. Until then, see how Bricata provides organizations with end-to-end visibility and full context for direct answers and powerful insight to take immediate action.