This multi-part blog series explores how advanced network-traffic analytics changed how the Department of Defense approaches its overall cyber security operations, creating a far more effective methodology for protecting many of our nation’s most sensitive networks. Today we’ll cover the limitations associated with today’s best, mostly automated tools such as SIEMs, security analytics and perimeter defenses. The problems encountered by the DoD before they began to use advanced network-traffic analytics are identical to the problems enterprises encounter today with securing their own networks. As most everyone in cyber security realizes by now, a determined attacker will almost always find a way around or through perimeter defenses. Perimeter defense tools use a signature-based approach that leaves them vulnerable to even the slightest malware modification, while a legitimate username and password obtained from, say, spear phishing is something that must be let through. Even some of the most advanced perimeter tools using whitelist sandboxing techniques can be either bypassed or fooled. Now, that’s not to say that these tools are useless, and they do initate plenty of legitimate alerts. However… The industry has turned to SIEMs to help correlate and prioritize the thousands of alerts coming from their systems. Even though SIEMs help, they still usually overwhelm analysts, and worse, the logs and events used to create alerts can be modified or subverted by a sophisticated attacker. Unfortunately, a SIEM can never be “tuned” perfectly: too tight, and you might miss an intrusion; too loose, and you overwhelm your analysts. Network traffic-based tools capture the ground truth of what’s happening on a network and almost by definition cannot be subverted. However, NetFlow-based analytics tools provide good behavioral information, but only capture a very top-level summary of network traffic data, and can’t provide enough information for a complete, rapid investigation. Security analytics and forensics tools work well for real-time deep packet inspection and forensics and other historical analysis, but because they unravel all content, even their metadata databases are very large and distributed. And as anyone in tech knows, large and distributed databases create very slow query response times, often in the realm of hours at very large scale. This lack of response is debilitating to an analyst in a real-time attack environment. Now that we’ve explored some of the primary limitations of the tools used by cyber security analysts today, in our next post we’ll get together with a former DoD analyst and have him describe a typical “day in the life” of an analyst attempting to use these tools.
Read more about the author: