In an ideal world threat intelligence should prevent IT security incidents from occurring in the first place; however, in reality incidents are inevitable, often with associated data breaches.

Post-event clear up requires intelligence gathering as well and the quicker this can be done the better. As incident response capability speeds up the ability to use intelligence in real time is increasing.

As Cisco’s Sourcefire, puts it: the need for security intelligence is “before, during and after.” The more timely the intelligence can be gathered, the more likely it is that it will be put to use for pro-active defence, rather than post-event clear up; this is the area of real time security analytics.

Threat intelligence gathered “before” is still the life blood of the IT security industry. It includes black lists of common spam emails, malware signatures and dodgy URLs as well as white lists of known good stuff.

All this is still a key part of protecting IT users and relies on the vast threat intelligence gathering networks that sit at the core of most IT security companies. The power of these networks is that they are kept up to date by gathering intelligence from, and sharing it with, large customer bases. However, many now accept that intelligence gathered “before” is never going to stop the most insidious threats.

So in the worst case scenario, when an event has occurred what can be done “after.” The requirement is to understand the extent of the damage. This is the world of IT forensics: the preparing of reports for internal investigations, responding to regulators and, in some cases, communicating with crime investigators.

Examples of relevant incidents include the discovery of unknown malware, evidence of hacking and, in some cases, the suspicious behaviour of employees.

Well-established vendors in forensics include Guidance Software, Access Data, Stroz Friedberg and Dell Forensics. In 2013, Guidance released a new version of its Encase product called Encase Analytics.

Ultimately, many of the clues to what has happened lie on the servers, storage systems and end user devices, so whilst Encase Analytics is a network-based tool, these end points are its focus. The volumes of data involved can be huge and as Guidance puts it, this is where “big data meets digital investigations.”

Access Data’s Cyber Intelligence and Response Technology (CIRT) collects end point and network data to provide a comprehensive insight in to incidents. Access Data has re-packaged this as a platform it calls Insight to provide continuous automated incident resolution (CAIR).

New capabilities include improved malware analysis (what might this software have done already, more automated responses and real time alerts). This is all well beyond forensics moving Access Data from “after” to “during” and even some “before” capability. Like Guidance and other forensics vendors, Access Data relies for some of it intelligence from SIEM vendors.

In the past SIEM has typically been an “after” technology too. Most SIEM vendors come from a log management background, which is the collection and storage of data from network and security system log files for later analysis.

Many of the major IT security vendors have entered the SIEM market via acquisitions; HP of ArcSight in 2010, IBM of Q1 Labs in 2011, McAfee of Nitro Security in 2011, EMC’s RSA of NetWitness in 2011 and KEYW’s Hexis of Sensage in 2012.

Other vendors include LogRhythm, Red Lambda and Trustwave. Splunk is often included in list of SIEM vendors, but its focus is even broader, using IT operational intelligence for providing commercial as well as security insight, this is the subject of a new Quocirca report, Masters of machines, sponsored by Splunk.

As with forensics, the volumes of data are so big that SIEM is increasingly referred to as a ‘big data problem.’ It fits the definition well, if you go by the five Vs of big data; volume, variety, velocity, value and veracity. There is certainly lots of data involved (volume) and it comes from a range of sources (variety), often being enriched with data from other sources.

However, it is the increasing capability to use SIEM data in real time that ticks the velocity box and this is turning SIEM in to a “during” technology. Quocirca covered this in its 2012 report Advanced cyber-security intelligence, sponsored by LogRhythm. Anything that minimises the impact of security incidents clearly has value and veracity comes from the truth exposed through deep insight.

To use intelligence from a range of sources in real time to identify and mitigate threats as they are occurring is the Holy Grail of IT security. Of course, there are plenty of measures that can be taken, running suspicious files in sandboxes (witness the rapid growth of FireEye), only allowing known good files to run (for example with white listing technology from Bit9, another vendor that has upped its ante for the “during” with its recent merger with Carbon Black), blocking access to dangerous areas of the web, which is a constantly moving goal (URL filtering from Websense, Proofpoint and others) or judicious checking of content in use (content inspection and redaction from Clearswift and other vendors in the data loss prevention/DLP space).

So, in general terms the news is good; the vendors that aim to protect IT infrastructure are upping the ante in the arms races with attackers (and it always will be an arms race).

More and more are making use of their ability to process and analyse large volumes of data in real time to better protect IT systems.

The bad news is, there is no silver bullet and there never will be; a range of security technologies will be required to provide state of the art defences and there will be no standing still. Those who would steal your data are moving the goal posts all the time and they will be doing that “before,” “during” and “after” their attacks.

Leave a Reply