RSA 2014 Recap: The Year of Pervasive Security and Analytics

by: Neal Allen, Sr. Worldwide Training Engineer, GigamonNeal-Allen

According to ESG research and Jon Oltsik, Sr. Principal Analyst at ESG: 44% of organizations believe that their current level of security data collection and analysis could be classified as “big data,” while another 44% believe that their security data collection and analysis will be classified as “big data” within the next two years. (note: In this case, big data security analytics is defined as, ‘security data sets that grow so large that they become awkward to work with using on-hand security analytics tools’).

This trend was highlighted at the RSA Conference the week before last with many organizations including Gigamon talking about ways security professionals can sift through the noise to find “the needle in the haystack.” Large amounts of security related data is driving the need for Big Data security analytics tools that can make sense of all this information to uncover and identify malicious and anomalous behavior.

Prior to a few years ago, threats were largely script kiddies and other unsophisticated hackers looking to disrupt communications. Organized crime then discovered they could make a lot of money selling access into corporate networks – so they started hiring really smart people to hack in. Around the same time, some governments created formal, but unofficial, departments whose job it was to steal third party intellectual property in order to advance their nation.

Between organized crime and state-sponsored industrial espionage, the interior of the network is at as much risk as the perimeter. This is particularly true with the growth in BYOD and mobility in general. If security analytics and security tool vendors are having problems keeping up with newly upgraded 10Gb edge links, then how will they deal with core networks where there are lots and lots of 10Gb, 40Gb or faster links? Not to mention, user edge traffic often times is not even tapped or spanned because of the potentially high costs of monitoring copious amounts of data across expansive networks.

The nature of security is evolving quickly and no one technique or approach to securing the network suffices anymore. Companies focused around security are now embracing multiple approaches in parallel to address security effectively. These include solutions that are inline and out-of-band, as well as solutions that do packet-level analysis and flow-level analysis. Gigamon, together with its Ecosystem Partners, presented at RSA and highlighted the critical role Gigamon’s Visibility Fabric™ plays in enabling pervasive security for best-in-breed solutions from Sourcefire/Cisco, ForeScout, FireEye, Websense, TrendMicro, Riverbed, Narus, LogRhythm and nPulse.

An effective solution that enables pervasive security should serve up the ability to address a multitude of approaches. The Gigamon Visibility Fabric does exactly that with highly scalable and intelligent solutions to address inline, out-of-band, packet-based and now flow-based security tools and approaches. In addition, Gigamon’s Visibility Fabric has the ability to combine approaches effectively, including packet-based pre-filtering prior to generating NetFlow. Gigamon’s Visibility Fabric is necessary to accelerate post analysis – through granular filtering and forwarding of packets, as well as pervasive flow-level visibility – to find that “needle in the haystack.”

We’ve entered into a new world of network security and providing insightful security analytics can be just as important as the ability to detect threats from across the network in real time. Walking around the booths at RSA, it was clear that without pervasive visibility most networks will be left with limited or delayed situational awareness, security intelligence and operational responsiveness. In a rapidly moving world, this delay may be too late.

Big Data and Intelligent Visibility: Just Give Me the Information That’s Important to Me

by: Paul Hooper, Gigamon CEO
Thirty-one thousand text messages in one month. One can only describe that as startling. Coming from the generation that preceded the texting-era, this seems like an incredible volume of communications that my two daughters managed to accomplish between them in a 30 day period. Downloading the full detail records from the service provider resulted in 96 pages of mobile numbers that really provided little value or context to understand how or why they achieved this milestone. As a father, all I really wanted was a list of any text messages that originated or were destined for a mobile device owned by a boy that contained the word “love” in the message.   
Ironically, this small and personal example represents one of the larger challenges facing businesses today. The volume of information that is created and required by most Enterprises is spiraling ahead of any expectations that we may have had in the years gone by. With end-user devices continuing to increase in capacity, with enterprise networks accelerating up the performance curve, and with the rapid growth in the reach and raw-speed of the mobile communications infrastructure, businesses, employees and in my case, family members, have an ever increasing demand and ability to consume and share information. 
With this scale and growth in the demand for information, the ability to identify the material details from the immaterial represents one of the hallmarks of an agile business. As the volume of reports, information and detail around and within the business grows, the smart money is on the organization that can leverage their ability to identity the material details within a mountain of data to enable faster reaction to changes within their own four-walls or the larger market, and also to recognize how to monetize new opportunities or inflections that are recognized within the broader market. 
And so, as information continues to scale in volume and performance, visibility into the information needs to become increasingly smarter and more intelligent. With applied intelligence into how the organization looks at information, the more responsive, more capable and potentially, more successful the organization will become. We have seen this proven out many times in the world around us and in many ways we see it within our personal lives. Watching live TV through channel-surfing is a bygone of a previous era; today we specifically identify and select what we believe is relevant and focus our few entertainment ours on the programs that are most relevant to our watching requires. Although some strides have been made within the residential market, intelligent visibility into the information, traffic and data is an aspirational vision for many organizations. 
We live in exciting times. We live in a very connected, and very communication-orientated world. The accelerating growth curve ahead for information creation and dissemination is clear. The need for intelligent visibility into that information has never been more obvious.

Mobile World Congress 2013 Recap: Big Visibility for Big Data & Turning Big Data into Manageable Data

by: Andy Huckridge, Director of Service Provider Solutions & SME

It was quite a week at Mobile World Congress. With a record attendance of around 72,000 people, this show continues to grow and grow. Which made it the perfect place to showcase Gigamon’s technology aimed at solving the issue of big data for mobile service providers.

Subscribers continue to embrace mobile lifestyles and conduct work outside of the office while applications become increasingly mobile. At the same time more and more video is generated and consumed which takes up orders of magnitude more bandwidth than legacy voice traffic.

In fact, in advance of the show, Cisco released their 6th annual Virtual Networking Index (VNI) Global Mobile Data Traffic Forecast indicating that mobile data traffic is going to increase 13-fold by 2017. Whether the growth lives up to this estimate remains to be seen, but it will probably come close. That’s a potentially scary statistic for mobile carriers.

We’ve heard of the problem of “Big Data” most often applied to enterprise storage and analytics, but it is clear that this is a major issue for these carriers as well, as analyst Zeus Kerravala writes in Network World. Big Data applications are increasing the volume of data in carriers’ pipes, posing a unique, but not insurmountable challenge.

Operators need a solution that won’t result in going significantly increase expenses from tool costs as the sizes of the pipes and the amount of data in those pipes increases. Carriers are looking for ways to realistically keep their business costs in line with what their subscribers are willing to pay for a service, and to provide subscribers with the quality, uptime and reliability they expect. In order to do this, carriers need to understand the nature of the traffic flowing through the pipes, its ingress and egress points and where resources need to be placed on the network to ensure that service-level agreements are met.

The answer is to change the way Big Data is monitored. First, carriers require a solution that combines volume, port-density and scale to connect the right analytical tools to the appropriate large or bonded pipes. Second, the data must be conditioned through advanced filtering and packet manipulation, which reduces the amount of data arriving at each tool, while ensuring that the data is formatted precisely for the tool’s consumption. This way, each tool is able to process more data without needing to parse the incoming stream and steal processor cycles from the more important task of data analysis. Gigamon currently offers all of these features and announced a combined solution before the start of the show.

However, volume, port density and scale won’t be enough for mobile carriers in the future. Effective monitoring of Big Data calls for reducing the amount of traffic in a large pipe to make it more suitable to connect to an existing speed tool, at 1G or 10G. Gigamon announced the development of this concept during the opening days of the show. Using this method, the connected tools will continue to see a representative view of the traffic in the larger pipe and in a session aware and stateful manner. The tools are thereby not merely filtering traffic. They are reducing the amount, while keeping data flows intact, but at a lower speed feed within a smaller pipe. The carrier will then be able to concentrate on specific types of data, or take a look at the entire range of traffic in the larger pipe.

This holistic network visibility solution from Gigamon will enable mobile service providers to handle the Big Data issue and maintain current business models. But more importantly, maintain existing expense structures while running the big data services of tomorrow.