SC/14 is here in New Orleans – the Big Easy – this week and it is time to think about the best ways to leverage the network in support of HPC, cloud and the emerging areas of big data analytics. How can we make Big Data the Big Easy Data – ok, bad pun.
Three huge trends are happening that affect Big Data and Supercomputing.
- Massive connectivity and data generated via the Internet of Things (IoT)
- Cloud connected everything
- Big Data with Hadoop
IOT: Millions of laptops, tablets, phablets and phones, RFID sensors, cameras and wearables stay connected all time and generate massive amounts of data. But it is only IF these literally billions of data entries can be stored, indexed and analyzed that actionable information may be gleaned. Forbes estimated that “Through 2015, 85% of Fortune 500 companies will fail to exploit big data for competitive advantage” [Forbes.com: Going Beyond Big Data to Knowledge, 3/11/2014]. And this affects a wide variety of industries, including energy exploration, education, government, biotech, climate research, and even targeted advertising. They are actively harvesting the information that can be derived from the data to be competitive and accomplish research breakthroughs.
Cloud: The cloud offers many possibilities to garner the benefits of Big Data. Cloud-based applications, the centralization of data storage and the ability to perform analytics operations in the cloud increase the accessibility of Big Data Analytics. It is now even possible to “rent” Hadoop clusters from cloud providers such as Amazon Web Services and others.
Hadoop: An open systems solution to scale big data analytics to support research, market analysis, trend predictions from financial to weather, and many other use cases, using common off the shelf hardware and open source software. The notion is that all of this data collected, stored and indexed is useful only when it is transformed into information and that is the task of Big Data Analytics. With the decreasing costs of storage and compute coupled with a plethora of open source Big Data analytics tools, Big Data is no longer just for the Fortune 100 or even 500; it is now readily accessible to mid-tier enterprises.
Storing and processing this data requires a high performance interconnect move it fast and reliably between data stores, data base nodes, application processing nodes, and systems that deliver the client-facing “apps.” The beauty is that one network, based on Ethernet, can meet all of these needs.
Extreme Networks has an exciting development – we now partner with Lenovo to promote a complete line-up of Big Data and HPC solutions, called Intelligent Cluster. Intelligent Cluster is Lenovo’s overarching brand for certified Big Data and HPC compute, storage and interconnect solutions. They reduce the complexity of system deployments with pre-integrated, fully supported solutions that match best-in-industry components with optimized solution design.
So what do we bring to the table for Lenovo?
Extreme Networks delivers Ethernet switches with intelligence, programmability, scalability, high resilience and low latency. Certified and tested by Lenovo, they are some of the fastest, lowest latency and most scalable Ethernet switches on the market, all a part of a combination of hardware and software we call the Software Defined Architecture.
Our most recent addition, the Summit X460-G2 and Summit X670-G2 have leading Ethernet port densities for 1 GbE and 10GbE respectively and astoundingly low latency, at under 600ns for the Summit X670-G2. All of these products are purpose-built to drive Big Data, HPC, the Cloud and the connections of billions of devices as the IoT world expands.