Earlier in this month, I was down at the Gartner Data Center Conference in Las Vegas at the same time the National Rodeo Finals were taking place which brought to town a lot of cowboys and cowgirls. Attending a number of sessions every day focused on big data, data center operations, storage optimization, infrastructure management, private cloud, among others, I’ll share some of the highlights from the conference.
Once Servers and Hardware reach the inflection point of commoditization the higher value offerings will reside in software and services for customization of big data implementations.
Startups like SimpliVity have raised over $101m and have spent over 3.5 years in development to address the limitations of current storage methods by their focus on converging all data sources and systems into a globally federated architecture.
The rules are changing for enabling software defined networking capabilities – applications need to be architected differently to directly interact with infrastructure.
One of the myths shared by IBM is that Big Data means you are using Hadoop or SAP’s Hana – that is not the case. In fact, around 87% of the big data implementations are homegrown. Myth #2 is that everyone is doing big data.
There are a lot of emerging companies within the DCIM [Data Center Infrastructure Management] space but companies are hesitating to do full scale implementations because of the lack of a large scale presence. There is yet a full suite vendor to emerge that provides the range of critical components required to address all the DCIM needs companies have.
The prediction for the DCIM space in 2014 will be the emergence of survivors out of the numerous point solutions and the beginnings of M&A activity to build out the a full-suite solution .
The private cloud vendors are undergoing a M&A consolidation wave of larger vendors filling out their portfolios; don’t expect small private cloud vendors to stay independent for long as companies hitting $20m in Revenue get taken off the market.
Datacenter implementations are on the rise with over half the participants in the Gartner survey building a new data center or relocated to a new lease in 2014.
Hyperscale implementations of datacenters are addressed in creative ways: Google focuses on containing heat with sea water cooling, Yahoo relies on air cooling, eBay tracks the beginning-to-end cost per user transaction, Facebook designs its own proprietary hardware and shares findings through their Open Compute Project, and Microsoft prepares freestanding, fully built server racks in modular containers.
We are still in the early days of big data and datacenter optimization with early feats on the hardware side to date. The next wave of software refining and integrating will create huge advances in optimizing data flow, reducing cost and automating the process from development to operation support.