It seems that the love-hate relationships that most enterprises have built up with server virtualization are set to continue into the new year.
On the one hand, we love the way it allows us to consolidate hardware and incorporate previously unimaginable levels of flexibility in our data environments. On the other, we can’t seem to overcome the trust issue when it comes to mission-critical data and applications. There’s also the small matter of managing these complex environments and ensuring that related infrastructure, namely storage and networking, can cope with the newly virtualized server resources.
That’s part of the reason why the market data on virtualization is so confusing. By now, we’re used to numbers like the latest from IDC, which estimates that fully 70 percent of server workloads will reside on virtual or logical machines by 2014. And yet, only about a quarter of the nearly 10 million servers expected to ship in the next year will be virtualized, typically hosting about 8.5 VMs.
That last number could be driven even higher with the proliferation of multicore technology. According to VKernal, the average VMware user is already running 12.5 VMs per server, with smaller organizations (10 servers or less) pushing it to 20. As the prevalence of dual-socket, quad-core servers increases, putting at least two VMs on each core has become a no-brainer.
This is part of the reason why some enterprises are reporting longer refresh cycles for their server hardware. According to attendees at the recent Gartner Data Center Conference in Las Vegas, many organizations are stretching server life cycles to five years and beyond. Clearly, the recession has a lot to do with these decisions, but it was certainly made easier by the fact that virtual servers can support many applications just as effectively as physical ones.
And yet, there are signs that this state of affairs cannot continue indefinitely. One of the lesser known facts of server virtualization is that few organizations can virtualize more than 40 percent of their server infrastructure. This “virtual stall,” as described by CA’s Andi Mann, is the result of limitations in surrounding infrastructure, licensing constraints, staffing issues, security and a host of other factors. Unless and until the IT industry works its way through these problems, virtualization will continue to provide only a partial solution to steadily increasing data loads.
Clearly, then, virtualization continues to offer a mixed bag of solutions/problems for the IT community. In that way, I guess, it’s just like every other solution that’s come down the enterprise pike.