Tapping the Data Center’s Hidden Resources

For those looking to increase data center resources and meet growing data needs, the cloud is a welcome development. Sure, there are security and reliability issues to deal with, but at least there is a ready supply of server and storage capacity when you need it.

Too bad there isn’t a ready supply of extra resources within your own data center, or is there?

The fact is that the typical data center already has the raw power to meet just about any data load you care to throw at it. It’s just that we’ve had a hard time adequately distributing those resources to meet data requirements.

Slide Show

Top 10 Benefits of Virtualization

Virtualization has taken a firm hold at most enterprises these days, but the fact is we’ve only just begun to unleash the true potential of the technology.

Virtualization solves part of this problem, and in theory it should do a lot more to increase resource utilization. But as VKernal recently found out in its Virtualization Management Index, compiled from data on 550,000 virtual machines, average density is an abysmal 2 VMs per core, with more than 40 percent utilizing more CPU power and memory than required for application and data needs. A key stumbling block here is available memory, as many organizations opt to scrimp on denser memory configurations in favor of more cores, even though that extra CPU power largely goes to waste.

This disconnect isn’t lost on server vendors. IBM, for one, has quadrupled the memory on its X5 architecture through standard memory DIMMs in an effort to boost VM densities. As Dr. Jai Menon, CTO and VP for technical strategy at IBM’s Systems and Technology Group, pointed out to Bank Systems & Technology recently, the typical VM needs at least 8 GB support, which means that a server running even 30 VMs at once needs at least 240 GB.

Maintaining high utilization rates must be tempered by the realization that physical resources are not infinite — and that VM density cannot be allowed to jeopardize availability and readiness. What’s needed is the ability to scale up density when convenient and then drop it back down when workloads increase. That’s the idea behind Veloxum’s Active Continuous Optimization (ACO) engine in the Veloxum for VMware system. The package collects and analyzes millions of metrics across virtual environments and then automates the reconfiguration of memory, CPU, network and storage settings to optimize performance and efficiency. New York’s Maimonides Medical Center saw a 30 percent utilization jump after deployment.

Slide Show

How to Choose the Right Cloud

A closer look at the three types of cloud computing – private, public, and hybrid

The utilization problem is compounded in multi-vendor virtual environments. Akorri’s BalancePoint software boosts utilization rates by keeping track of the changing relationships between VMs and their corresponding server and storage environments. The package supports VMware vSphere and Microsoft Hyper-V, as well as top storage vendors like EMC, NetApp, HP/3Par, Dell/Equalogic and IBM.

To be sure, the cloud will likely be the cheapest way to add necessary resources, but improving VM density is the best way to leverage existing data center infrastructure — unless, of course, you enjoy paying for systems and technology that you don’t need.

Get the Free Newsletter!
Subscribe to Daily Tech Insider for top news, trends & analysis
This email address is invalid.
Get the Free Newsletter!
Subscribe to Daily Tech Insider for top news, trends & analysis
This email address is invalid.

Latest Articles

Follow Us On Social Media

Explore More