It’s a pretty safe bet that the majority of enterprises these days are either already up on the cloud or are planning to do so in the near future. Of course, the increasing variety of cloud configurations almost guarantees that no two clouds will be the same, which leads to the vexing question: How do I know I’m getting the most out of my investment?
If recent studies are to be believed, most enterprises are intent on pursuing a hybrid cloud model, that is, a mixture of public and private resources that data and applications can traverse with relative ease. Ipanema Technologies’ recent survey of 150 IT leaders had 66 percent looking to build a hybrid platform within the next four years, while only 17 percent favor an all-public model. However, that’s still enough to more than triple spending on public cloud services by 2015, according to IDC. The company predicts a $72.9 billion cloud market by then, up from $21.5 billion today.
Of course, those plans could be tempered if the cloud industry fails to settle some thorny issues regarding security and availability. A potentially serious fly in the cloud ointment surfaced earlier this week when the FBI carted off a number of servers from provider DigitalOne’s Reston facility as part of a criminal investigation. The raid apparently took down service for a number of legitimate enterprises who happened to be sharing resources with the bad guys. Unless and until law enforcement and the cloud/virtualization industry can come to terms regarding seized property, cloud users across the globe could find themselves without service on a moment’s notice.
This brings us back to our original question of how best to optimize the cloud. A major component of that equation is the determination as to what types of data and applications are suitable for public, private and hybrid infrastructure. When it comes to public services, one of the biggest drawbacks will be network latency, according to F5’s Don MacVittie. Optimization and other technologies will help, but at the moment the WAN is generally slower than the LAN. That means the cloud is probably best suited for backup/archiving and other low-priority storage applications. Time-sensitive and business-critical data is best kept closer to home.
Fortunately, these lessons are not being lost on the industry. Computing magazine detected a 10 percent drop in IT professionals willing to commit mission-critical apps to the cloud in its latest survey. Those looking to move basic tasks saw a 2 percent gain.
Despite the buzz over the past two years or so, the cloud is still a new technology, which means most enterprises are still testing the waters. It will probably take some time to figure out what works and what doesn’t on the cloud, and even then the correct mixture will vary from enterprise to enterprise.
If there is any hard and fast rule for the cloud, then, it’s to set specific goals before making a commitment ? but be ready to adjust those goals if reality does not conform to theory.