The Cloud and Public Internet: Keep the Data Flowing
Proponents of the cloud describe its possibilities as truly magical. Unlimited resources, on-demand self-service provisioning, no more hassles over infrastructure…
But while this may seem magical, it is not.
Despite the tales of unlimited scalability and everything/everywhere functionality, the cloud does depend upon a finite resource: Internet bandwidth. It doesn’t matter if you use the cloud for storage, processing, applications, or networking, all data that traverses the path between provider and client has to go through the same infrastructure as everyone else’s. And as the cloud and web-capable mobile devices grow more popular, that bandwidth will come under increasing pressure.
According to Teleogeography, worldwide demand for bandwidth grew nearly forty percent in 2012. The previous five years averaged more than fifty percent. The rapid wiring of the third world has helped meet some of the new demand, but developed regions are putting increasingly large volumes of data on the cloud. Fortunately, bandwidth is an eminently renewable resource, and telecommunications companies all over the world are laying new cable, to the tune of about nine Tbps per year.
Thanks to this, the cost per GB is actually decreasing, despite the explosion in demand. As a commodity, then, bandwidth is pretty hard to beat. There is little doubt that the capacity being added today will prove immeasurably valuable in the near future, a fact that has drawn the attention of the investment community. So even though the cost per GB is about a third of what it was a few years ago, it still provides a more solid investment than infrastructure investments for highly profitable industries like oil.
Still, it serves no one to treat bandwidth cavalierly, least of all the user. That is why we are certain to see a growing pool of bandwidth management systems and other solutions designed to lessen the impact of cloud-related data on public networks. Ilesfay Technology Group, for example, is out with the MatchMaking platform, said to boost replication and transfer speeds up to thirtyfold, in part through a series of newly patented algorithms that identify and move only data that has been recently updated. The company has targeted the system at data delivery applications ranging from multicloud and regional replication services to business continuity and disaster recovery.
These, and other types of bandwidth reduction services, may become increasingly important moving forward. Telecom companies are not likely to keep the bandwidth floodgates open should data volumes get too large. We’ve already seen wireless firms dial back on their unlimited plans, so it isn’t hard to imagine wired bandwidth under the same restrictions as Big Data and media-rich services gain in popularity. An unlimited service is your best bet now, but it’s never too late to start conserving. The free ride may not be around forever.
Considering the IT industry’s penchant for maxing out available resources and then calling for more, top telecommunications providers have done a good job of keeping ahead of the curve. While bandwidth issues have certainly not hampered either cloud or mobile services so far, however, loads could conceivably grow so large, so quickly, that scarcity, cost increases, and even rationing become a reality.
It would be much wiser to implement effective bandwidth management techniques now, rather than do so in crisis mode later.