Networking in a Changing Data Environment

Just as networks take their first steps into virtual infrastructure, the data environment and its requirements are changing again.

By Arthur Cole | Posted Sep 5, 2014
Page of   |  Back to Page 1
Print ArticleEmail Article
  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn

Networking is by far the most complicated component in the modern data center. Servers and storage are a challenge, to be sure, but maintaining adequate bandwidth, throughput and connectivity, not to mention security, is what really gives enterprise executives fits.

This is part of the reason why networking has struggled to keep up with the virtual and solid state revolutions that have hit IT. Developments like SDN and NFV can only be implemented once the upgrade paths in the server and storage farms are fairly established.

Unfortunately, everything changes, including change itself. So even as most enterprise networks take their first tentative steps into virtual infrastructure, it seems things are about to change all over again.

Server virtualization, for example, is heading for a makeover with new container-based solutions from Docker and Linux. The idea is to enable a single operating system to simultaneously handle multiple applications, as opposed to creating multiple virtual machines, each with their own OS. Supporters say this provides a more efficient and effective means of distributing workloads, but it also impacts network infrastructure, as Cisco is finding out. The company is said to be working with Red Hat on ways to alleviate some of the problems that containers present, such as the need to provision VLAN resources, open up ports, establish security and other issues. Most likely this will be done through some form of "application-optimized container" that would essentially embed these functions on the app layer to allow self-provisioning in the new virtual environment.

Other people are starting to look toward emerging hyperscale infrastructure to see if there are more efficient ways to configure networks on a broad scale. On possibility is the "core and pod" architecture that top web-facing companies like Google and Facebook are using, says 'Gigaom's Jonathon Vanian. In this approach, IT executives determine the optimal hardware configuration for key applications so they can be pooled into multiple pods that are then tied to a core network for distribution to users. This is said to provide for easier data management in scaled environments because it becomes a matter of linking appropriate pods rather than individual server and storage components.

Networks are also under pressure to be "hyperconverged" to allow for easier integration between cloud architectures and software defined environments. Supporters of the concept claim that by converging resources inside modular, commodity systems, they can be integrated to a higher degree to provide for increased levels of control and flexibility. For example, SimpliVity'sOmniStack Data Virtualization Platform allows for multiple core data functions to be stacked within a Cisco UCS C240 rack-mounted server, so not only do you get the hypervisor, compute, storage and networking components bundled into a single entity, but backup, replication, cloud gateway services and caching as well. And this can all be pre-engineered at the factory so it is ready for deployment right out of the box.

Part of the reason networks, and data infrastructure in general, are changing to such a degree is the fact that the way we use data is changing as well. The U.S. military has probably the most sophisticated network environment in the world, and it is rapidly upgrading its capabilities to enable broad collaboration between users and teams across the globe. The Department of Defense has instituted the Joint Information Environment, which calls for a centralized, integrated network that provides advanced sharing of services and applications without compromising security or a common operating framework. The program is a work in progress, with many of the ultimate details still undecided, but as with many military endeavors, failure is not an option.

As the link between resources, data sets and users themselves, the network plays a crucial role in the emerging data ecosystem. But its complexity often makes it the laggard when it comes to supporting new services and applications.

Software defined architectures will help with this problem, but changes to the underpinnings of virtual and even physical infrastructure will continue to throw curveballs at IT in the coming decades.

Preparedness is key, but so is flexibility. And in the end, some advances will only come about through significant time, effort and expense.

Photo courtesy of Shutterstock.

Comment and Contribute
(Maximum characters: 1200). You have
characters left.
Get the Latest Scoop with Enterprise Networking Planet Newsletter