The Brave New Application Delivery Environment
Improved application delivery is the ultimate goal of technologies like SDN and virtualization and should be the ultimate goal of the enterprise data center.
With all the talk lately about software defined networking (SDN) and scale up/out infrastructure, it’s easy to forget that the purpose of networking is to make it easier for people to do their jobs.
In this regard, it isn’t so much the network or the networking technologies that are important, but the way in which data and applications can be delivered and leveraged to produce higher productivity and thus greater profitability.
This is why F5 Networks’ recent State of Application Delivery report makes for an interesting read: rather than focusing on deployments of SDN or even its own application-centric solutions, it looks instead at the results enterprise users are getting from their current data and infrastructure configurations. As can be expected, these are mixed.
For one thing, application environments themselves are growing, with 20 percent of organizations reporting between 200 and 500 applications under management. Meanwhile, interest is high across the board for leveraging cloud and mobile infrastructure for the next-generation application environment. At the same time, however, companies are starting to look to key technologies like network programmability to support a richer application lifecycle, from development to deployment to utilization to decommissioning. In this regard, it isn’t so much the adoption of SDN or other advanced technologies that are important, but the automation and orchestration of network services that they enable.
This is why many vendors are starting to de-emphasize the technologies they have developed and are talking more about the capabilities they can provide. Citrix, for example, recently acquired software defined storage vendor Sanbolic in order to provide the virtual storage component of its WorkspacePod appliance and XenDesktop and XenApp platforms. In this way, the company hopes to lower the cost of key applications like VDI and application delivery while at the same time providing a more flexible, functional means for knowledge workers to engage and manipulate their data.
Systems developers are also stressing the service level capabilities of their latest platforms, an effort to show they can not only do what they say, but do it well. Radware’s new Alteon NG 5208 application delivery controller, for example, is said to provide “full SLA assurance” by virtue of the 24 dedicated virtual ADC instances it can provide for each application or service. To enable top performance, even in multitenant situations, the platform offers full isolation between instances along with advanced acceleration. And by individualizing ISP links, internal applications and load balancing functions via a dedicated virtual instance, users can more easily identify and correct performance-related issues.
All of this is happening at a time when the very notion of what an application should be is coming under fire. As acceleration developer Nginx’ Owen Garrett points out, organizations are starting to see the limitations of the traditional, monolithic approach to development and deployment and are opening up to the idea of micro services. These are smaller pieces of code developed by a single team that can then be added to the overall services pool. Users compile various microservices based on their needs and viola! an application is born. This cuts down on development cost and complexity and enhances the ability to repurpose code that would otherwise be tossed when the application is retired But it also requires an application delivery environment that can support services on a more granular level. Today’s open-source collaboration platforms are already enabling this level of functionality through advanced API management and application-centric load balancing and other delivery functions.
“It’s the applications, stupid,” is the latest enterprise iteration of the popular meme, but it isn’t entirely clear that many organizations truly comprehend the significance of the change it represents. As infrastructure becomes more virtualized, the ability to tailor data environments becomes less a function of what hardware will allow and more about what the application requires.
It’s a bold new way of looking at the enterprise data environment, but it also means we need to rethink many of the foundational principles that have guided infrastructure, application and data management for much of the information age.
Photo courtesy of Shutterstock.
Arthur Cole covers networking and the data center for IT Business Edge. He has served as editor of numerous publications covering everything from audio/video production and distribution, multimedia and the Internet to video gaming.