That the enterprise edge will have to up its game for Big Data and the Internet of Things is practically common knowledge by now. What is still unclear is how this new edge should function and what sort of infrastructure will be needed to support it.
In a way, the enterprise has already begun the transition to the new, improved edge merely by embracing software defined networking. According to BizTech Magazine’s Michael Sheehan, the network abstraction that is part and parcel of both SDN and NFV provides for greater centralized control. This, in turn, allows governance and policy to be pushed out across the network more easily, regardless of where the router or switch is located, including the edge. And as data loads increase, the enterprise will have no choice but to move processing, storage, analytics and other capabilities to the edge. If not, they risk overloading core infrastructure.
A recent IDC study estimates that 90 percent of enterprises will have swapped out current edge devices for new intelligent ones by the end of 2017. 40 percent of IoT data will remain on the edge by 2018. Not only does this preserve the sanctity of core data infrastructure, it enables faster service to users and, increasingly, to the devices that drive rising levels of Machine-to-Machine (M2M) traffic. For IT, this will represent a massive shift from mere hardware manager to integral player in the products and services development process, says ThinkStrategies’ Jeffrey Kaplan.
It would be nice if simply switching out one edge router for another is all it takes to spruce up the edge. There is more to it than that, though – at least, if you care about things like security and reliability. One key issue is power. As Emerson Network Power’s Ling Chee Hoe told Asia Outlook recently, increased processing at the edge increases its importance in the overall data chain, fueling the need for more resilient and easily deployable power supplies. Failure in this aspect of edge infrastructure could cripple an organization’s competitive advantage in an age in which speed is crucial.
Programming this array of edge devices will also be a challenge, particularly as it will likely have to communicate with a broad array of sensors and other data points across distributed and increasingly dynamic data infrastructure. Arrow Electronics is hoping to rectify this problem with its new Arrow Intelligent Services framework, which enables device manufacturers to communicate via open APIs rather than proprietary interfaces. Arrow is already collaborating on a new series of intelligent gateways that will feature common setup, configuration, and management processes while enabling key data ingestion, filtering, and other services needed for advanced data analytics.
If the intelligent edge is to become a common facet of the modern enterprise, network managers will have to get used to a more hands-off approach to management and operations. This shouldn’t be much of a stretch when it comes to ephemeral Big Data functions like sending coupons to the cell phones of shoppers who happen to be in a given store at a given time. It could, however, prove problematic for larger workloads that must be brought into pooled infrastructure for more advanced processing. Automation cannot accommodate every contingency, after all, so at some point a human operator has to make the call as to when and how to get involved.
It may be tempting to simply push intelligence to the edge first and then leave these kinds of fine-tuning details for later, but the IoT is happening much quicker than many people realize. Nimble startups like Uber are already showing how it can be used to disrupt long-standing industries.
A more intelligent edge will be key to future competitiveness, but only if it has the support structure to ensure its viability in fast-moving data environments.
Photo courtesy of Shutterstock.