One of the key elements of intent-based networking (IBN) is analytics, specifically the kind of analytics that relies on deep visibility into network conditions and then informs the automation engine as to the best way to accommodate user requirements.
But this level of functionality is not easy to obtain, and it will likely take more than a little trial and error in order to create IBN that is functional, let alone optimal.
For leading IBN developers like Cisco, analytics is crucial to providing “network assurance,” the ability to establish and verify that networking intentions are indeed being met. As the company’s Prashanth Shenoy told ENP a few weeks ago, assurance can be delivered through a combination of contextual awareness and continuous verification of network operations that shift change management and other functions from a reactive to a proactive footing. Using tools like the new Network Assurance Engine and DNA Center Assurance platform, network managers will be able to build precise models of how their environments are actually working and head off any problems long before they produce a noticeable effect on performance.
Many third-party developers are implementing advanced analytics as well, stressing their ability to maintain IBN across multi-cloud, multi-platform environments. Apstra recently released AOS 2.1 with a new Intent-Based Analytics (IBA) module that provides real-time continuous validation of networking conditions. The company has gotten backing from numerous tech firms, such as Dell, Mellanox and Cumulus, allowing managers of mixed environments to detect and prevent service-level violations related to security, performance and other characteristics. The platform also enables customized device configuration and a fabric-wide MAC and IP locator to improve troubleshooting.
Meanwhile, Anuta Networks is out with the ATOM (Assurance, Telemetry and Orchestration for Multi-vendor networks) platform that offers a modular, scalable and cloud-native approach to network design and provisioning. The system utilizes deep network analytics to ensure compliance with user specifications while also improving service delivery, security profiles, availability and other metrics. The platform has garnered support from more than 40 tech vendors and can scale to more than 1 million devices using streaming telemetry and Google Protocol Buffers to facilitate open connectivity. A real-time analytics engine provides current and historical reports for rapid alterations of network parameters and improved QoS fulfilment.
Analytics is also helping to bridge the gaps among the data silos that are still prevalent in enterprise networks and are already showing up on the cloud and the IoT, says Naim Falandino, chief scientist of Nokia Deepfield. With tools like telnet and the Simple Network Management Protocol (SMNP) no longer able to keep pace with today’s performance demands, the need for asynchronous, event-driven updates on the state of end-to-end infrastructure is becoming an operational requirement. Through streaming telemetry, organizations will be able to gain a full view of the network, instead of the limited data from SNMP management information bases. It also provides a means to implement more complex encryption protocols that enhance security without blinding admins to traffic conditions.
Networking is not the only area that will be aided by analytics, but it will likely derive the greatest benefit. As environments scale, networking becomes increasingly complex, particularly now that mesh-style fabrics are all the rage.
Simply collecting more data will not be enough to implement a truly responsive, agile network, however. It will also require a means to convert data into actionable intelligence on a continual basis. Only then will the enterprise come through on the promise of a real-time, self-governing data environment.
Arthur Cole is a freelance journalist with more than 25 years’ experience covering enterprise IT, telecommunications and other high-tech industries.