It’s more than an academic question now that the age of the cloud is upon us. For one thing, it would seem to be a moot point for most data users, and data managers for that matter, considering resources, infrastructure and architectures will soon be available in multiple forms at the touch of a button (and the payment of fees). And secondly, the data center industry is quickly gravitating toward open platforms, open fabrics and dynamic data infrastructures that aim to suit all manner of requirements at any given time.
So the question remains: With unlimited scalability, flexibility and operability at our disposal, do we really need to worry about how data environments are designed and built anymore?
To some data experts, like Ofcom’s Adrian Grigoriu, the more appropriate question is, did any of this ever matter in the first place? Enterprise architecture (EA) has always been more of an art form than a science anyway. With no real EA framework in place, most data architectures remain a loose collection of components, so the architect is free to define his own goals at the start and then determine for himself if they have been achieved. Tools like TOGAF provided a process template, but there is and never has been a formalized EA framework in which discrete parts are integrated into a cohesive whole.
Too often, says MIT’s Jeanne Ross, what usually passes for architecture is simply a few key systems or managers who maintain responsibility for the most essential business processes. As data centers become tasked with meeting an increasingly diverse set of requirements, however, this approach starts to break down. The end result is that without a proper grasp of architecture, organizations will find their data environments to be less responsive and more expansive than their properly designed peers.
A key concept in future architectures will be agility, according to Todd Drake, VP of technology at digital marketing firm Organic. As enterprises are hit with everything from virtualization to mobile applications, the ability to adapt and respond to changing environments will be crucial. The problem is there is no set way to measure architectural agility with different organizations factoring in various mixtures of speed flexibility, adaptability and other hard-to-quantify parameters. His own approach is built largely on costs where agility becomes a ratio of the complexity of a given change over the effort required to implement it.
The term “holistic” is also getting more buzz around the architectural water cooler. As the University of North Texas’ Dr. Leon Kappelman describes it, the tendency to compartmentalize data center resources results in ever diminishing results — like trying to understand a living being by analyzing its component chemicals. Much better to embrace a “counter-reductionist approach” by viewing the interactions and interconnections of various components, rather than the raw capabilities of the components themselves. In that way, you get a fully optimized data center, as opposed to optimized networks, storage systems or operating platforms.
So in the end, what are we left with? Should we concern ourselves with architecture or not? It would seem that in the old days of static data silos and one-to-one user-desktop-server relationships, the answer would have been no, at least not to any significant degree.
Going forward, however, as the dynamic infrastructure of the cloud starts to take hold and former rules governing data, systems, resources and infrastructure break down, enterprise architecture will start to matter a great deal.
Arthur Cole covers networking and the data center for IT Business Edge. He has served as editor of numerous publications covering everything from audio/video production and distribution, multimedia and the Internet to video gaming.