Securing the Cloud in the Digital Age

It’s interesting to note that Cloud computing has topped Gartner’s “Top 10 Strategic Technologies for 2010.” They define a strategic technology as “one with the potential for significant impact on the enterprise in the next three years.” Gartner is somewhat right here. The fundamental problem I have is that the industry has bucketed anything that can be loosely defined as cloud, virtual, consolidatory, or anything on the network in the same term being cloud. All of us loosely interchange public, private and cloud services to our whims which quite frankly confuses the general public.

To be fair, Gartner does predict that through 2012, “IT organisations will spend more money on private cloud computing investments than on offerings from public cloud providers.” This is great, but I long for the day where this nebulous or opaque term can be segmented into public clouds, private clouds and more importantly ITaaS. This is not only a trend for 2010 but has been feverishly worked on through the last 24 months. It has been wrapped up in a pretty bow and proclaimed as ‘cloud’ for the convenience of propping up the ‘invisible dog leash’ fad-based early start-ups that infest the wannabe public cloud offerings (or so they think).

There are two primary reasons (amongst many) why the enterprise will not make major strides towards the public cloud– lack of visibility and multi-tenancy issues which cloak the real concern over critical data security.

Lack of visibility

The public cloud is opaque and lacks a level of true accountability that will paralyse any enterprise account from releasing their prized data assets to a set of unknown entities. Look at the value proposition – no one consuming the service has visibility into the infrastructure. The provider themselves aren’t looking at the infrastructure. Are SLAs relevant? And if so, who can enforce or even monitor them?

The public cloud has received so much buzz in large part because it professes to offer significant cost savings over buying, deploying and maintaining an in-house IT infrastructure. While this is massively appealing, it doesn’t answer any of the fundamentals of Quality of Service, network and data security to name a few. Imagine the concern of opening up your internal systems with a direct pipe into the ‘cloud’. This is the equivalent of leaving your data centre door open, while your data centre adjoins a ‘how to hack systems’ symposium.

Multi-tenancy Issues

The second reason why businesses of any real size will not make the leap to the public cloud is: Multi-tenancy. Wikipedia (the font of all knowledge) defines multi-tenancy as “a principle in software architecture where a single instance of the software runs on a server, serving multiple client organisations (tenants).” In other words, many people using the same IT assets and infrastructure.

So here’s the rub, EC2, Google, etc., provide true multi-tenancy but at what cost to compliance and security? What about such hot topics such as PCI or forensics? How safe are the tenants on a system? Who is on the same system as you, a hacker or perhaps your dearest competition? How secure is the isolation between clients? What data have you trusted to this cloud? If you buy the argument, it will be your patient records, payroll, client list, etc. It will be essentially your most important data assets. I have to think this would be a good test of data asset Darwinism.

Cloud computing needs to cover its assets

Until the public cloud can provide visibility all the way down to the IT infrastructures most simple asset – logs – enterprises simply won’t risk it. To be deployed properly, a public cloud needs to understand logs and log management for purposes such as security, business intelligence, IT optimisation, PCI forensics, parsing out billing info, and the list goes on.

Until then, in the grand scheme of risk mitigation, enterprises will fear the cloud and per my recommendation, segment public cloud from ITaaS in a private cloud. It’s a shame but as we’ve clubbed all the terms into a single bucket. It turns all the lights red and in fact there’s a tremendous value in cloud computing. But public clouds and enterprise computing are a world apart and should be treated as such. And there are whole rafts of risks to be consider along the way.

In terms of the role log management plays – our architectural premise is to handle the ingestion of logs from unknown sources, and to have flexibility as to the kinds of devices, logs or target locations. Additionally, we even offer a unique feature allowing automatic identification of log sources. This is where the system can match a stream to a type of log for agile reporting and normalisation.

We’ve also designed our licensing model to embrace such agile or fluid computing models, and not be tightly licensed to a specific target, device or log source. In this way we’re not only the leader in Log Management, but we’re also enabling many ESSP, MSP and cloud enabling Telco clients to have flexibility in their logging demands. This is being done all while tracking data that’s dynamically moved around their asset pool.

With LogLogic, we leave no log left behind, and there’s no cloud too opaque.

LogLogic are exhibiting at Infosecurity Europe is the No. 1 industry event in Europe held on 27th – 29th April at Earl’s Court, London. The event provides an unrivalled free education programme, exhibitors showcasing new and emerging technologies and offering practical and professional expertise. For further information please visit

Latest Articles

Follow Us On Social Media

Explore More