Web Security: Rein In Dangerous Web Apps

With tools like goolag out there looking for vulnerable Web apps, it's more important than ever to consider centralized deployment of popular potential security liabilities.

By Charlie Schluting | Posted Mar 5, 2008
Print ArticleEmail Article
  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn

Charlie Schluting Part Two: Web security always comes down to a single common denominator: application security. Because of that, this follow-up to last week's article, "Learn Best Practices for Web Server Security," will focus wholly on the Achilles heel represented by Web servers.

Without the applications that run on a Web server, be they horribly insecure or not, there would not be much point to hosting Web sites. The hosting provider's hierarchy of needs goes something like this:

  1. Obtain customers, so we can eat.
  2. Ensure job security and secure the servers against total compromise when a bad application gives too much access.
  3. Focus on making customers happy through improved service, automation of tasks, and so forth.
  4. Consider our self-esteem.

Vulnerable Web applications can wreak havoc on a hosting provider's sanity, and its image. Botnet clients are not only for Windows anymore: They can also be small Perl scripts run via exploitable Web sites. That's right, all the evilness associated with botnet zombies can also apply to Linux-based Web servers. DDoS attacks and spam floods are commonly—and more frequently these days—seen originating from Web servers. An Internet site's reputation will quickly become tarnished when such evilness emanates from its IP address space. Remote sites will block e-mail, upstream service providers may degrade service levels, and generally, the Internet will prove about as hostile to the Web host as it is to dynamic IP addresses.

To achieve a measure of self-esteem, Web hosts must somehow figure out a way to save face. The proactive answer is to find a way to prevent the initial attack from happening in the first place. Many will deploy vast (read: expensive) technologies designed to detect misbehaving servers, which can eventually be traced to a specific Web site. Those tools often reveal the culprit to be a vulnerable PHP application that someone downloaded from the Internet. Not all PHP applications are inherently insecure, but the most popular ones with more diverse contributors frequently have security issues.

What, then, can we do about all of the popular applications that nobody updates? In a perfect world, each user would subscribe to the security announcements mailing list for every application he uses. Upon receiving a vulnerability announcement, our dutiful user would dutifully update his installation.

This never happens.

Dead Cows Bearing Dorks

Fresh out from Cult of the Dead Cow, is goolag, a Web site scanner that uses Google to find vulnerable applications. Give it a try against your own sites; that is after all what these tools are for, in the hands of a good guy. Conversely, it's very clear that with tools such as goolag, malicious people are going to be looking for well-known vulnerabilities. There are just too many exploitable Web sites in the world for the crackers to worry about manually attacking an application they don't have an exploit for already.

An understanding develops right around the time a Web service provider realizes that it's getting compromised way too frequently: Most, if not all, of these security incidents gain their foothold through a very small handful of applications! Wikis, WordPress, Mambo, Drupal, Gallery and others are all extremely common applications that have had their share of vulnerabilities. Sure, somebody's custom PHP app will get hacked eventually, but the vast majority of exploits seen in the wild result from automated scans of well-known vulnerabilities.

Users demand a very limited set of applications, so managing this small set of applications for users makes sense. It is not difficult to provide a scripted way to allow a user to install phpBB, for example. Most Web applications' installation simply consists of copying files in and updating a configuration file with things such as the URL of the site.

Dreamhost, for example, offers one-click installs. It's really more of an input-information-and-maybe-create-a-database-to-use and then one-click install, but it's extremely simple for the average user to comprehend. Follow-up configuration is sometimes necessary, but this is generally done via the Web page itself, which allows users to complete the task when they visit their new site for the first time.

By making it easy for users to install the most popular Web applications, Dreamhost and other providers that follow a managed application model are closer to having a definitive list of what applications are installed at which sites.

There is also less chance that the install will be left in an insecure state if providers manage applications for their users. This may involve adding a quick 'chmod 755' at the end of a configuration routine in the worst case, but it's well-worth verifying that every supported application is properly secured. Providers only need to do it once, yet thousands of customers will reap the benefits.

Furthermore, and more to the point, a managed approach allows users to automatically update their applications. Updating user-installable Web applications is not as difficult as it sounds. More often than not, a quick security update simply involves replacing a single file within an installation. Major version upgrades generally take more work, however. It's certainly a worthwhile endeavor, once we take the time to imagine knowing that thousands of phpBB (sorry to pick on the easy target again) installs are 100 percent up-to-date. By making updates as easy to perform as installations, users are more likely to update their applications. Given the choice with an unmanaged application, we can safely assume fewer users will update regularly, or keep up with security announcements.

By taking a centrally managed approach, when providers in tough security situations know many sites are vulnerable to something that is being actively scanned for, they can make the call to force an automatic upgrade for all instances.

Many would say that this is crossing the line between "service provider" and "applications maintainer." It is true, but as things evolve, we generally must move more in a direction of control, even if that's territory we've avoided entering in the past. Remember when people used to cite "end to end!!" all the time? Or when spam filtering was considered harmful? Thanks to the rise of spam as a major drain on resources and a security threat, we simply cannot live without spam filtering now. Change is the nature of the business.

Comment and Contribute
(Maximum characters: 1200). You have
characters left.
Get the Latest Scoop with Enterprise Networking Planet Newsletter