Network Neutrality: Either Way, Get Ready to Pay

Feelings run pretty high among Internet users when discussing Net neutrality. The idea that network operators such as AT&T and Verizon could become Internet gatekeepers, charging tolls to certain content providers to carry certain types of data, goes against the whole ethos of a neutral network which simply transports packets regardless of their content.

This strength of feeling is demonstrated by the coalition, whose “preserve net neutrality” petition got 250,000 signatures in less than a week, including those of organizations such as Gun Owners Club of America and Craigslist. Sir Tim Berners-Lee, the web’s inventor, made his views clear at a web conference in Britain at the end of May. “It’s better and more efficient for us all if we have a separate market where we get our connectivity, and a separate market where we get our content. Information is what I use to make all my decisions. Not just what to buy, but how to vote,” Berners-Lee told journalists.

The issue has become highly politicized with at least six bills either in committee or awaiting House floor votes, and in late May the House Judiciary Committee approved a bill that could oblige broadband providers to adopt neutrality principals and operate their networks in a “non-discriminatory” way.

One of the arguments in favor of net neutrality is that is fosters innovation. If search engines are forced to pay extra so that network providers will carry or prioritize their data, then this favors those that can afford to pay, rather than encouraging the market to decide which ones are the best. And if these tolls are high enough, this clearly favors existing companies over startups which may not easily afford them, or who would otherwise be spending their startup cash on product development rather than the right to compete on an equal footing to the incumbents.

It’s also clearly in the interests of telcos around the world – which along with cable companies operate most of the network infrastructure – to try to charge companies injecting voice traffic onto the network, as it is these companies which are competing directly with the telcos’ circuit switched voice businesses. If VoIP specialist Vonage or anyone else is forced to pay the telcos to send voice traffic over the Internet, then these companies are no longer such a competitive threat to the carriers – they end up becoming new customers providing new income streams. The losers are the users of Internet telephony, consumer or business, who almost certainly end up bearing the cost of these charges.

The network operators’ argument is that the Internet was never designed to carry the huge amounts of traffic that video, for example, generates. If network operators are to carry data individually to millions of endpoints then they argue video content providers should be forced to contribute to the cost of upgrading the network infrastructure. And time sensitive applications like video are likely to give a better end user experience if they are prioritized, which can only be done for a fee. But it’s also true to say that the Internet was never designed to carry Web traffic either, and that content providers already pay for a connection to the Internet, so why should they have to pay again to use it? And users of that content pay for a connection too, so why should they end up being saddled with extra charges to receive certain types of data when those charges are inevitably passed on to them? If businesses or consumers want to receive video content, they have to pay to ensure they have a big enough pipe.

But what about enterprises who may wish to use various data intensive Web-based applications or specific types of content? Obviously they don’t want to have to pay new charges for services they are already getting, but could a two-tiered Internet actually be a benefit? Would it be worth paying more for better performance for some types of applications, and would a two-tiered Internet actually provide that?

Absolutely not, according to Daniel Berninger, a senior analyst at Minneapolis-based research house Tier1. “It’s important not to confuse Quality of Service (QoS) with a two tier Internet,” he says. “QoS over the Internet never got anywhere because it is too difficult to do interconnections with hundreds or thousands of network operators. The result is that the general level of quality of service on the Internet got better and keeps getting better.” Berninger says that in 1996 the threshold was reached for voice over IP, and now video over the Internet is possible. “Internet users have moved from dial-up to broadband because they want more, and service providers provide it because they can sell them more.”

He believes that the idea that extra charges are needed to pay for new infrastructure is bogus, and the only gainers if net neutrality is lost will be the carriers. “There is no upside for the ecosystem as a whole. All that happens is the phone companies will survive for a bit longer as they get a new income stream, but innovation will screech to a halt,” he says.

But his downbeat view is not shared by Jan Dawson, research director at UK-based analyst Ovum. Dawson believes that a two-tier Internet could be of significant benefit to organizations which require high speed, high quality data links – for example those in the media industry or ones involved with video intensive applications like telemedicine. “A two-tier Internet would be of interest to companies which at the moment are forced to used private networks to get the quality of service they need, because this sort of connection is very expensive. If there was a premium version of the Internet that offered similar quality at lower cost it would be very attractive.”

He envisions a voluntary system where it’s possible to use a given Web application or receive data over the “free” Internet, or choose to pay to receive the same data over a higher quality Internet connection. Rather than see the existing infrastructure divided to carry two classes of data, he expects that the premium class would be carried over new infrastructure, so the quality of service of free traffic would not suffer – if anything it would improve since less data would actually be traveling over it.

But skeptics doubt that this would, in fact, come to pass. A more likely scenario, they believe, is that some of the existing infrastructure would be used for premium traffic, with all the rest of the traffic crammed into whatever bandwidth is left. While premium traffic would travel faster, the rest would travel slower.

At the time of writing Congress has just passed the Communications Opportunity, Promotion and Enhancement Act (COPE) without a neutrality provision as many campaigners had wanted. Rep. Edward Markey’s amendment ensuring network neutrality was defeated by 269 votes to 151.

So what is the likely outcome? The quick answer is that some sort of two-tier Internet appears to be inevitable. For enterprises, this may not be too significant in the short term, but in the longer term most will probably end up paying more for Internet connectivity than they otherwise would.

But on the positive side, a two-tier Internet could reduce overall bandwidth costs if leased line traffic could be switched to lower cost premium Internet connections without loss of quality. It could also make the free Internet perform better for those who don’t wish to pay.

Overall enterprises face either higher or lower bandwidth costs, and faster or slower network performance on the “free” Internet. What will actually happen, then, is really anyone’s guess.

Paul Rubens
Paul Rubens
Paul Rubens is a technology journalist specializing in enterprise networking, security, storage, and virtualization. He has worked for international publications including The Financial Times, BBC, and The Economist, and is now based near Oxford, U.K. When not writing about technology Paul can usually be found playing or restoring pinball machines.

Latest Articles

Follow Us On Social Media

Explore More