Prediction 2009: No Net Neutrality Regulation

by Marc Scribner on January 2, 2009 · 8 comments

in Regulation, Tech & Telecom, Zeitgeist

Perhaps this is just wishful thinking, but I think that 2009 may see the death of calls for net neutrality regulation and may even see some of the most ardent supporters of neutrality soften their stances as it becomes painfully obvious that non-neutral arrangements for distributing content—especially large files like movies and digitally-distributed software—are the best way for the maturing Internet to deal with the accelerating amount of content online.

But before I address why I think more proponents of neutrality regulation will be jumping ship this year, we should break down the neutrality debate into its constituent parts.  I think its most useful to think of net neutrality as three separate policy questions—one dealing with censorship, one addressing the issue of prioritizing traffic, and the other dealing with the physical architecture of the Internet.


I’m of course sympathetic to the idea that ISPs (Internet Service Providers) and other network companies like wireless phone services shouldn’t be in the business of making judgments about content.  But history has shown us fairly decisively that no law or regulatory body is needed to make sure network providers stay neutral on content.

As Adam Thierer of the Progress & Freedom Foundation and a fellow blogger at the Tech Liberation Front points out in his post “Net Neutrality & the White Hot Spotlight of Public Attention,” even the appearance of non-neutrality toward content has been met with such an incredible backlash from the public that networks providers have already been forced by the court of public opinion into walking a fine line when it comes to content neutrality.

One example occurred in September of 2007 when Verizon denied the request of NARAL (the National Abortion Rights Action League) to send a “call to action” SMS message to members that had signed-up to receive them.  Verizon wasn’t picking sides, but was instead trying to avoid taking any side in the abortion debate, based on a company policy that disallowed such favorite-choosing.   But the denial of NARAL’s request wasn’t seen as Verizon staying out of the debate, but rather as an attack on the pro-choice movement.  The public outcry was met with a change in policy at Verizon (and the claim that the existing policy was misapplied), which quickly granted NARAL the ability to send it pro-choice messages via SMS and granted the same access to pro-life groups.

Other famous (or perhaps infamous) examples of backlash to even accidental violations of neutrality have popped up just over the last few years.  Cox Interactive customers found in early2006 that they were unable to access Craigslist.  Users cried foul while pundits claimed Cox was blocking the online classifieds site to boost its own classifieds business amongst its Internet users.  Cox pointed the finger at Authentium—a company working with Cox to provide security software—and claimed the software was mistakenly blocking Craigslist.  The problem was corrected, but Cox took a serious beating in the press.  The negative publicity for the company did more damage than any fine from the FCC ever could.

Prioritizing Traffic

Perhaps the most contentious issue in the neutrality regulation debate is network management.  Since the Internet first became a commercial enterprise, networks have been much more than cables and routers.  A sophisticated process of prioritization of data has accompanied the Internet’s growth, allowing for dynamic responses to demand and helping to curtail abuses like spam email.

Proponents of network neutrality regulation believe that some network management techniques are unacceptable politically, claiming that they endanger free speech.  However, unlike the public square, the commercial Internet has always been a pay-to-play system with prioritization and other optimization built into its architecture.  It’s designed with functionality in mind, rather than political correctness.

That said, private enterprises still must abide by their own contracts and have an open, and honest relationship with their customers.

In mid 2007 it became evident that Comcast was using prioritization methods that it hadn’t made known to its customers.  It was discovered by users working  in conjunction with groups like the Electronic Frontier Foundation that Comcast was blocking the outgoing traffic from the popular file-sharing protocol BitTorrent.  BitTorrent is designed to allow people to rapidly download files, not from a central server, but from other BitTorrent users dispersed around the Internet.  To do this, BitTorrent users not only downloads file, but also upload or “seed” files to other users.  Comcast blocked this seeding during several hours of the day.

This policy of “traffic-shaping,” just like Cox & Verizon’s blocking of services, caused a blowout among opinion-leaders in the Internet Community and pundits in tech news.  Comcast responded by citing that BitTorrent was choking its network during its busiest hours.  Comcast was also quick to point out that  BitTorrent users were a very small minority of its users and that the vast majority benefited from Comcast curtailing this bandwidth-gobbling service during times of peak usage.

Despite all of these reasonable arguments for why its network management was necessary and even desirable, Comcast eventually stopped the practice and later partnered with BitTorrent to address the problem of network management.  Public pressure had been so great that they were forced to change their policy and instead adopted a 250 gigabyte monthly cap on traffic for all users.

Some cited this as a justification for neutrality regulations, but this is a clear example of neutrality being preserved by the free market and forces of public opinion.

Regulators claim that because the esoteric nature of high-technology services, regulations and government control is needed to protect consumers.  But history shows that consumers turn to trusted experts such as consumer product reviewers, technology journalists, and their tech-savvy friends and colleagues for advice on making purchases.  Its because Comcast lost the favor of these opinion-leaders that it had to change its policy.  Had it done otherwise, it risked losing a significant portion of its customer base.

Architecture of the Net

The Internet is physically constructed of many smaller, distinct networks.  These networks exchange data between each other to form the Internet—the network of networks.  Traffic traveling from one coast of the United States to another usually passes over a handful of networks including the Internet’s “core”—the interchanges between the largest networks such as AT&T, Verizon, Level3, Global Crossing, or Sprint.

This network-to-network model, however, has been augmented through what are commonly referred to as “short lanes” and “fast lanes.”

Short lanes are used by companies like Akamai and Google to provide better service by placing data closer to users.  By placing servers in the facilities of ISPs, content providers only have to transfer a given file across the country once.  Once that file—for example, an HD movie or software title—reaches a server at an ISP, a copy will be stored there.   Later requests can then be fulfilled by these nearby servers.  This means the core of Internet is only burdened by these large files once, rather than thousand or millions of times.

Fast lanes skip many of the Internet’s core altogether, further relieving congestion at interchanges most likely to become bottlenecks for data.  One such fast lane, LimeLight, provided the online video for the 2008 Beijing Olympics.  Its servers in Beijing sent data directly over its own global fiber optic network to servers at ISPs.  This not only represents a less traffic-intesive means of distribution, but much faster one, allowing users to watch the games as they happened.

Proponents of network neutrality regulations believe that such arrangement represent unfair advantages available only to the most wealthy companies.  While it’s true that it does take a significant amount of capital to afford such distribution, these distributions methods also benefit small companies and consumers.  By either reducing or removing traffic from the core of the public Internet, fast lanes and short lanes free up bandwidth, making Internet service faster for small businesses and consumers without any cost to them.

A Market for Neutrality, Not Neutrality by Regulation

Ultimately, the death-nil of neutrality regulation will result from the win-win scenarios that non-neutral architectures represent, and the acknowledgment that the court of public opinion is a much swifter and fairer judge than any hearing called by the FCC.

It’s already the case that some supporters of neutrality are falling away from the cause. Though a recent Wall Street Journal piece claiming that Google was building its own fast lane online was debunked (its really a short lane), there was some truth to the other parts of the story.  Aside from the technical inaccuracies about Google’s current project, the Journal was very astute in noting that many of the billion-dollar supporters of neutrality—Microsoft, Yahoo!, and Amazon—have changed their tune.  Many of these firms are forming partnership or making other arrangement with ISPs to deliver their content more efficiently.

Moreover, even intellectual leaders like Harvard Professor Lawrence Lessig have said that “There are good reasons to be able to prioritize traffic.” Lessig endorsed Mr. Obama and many in his former home of California urged him to run for Congress.  He’s a person with much political clout who may be bellweather in this debate.

Mr. Lessig’s statement also bring up another important point.  Lessig’s definition of “good reasons” might be much different from mine.  Some might say this is reason to have a definition enshrined into law, but I think it means quite the opposite.

That’s because regulation won’t be as flexible and therefore not allow for the most robust Internet as would letting the pro-neutrality market continue along with a less ridged definition of neutrality.

Any regulation would have to be defined in (inevitably misinterpreted and distorted for political gain) legal terms, rather than being defined by the ever-shifting ideas of what acceptable network management looks like.  As new technologies emerge, technologists, tech reporters, pundits, think-tankers, and others will judge them as either good or bad for the Net.  Consumers will vote with their feet by either staying with the providers they have or switching to the more neutral provider.

Some Good Policy Changes

If anything, public policy needs to focus on ensuring that there are more places for consumers’ feet to go.  Increasing competition in broadband markets through loosening the restrictions on the wireless spectrum would be a good start.   Through either expanding the commons or (preferably) creating private ownership of large swaths of the spectrum we could see robust, viable alternatives to land-line broadband emerge.  Taking a federal wrecking ball to state and municipal franchising agreements that block outside competition would also increase competition tremendously.

Hopefully the Obama administration and Congress will do the opposite of what history shows politicians usually do.  Most often, politicians would address the concerns about corporate power caused by regulation (ISP duopolies) by making a whole new set of regulations with a new set of problems.  If this temptation is resisted, we may be able to roll-back the regulations that have limited competition to begin with and see the Internet grow as a consequence.

But again, this is probably just wishful thinking.

Alan January 2, 2009 at 6:22 pm

This is a good article! Just as a general anti-regulatory stance may ask us to give up utilitarian advantages in the name of an hallowed principle, so too a pro-regulatory bias may be asking the same. The idea that net-neutrality regulation may have less than desirable consequences and should be moderated is not unreasonable. That is probably true of a completely hands-off anti-regulatory approach also.One thought comes to mind – in many past instances of general utilities that provide distribution services for content providers, for instance, radio, television and movie theater networks, anti-trust regulations have prohibited the content providers and the content distributors from being the same economic entities. For instance, movie theater chains can not (could not?) produce movies. At least one example provided, the case of Cox Interactive being accused of blocking access to Craigslist to boost its own classifieds business amongst its Internet users, falls into this category. Keeping distributors strictly separate from providers would provide a structural protection against at least some forms of unreasonable favoritism.

Avitar January 6, 2009 at 10:04 am

Wrong! Wrong! Wrong! This article by Cord Blomquist could not be more off the mark. As a computer engineer, my computer network experience goes back before President Reagan released the Internet to commercial use in the 1980's. As a student of history, I have to endorse the view expressed in"Nature abhors a vacuum.” Anything that is not explicitly placed outside of control will inevitably be seized for influence, status and profits. It can be by Governments or Corporations on NGO, somebody is trying to steal everything not nailed down. It is also important to remember, “If it can be pried-up then it is not nailed down.” One only has to consider the right to free speech and all of the exceptions created by the Political Correctness police to see that this is true. The Chinese government will censor the American Internet if cooperating with the censors is not a crime. The local politicians, (think Governor Blajovich) will offer to allow Internet services to be made a metered franchised monopoly in exchange for campaign contributions. Corporations like Comcast will enter into agreements (secret and otherwise) to limit the availability of content to preferred providers in exchange for consideration unless the upper management can look forward to being dragged into court every week. The proposal that was offered in Congress that any company with right-of-way be allowed to run fiber to the house is probably the best one. Technically the most expensive part of the Internet is securing right-of-way, followed by the cost of installing fiber. The bandwidth of fiber can be into the Terabyte per second if the electronics at each end are upgraded and that is a Moore's law function. Bandwidth should currently be in the 600 Megabit per second range for home service. However, businessmen benefit by making the commodity they sell scarce and costly. Builders support zoning boards, BP advertises Environmentalism and Comcast and the other ISP support “Bandwidth management.”

Televisions January 19, 2009 at 12:08 pm

Good one, do you know how/where to get one? Would be great to have one.thanks

Televisions Online January 20, 2009 at 11:16 am

Sounds interesting, thank you for posting

Comments on this entry are closed.

{ 3 trackbacks }

Previous post:

Next post: