Prof. Justin Hughes, Cardozo School of Law, Yeshiva University
October 2, 2009

Last week, Federal Communications Commission Chairman Julius Genachowski gave an address at The Brookings Institution in which he proposed that the FCC’s position on “net neutrality” be strengthened and deepened – through converting four existing principles into rules and adding two new net neutrality rules, one concerning non-discrimination “against particular Internet content or applications” and one concerning transparency – “that providers of broadband Internet access must be transparent about their network management practices.”1

Chairman Genachowski’s speech touches off what will probably be a long, drawn-out political process to craft the final FCC position on net neutrality.  Copyright law could be a wildcard in that process, so it’s worthwhile to consider the role of Internet service providers (ISPs) in the copyright equation.

This is not the first time that regulation of ISPs has crossed paths with copyright.  More than a decade ago, “cyberlaw” started with the problem of how to handle information torts on the Internet – spam, invasion of privacy, pornography, copyright infringement – and what responsibility ISPs should have for such wrongs.  The first of these cases to reach the courts – in the United States, UK, and Japan – were defamation claims.  With early dialup speeds, it was easier to defame than to infringe.  The issue of ISP liability for copyright infringement by ISP users was debated for many years, but it became increasingly settled that ISPs should be shielded from liability for copyright infringement unless the ISP knew about the infringement and continued to permit it.

That conclusion was based on a few key assumptions.  One was that if ISPs had financial liability for copyright infringements, [a] they would become private censors, and [b] Internet access could become much more expensive (as ISPs had to expend resources on monitoring and insurance).  Both of these could have profoundly undermined the Internet’s potential as a communications medium.  While a web host ISP could be advised of on-going infringement (a website that it was hosting) – or notice the on-going infringement during its regular operations, it was also assumed that transmission ISPs could not know about copyright infringements in “wire time.”

For this reason, the “notice and take-down” provisions of the 1998 Digital Millennium Copyright Act (DMCA) were crafted to grant safe harbors to host ISPs [as well as search engines] only when they did not know about an on-going infringement [17 USC 512(c) and (d)], but the DMCA does not impose the same knowledge limitation on transmission ISPs [17 USC 512(a)].

At the time – as now – it was also implicitly understood that ISPs and copyright owners had conflicting interests.  While ISPs might publicly say they were against copyright infringement, they were not ignorant of the fact that – generally speaking – unauthorized distribution and/or performance of copyrighted music and audiovisual works was driving much of the demand for Internet services, particularly broadband.  Sheltered by the safe harbors of the DMCA, the EU Electronic Commerce Directive, and counterpart laws in Japan, China, Australia, and Singapore, mainstream ISPs seemed to have limited responsibility for – and even less interest in – copyright enforcement on the Internet.

But technology changes and the filtering capacity that seemed improbable a few years ago has become more and more a reality.  For webhosts like YouTube, the development of filtering technology has not just been to appease copyright owners, but to serve their own business model (you can not develop a targeted advertising model unless you know what people are watching).  Prompted by technological change, courts from California to Belgium have chipped away at the assumption that ISPs cannot do anything because they do not know anything.  Instead, there are strains of a new equation – if an ISP can easily do something to stop copyright infringement, then it should.

This is an extremely complex and shifting story, but nowhere has the realignment in capacities and interests been more dramatic than with broadband ISPs and audiovisual works.  In 2007, Comcast, one of the largest providers of Internet access in the United States, was found to be slowing down BitTorrent packets.  While a few of these BitTorrent packets are authorized distributions of materials, the vast majority of BitTorrent traffic distributes large audiovisual files without authorization.  It quickly became apparent that slowing down BitTorrent packets seems to be a common practice among ISPs in the United States, Canada, Singapore, and Australia.

The widespread decision to slow BitTorrent and other P2P applications provoked a few rumblings of a conspiracy between ISPs and copyright owners, but the alignment of the two camps’ interests seems coincidental.  The problem ISPs are addressing is one of their own creation.  By offering unlimited Internet access for a flat monthly fee – a business model credited to AOL or AT&T Worldnet (depending on whom you read) – ISPs invite some people to use disproportionate amounts of network resources.  And as P2P applications have gained popularity, they have increasingly sucked up broadband capacity and strained the system.

Slowing P2P packets is, from an ISP’s business perspective, a sensible way of managing its network.  And there are several good reasons to choose the BitTorrent application for targeting.  First, BitTorrent packets may be anywhere from 25 percent to half of all Internet traffic.2  Second, we can reasonably assume that the ISPs concluded that since the vast bulk of BitTorrent usage is the unauthorized distribution of films, BitTorrent users would have a weaker case to complain publicly – and could have their ISP service terminated anyway under most terms of service.

But there may be two additional, important reasons for the choice of BitTorrent.  First, there were widespread reports that BitTorrent by design seeks out faster connections whenever it is in use – meaning that when a large ISP adds capacity, the BitTorrent protocol will shift its usage to that ISP.  In the words of an executive at one of Canada’s larger ISPs, “[y]ou can’t spend your way out of this problem.  [P2P] has a behaviour that swamps all other behaviours.”3

Second – and despite popular misconceptions – identifying BitTorrent packets probably does not require deep packet inspection.  The particular characteristics of BitTorrent make it possible to identify these packets by the traffic pattern of the packets, not their content.  Because the traffic flow is basic to the BitTorrent architecture, efforts to camouflage the BitTorrent packets against ISP detection seem to have had limited success – so far.

This discussion is neither to condone nor to condemn any particular ISPs activities.  Indeed, while the FCC criticized Comcast for throttling BitTorrent (in a 3-to-2 vote), their Canadian counterparts did not condemn BellCanada for similar actions.  (Some important differences between Comcast and Bell Canada’s activities can explain the different rulings.)  My goal is to explain why it makes sense for ISPs to “discriminate” – in Chairman Genachowski’s parlance – against this particular application.

How else could transmission ISPs keep BitTorrent (or whatever popular P2P application usurps its place) from sucking up too much bandwidth?   Again, copyright law enters the equation.

One approach to the problem of P2P devotees chewing up inordinate amounts of bandwidth would be to charge each user for the true amount of bandwidth she is consuming – and in 2008 there were reports that at least one company, Time Warner, was experimenting with differential, capacity-usage pricing models.  The problem is that such a business model would likely expose ISPs to vicarious liability under American copyright law.

As a 1963 decision by the U.S. Court of Appeals for the Second Circuit described it, vicarious liability arises “when the right and ability to supervise [the infringer] coalesce with an obvious and direct financial interest in the exploitation of copyrighted materials.”4  ISPs already fulfill the first requirement for vicarious liability: They can control the activity at issue – that, after all, is what Chairman Genachowski wants to address.

Case law concerning both Internet and meatspace businesses makes it likely that the second requirement for vicarious liability is fulfilled – in the word of the Napster decision – “where the availability of infringing material acts as a draw for customers” or where the ISP’s “future revenue is directly dependent upon ‘increases in user-base,’” i.e., driven by the availability of infringing material.  In short, if an ISP “meters” its usage and the amount of usage increases with the unauthorized downloading of copyrighted materials, that ISP makes itself a plump target for a vicarious liability claim.

Another option for ISPs is to offer “tiered” service, giving customers who pay a premium price greater bandwidth or priority for their packets.  Such tiered service makes a lot of sense for customers who need their packets uninterrupted, such at businesses using VOIP services or anyone engaged in telemedicine.  Most advocates of net neutrality say they are willing to accept this kinds of non-neutrality.  But, again, we can conjecture that in the first generation of “premium service” purchasers, those interested in uninterrupted bandwidth will want it more for movie downloads than for remote surgery.  Depending on how the premium service is offered and who its customers are, an ISP could again find itself in a murky zone on the question of vicarious liability under copyright law.

The politics of copyright could also have their own gravitational pull on the net neutrality debate.  Copyright advocates recognize that traffic management technology, if widely used, would be a crippling blow to file-sharing networks, but it is not politically wise for the motion picture studios to defend the transmission ISPs too loudly.  On the other side, for Public Knowledge and the Electronic Frontier Foundation, fighting ISPs’ actions against P2P fits both network neutrality and their general struggle against Internet copyright enforcement.

In contrast, companies like eBay, Amazon, and Google have an interest in separating copyright enforcement via traffic management and packet inspection from the more general network neutrality concerns that affect them.  The recording and film industries have already stepped into the debate, prodding the different congressional sponsors of network neutrality to craft their proposals to protect only “lawful” traffic – and perhaps pushing Chairman Genachowski to toe that line in both his confirmation testimony and his remarks at The Brookings Institution.5

In those remarks, Chairman Genachowski said that “[t]he enforcement of copyright … and the obligations of network openness can and must co-exist,” but the devil will be in the details.  Genachowski also promised us a “fact based” and “data driven” inquiry in the development of net neutrality rules.  So, here’s a question: What happens if the facts show that today 90 percent of BitTorrent use – completely visible to transmission ISPs – is for unauthorized downloads of audiovisual works and video games?  95 percent?  97 percent?6

In any other regulatory regime, such statistics – and our new technological capacities – would suggest placing some responsibility on the ISP as the party able to stop the tort at the least cost.  But is the Internet a special case?  Maybe, but it’s definitely not as special as it used to be.  In the words of the Ninth Circuit in 2008, the Internet “is no longer a fragile new means of communication that could easily be smothered in the cradle by overzealous enforcement of laws.”7

Finding the right answers here won’t be easy, but having started down the road of regulating the net, the FCC needs to ensure that its own zeal for network neutrality does not smother meaningful enforcement of copyright law.