The Future: It's Not What it Used to Be

Adam Thierer pointed me to a posting by Maria Popova at Brain Pickings, in which Popova has pieced together a somewhat cheesy 1972 documentary based on Alvin Toffler’s classic “Future Shock.”  The over-the-top narration by Orson Welles is well worth the price of your time.  I read Future Shock as a kid (I was 11 when it was published) and didn’t really understand it, but I loved its urgent tone and its sense of revealing a secret the “establishment” couldn’t handle.

Toffler was really the second person to point out the gap between the capacity of technology to change our lives and the ability of human beings to absorb that change.  The first (or at least an earlier version) comes from Henry Adams in his autobiographical “Education.”

Adams called the phenomenon, which he claimed had shipwrecked his life “The Law of Acceleration,” a revelation that struck him in the Hall of Dynamos at the 1893 World’s Columbian Exposition in Chicago, where he seems to have suffered a serious mental breakdown.  Working with pseudo-math and pseudo-physics, Adams calculated that the accelerating pace of change would end civilization by around 1920.

The world didn’t end in 1920, nor did it end in 1970.  Although perhaps from the standpoint of a Henry Adams (who died in 1917), the world of today might represent something so different and unpleasant that it might be described as post-apocalyptic.

Taking a more economic view of the process, I referred to is as The Law of Disruption.  Some technological breakthroughs cause considerable chaos and revolutionary change in business and society, it’s true, but what results on the other side, for those who survive the change, is always something better, or at least more efficient.

The period of adjustment is, to me, the most interesting.  Lucky for me then, that we seem to move from one period of adjustment to the next, or even to overlapping periods coming faster and closer together as “The Law of Acceleration” proceeds.

As excited as we get by the latest innovations in information, medical, materials, and transportation science, we can never exploit those changes as fast as the technology itself would make possible.  We need time to adjust, and to replace our inadequate metaphors (horseless carriage, radio with pictures, iron rope, etc.) for a new reality in which disruptive technologies are recognized as something new and not simply an incremental improvement.

From a business standpoint, the gap is an opportunity—companies can focus on helping customers and other business partners move up the curve just a little faster than they might on their own.  In some ways it’s like the old joke about the two campers who hear a bear outside their tent.  One of them starts to put on his shoes and the other says, “What are you doing?  You can’t outrun a bear.”  “I don’t have to outrun a bear,” the first one says.  “I just have to outrun you.”

But outside the business context, anxiety about the future—future shock—can look like a threat rather than an opportunity.  In my experience, in fact, you can divide the world into two kinds of people:  those who look forward to the sometimes uneven and messy process of adaptation and those who fear it.  Will the future be utopian (the United Federation of Planets) or dystopian (The Borg)?  The answer is of course neither—some things change for the better, some don’t, but mostly the course is unpredictable.  How someone views the future says much more about who they are now than anything else.

Toffler, as best I remember the book, didn’t really take a position, but his talent and the phenomenal success of the book rested on his ability to make the anxiety seem as unbearable as possible.   The transformation of society was inevitable, Toffler agreed, but the process was creating social chaos and leaving people to suffer from “shattering stress and disorientation.”

The changes Toffler wrote about (as a journalist, he largely collected example after example and lined them up as overwhelming evidence) seem modest and even quaint by comparison to those we are now experiencing forty years later.  Clearly, humanity survived, more-or-less intact.

Which suggests that we’ll survive the next wave, and the one after that, as well.  (Further still, well, who knows?)  Transformation is an inevitable feature of modern life, and isn’t likely to go away.  When change is constant, the only thing you can predict is unpredictability.  But at least you can predict that!

Watching the documentary, you get a sense of perspective from the inability of a previous generation to imagine its survival.  The process is there, but the result wasn’t nearly as disastrous as imagined.  So maybe we’ll cope just fine, too, with our dangerous inventions.

It’s not as if we have any choice.  As the Firesign Theater famously albeit cryptically said, “Live in the future.  It’s happening now.”  (To which a character in the background retorts: “The future?  The future’s not here yet, man.”)

I met Alvin Toffler in 1999.  We were teamed together for a one-day program, incongruously, in Buenos Aires.  Toffler spoke in the morning, and I spoke in the afternoon.  We weren’t asked to coordinate our messages, but it more-or-less worked out that he was giving the pessimist view of the future and I was giving the optimist view.  Most of all, I was interested in what new examples Toffler had found in his research on future shock—examples I hoped to use in my own speeches.

He spoke eloquently and powerfully, but all the examples he gave were the same old ones he had been using all along.  His research—at least—was stuck in 1970, and he seemed largely unaware of the digital revolution already well in progress by 1999.   (I have not read his most recent book, published in 2006.)

But perhaps in some sense, Toffler found—perhaps unintentionally—a cure for future shock.  Just keep living in the past, even a past in which you are fearing a future that has already arrived and kept going.

He is of course not alone.  The hell we’re familiar with is always easier to manage than the heaven that might be achieved.

“Preserving the Internet,” But Which One?: Reading the FCC’s Net Neutrality Order (Part IV)

This is Part IV of a five-part commentary on the FCC’s Dec. 23, 2010 “Open Internet” Report and Order.

Part I looked at the remarkably weak justification the majority gave for issuing the new rules.

Part II explored the likely costs of the rules, particularly the undiscussed costs of enforcement that will be borne by the agency and accused broadband access providers, regardless of the merits.  (See Adam Thierer’s post on the first attenuated claim of violation, raised before the rules even take effect.)

Part III compared the final text of the rules to earlier drafts and alternative proposals, tracing the Commission’s changing and sometimes contradictory reasoning over the last year.

Part IV, (this part), looks at the many exceptions and carve-outs from the rules, and what,  taken together, they say about the majority’s dogged determination to see the Internet as it was and not as it is or will become.

Part V will review the legal basis on which the majority rests its authority for the rules, likely to be challenged in court.

What does an Open Internet mean?

The idea of the “open Internet” is relatively simple:  consumers of broadband Internet access should have the ability to surf the web as they please and enjoy the content of their choice, without interference by access providers who may have financial or other anti-competitive reasons to shape or limit that access.

In the act of trying to translate that idea into enforceable rules—enforceable, inexplicably, by a federal regulatory agency with no legislative authority over any substantial feature of the Internet economy and no real justification for creating rules of any kind for a system that is working nearly flawlessly so far—the FCC has found itself tied in unholy knots.

The rules as enacted carved out exceptions and caveats that, taken together, render the final regulations not meaningless but certainly incoherent.

In exempting from the rules a host of important innovations in network management and infrastructure optimization developed over the last decade, the FCC has stepped back from the brink of its original plan, which would have returned the Internet to the days of unreliable dial-up access and static websites.

But it has also revealed the danger of trying to regulate a rapidly-evolving life form, and risked the unintended consequence of denying it future forms of nutrition and good health.  If these rules stand and are vigorously enforced, the Internet’s further growth and development may be stunted.

The Mythical Neutrality Principle

Back in the stone age of 1998, I wrote in “Unleashing the Killer App” that one of the fundamental bases on which the Internet became an engine of innovation and even social change was that its basic protocols are non-proprietary.  Anyone can make use of them, any device can support them, and every node is a peer—without paying royalties or other tribute to anyone.  As the “lowest common denominator” standard, TCP/IP benefited from network effects to overtake several popular proprietary standards, including IBM’s SNA.

The technical and legal openness of TCP/IP has been romanticized over the years, particularly by legal scholars and journalists who know less about technology than they think they do, into a view of the Internet as a meritocracy and a vehicle for true collaboration and consciousness-raising; the fruition, as Tim O’Reilly put it, “of “what we were talking about at Esalen in the ’70s—except we didn’t know it would be technology-mediated.”

The ideal of neutrality—of a level playing field in which every website, application, and device is no more prominent than any other–is a persistent and compelling myth.  It evokes the heroism of the entrepreneur in the garage, developing the next Yahoo or Google or YouTube or Facebook or Twitter or Groupon, with little more than a great idea, technical skills, and the willingness to sacrifice sleep and social life for the promise of a future liquidity event.  Optimally, the great IPO, or to change the world and make it a better place by connecting people and information in new and unexpected ways.  Wikipedia, for example.

Whatever the motivation, after a grueling race against the clock, the app is released.  If all goes well, it reaps the benefit of Metcalfe’s Law, goes viral, and becomes the next Big Thing, all in the span of time between one SXSW conference and the next Web 2.0 Summit.

No large corporation can stop the plucky inventor, or ransom a part of her invention.  No access provider can hold its invaluable user base hostage.  No competing content provider, no matter how giant, can buy up all the available market channels and freeze out the upstart start-up.  No government regulator need approve or license the invention before human testing and general use can begin.

When Worlds Collide

A considerably more mundane version of that ideal world did exist in the last half of the 1990’s.  It still exists today.  But it has become much more complex and nuanced in the last decade.

The Internet, the Web, the Cloud and the app-based economy of wireless computing devices, TVs and increasingly other things (including cars and other non-traditional computing platforms such as consumer electronics and home appliances) have evolved in interesting and productive ways, often “under the covers” of the network infrastructure.

Few consumers know or would care to know about the existence, let alone the details, of network optimization algorithms, content delivery networks, complex peering arrangements, caching and edge servers, file torrenting, mirror sites, specialized services, virtual private networks, packet prioritization based on media type, spam and other malware filters, dynamic IP addresses or domain name redirection.

All of these (and more) are mechanisms for speeding up the delivery of the most popular or the most bandwidth intensive content.  Many have been developed by entrepreneurs or by the large access and hosting services, often working in concert with the voluntary protocol and technical committees of the Internet Society.

ISOC keeps the standards alive, flexible, and responsive to new opportunities for expansion and reinvention made possible through the agency of Moore’s Law, which continues to drive the basic technological components of digital life into the uncharted realm of the faster, cheaper, and smaller.

Strictly speaking, of course, all of these innovations violate the neutrality principle.  They recognize that some packets, either because of file size or popularity or media characteristics or importance to the recipient, requires special treatment in the transport from host to client.

Video (YouTube, Hulu, Netflix), for example, can consist of very large files, and the  component packets must arrive at their destination with relatively short delays in order to maintain the integrity of streaming display.

Hosted services, such as medical monitoring, use parts of the same infrastructure as the public Internet, but cannot safely be left to the normal ebb and flow of Internet traffic patterns.  Limitations of the 3G wireless infrastructure—in large part a result of regulatory restrictions on cell siting and spectrum mismanagement—make it difficult to satisfy exploding customer demand for ever-more of the most bandwidth-intensive apps.

When all is said and done, the core problem with the FCC’s Open Internet Report and Order comes down to a clash of the idealized view of the neutral Internet with the reality of an always-evolving, always-improving technology infrastructure.

Chairman Genachowski, himself a former venture capitalist, is clinging to the myth of the Internet as virtual frontier, an understandable but highly dangerous indulgence in nostalgia, a remembrance of Internets past.  He’s not alone.  The romance of the American west has persisted more than a hundred years since historian Frederick Jackson Turner famously declared the frontier closed.

As he said in introducing the Open Internet proceeding in September, 2009, shortly after taking office:

“The Internet’s creators didn’t want the network architecture — or any single entity — to pick winners and losers. Because it might pick the wrong ones. Instead, the Internet’s open architecture pushes decision-making and intelligence to the edge of the network — to end users, to the cloud, to businesses of every size and in every sector of the economy, to creators and speakers across the country and around the globe. In the words of Tim Berners-Lee, the Internet is a ‘blank canvas’ — allowing anyone to contribute and to innovate without permission.”

Many of us fortunate enough to have been there at the moment the Internet reached its tipping point and became an unstoppable force, a kind of network gravity, share this nostalgia.  It was a moment that changed the trajectory of computing, upended giants, and unleashed tremendous creativity.  For me, it utterly transformed my career, much as my first FORTRAN course as an undergraduate had unintentionally started it.

But the effort to translate nostalgia into federal law—assuming, but only for the moment, that the FCC is the appropriate agency to preserve an Internet that has long since passed even if it was ever the way we old-timers remember it—has already fallen down more than its fair share of abandoned mine shafts

.

The Exceptions that Expose the Rule

Even the original Notice of Proposed Rulemaking and draft order released for comment in October, 2009 included many (necessary) exceptions from strict adherence to the neutrality principle.

The proposed rules, most important, limited all six neutrality rules (§§ 8.5-8.15) to an exception for “reasonable network management.”  Reasonable network management was defined as all “reasonable practices” broadband Internet access providers undertook to, among other things, “reduce or mitigate the effects of congestion on the network or to address quality-of-service concerns.”  (§ 8.3).  And bowing to legal limits to neutrality, reasonable network management did not apply to efforts by broadband access providers to “address unlawful conduct on the Internet,” including unlicensed sharing of copyrighted content. (¶ 139)

In explaining “reasonable network management,” (¶¶ 135-141), the FCC acknowledged that the technology by which a user accessed the Internet could play a significant role in determining when a provider could act “inconsistently” with the neutrality principle but still not violate the rules.  Access over coaxial cable follows a different architecture—with different constraints—than fiber, copper, satellite, or cellular access.  For purposes of “quality of service,” the agency acknowledged that it might be appropriate for an access provider to implement a “network management practice of prioritizing classes of latency-sensitive traffic,” such as VoIP, gaming, and streaming media traffic.  (¶137)

Since the FCC has up until now had little role to play in the regulation of the Internet, it’s not surprising that the agency began this process with a highly outdated view of how the Internet “worked.”  So the NPRM here and in eighty other sections, sought comment on the current state of the Internet ecosystem, the technologies of broadband access, network management principles in place, and the nature of the broadband access market throughout the U.S.—the latter a subject the agency took up again in the National Broadband Plan.

Not surprisingly, the FCC heard plenty.  The final report lists over 450 sources of comments and replies to the NPRM, many of which addressed themselves to educating the FCC on the technologies it had undertaken to regulate.

As a result of this formal (and no doubt a great deal of informal) feedback, the final rules added numerous additional exceptions, authorizing a wide range of ways a provider of broadband Internet access could act “inconsistently” with the neutrality principle but still not be thought to have violated them.

The new exceptions include:

  • Exemption from many of the rules for all providers of mobile broadband Internet access, including the “no unreasonable discrimination” rule and some of the “no blocking” rule.  (§ 8.5, 8.7)
  • Explicit exemption from the “no blocking” rule for app stores and other control mechanisms used by mobile broadband providers.  (¶ 102)
  • A change from a strict “nondiscrimination” rule for wireline providers to a rule prohibiting only “unreasonable discrimination.” (§ 8.7)  (See Part III for a discussion of the difference between those two formulations.)
  • A limited definition of “broadband Internet access service” that applies the rules only to providers of a “mass market retail service” providing “the capability to transmit data to and receive data from all or substantially all Internet endpoint.”  (§ 8.11(a)  That change leaves out a range of relatively new Internet devices and services—including the Amazon Kindle, game consoles, cars, TVs and refrigerators—that offer some form of web access incidental to their main purpose in connecting to the network.  (See ¶ 47)
  • A broader definition of “reasonable network management,” that includes any practice that is “appropriate and tailored to achieving a legitimate network management purpose.”  (§ 8.11(d) and see ¶ 82)
  • Exemption for virtual private networks, which use much of the same infrastructure as the public Internet. (¶ 47)
  • Exemption for Content Delivery Networks and co-located servers that put particular content in closer proximity to important network nodes and therefore speed its transmission to requesting users. (see ¶ 47 and ¶ 76 note 235)
  • Exemption for multichannel video programming services (e.g., U-verse) that use TCP/IP protocols and existing Internet infrastructure.  (¶ 47)
  • Exemption for Internet backbone services.  (¶ 47)
  • Exemption for hosting or data storage services. (¶ 47)
  • Exemptions for “coffee shops, bookstores, airlines and other entities when they acquire Internet service from a broadband provider to enable their patrons to access the Internet from their establishments.” (¶ 52)
  • Exemption from the discrimination rule for “existing arrangements for network interconnection, including existing peering arrangements.”  (¶ 67 n. 209)
  • Exemption (for now) ­ for “specialized services,” including multichannel video programming (see above) or facilities-based VoIP, that “share capacity with broadband Internet access services over providers’ last-mile facilities.”  (¶¶ 112-114)
  • A hedge on whether “paid priority” of some content, either of the access provider or a third party, would necessarily violate the “unreasonable discrimination” rule (¶ 76), and an explicit rejection of the argument that CDNs constitute illegal “pay for priority” though they have the same effect on consumer experience as prohibited prioritization schemes.  (¶ 77)
  • Recognition that end-users may elect to acquire Internet access that limits their choice of content, including services that support parental controls or which “allow end users to choose a service that provides access to the Internet but not to pornographic websites.”  (¶ 89).  Further, “[b]roadband providers are also free under this Order to offer a wide range of ‘edited’ services,” including a “service limited to ‘family friendly’ materials.”  (¶ 143, cf. ¶ 141)
  • Recognition that existing federal law allows all Internet Service Providers to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”   (¶ 89 n. 279)

Finding the Forest Amid the Exemptions

Of course these exceptions, particularly the measured approach to mobile broadband access and the provisional reprieve for specialized services, generated howls of indignation from advocacy groups hoping for pure neutrality, and led many of the Chairman’s initial supporters to abandon him over the course of the year the NPRM was publicly and privately debated.

My concern is quite different.  I think each of these exceptions makes good sense, and will keep the new rules, at least in the short-term, from causing life-threatening damage to the Internet ecosystem.

Rather, what the laundry list of exceptions demonstrates is that the majority just isn’t seeing the forest for the trees.  What the exceptions have in common is that each of them represents change to the Internet’s architecture and service models that have emerged in the last decade and a half. They are all new services, technologies, or service providers who, in these and other ways, violate the neutrality principle.

But these innovations have been developed for beneficial, not evil purposes.  The network is better in every sense imaginable, and will continue to improve in speed, efficiency, and usability so long as future innovations don’t run afoul of the rules and their enforcement.)  The Internet is not “open” in the way it may have been in 1995 (it was never as open as the idealists imagine).  But in order for the Internet we have today—faster, cheaper, better—to exist, each of these changes had to be made.

The genius of a virtual infrastructure is that it can absorb redesign without any interruption in service.  One unfortunate side-effect of that ease of transformation is that users don’t see the construction cones and highway workers.  Consumers—and the FCC–don’t realize that we’re now traveling on a multi-lane highway rather than the old dirt road.  The technology is utterly changed, and the rules of the road have changed with it.  For better or worse, but largely for the better.

The final rules, with all their exceptions suggest a majority clinging to the idealized past, and a stubborn refusal in the end to admit that the Internet has changed and continues to change—that it needs to change.

The exceptions for  the “inconsistent” behavior of CDNs, specialized services, peering arrangements, e-readers and game consoles, and app stores have no logical rationale, other than that the FCC has now learned that they are part of the current status quo.  But they are being exempted because they are in place, and they work.

For example, paying a CDN to replicate your content and co-locate servers at key network access points is surely “paying for priority.”  It puts a start-up offering similar content but without the funds for similar services at a competitive disadvantage.  The cached content will arrive faster when requested by a consumer.  But for consumers, that feature is a good thing—an improvement—even though it is not “neutral.”

Likewise, the mobile Internet is given special treatment because it is “evolving rapidly.”  But the fixed Internet is evolving rapidly as well, as many of these exemptions implicitly recognize.

The majority is fixated on maintaining a neutral Internet even though it now understands that neutrality is a virtue more honored in the breach.  The final report uses the word “traditionally” 25 times, the word “historically” 9 times, and the word “typically” 21 times.  These are the only justifications for the exceptions, and they undermine the purpose of the rules that remain.  There is no neutral Internet to preserve.  There’s only one that works.

The reality is that we’re moving away from websites to the mobile, app-based economy, specialized services and high-bandwidth applications such as video that shouldn’t be treated the same.  A “level playing field” doesn’t mean everyone gets a trophy

The good news is that the final rules grandfather in many existing technologies that violate the neutrality principle.  That’s essential, even if each of the exceptions is granted in isolation and begrudgingly at that.  But the open Internet regulations could have the effect of freezing that status quo, for no good (and lots of bad) reasons.

Nostalgia can be fun.  I enjoy sitting around with my fellow veterans of the pre-bubble dot com boom talking about the good old days, toasting to our irrational exuberance. But translating that wistfulness into federal law, even as here with rules pockmarked by the blemishes of a reality that looks far different than our idealized view of the past, is a dangerous way to celebrate it.

Next:  Not to worry.  The FCC has no authority, either.

"The Next Digital Decade" book and event – January 19, 2011

I have two essays–one on privacy and the other on information economics, appearing in a new book called “The Next Digital Decade.”

The book, edited by Berin Szoka and Adam Marcus, is a unique collaboration from 26 thought leaders on Internet law, philosophy, policy and economics. The book is available as a free PDF and the hardcover can purchased online. A free eBook will be available soon.

A free book launch symposium will follow the State of the Net Conference in Washington, DC the afternoon of January 19, immediately following the State of the Net Conference. Space is limited so RSVP here.

Spectrum Crisis Amnesia: What Happened in Vegas Stayed in Vegas, Unfortunately

For CNET, I posted a long piece describing a full day at CES’s Tech Policy Summit largely devoted to spectrum issues. Conference attendees in several packed sessions heard from FCC Chairman Julius Genachowski and three of the four other FCC Commissioners (Commissioner Copps was absent due to illness), as well as former Congressman Rick Boucher and industry representatives.

The message was as clear as it is worrisome. The tremendous popularity of wireless broadband, on view in a remarkable range of new devices and gizmos on display at the Vegas Convention Center, is rapidly outpacing the radio frequencies available to handle the data.

The mobile Internet needs more spectrum, and there isn’t any to give it. The app revolution is in danger of hitting a hard stop, perhaps as soon as 2015.

As the exclusive manager of America’s radio waves, only the FCC can reallocate spectrum. And the good news is that the agency recognizes the crisis as well as its role in solving it. Chairman Genachowski told the audience that spectrum reform will be the agency’s top priority for 2011.

Reading the Chairman’s prepared comments, however, I was struck by the sense that I’d heard something similar before. Perhaps in the very same room. Perhaps by the very same speaker. Continue reading

Concensus: The Net Neutrality Fight is Just Getting Started

I published an article for CNET late last night on a spirited debate at CES yesterday over the FCC’s recently-enacted “open Internet” rules, aka net neutrality.  Panelists from the FCC, Congress, AT&T, Verizon, Google and the Center for Democracy and Technology actually agreed on one point, which is that the neutrality saga has only completed its first chapter.

(The session was the most popular of the day.  Several people were turned away from the packed room, and former Congressman Rick Boucher and FCC Commissioner Mignon Clyburn almost didn’t get in!)

While some panelists believe the next step is more regulation, others promised Congressional and perhaps court challenges aimed at undoing the Commission’s “Christmas Surprise.”  As I note in the piece, the new Congress, with its Republican majority in the House, has already taken up reversing the rulemaking as a priority.  Rep. Marsha Blackburn has introduced legislation, signed by 60 other members including at least one Democrat, that would make clear the FCC’s lack of authority over broadband access.

And Neil Fried, senior counsel to the House Energy and Commerce Committee, promised the overfull audience that the Committee would take up the FCC’s “overreaching” as its first tech agenda item.

At the Tech Policy Summit at last year’s CES, the neutrality panel featured current and former White House staffers Susan Crawford and Andrew McLaughlin, as well as more outspoken neutrality advocates from public interest groups.

Yesterday’s panel, by contrast, had industry representatives from Verizon, AT&T and Google, along with David Sohn of CDT, whose rhetoric was far less fiery than that of his counterparts last year.

So it seems the net neutrality fight is still on, and drawing even bigger audiences.  But at least at CES the White House and the most vocal public interest groups have both gone quiet, at least for now.

Today’s sessions include an interview with Chairman Julius Genachowski and a panel featuring the other FCC Commissioners.  Stay tuned for more news.

“Fake Neutrality” or Government Takeover?: Reading the FCC’s Net Neutrality Report (Part III)

In Part I of this analysis of the FCC’s Report and Order on “Preserving the Open Internet,” I reviewed the Commission’s justification for regulating broadband providers.   In Part II, I looked at the likely costs of the order, in particular the hidden costs of enforcement.  In this part, I compare the text of the final rules with earlier versions.  Next, I’ll look at some of the exceptions and caveats to the rules—and what they say about the true purpose of the regulations.

In the end, the FCC voted to approve three new rules that apply to broadband Internet providers.  One (§8.3) requires broadband access providers to disclose their network management practices to consumers.  The second One (§8.4) prohibits blocking of content, applications, services, and non-harmful devices.  The third One (§8.5) forbids fixed broadband providers (cable and telephone, e.g.) from “unreasonable” discrimination in transmitting lawful network traffic to a consumer.

There has of course been a great deal of commentary and criticism of the final rules, much of it reaching fevered pitch before the text was even made public.  At one extreme, advocates for stronger rules have rejected the new rules as meaningless, as “fake net neutrality,” “non neutrality,” or the latest evidence that the FCC has been captured by the industries it regulates.  On the other end, critics decry the new rules as a government takeover of the Internet, censorship, and a dangerous and unnecessary interference with a healthy digital economy.  (I agree with that last one.)

One thing that has not been seriously discussed, however, is just how little the final text differs from the rules originally proposed by the FCC in October, 2009.  Indeed, many of those critical of the weakness of the final rules seem to forget their enthusiasm for the initial draft, which in key respects has not changed at all in the intervening year of comments, conferences, hearings, and litigation.

The differences—significant and trivial—that have been made can largely be traced to comments the FCC received on the original draft, as well as interim proposals made by industry and Congress, particularly the framework offered by Verizon and Google in August and a bill circulated by Rep. Henry Waxman just before the mid-term elections.

1.      Transparency

Compare, for example, the final text of the transparency rule with the version first proposed by the FCC.

Subject to reasonable network management, a provider of broadband Internet access service must disclose such information as is reasonably required for users and content, application and service providers to enjoy the protections specified in this part. (Proposed)

A person engaged in the provision of broadband Internet access service shall publicly disclose accurate information regarding the network management practices, performance and commercial terms of its broadband Internet access service sufficient for consumers to make informed choices regarding use of such services and for content, application, service and device providers to develop, market and maintain Internet offerings. (Final)

The final rule is much stronger, and makes clearer what it is that must be disclosed.  It is also not subject to the limits of reasonable network management,  Rather than the vague requirement of the draft for disclosures sufficient to “enjoy the protections” of the open Internet rules, the final rule requires disclosures sufficient for consumers to make “informed choices” about the services they pay for, a standard more easily enforced.

By comparison, the final rule comes close to the version that appeared in draft legislation circulated but never introduced by Rep. Henry Waxman in October of 2010. It likewise reflects the key concepts in the Verizon-Google Legislative Framework Proposal from earlier in the year.

As the Report makes clear (¶¶ 53-61), the transparency rule has teeth.  Though the agency declines for now from making specific decisions about the contents of the disclosure and how is must be posted, the Report lays out a non-exhaustive list of nine major categories of disclosure, including network practices, performance characteristics, and commercial terms, that must be included.  It’s hard to imagine a complying disclosure that will not run to several pages of very small text.

That generosity, of course, may be the rule’s undoing.  As anyone who has ever thrown away a required disclosure from a service provider (mortgage, bank, drug, electronic device, financial statement, privacy, etc.) knows full well, information “sufficient” to make an informed choice is far more information than any non-expert consumer could possibly absorb and evaluate, even if they wanted to.   The more information consumers are given, the less likely they’ll pay attention to any of it, including what may be important.

The FCC recognizes that risk, however, but believes it has an answer.  “A key purpose of the transparency rule,” the Commission notes (¶ 60), “is to enable third-party experts such as independent engineers and consumer watchdogs to monitor and evaluate network management practices, in order to surface concerns regarding potential open Internet violations.”

Perhaps the agency has in mind here organizations like BITAG, which has been established by a wide coalition of participants in the Internet ecosystem to develop “consensus on broadband network management practices or other related technical issues.”  Or by consumer watchdogs, perhaps the agency imagines that some of the public interest groups who have most strenuously rallied for the rules will become responsible stewards of their implementation, trading the acid pens of political rhetoric for responsible analysis and advocacy to their members and other consumers.

We’ll see.  I wish I shared the Commissions confidence that, “for a number of reasons” (none cited), “the costs of the disclosure rule we adopt today are outweighed by the benefits of empowering end users and edge providers to make informed choices….”  (¶ 59).  But I don’t. Onward.

2.       Blocking

The final version of the blocking rule (§8.5) consolidated the Content, Applications and Services and Devices rule of the original draft.  The final rule states:

A person engaged in the provision of fixed broadband Internet access services, insofar as such person is so engaged, shall not block lawful content, applications, services or non-harmful devices, subject to reasonable network management.

A more limited rule applies to mobile broadband providers, who

[S]hall not block consumers from accessing lawful websites, subject to reasonable network management, nor shall such person block applications that compete with the providers’ voice or video telephony services, subject to reasonable network management

Much of the anguish over the final rules that has been published so far relates to a few of the limitations built into the blocking rule.  First, copyright-reform activists object to the word “lawful” appearing in the rule.  “Lawful” content, applications, and services do not include activities that constitute copyright and trademark infringement.  Therefore, the rule allows broadband providers to use whatever mechanisms they want (or may be required to) to reduce or eliminate traffic that involves illegal fire-sharing, spam, viruses and other malware, and the like.

A provider who blocks access to a site selling unlicensed products, in other words, is not violating the rules.  And as the agency finds it is “generally preferable to neither require not encourage broadband providers to examine Internet traffic in order to discern which traffic is subject to the rules” (¶ 48), there will be considerable margin of error given to providers who block sites, services, or applications which may include some legal components.

On this view, though the FCC otherwise contradicts it—see footnote 245 and elsewhere—a complete ban on the BitTorrent protocol, for better or worse, might not be a violation of the blocking rule.  Academic studies have shown that over 99% of BitTorrent traffic constitutes unlicensed file sharing of protected content.  Other than inspecting individual torrents, which the agency disfavors, how else can an access provider determine what tiny minority of BitTorrent traffic is in fact lawful?

A second concern is the repeated caveat for “reasonable network management,” which gives access providers leeway to balance traffic during peak times, limit users whose activity may be harming other users, and other “legitimate network management” purposes.

Finally, disappointed advocates object to the special treatment for mobile broadband, which may, for example, block applications, services or devices without violating the rule.  There is an exception to the exception for applications, such as VoIP and web video, that compete with the provider’s own offerings, but that special treatment doesn’t keep mobile providers from using “app stores” to exclude services they don’t approve.  (See ¶ 102)

Of course even the original draft of the rules included the limitation for “reasonable network management,” and refused to apply any of the rules to unlawful activities.  The definition of “reasonable network management” in the original draft is different, but functionally equivalent, to the final version.

The carve-out for mobile broadband, however, is indeed a departure from the original rules.  Though the Oct. 2009 Notice of Proposed Rulemaking expressed concern about applying the same rule to fixed and mobile broadband (see  13, 154-174), the draft blocking rule did not distinguish between fixed and mobile Internet access.  The FCC did note, however, that different technologies “may require differences in how, to what extent, and when the principles apply.”  The agency sought comment on these differences (and asked for further comment in a later Notice of Inquiry).  Needless to say, they heard plenty.

Wireless broadband is, of course, a newer technology, and one still very much in development.  Spectrum is limited, and capacity cannot easily be added.  Those are not so much market failures as they are regulatory failures.  The FCC is itself responsible for managing the limited radio spectrum, and has struggled by its own admission to allocate spectrum for its most efficient and productive uses—indeed, even to develop a complete inventory of who has which frequencies of licensed spectrum today.

Adding additional capacity is another regulatory obstacle.  Though mobile users rail against their providers for inadequate or unreliable coverage, no one, it seems, wants to have cellular towers and other equipment near where they live.  Local regulators, who must approve new infrastructure investments, take such concerns very much to heart.  (There is also rampant corruption and waste in the application, franchising, and oversight processes at the state and local levels, a not-very-secret secret.)

The FCC, it seems, has taken these concerns into account in the final rule.  Its original open Internet policy statements—from which the rules derive—applied only to fixed broadband access, and the October, 2009 draft’s inclusion of mobile broadband came as a surprise to many.

The first indication that the agency was considering a return to the original open Internet policy came with the Verizon-Google proposal, where the former net neutrality adversaries jointly released a legislative framework (that is, something they hoped Congress, not the FCC, would take seriously) that gave different treatment to mobile.  As the V-G proposal noted, “Because of the unique technical and operational characteristics of wireless networks, and the competitive and still-developing nature of wireless broadband services, only the transparency principle would apply to wireless at this time.”

The Waxman proposal didn’t go as far as V-G, however, adding a provision that closely tracks with the final rule.  Under the Waxman bill, mobile providers would have been prohibited from blocking “lawful Internet websites”, and applications “that compete with the providers’ voice or video communications services.”

So the trajectory of the specialized treatment for mobile broadband is at least clear and, for those following the drama, entirely predictable.  Yet the strongest objections to the final rule and the loudest cries of betrayal from neutrality advocates came from the decision to burden mobile providers less than their fixed counterparts.  (Many providers offer both, of course, so will be subject to different rules for different parts of their service.)

At the very least, the advocates should have seen it coming.  Many did.  A number of “advocacy” groups demonized Google for its cooperation with Verizon, and refused to support Waxman’s bill.  (It should also be noted that none of the groups objecting to the final rules or any interim version ever actually proposed their own version—that is, what they actually wanted as opposed to what they didn’t want.)

3.      Unreasonable discrimination

The final rule, applicable only to fixed broadband providers, demands that a provider not “unreasonably discriminate in transmitting lawful network traffic over a consumer’s broadband Internet access service.”  (§ 8.7, and see ¶¶ 68-79 of the Report).

Though subtle, the difference in language between the NPRM and the final rule are significant, as the FCC acknowledges.  The NPRM draft rule noted plainly that “a provider of broadband Internet access service must treat lawful content, applications, and services in a nondiscriminatory manner.”

The difference here is between “nondiscrimination,” which prohibits all forms of differential network treatment, and “unreasonable discrimination,” which allows discrimination so long as it is reasonable.

The migration from a strict nondiscrimination rule (subject, however, to reasonable network management) to a rule against “unreasonable” discrimination can be seen in the interim documents.  The Verizon-Google proposal, which called for a “Non-Discrimination Requirement,” nonetheless worded the requirement to ban only “undue discrimination against lawful Internet content, application, or service in a manner that causes meaningful harm to competition or to users.” (emphasis added)

Rep. Waxman’s draft bill, likewise, would have applied a somewhat different standard for wireline providers, who “shall not unjustly or unreasonably discriminate in transmitting lawful traffic over a consumer’s wireline broadband Internet access service,” also subject to reasonable network management.

Over time, the FCC recognized the error of its original draft and now agrees “with the diverse group of commenters who argue that any nondiscrimination rule should prohibit only unreasonable discrimination.” (¶ 77)

As between the suggested limiting terms “undue,” “unjust” and “unreasonable,” the FCC chose the latter for the final rule.  Though many have complained that “unreasonable” is a nebulous, subjective term, it should be noted that of the three it is the only one with understood (if not entirely clear) legal meaning, particularly in the context of the FCC’s long history of rulemaking and adjudication.

The earliest railroad regulations, for example, which also provided the beginning of the FCC’s eventual creation and authority over communications industries, required reasonable rates of carriage, and empowered the Interstate Commerce Commission to intervene and eventually set the rates itself, much as the FCC later did with telephony.

One lesson of the railroad and telephone histories, however, is the danger of turning over to regulators decisions about what behaviors are reasonable. (Briefly, regulatory capture often ends up leaving the industry unable to respond to new forms of competition from disruptive technologies, with disastrous consequences.)

The V-G proposal gets to the heart of the problem in the text I italicized.  Despite the negative connotations of the word in common use, “discrimination” isn’t inherently bad. As the Report makes clear, in managing Internet access and network traffic, there are many forms of discrimination—which means, after all, affording different treatment to different things—that are entirely beneficial to overall network behavior and to the consumer’s experience with the Internet.

The draft rule, as the FCC now admits (see ¶ 77 of the Report), was dangerously rigid.  If any behavior should be regulated, it is the kind of discrimination whose principal purpose is to harm competition or users—though that kind of behavior is already illegal under various antitrust laws.

For one thing, users may want some kinds of traffic – e.g., voice and video – to receive a higher priority over text and graphics, which do not suffer from latency problems.  Companies operating Virtual Private Networks for their employees may likewise want to limit Web access to selected sites and activities for workers while on the job.

A strict nondiscrimination rule would have also discouraged or perhaps banned tiered pricing, harming consumers who do not need the fastest speeds and the highest volume of downloads to accomplish what the want to online.  (Without tiered pricing, such consumers effectively subsidize power users who, not surprisingly, are the most vociferous objectors to tiered pricing.)

Discrimination may also be necessary to manage congestion during peak usage periods or when failing nodes put pressure on the backbone.  Discrimination against spam, viruses and other malware, much of which is not “lawful,” is also permitted and indeed encouraged.  (See ¶ 90-92.)

By comparison, the Report notes three (¶ 75) types of provider discrimination that are of particular concern.  These are:  discrimination that harms competitors (e.g., VoIP providers of over-the-top telephone service, such as Skype or Vonage, that competes with the provider’s own telephone service), “inhibiting” end users from accessing content, services, and applications of their choice (but see the no-blocking rule, above, which already covers this), and discrimination that “impairs free expression,” including slowing or blocking access to a blog whose message the broadband provider does not approve.

On that last point, however, it’s important to note that Congress has already given broadband providers (and others) broad freedom to filter and otherwise curate content they do not approve of or which they believe their customers don’t want to see.  Under Section 230 of the Communications Decency Act,

“No provider or user of an interactive computer service shall be held liable on account of . . . any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

The goal of Section 230 was to immunize early Internet providers including CompuServe and Prodigy from efforts to exercise editorial control over message boards whose content was provided by customers themselves.  But it gives providers broad discretion in determining what kind of content it believes its customers don’t want to see.  So long as the filtering is undertaken in “good faith” (e.g., not with the intent of harming a competitor), there is no liability for the provider, who does not, for example, become a “publisher” for purposes of defamation law.

The FCC (¶ 89) acknowledges the limit that Section 230 puts on the discrimination rule.

On the harm to competitors prong, the FCC waffles (see ¶ 76) on whether “pay for priority”—the bugaboo that launched the neutrality offensive in the first place, actually constitutes a violation of the rules.  While a broadband provider’s offering to prioritize the traffic of a particular source for a premium fee “would raise significant cause for concern,” the agency acknowledges that such behavior has occurred and thrived for years in the form of third party Content Delivery Networks.  (See footnote 236)  CDNs are allowed.   (More on CDNs in the next post.)

So in the end the discrimination rule doesn’t appear to add much to the blocking rule or existing antitrust laws.  Discrimination against competing over-the-top providers would violate antitrust.  Blocking or slowing access to disfavored content is already subject to the blocking rule.  And interfering with “free expression” rights of users is already significantly allowed by Section 230.

What’s left?   “The rule rests on the general proposition,” the agency concludes (¶ 78), “that broadband providers should not pick winners and losers on the Internet,” even when doing so is independent of competitive interests.  What exactly this means—and how “reasonable” discrimination will be judged in the course of enforcing the rules—remains to be seen.

Next:  The exceptions and what they say about the real purpose of the rules