Category Archives: Infrastructure

FCC Votes for Reclassification, Dog Bites Man

Not surprisingly, FCC Commissioners voted 3 to 2 today to open a Notice of Inquiry on changing the classification of broadband Internet access from an “information service” under Title I of the Communications Act to “telecommunications” under Title II.  (Title II was written for telephone service, and most of its provisions pre-date the breakup of the former AT&T monopoly.)  The story has been widely reported, including posts from The Washington Post, CNET, Computerworld, and The Hill.

As CNET’s Marguerite Reardon counts it, at least 282 members of Congress have already asked the FCC not to proceed with this strategy, including 74 Democrats.

I have written extensively about why a Title II regime is a very bad idea, even before the FCC began hinting it would make this attempt.  I’ve argued that the move is on extremely shaky legal grounds, usurps the authority of Congress in ways that challenge fundamental Constitutional principles of agency law, would cause serious harm to the Internet’s vibrant ecosystem, and would undermine the Commission’s worthy goals in implementing the National Broadband Plan.  No need to repeat any of these arguments here.  Reclassification is wrong on the facts, and wrong on the law.

What is Net Neutrality?

Instead, I thought it would be useful to return to the original problem, which is last fall’s Notice of Proposed Rulemaking on net neutrality.  For despite a smokescreen argument that reclassification is necessary to implement the NBP, everyone knows that today’s NOI was motivated by the Commission’s crushing defeat in Comcast v. FCC, which held that “ancillary authority” associated with Title I did not give the agency jurisdiction to enforce its existing net neutrality policy.

Rather than request an en banc rehearing of Comcast, or appeal the case, or follow the court’s advice and return to Congress for the authority to enforce the net neutrality rules, the FCC has chosen in the name of expediency simply to rewrite the Communications Act itself.

Many metaphors have been applied to this odd decision.  I liken it to setting your house on fire to light your cigarette.  (You shouldn’t be smoking in the first place.)

Let me be clear, once again, that I am all for an open and transparent Internet.  I believe the packet-switching architecture is one of the key reasons TCP/IP has become the dominant data communications protocol (and will soon dominate voice and video).

Packet-switching isn’t the only reason the Internet has triumphed.  Perhaps the other, more important secrets to TCP/IP’s success are that it is a non-proprietary standard –so long SNA, DECNet and OSI and the corporate strategies their respective owners tried to pursue through them–and simple enough to be baked in to even the least-powerful computing devices. The Internet doesn’t care if you are an RFID tag or a supercomputer.  If you speak the language, you can participate in the network.

These features have made the Internet, as I first argued in 1998 in “Unleashing the Killer App,” an engine for remarkable innovation over the last ten years

The question for me, as I wrote in Chapter 4 of “The Laws of Disruption,” comes down most importantly to one of institutional economics.  Who is best-suited, legal authority aside, to enforce the features of the Internet’s architecture and protocols that make it work so well?  The market?  Industry self-regulation?  A global NGO?  The FCC?  Or put another way, why is a federal government agency (limited, by definition, to enforcing it authority only within the U.S.) such a poor choice for the job, despite the best intentions of its leadership and the obviously strong work ethic of its staff?

To answer that, let’s back all the way up.  Net neutrality is a political concept overlayed on a technical and business architecture.  That’s what makes this debate both dangerous and frustrating.

For starters, it’s hard to come up with a concise definition of net neutrality, largely because it’s one of those terms like “family values” that means something different to everyone who uses it.  For me it’s become something of a litmus test—people who use it positively are generally hostile to large communications companies.  People who use it negatively are generally hostile to regulatory agencies.  A lot of that anger, wherever it comes, seems to get channeled into net neutrality.

In fact the FCC doesn’t even use the term—they talk about the “open and transparent” Internet instead.

But here’s the general idea.  The defining feature of the Internet is that information is broken up into small “packets” of data which are routed through any number of computers on the world-wide network and then are reassembled when they reach their destination.

Up until now, with some notable exceptions, every participating computer relays those packets without knowing what’s in them or who they come from.  The network operates on a packet-neutral model—when one computer receives it, it looks only to see where it’s heading and sends it, depending on traffic congestion at the time, to some other computer along the way just as quickly as it can.

That’s still the model on which the Internet works.  The FCC’s concern is not with current practice, but of future problems.   Increasingly, they see a few dominant providers controlling the outgoing and incoming packets to and from consumers—the first and last mile.  So while the computers between my house and Google headquarters all treat my packets to Google and Google’s packets back to me in a neutral fashion, there’s no law that keeps Comcast (my provider) from opening those packets on their way in or on their way out and deciding to slow or speed up some or all of them.

(Well, the law of antitrust and unfair trade could in fact apply here, depending on how the non-neutral behavior was expressed and by whom.  See below.)

Why would they do that?  Perhaps they make a deal with Google to give priority to Google-related packets in exchange for a fee or a share of Google’s ad revenues.  Or, maybe they want to encourage me to watch Comcast programming instead of YouTube videos, and intentionally slow down YouTube packets to make those videos less appealing to watch.

Most of this is theoretical so far.  No ISP offers the premium or “fast lane” service to individual applications.  Comcast, however, was caught a few years ago experimenting with slowing down the BitTorrent peer-to-peer protocol.  Some of Comcast’s most active customers were clogging the pipes sending and receiving very large files (mostly illegal copies of movies, it turns out).

When they were caught, the company agreed instead to stop offering “unlimited” access and to use more sophisticated network management techniques to ensure a few customers didn’t slow traffic for everyone else.  Comcast and BitTorrent made peace, but the FCC held hearings and sanctioned Comcast after-the-fact, leading to the court case that made clear the FCC has no authority to enforce its neutrality policies.

The simple-minded dichotomy of the ensuing “debate” leaves out some important and complicated technical details.  First, some applications already require and get “premium” treatment for their packets.  Voice and video packets have to arrive pretty much at the same time in order to maintain good quality, so Voice over IP telephone calls (Skype, Vonage, Comcast) get priority treatment, as do cable programming packets, which, after all, are using the same connection to your home that the data uses.

Google, as one of the largest providers of outbound packets, has deals with some ISPs to locate Google-only servers in their hubs to ensure local copies of their web pages are always close by, a service offered more generally by companies such as Akamai, which caches copies of the most frequently-used sites to speed things up for everyone.  In that sense, technology is being used to give priority even to data packets, about which no one should complain.

Fighting over the Future

So the net neutrality fight, aside from leaving out any real appreciation either for technological or business realities, is really a fight about the future.  As cable and telephone companies invest billions in the next generation of technology—including fiber optics and next-generation cellular services–application providers fear they will be asked to shoulder more of the costs of that investment through premium service fees.

Consumer groups have been co-opted into this fight, and see it as one that pits big corporations against powerless customers who need outside advocates to save them from dangers they do not understand.  That increasingly quaint attitude, for one thing, grossly underestimates the growing power of consumers to effect change using the Internet itself (see:  Facebook et al.). Consumers can save themselves, thanks very much.

What is true is that consumers do not and aren’t likely to be asked to pay the true costs of broadband access given the intense competition in major markets between large ISPs such as Comcast, AT&T, Verizon and others.  That is the source of anxiety for the application providers–they are seen as having more elasticity in pricing than end-users.

The existence of provider competition, however, also weighs heavily against the need for government intervention.  If an ISP interferes with the open and transparent Internet, customers will know and they will complain. Ultimately they will find a provider that gives them full and unfettered access.   (There are plenty of interested parties who help consumers with the “know” part of that equation, but still, I fully support the principle of ISP transparency with regard to network management principles.  Few consumers would actually read them, and fewer still understand them, but it’s still a good practice.)

If the market really does fail, or fails in significant local ways (rural or poor customers, for example), then some kind of regulatory intervention might make sense.  But it’s a bad idea to regulate ahead of a market failure, especially when dealing with technology that is evolving rapidly.  In the last ten years, as I argue in The Laws of Disruption, the Internet has proven to be a source of tremendous embarrassment for regulators trying to “fix” problems that shift under their feet even as they’re legislating.  Often the laws are meaningless by the time the ink is dry or—worse—inadvertently make the problems worse after the fact.

Nevertheless, in October of last year the FCC proposed—in a 107-page document—six net neutrality rules that would codify what I described above and a number of peripheral, perhaps unrelated, ideas.  Right now the agency has only a net neutrality policy, and that policy, the D.C. Circuit Court of Appeals ruled, doesn’t constitute enforceable law.  Implicit in that rulemaking was the assumption that someone needed to codify these principles, that the FCC was that someone, and that the agency had the authority from Congress to be that someone.  (The court’s ruling made clear that the latter is not the case.)

There are good reasons to be skeptical that the FCC in particular is the right agency to solve this problem even if it is a problem.  Through most of its existence the agency has been fixed on regulating a legal monopoly—the old phone company—and on managing what were very limited broadcast spectrum—now largely supplanted by cable and more sophisticated technologies for managing the spectrum.

The FCC, recall, is the agency that watches broadcast (but not cable) television and issues fines for indecent content—an activity they do more, rather than less, even as broadcast becomes a trivial part of programming reception.  Congress has three times tried to give the FCC authority to regulate indecency on the Internet as well, but the U.S. Supreme Court has stopped all three.

So if the FCC were to be the “smart cop on the beat” as Chairman Genachowski characterized his view of net neutrality, how would the agency’s temptation to shape content itself be curbed?

Worse, no one seems to have thought ahead as to how the FCC would enforce these rules.  If I complain that my access is slow today and I believe that must mean my ISP is acting in a non-neutral fashion, the agency would have to look at the traffic and inside the packets in order to investigate my complaint.  Again, the temptation to use that information and to share it with law enforcement under the name of anti-terrorism or other popular goals would be strong—strong enough that it ought to worry some of the groups advocating for net neutrality laws as a placebo to keep the ISPs in line.

The Investment Effect

It should be obvious that the course being followed by the FCC – the enactment of net neutrality rules in the first place and the increasingly desperate methods by which it hopes to establish its authority to do so—will cast a pall over the very investments in infrastructure the FCC is counting on to achieve the worthy goals of the NBP.  If nothing else, the reclassification NOI will invariably end in some heavy-duty litigation, which is likely to take years to resolve.  Courts move even more slowly than legislators, who move more slowly than regulators, all of whom aren’t even moving compared to the speed of technological innovation.

How serious a drag on the markets will regulatory uncertainty prove to be?  For what it’s worth, New York Law School’s Advanced Communications Law & Policy Institute today issued an economic analysis of the Commission’s proposed net neutrality rules, arguing that as many as 604,000 jobs and $80 billion in GDP loss would result from their passage.  Matthew Lasar at Ars Technica summarizes the report, which I have not yet read.

But one doesn’t need sophisticated economic analysis to understand why markets are already reacting poorly to the FCC’s sleight-of-hand.  The net neutrality rules the FCC proposed in October would, depending on how the agency decided to enforce them, greatly limit the future business arrangements that broadband providers could offer to their business customers.

Application providers worry that the offer of “fast lane” service invariably means everything else will become noticeably slower (not necessarily true from a technical standpoint).  But in any case the limitation of future business innovations by providers is bound to discourage, at least to some extent, up-front investments in broadband, which are characterized by high fixed costs and a long payback.

Worse, the proposed rules would also apply to Internet access over cellular networks, which is still in a very early stage of development and has much more limited capacity.  Cellular providers have to limit access to video and other high-bandwidth applications just to keep the networks up and running.   (Some of those limits are the result of resistance from local regulators to allow investments in cell towers and other infrastructure.)  The proposed rules would require them not to discriminate against any applications, no matter how resource-intensive.  That simply won’t work.

Investors are worried that the hundreds of billions they’ve spent so far on fiber optics, cellular upgrades and cable upgrades and the amount left to be spent to get the U.S. to 100 mbps speeds in the next ten years are going to be hard to recover if they don’t have flexibility to innovate new business models and services.

To Wall Street, the net neutrality rules are perceived not as enshrining a level playing field for the Internet so much as a land grab by content providers to ensure they are the only ones who can innovate with a free hand, pushing the access providers increasingly to a commodity business as, for example, long distance telephony has become.  Why should investors spend hundreds of billions to upgrade the networks if they won’t be able to make their money back?

Investors are also concerned more generally that the FCC will implement and enforce the proposed neutrality rules in unpredictable ways, bowing to lobbying pressure by the content companies even more in the future.  Up until now, the FCC has played no meaningful role in regulating access or content, and the Internet has worked brilliantly.  The networks the FCC does regulate–local telephone, broadcast TV–are increasingly unprofitable.

How would the FCC proceed if the rules are enrolled and upheld?  The NPRM says only that the Commission would investigate charges of non-neutral behavior “on a case-by-case basis.”  That approach is understandable when technology is changing rapidly, but at the same time it introduces even more uncertainty and more opportunities for regulatory mischief.  Better to wait until an identifiable problem arises, one that has an identifiable solution a regulatory agency can implement more efficiently than any other institution.

It’s possible of course that access providers, especially in areas where there is little competition, could use their leverage to make bad business decisions that would harm consumers, content providers, or both.  But that that risk could be adequately covered by existing antitrust law, or, if necessary, by FCC action when the problem actually arises.

The problem isn’t here yet, other than a handful of anecdotal problems dealt with quickly and without the need for federal intervention.  Again, the danger of rulemaking ahead of an actual failure of the market is acute, especially when one is dealing with an emerging and fast-changing set of technologies.

The more the FCC pushes ahead on the net neutrality rules, even in the face of a court decision that it has no authority to do so, the more irrational the agency appears to the investor community.  And given the almost complete reliance for the broadband plan on private investment, this seems a poor choice of battles for the FCC to be spending its political capital on now.

Preserving the Ecosystem

There’s a forest among all these trees.  So far, the Internet economy has thrived on a delicate balance today between infrastructure and the innovation of new products and services that Internet companies build on top of it.  If the infrastructure isn’t constantly upgraded in speed, cost, and reliability, entrepreneurs won’t continue to spend time and money building new products.

At the same time, if infrastructure providers don’t think the applications will be there, there’s no reason to invest in more and better capacity.  So far, consumers have shown a voracious appetite for both capacity and applications, in part because there’s been little to make them doubt more of both are always coming.

Given the long lead time for capital investments, the infrastructure providers have to bet pretty far into the future without a lot of information.  Sometimes they overbuild, or build ahead of demand (this has happened at least twice in the last ten years); sometimes (in the case of cellular), the applications arrive faster than the capacity after a long period of relative quiet.   3G support was an industry embarrassment until the iPhone finally put it to good use.

By and large the last decade has seen remarkable success in getting the right infrastructure to the right applications at the right time, as evidenced by the fact that the U.S. is still the leader by far in Internet innovation.   The U.S., despite its geography and economic diversity, is also still the leader in broadband access, with availability to over 96% of U.S. residents.  According to the latest OECD data, the U.S. has twice the number of broadband subscribers as the next-largest market.  Our per capita adoption is lower, as are our broadband speeds—both sources of understandable concern to the authors of the NBP.

The larger issue here is that regulatory intervention, or even the looming possibility of it, can throw a monkey wrench in all that machinery, and make it harder to make quick adjustments when one side gets too far ahead of the other.  Once the machine stalls, restarting it may be difficult if not impossible.   The Internet ecosystem works remarkably well.  By contrast, even regulatory changes intended to smooth out inefficiencies can wind up having the opposite effect, sometimes disastrously so.

That above all else should have given the FCC pause today in its vote.  Apparently not.

Updates to the "Media" Page

I’ve added almost twenty new posts to the Media Page from April and May. These were busy months for those interested in the dangerous intersection of technology and policy, the theme of The Laws of Disruption.

A major court decision upended the Federal Communications Commissions efforts to pass new net neutrality regulations, leading the Commission to begin execution of its “nuclear option”–the reclassification of Internet access under ancient rules written for the old telephone monopoly.  While I support the principles of net neutrality, I am increasingly concerned about efforts by the FCC to appoint itself the “smart cop” on the Internet beat, as Chairman Julius Genachowski put it last fall.

As consumer computing outstripped business computing for the first time, privacy has emerged as a leading concern of both users and mainstream media sources.  Not surprisingly, legal developments in information security go hand-in-hand with conversations about privacy policy and regulation, and I have been speaking and commenting to the press extensively on these topics.

The new entries run the full range of topics, including copyright, identity theft, e-commerce, new criminal laws for social networking behaviors, as well as privacy, security, and communications policy.

In the last few months, I have continued writing not only for this blog but for the Technology Liberation Front, the Stanford Law School Center for Internet & Society, and for CNET.  I’ve also written op-eds for The Orange County Register, The Des Moines Register, and Info Tech & Telecom News.

I’ve appeared on CNN, Fox News, and National Public Radio, and have been interviewed by print media sources as varied as El Pais, The Christian Science Monitor, TechCrunch and Techdirt.

My work has also been quoted by a variety of business and mainstream publications, including The Atlantic, Reason, Fortune and Fast Company.

As they say, may you live in interesting times!

FCC Broadband Reclassification: Third Way or the Highway?

The announcement yesterday from key Congressional Democrats of an effort to reform the Communications Act put me in a nostalgic mood. Here follows one of my longest efforts yet to bury the lede.

One of my favorite courses in law school was Abner Mikva’s “Legislative Process” course, which he taught while serving on the D.C. Circuit Court of Appeals and before his tenure as White House counsel to President Clinton. Mikva had previously served in Congress; indeed, one of the first votes I ever cast was for Mikva while an undergraduate at Northwestern University.

(It was a remarkable period at the law school. The year Mikva signed on as a lecturer was also the first year on the faculty for three professors just starting their academic careers: Larry Lessig, Elena Kagan, and Barack Obama. I took two classes with Lessig, including an independent study on the impact of technology on the practice of law, but regrettably none from the other two.)

There were two versions of the legislative process, Mikva made clear. The one we were learning–the one specified in the Constitution and the standing rules of the House and Senate—and the other kind, made up of byzantine posturing and back-room dealing.

I don’t consider myself an expert in either, but especially not the second kind, which seems to require living inside the Beltway just to follow. And even then, I suspect, the participants always imagine themselves to be like Josef K in Kafka’s “The Trial,” where everyone believes they are the only ones who understand what is really going on. (I read “The Trial” in another law school course, “Law and Literature,” taught by my future employer Richard Posner.)

“You know that there are so many various opinions about the procedure that they form into a great big pile and nobody can make any sense of them,” one character tells K. “This judge, for instance, sees proceedings as starting at a different point from where I do. A difference of opinion, nothing more. At a certain stage in the proceedings tradition has it that a sign is given by ringing a bell. This judge sees that as the point at which proceedings begin. I can’t set out all the opinions opposed to that view here, and you wouldn’t understand it anyway, suffice it to say that there are many reasons to disagree with him.”

Bi-Partisan, Issue-Focused


Yesterday began one of the second kind of legislative processes, with the announcement from Senator Rockefeller and Rep. Waxman that they will soon begin a series of “bipartisan, issue-focused meetings” to evaluate changes to the Communications Act.

This announcement followed on the heels of a letter from 37 Republican Senators to FCC Chairman Julius Genachowski urging him to abandon his plans to reclassify broadband Internet access as a Title II telecommunications service, which Genachowski first proposed on May 6. Also on Monday, 74 Democratic Congressmen likewise urged the FCC to abandon its “third way” reclassification efforts. “The significant regulatory impact of reclassifying broadband service is not something that should be taken lightly and should not be done without additional direction from Congress,” the Democrats wrote. “We urge you not to move forward with a proposal that undermines critically important investment in broadband and the jobs that come with it.”

(It didn’t take a crystal ball to predict pushback from Congress on the FCC’s effort to end-run both Congress and the D.C. Circuit’s conclusion in Comcast v. FCC that the agency lacked jurisdiction to implement net neutrality rules, the major incentive behind the “third way” proposal. But for the record, I did predict it.)

My Complements to the Source


To recap, yesterday saw letters from Republic and Democratic lawmakers urging the FCC not to proceed with its “third way” efforts, and announcements from key Senate Committee Chairmen of a plan to update the FCC’s organic statute, with an eye toward bringing it into line with a communications landscape greatly altered since 1996, the year the law was last overhauled.

One might imagine these maneuvers to be something other than a coincidence. Are they part of a coordinated effort by Congress to signal to the FCC a desire not to have the reclassification proposal brought to a vote by the full Commission? It certainly sounds like this is the end of the “third way.”

Well, maybe not. As Cecilia Kang notes in The Washington Post, the FCC had no comment about yesterday’s developments. But an article by Gauthem Nagesh in The Hill included this coda:

Update: A spokesperson for Sen. Kerry sends the following:

“Senator Kerry believes that this process is complimentary to the efforts at the FCC, not a substitute for them. The deliberative process, both here and at the agency, will help inform and enhance our respective responsibilities to write and execute law and regulation that encourages innovation, inclusion, and consumer protections.”

Kang’s article included a similar statement:

“A Senate staffer, who was not authorized to speak on the record, said the announcement is a recognition that current law doesn’t reflect the changing landscape of the Web- and mobile-centric communications landscape. The staffer said the move was meant to complement the FCC’s broadband reclassification proposal. The lawmakers don’t intend, as proposed by some network operators, to preempt the FCC’s plan.”

A bit of linguistic forensics suggests that the “Senate staffer” cited by Kang was almost certainly the same person as the “spokesman” for Sen. Kerry quoted by Nagesh. Note that the spokesman commits the common error of using “complimentary” (with praise) when he or she actually meant “complementary” (supplying mutual needs). The original posting of Kang’s article repeated the error in paraphrasing the staffer, though that paragraph was later corrected.

So at the very least Senator Kerry is signaling that the Congressional effort to review the Communications Act isn’t meant to block Genachowski’s moves at reclassifying broadband. Those statements came, however, after the original announcement. Was Kerry trying to clarify the day’s events or spin them?

At the end of the day (literally), I’m left with more questions than answers from what seemed like straightforward legislative process: Who is Sen. Kerry trying to communicate with here, and what is the real message? Is the “third way” dead on arrival? Is Congress hedging its bets? What kind of reform does the Senate have in mind?

Two more questions: Did anyone hear a bell ring? And does it matter?

Update: see Declan McCullaugh’s excellent analysis of the day’s events at CNET.

Albert Gallatin and the First National Broadband Plan

Over the weekend, I published an op-ed in The Des Moines Register encouraging the FCC to heed the lessons of the first national broadband plan, the one Secretary of the Treasury Albert Gallatin sent to Congress in 1808.

Gallatin was a remarkable figure in the early history of the federal government, and his accomplishments include being the longest-serving Treasury secretary (1801-1812) to date. His report on the Subject of Public Roads and Canals, completed at the request of Congress, remains one of the seminal documents in the history of American infrastructure. It is a masterpiece of dispassionate policy-making and clear-headed writing.

Alas, the document is available nowhere online, and the only in-print copy I can find is published by the aptly-named Dodo Press. This is indeed unfortunate given the renewed interest in network infrastructure as a form of national technology. The NBP published in March by the FCC, despite its nearly 400 pages and thousands of footnotes, makes no reference to Gallatin or his plan. Continue reading

Net Neutrality and the Inconvenient Constitution

I appeared this afternoon on the inaugural edition of TechCrunch TV to talk about–what else?–Net Neutrality.

Multiple media sources are now reporting that the FCC, contrary to reports from earlier this week, has decided to go ahead with an effort to change the classification of broadband Internet service from a Title I “information service” to a Title II “telecommunications service,” if only to salvage the proposed rulemaking on the open and transparent Internet. (See stories on The Wall Street Journal and The Washington Post as well as Ars Technica.)

Those of us who aren’t on the FCC’s official leak list will have to wait with the rest of the rabble to get a look at just how the FCC proposes to effect this radical change in communications law. Will it apply to all broadband Internet–including cable, fiber, DSL, satellite, wireless and broadband over power lines? Will the FCC propose to regulate only as much as needed to get the jurisdiction the D.C. Circuit says it doesn’t have under Title I to implement the NPRM, or will they throw in some additional provisions to achieve other goals–such as the reform of the Universal Service Fund? Will state and local regulators get to share in the fun of telling ISPs how best to run their business?

Continue reading

Reality Check: “Reclassifying” Broadband Would be Hard—Thank Goodness

I have a long opinion piece on CNet today, arguing that much of the talk of “reclassifying” or “relabeling” broadband Internet access to bring it under the FCC’s regulatory authority is just that—talk.

On April 6th, the D.C. Circuit Court of Appeals ruled definitively that the squishy doctrine of “ancillary jurisdiction” provides no authority for the FCC to impose its net neutrality rules on broadband Internet providers.

Law professors and paid advocates are doing a good job of convincing journalists who don’t understand the finer points of administrative law that all the FCC needs to undo that decision is the will to change the classification of broadband and…problem solved.

Not quite.  Those who argue the FCC can simply waive a regulatory wand and give itself all the jurisdiction it needs under Title II of the Communications Act are engaging in serious wishful thinking, or worse.

Yesterday, for example, The New York Times ran an editorial that suffered from a surfeit of fairy dust:

Fortunately, the commission has the tools to fix this problem. It can reverse the Bush administration’s predictably antiregulatory decision to define broadband Internet access as an information service, like Google or Amazon, over which it has little regulatory power. Instead, it can define broadband as a communications service, like a phone company, over which the commission has indisputable authority.

Where to begin?

First, it was the FCC, not the Bush administration, that convinced the U.S. Supreme Court that broadband is an (unregulated) information service.  And that was not a decision the agency made with Congressionally-delegated discretion.  The FCC didn’t “define” anything–it interpreted the statute.  That broadband is an “information service” reflected the FCC’s understanding of where Congress put broadband when it wrote the 1996 revisions to the Communications Act.

(Nothing in the definition of “information service” has anything to do with applications or web-based businesses such as Google or Amazon, by the way—that’s really left-field.  As I say in the CNet piece, “information services” mean data, as opposed to voice, communications.)

The Supreme Court agreed with the FCC’s interpretation in the Brand X case, and Congress has given not even a hint of a rumor of a private thought that they believed the agency and the courts got their intention wrong.

In Brand X and regulatory proceedings before and since, the FCC argued for treating broadband Internet access as an information service not because the agency thought that the best way to regulate.  They argued that broadband was outside any common carrier regulation because Congress said so in the 1996 Communications Act.  With the Brand X decision, all three branches of the government agreed with that understanding of the law.

Finally, there’s nothing in the statute that gives the agency the power to “define broadband as a communications service.”  (The actual term is “telecommunication”—nice fact-checking, New York Times.)  Agencies don’t get to define terms in their governing statute—Congress does.  If the FCC had the kind of authority the Times’ editors seem to think they have, the Communications Act would likely fail a constitutional challenge.  Congress cannot delegate lawmaking power to an agency of the executive branch.

Journalists aside, even the strongest proponents of the Title II panacea know in their hearts that the FCC can’t just wish themselves new powers without authorization from Congress.  But they also know that going back to Congress for that authority—the logical response to any finding that an agency lacks authority it believes it needs to meets its statutory objectives–is a dicey proposition.

There never has been sufficient support in Congress for net neutrality to get anything passed.  The math looks worse now than it did a few years ago, when Comcast was first found to have secretly slowed or blocked some users’ BitTorrent downloads.

Indeed, the NPRM came after Congress failed for years to pass any of the proposed neutrality laws.  The FCC argued that ancillary jurisdiction was enough authority to do it themselves, a gambit even pro-neutrality groups including the Electronic Frontier Foundation saw as more dangerous than the harm the agency was trying to abate.  Now the D.C. Circuit has signed on to that view, an entirely sensible limitation of agency power regardless of whether it is being exercised for good or perceived evil.

If the FCC wants to save its net neutrality proposal, it will have to go back to Congress one way or the other.  Or proceed, and face at least a decade of litigation that it will ultimately lose—at the waste of millions of taxpayer dollars that could go toward fulfilling the National Broadband Plan.

Sounds like an easy choice to me.  But some people want it to be even easier, despite those pesky facts that are getting in the way.