Category Archives: Infrastructure

SOC: Tempest in the Back of Your TV

mpaa logoI’m fascinated by the firestorm that has erupted over what sounds on paper like the most boring combination of a legal and technical discussion: the recent appeal by Hollywood for a waiver from the FCC’s Selectable Output Control (SOC) rule.

First a little background, greatly simplified. (Those wanting the gory details can read the excellent coverage of the story over at Ars Technica.) Older television sets receive cable programming through analog component wires. Newer TVs include the old analog interface but also added digital ports, such as HDMI, that can reproduce a higher-quality picture.

The SOC rule, adopted in 2003, prohibits content providers (including cable and phone companies offering television content) from manipulating transmissions in a way that turns off or otherwise disables the analog ports, which would have forced consumers either to use the digital interface or, if they don’t have those ports, buy a new set that does.

In addition to quality, the other relevant difference between analog and digital outputs is that the latter can be programmed to obey increasingly sophisticated forms of digital rights management (DRM), used to limit the reception, quality and use of received content. HDMI interfaces, among other features, support signal encryption that ensures the output is being directed to an authorized device—a television set registered for on-demand viewing, for example.

The MPAA has asked the FCC for a waiver to allow studios to broadcast new movies before they are made available on DVD. To make such broadcasts more secure, the MPAA wants permission to block the signal from being output through the analog interface. While digital outputs can be hacked and DRM bypassed, the MPAA believes that the most likely and most dangerous form of piracy of these early releases would come from users with active analog ports–what is sometimes referred to as “the analog hole.”

If the waiver is granted, content providers would be able to disable analog ports when transmitting early-release movies to the set. The digital ports could then be manipulated to ensure that the programming was not copied in violation of the new service’s terms.

The MPAA’s request is being supported by content providers including cable TV, satellite, and phone companies, as well as some device manufacturers. The principal opposition is coming from the Consumer Electronics Association, the main trade group for device manufacturers, as well as a coalition of public interest groups including Public Knowledge and the EFF. (See CEA President Gary Shapiro’s open letter to the FCC on The Huffington Post and John Bergmayer’s “SOC in Context” at Public Knowledge, as well as Ars Technica’s Matthew Lasar’s response to the cable industry.)

This is not an open-and-shut case, though both sides would like to characterize it as such. The objectors argue that consumers who only have analog outputs (25 million, according to CEA) should not have their TV’s “broken” by SOC, in essence forced to upgrade to newer TVs if they want to watch early releases of new movies.

They also point out the there is no evidence that content piracy has anything to do with home viewers intercepting transmissions and translating them to media or file-sharing copies through analog interfaces or otherwise; that, indeed, the most significant source of piracy for new movies comes from insiders who get hold of production copies before or soon after movie releases.

The public interest groups are particularly concerned that a SOC waiver here is at best a Trojan Horse, giving the entertainment industry a foot in the door to control the use of more kinds of broadcasts, including those for which today there are no or fewer restrictions.

The waiver could be a start, in other words, toward more restrictive content limitations, along the lines of so-called “broadcast flag” technology embedded into TV sets that Hollywood had earlier convinced the FCC to mandate. (A federal appeals panel laughed the FCC out of court on that one, reversing the Commission as wildly out of its jurisdiction.)

Ultimately, the public interest groups believe, the SOC waiver could lead to the end of long-established rights for viewers to record and time-shift programming, a right that the Supreme Court underscored nearly 25 years ago when the same parties asked for a ban on VCRs.

The requested waiver, however, doesn’t apply beyond the “narrow” exception of the early-release movies. (A number of companies, including TiVo, have urged the FCC to allow the waiver, but only after significantly tightening up just how narrow the exception really is.) And supporters point out that without the SOC waiver that new service simply won’t be offered, harming everyone.

As the National Cable & Telecommunications Association (NCTA) put it in a letter to the FCC, “MPAA has sought waiver of the ban on SOC to permit [content providers] to provide consumers with more viewing options, which can only be made available if the ban on SOC is waived.” (emphasis added)

The implication of that statement is that the SOC waiver is a kind of technological requirement for early-release movies, which isn’t the case. What the NCTA means is that without the waiver, Hollywood won’t risk showing movies before DVD releases for fear that piracy will undermine the subsequent market for media.

Well, maybe. But even absent rampant piracy, the likelihood is that DVD and other media purchases will continue to decline (see “Hollywood: We Have Met the Enemy”) and the studios will be forced into new (potentially more-profitable) on-demand and/or subscription models. Indeed, Warner Home video is already experimenting with early release video-on-demand for new movies even without a SOC waiver.

I share the public interests groups’ fears of a slippery slope. As I write in “The Laws of Disruption,” since the advent of the Gutenberg Press, content industries have demonstrated an uncanny consistency in lobbying and litigating against every new consumer technology that threatens their control over distribution and use, including several that have, ironically, saved them from extinction.

Consumers are understandably wary of promises from their content providers and the entertainment industry that a small inconvenience is needed in order to give consumers what they want. There’s something frankly irritating not so much in the argument but in the tone with which the NCTA is making its case (“sounds like a slam dunk to me, but surprisingly, some object” NCTA head Kyle McSlarrow writes. Come on!)

On the other hand, the SOC waiver, more narrowly defined, won’t take away any abilities or rights consumers have now. Though the analogy isn’t perfect, consumers who want to see HD broadcasts must buy HD television sets. The MPAA is arguing that it won’t offer pre-release movies if it can’t stop them from coming out an analog interface. (Technically, of course, they can come out that interface.)

Eventually, the 25 million analog-only TV sets left in the U.S. will be replaced anyway, and the “analog hole” will be plugged with more secure (though hardly bulletproof) digital technology. At that point the analogy is a little better: if you want the new service, you need to plug it into a port you will likely have and not one that you have but which isn’t permitted.

In the end, I’m with TiVo and Sony (both a content producer and a device manufacturer), who “now believes that under certain, very narrow, circumstances, SOC could bring benefits to consumers that on balance would outweigh any potential drawbacks.”

At the same time, I’d sure like to see Hollywood and its partners stop listening to consumers with such a tin ear. If they devoted even a fraction of the energy they put into trying to control the uncontrollable into experimenting with a holistic approach to offering access through all the channels—media, on-demand, virtual libraries, etc.—consumers are interested in, these fights could largely be avoided. Profits would be higher and more secure, and the pressure for piracy would be greatly diminished.

As Internet Research Group’s Peter Christy puts it, “The Internet is playing a fascinating role here by enabling experimentation. In the end the content owners are king, and their objective is maximum revenue capture from a menu of distribution alternatives. I think they are starting to know enough data not to act in way that will cost them money.”

I’m not quite that optimistic.  But I hope he’s right.

Hollywood: We Have Met the Enemy…

tivo

Strategy under The Law of Disruption requires attention to detail.

Two recent articles with competing views of the fate of Hollywood content producers caught my attention.  The first, by CNET’s Greg Sandoval, reiterates long-standing predictions that for current industry giants the Internet spells doom.   “[T]he end is coming,” Sandoval concludes, “for DVDs, traditional movie rentals and yes, much of your cable money…..”

The second, from New York Times reporter Bill Carter, reported surprising results from a recent change by ratings agency Nielsen.  In determining whether consumers are watching commercials and, therefore, what “rating” to assign a broadcast program, Nielsen now includes DVR views within three days of airing if commercials aren’t skipped.

The surprising result is that more than half of all DVR viewers don’t skip the commercials, even though they have a button that lets them do so with relative ease.  For some shows, including “Heroes” and “Fringe,” actual ratings jumped after taking the new data into account.  The DVR, seen as the destroyer of commercial television, may save the major networks, which are averaging a 10% increase in ratings under the new system.  “It’s completely counterintuitive,” the article quotes the president of research for NBC.  “But when the facts come in, there they are.”

Ah, facts.  The hobgoblins of pundits everywhere.

So is Hollywood dead or is it doing better than ever?  Let’s split the difference here.   The content industries are clearly in crisis.  But their doom is not inevitable.

Sandoval is right that digital technology, held back for years by litigation and cost, is now in the midst of thoroughly disrupting the entire content supply chain, from creation through consumption.  The lawsuits have failed (the MPAA recently fired its general counsel in a housecleaning) and, thanks to Moore’s Law, digital content is everywhere.

Sandoval  thinks that the availability of cheap copying and rebroadcasting technologies (file-sharing, streaming, and of course the Internet everywhere) poses an insurmountable foe for the content industry.  After the fall, he says, “What will come out the other side is still uncertain but will likely be very much smaller.”

I don’t agree.  In fact, I think the answer is just the opposite, and the DVR data Nielsen is now collecting (in the teeth of initial opposition from the broadcasters, who thought it would lower their commercial-viewing share points) gives the best clue as to why.  More on why in a moment.

First, let’s be clear on the source of the crisis.  Though it’s convenient for media executives to see it this way, consumers aren’t evil–they aren’t breaking the rules because they hate rules.  They’re breaking them because they want something they aren’t getting.  And they don’t understand why broadcasters would object to their efforts to enjoy entertainment content however they like.  Legally speaking, we’re all felons.  But who taught us to think of broadcast content as something that magically appeared on the TV for free in the first place?

Consumers break the industry rules (and, often, the law of copyright) not because they want to destroy the industry but because they have access to technology that lets them do something they want to do but otherwise can’t.  Consumers want to watch what they want, when they want to, on whatever device they want it.

When the only way to do so was on the broadcaster’s timetable, on media (over the air, cable, videocassettes, DVDs, Blu-Rays) controlled by content owners, paid for by advertiser support, media purchase, cable subscription fees, or all of the above, that is the way media was consumed.

Now that there are other options–including legal ones such as Hulu (ad-based), iTunes (fee-based) and Netflix (subscription-based) that remove some of the artificial constraints on time, quality, media, and frequency of viewing.  Consumers are embracing these, even as they continue, in smaller numbers, to buy media versions of movies and TV programs.

Here’s the key point: they are consuming much more media, whether legally or otherwise. They want more choices and more content. If Hollywood won’t give it to them, the Internet will. But it’s not as if we really care who gives us what we want. We’re willing to make all manner of trade-offs on quality, cost, ease-of-use.  If, that is, there is a real choice.

Consumers will always reject artificial constraints where technology allows them to do so.  Inherently, they understand that information is an inexhaustible commodity–that no matter how and how often and in what quality they watch “Star Wars” or last week’s “Flash Forward,” the programming is still intact, undamaged, and available any time in the future for them or any other viewer–simultaneously, if desired.

Now that most everything has been translated to bits (or starts life that way in the first place), the curtain has been lifted.

There are two ways to make consumers pay for this content in a manner that that make it profitable for creators, distributors and others in the supply chain to continue to produce it.  One way, the pre-Internet way, was to give them no choice.  Watch these commercials because you can’t skip them.  Buy these videotapes because there’s no other media.  Pay your cable bill because that’s the only way to get the channels you want.

The second way, which will now determine who wins and who loses in the content industry of the future, is to use whatever information you can get your grubby hands on about what consumers are actually doing with technology and learn from it.

Now that both the content and information about the content have become digital, the media industry needs to learn what it is that consumers actually want–that is, what consumers actually value–and offer it to them.

The DVR data, as a  starting point, tells us two interesting things.  First, that many consumers don’t mind watching commercials, either because they like them, or they’re too dazed to skip them, or because they understand that the commercials subsidize the programming.  Media buyer Brad Adgate, quoted in the Times piece, notes something that hasn’t changed about the viewing experience:  “It’s still a passive activity.”

(To exploit that passivity, sponsors are going back to the original model of embedding product placements and commercials into the programming–a la “Top Chef” and “The Biggest Loser” and probably every other show that does it with more subtlety.)

But the second and more interesting insight from the DVR data is that resisting information because you think it will deliver bad news is a self-destructive behavior, especially during times of industry transformation.

Ten years into the digital revolution, Hollywood is still firmly stuck in the first stage of grief–denial.  Not only are they resisting change, they are resisting any knowledge about how the change is taking place.  Even when, as here, that knowledge tells them something valuable about how to thrive in the emerging new order.

(History repeats itself:  recall the industry reaction to the VCR, which MPAA President Jack Valenti famously equated to the Boston Strangler–the violent, insane destroyer of his industry.  In retrospect, the VCR saved Hollywood from itself.)

I don’t agree with Sandoval’s conclusion that the Hollywood of the future “will likely be much smaller.”  The popularity of YouTube and other user-produced content services, the explosion of cell phone apps for enjoying content, the success of Indy studios and niche channels, and the continued interest of consumers in “collector edition” and other high-end media artifacts all suggest that the public’s appetite for entertainment is unfathomable.

Three networks, we have already learned, aren’t enough.  Hundreds of specialty cable channels aren’t enough.  Content produced and delivered on a take-it-or-leave-it basis in a vacuum of consumer insight beyond gross demographics and what-worked-last-year strategies is no longer a sustainable model.

But what will come out “the other side,” as so often happens when disruptive technologies rewrite the rules, will be a much bigger industry, with more profits to share.  True, the Hollywood of tomorrow won’t look much like the Hollywood of today.  But then, the Hollywood of today has almost nothing in common with the original industry model, dominated by the studio system and a handful of powerful decison-makers.  For one thing, it’s a whole lot bigger by any measure.  Technology makes things better–always, if eventually.

Here’s what else the DVR data tells us.  In the future, information about media consumption will prove as valuable as the entertainment itself.  Strike that:  it’s already happened.  For the first fifty years of its existence, TV Guide, which merely printed local listings summarizing what was on the few channels a household received, made more money than all three of the major broadcast networks combined.

Existing players in the collapsing Hollywood supply chain can either learn new ways to add value and thrive, or they can continue to resist the inevitable, close their eyes to valuable data, insist on business as usual, sue everyone and everything, and go the way of buggy whips and analog broadcast.  Add value, as a client once summarized it for me, or adios.

I know which one I would choose.  But then, I don’t run a multi-billion dollar public company.

Yet.

The Case Against the FCC's Neutrality Rules – CNET

CNETMy analysis of the FCC’s proposed neutrality rules appears this morning on CNET.

No surprise, I think the FCC’s plan is a bad idea, and I think, more to the point, that the FCC is the wrong organization to be “saving” the open Internet. Among other crimes, as the Electronic Frontier Foundation points out, the FCC is the same regulator who has ramped up the penalties and frequency of fines for “indecent” content over the airwaves.

The FCC is also the organization that has tried repeatedly to push through, at the behest of the media industries, the notorious “broadcast flag,” which would force electronics and software companies to limit the legal use of broadcast content.

Meanwhile, the agency that now believes there is a severe lack of competition in broadband provisioning–severe enough to regulate–has done everything it can to stop alternative broadband technologies, including broadband over power lines and municipal wireless projects.

The open Internet is a great thing, but the FCC is wrapping itself in the flag of Internet freedom and consumer advocacy in a most unconvincing manner.

Net Neutrality Debate: The Mistake that Keeps on Giving

fcc logo

Again, a long post on Net Neutrality.  Again, my apologies.

The fallout continues from FCC Chairman Julius Genachowski’s call to initiate new rulemaking to implement Net Neutrality principles promised by candidate Obama during the campaign.

The bottom line:  what proponents wish with all their hearts was a simple matter of mom and apple pie (“play fair, work hard, and get ahead” as Craiglist’s Craig Newmark explains it) is in fact a fight for leverage among powerful interests in the communications, software, and media industries.  Net neutrality, if nothing else, is turning out to be a complex technical problem—technical in both the engineering and regulatory sense.

As I write in Law Four of The Laws of Disruption, there’s nothing neutral about the rules under which Internet provisioning is regulated today, with broadband offered by phone companies subject to one set of rules (“common carrier”) and access offered by everyone else subject to, for the most part, no rules at all.  (Wireless Internet providers, who have far less bandwidth to offer, greatly restrict user behavior, but Genachowski indicated they too would be brought under the neutrality principles he outlined.)

There’s also nothing rational about the current rules.  That’s becoming abundantly clear as neutrality proponents start to back away from the firestorm they helped fund and as the messy details of current network management behavior becomes clearer.  As Vishesh Kumar and Christopher Rhoads of The Wall Street Journal noted in late 2008, Microsoft, Yahoo and perhaps Amazon have quietly backed away from their initial enthusiasm for more FCC oversight of Internet access and traffic management.  Microsoft’s official position:  “Network neutrality is a policy avenue the company is no longer pursuing.”

Nor should they.  Even as the regulatory process grinds on at its naturally-slow pace, Moore’s Law continues to change the technological landscape with breathtaking speed.  Which is a good thing.  Despite all the think-tank and lobbyist hand-wringing, every aspect of digital life has improved dramatically in the last decade—access options, connections speeds, applications, content, devices, you name it.  It’s possible that in the future all of this could come to a grinding halt because of uncompetitive and ultimately irrational behavior by a few market dominators.  But why legislate ahead of a problem in the area most certain to change dramatically regardless of regulation?

Here are just a few of the most recent developments:

  • Google complains to the FCC about Apple’s rejection of Google Voice from the iPhone (The full letter, originally redacted, is now available here)
  • AT&T complains to the FCC about Google Voice’s refusal to connect certain calls (a luxury that common carriers don’t have)
  • Seventy-Two Democrats urge the FCC to tread carefully into Net Neutrality, encouraging Genachowski to “avoid tentative conclusions which favor government regulation.”
  • Wireless network providers object to being included under the Chairman’s proposed six rules for Net Neutrality (“The principles I’ve been speaking about apply to the Internet however accessed, and I will ask my fellow Commissioners to join me in confirming this.”)

AT&T’s complaint about Google Voice is informative.  On October 8th, the FCC announced it was investigating Google Voice’s treatment of calls to certain rural areas, where under FCC rules common carriers are required to pay higher connection fees to complete calls from their subscribers.  These fees are intended to help offset the extra costs rural phone companies must otherwise absorb in order to serve a dispersed customer base.  Unfortunately, as everyone knows, the local companies have abused that rule by hosting a variety of non-local services, including free conference call services and sex chat lines and then splitting the profits with the service providers.

(The technical implementation of Google Voice is largely confidential.  The application, among other features, provides its users a single phone number and  routes incoming calls to any phone device they have and places outbound calls managed by Google (through its wholesale partner Bandwidth.com), for free or a small charge.  Free, that is, in the sense of being supported, of course, by ads.)

The response from Google?  Google Voice, as the company acknowledges, “restricts certain outbound calls from our Web platform to these high-priced destinations.”  But Google Voice is a “Web-based software application,” not a “broadband carrier,” and so is not subject to common carrier rules or existing Net Neutrality principles.  “We agree with AT&T,” Google says, “that the current carrier compensation system is badly flawed, and that the single best answer is for the FCC to take the necessary steps to fix it.”

Not surprising, AT&T argues that Google is violating both common carrier and Net Neutrality principles.  AT&T reports that its tests of Google Voice indicate the service blocks calls to ALL numbers of the rural exchanges, not just those for sex chat lines and teleconferencing services.  (Note in the quote above that Google says only it restricts calls to “high-priced destinations,” leaving it unclear whether by “destination” they mean the over-priced services or the actual area codes.)

The argument between the two companies breaks down to two simple but largely irresolvable questions:  (1) do Internet phone applications that look like traditional phone services, but which rely on customer-leased connections to initiate and terminate calls, need to abide by common carrier rules? (2) does non-Neutral behavior by an application mimicking many of the core functions of a broadband provider violate Net Neutrality, or do the principles (and those proposed by Genachowski) apply only to providers of last mile service?

Regardless of the answers the FCC reaches, here’s the point:  common carrier rules cannot be untangled from the Net Neutrality debate.  Personally, I believe consumers would be better off without either, a position neither company has publicly taken.

In a seemingly unrelated story, Wired’s Ryan Singel reports that Google appears to pay nothing to broadband carriers for its Internet connections.  This despite the fact that Google, in significant part because of its ownership of YouTube, may now account for as much as 10% of all Internet traffic.  That’s because during the great telecom meltdown that followed from the dot com boom, the company wisely snapped up a great deal of unused new fiber optic capacity on the cheap.  Google is now trading (the technical term is “peering”) that capacity with broadband providers in exchange for the company’s own connection.

The story has some interesting quotes from Arbor Network’s chief scientist, Craig Labovitz.  “[T]he real money is in the ads and the services in the packets, not in moving bits from computer to computer,” he told Wired.  Then this:  “Who pays whom is changing.  All sorts of negotiations are happening behind closed doors.”  Most of the net’s architecture, as Singel notes, “remains a secret cloaked in nondisclosure agreements.”

Don’t get me wrong.  I think Google is a great company that has introduced a tremendous range of innovative products and services to consumers, nearly all of which are paid for by an advertising model (which increasingly raises the ire of privacy advocates, but that’s another story).  Consumers, as I said before, have benefited from the technical and business decisions of the companies now publicly airing their dirty laundry in the Net Neutrality fight.  We get more stuff all the time, we get is faster and, for the most part, the cost is either holding steady or declining.

But irony, as Bart Simpson once said, is delicious.  The peering arrangements almost certainly means that Google traffic is getting priority.  Not necessarily transit priority—that is, special privilege through the network.  But they do get what Internet Research Group’s Peter Christy calls “ingress priority,” that is, how you get into the provider’s network.  As Christy says, “If you go through some kind of general exchange then it is sort of a free for all and if traffic is heavy there may well be congestion and packet loss at this point.  With specific private peering you can assure that your traffic will get into the network unimpeded.”

It may not be a “fast lane,” in other words.  But it is a dedicated on-ramp.

So, does ingress priority through peering arrangements violate Net Neutrality?

Consider this explanation for why Neutrality is imperative:  “Some major broadband service providers have threatened to act as gatekeepers, playing favorites with particular applications or content providers, demonstrating that this threat is all too real.”

Guess who?  That’s right—it’s from Google’s own policy blog from 2008. The post goes on: “It’s no stretch to say that such discriminatory practices could have prevented Google from getting off the ground — and they could prevent the next Google from ever coming to be.”

Well I think that’s an awfully big stretch—now, and in 2008.  Nonetheless, if the company continues to beat the drum for completely open gates, they will find themselves increasingly hard-pressed to justify peering arrangements, content restrictions on use of their applications, and other deals aimed at improving performance for Google applications.  “Such discriminatory practices” could just as easily prevent new services—competitors to Google—from “getting off the ground.”  AT&T’s complaints that Google is straddling both sides of the fence sound increasingly accurate, regardless of their motivation for saying so.

(At the end of 2008, recall, the company was similarly forced to beat a rhetorical retreat when it was revealed that it had been negotiating peering arrangements for edge-caching devices—that is, for co-locating Google servers with broadband provider equipment to ensure faster access to Google content when consumers called for it.  What seemed again a contradiction to Net Neutrality principles was explained weakly as a “non-exclusive” arrangement that any content provider could also negotiate.  Any content provider with money to spend on caching servers and unused fiber optic cables, that is.)

It’s just going to get worse.  The FCC can no more likely navigate its way through these murky waters than it can decide whether an errant nipple on a live broadcast violates its broadcast decency rules.  (An appellate court recently threw out the Janet Jackson fines.)  The Commission is quite simply the worst possible arbiter of these complex business and technical problems.

So here’s an open invitation to Google, AT&T, Apple, and everyone else in the Net Neutrality slugfest.  Let’s call the whole thing off, before someone—that is, consumers—really gets hurt.

The End of the American Internet

icann logo

Forty years after the first successful connection was made on the predecessor to the Internet, the U.S. has given up its fading claims to govern the network.

A fight over governance which erupted in 1998 has ended with a whimper.

In this case, I’m not talking about the regulation of human activity that takes place using the Internet, but of the internal working of the network itself.

As reported by the Advisory Committee of the Congressional Internet Caucus, the U.S. government’s agreement with ICANN was allowed to expire on September 29th. (The Department of Commerce has a separate agreement with ICANN, which was also significantly modified.)

ICANN is a non-profit corporation formed in 1998 to manage two key aspects of network governance: the assignment of domain names and website suffixes and of IP addresses for computers connected to the Internet. There are now over 110,000,000 registered domains.

Hard as it is to believe, before 1998 the management of names and addresses was largely left to the efforts of Jon Postel, a computer science professor at the University of Southern California. As the Internet shifted dramatically from an academic and government network to a consumer and business network, it became clear that some more formal mechanism of governance was required.

But by then the Internet had become a global phenomenon. The U.S. government was adamant that it retain some measure of control over its invention; the rest of the world argued that resting authority for a global infrastructure with one national government would cripple it, or worse.  Hearings were held, speeches were made, the U.N. was called in (literally).

ICANN was the compromise, and it was an ugly compromise at that. ICANN has run through several executive directors and political battles. Just explaining the selection of members of its Board of Directors, as David Post demonstrates in Figure 10.3 of his book, “In Search of Jefferson’s Moose: Notes on the State of Cyberspace,” requires a flowchart with nearly fifty boxes.

It has also been the subject of regular criticism, in particular for the ways in which it subcontracts the registration of domain names, its resistance to creating new “dot” suffixes, and its evolving and weird process for resolving disputes over “ownership” of domains, typically involving a claim of trademark infringement or unfair competition. Former board member Karl Auerbach, quoted in Information Week, put it this way:

At the end of the day it comes down to this: ICANN remains a body that stands astride the Internet’s domain name system, not as a colossus but more as a Jabba the Hutt. ICANN is a trade guild in which member vendors meet, set prices, define products, agree to terms of sales, and allow only chosen new vendors to enter the guild and sell products.

Still, through dot.com boom and bust, Web 2.0 and social media, the Internet has continued to grow, operate, and reinvent itself as new technologies arrive on the scene.

And what started as a U.S. government project is now clearly a worldwide convenience. According to Christopher Rhoads in The Wall Street Journal, “today just 15% of the world’s estimated 1.7 billion Internet users reside in North America.”

Which is perhaps why the end of federal government oversight of ICANN received so little attention in 2009.

But in 1998, you would have thought the future of civilization depended on keeping the Internet an American property.

The Net Neutrality Walk of Shame

at&t logoIt’s the morning after Chairman Genachowski’s impassioned call for new FCC regulations to impose “net neutality” rules on Internet access providers.  No surprise, everyone is reaching for the aspirin.  Communications users have been partying like it’s 1974, when U.S. regulators finally began the painful process of breaking up the long-sanctioned AT&T monopolies on long-distance and equipment.

In the interim, the FCC, in the name of de-regulation, has constructed a remarkably complex machinery of new regulation for existing (telephone) and emerging (data) services.  They’re always a good ten years behind the march of new technology–including new infrastructure technologies such as cable, satellite, wireless, and fiber, as well as new applications such as the Web, voice-over-IP telephone, and the convergence of voice, data, television and everything else.  And they always will be, no matter how many smart people are working on the problem.  (There are many smart people working at the FCC.)

In a deja vu of the passage of the landmark 1996 Communication Act, within days of Genachowski’s speech, everyone was crying foul.  Wireless operators can’t understand why they would be included in neutrality regulations, given the relative competitiveness of the wireless industry (new regulations are presumed to be needed when market mechanisms fail to correct anti-consumer behavior).  AT&T reminded us that the wireless spectrum they bought last year (part of what was given up by broadcast television in the switch to digital broadcast) came with no requirement to be open to any device or application.  Verizon bought a block that did come with that requirement, and AT&T paid “many billions more”  to avoid the open rules.  Oops.

Then AT&T complained that Google, a leading proponent of neutrality rules, was itself violating some of the basic principles with its own Google Voice application, which blocks certain services (900 numbers, free conference call services, etc.) that charge high fees to the customer’s provider that cannot be passed along.   “Traditional” phone companies can’t block those services.  The fees, as the New York Times’ Saul Hansell points out, were originally authorized to help subsidize rural telephone service, but everyone understands that the system has now been thoroughly gamed, part of the post-1996 de-regulation of telephony and the end of Judge Greene’s oversight of the 1982 breakup of the old AT&T.  (Hate them all you want, but as recently as six months ago, AT&T was losing money on every iPhone customer it signed up under its exclusive deal with Apple.)

Google got into similar trouble in late 2008, when it became known that the company had offered to “co-locate” its own servers at key exchange locations of broadband providers in order to speed up delivery of Google content such as YouTube videos. This “fast lane” service seemed to be precisely what the net neutrality advocates feared most, yet it was coming from one of their chief allies and instigators. Google’s defense was that its offer was non-exclusive, meaning any other application provider could make a similar (non-neutral) arrangement.

Reading through some of the comments posted by readers of The New York Times and The Wall Street Journal articles reporting on these developments, one can’t avoid the sense that no one really knows what anyone else really means by neutrality.  “Net neutrality is about the Internet, not the telephone network,” says one reader.  “Neutrality in general does not apply to telephone companies, nor would it benefit them,” says another.

Harold Feld at Public Knowledge, acknowledging that “I do not know how other VOIP providers behave,” nonetheless is confident that neutrality is separable from telecommunications regulation.  “[I]t is easy to make [Google Voice] look like a network neutrality question and try to undermine network neutrality than focus on the merits of either the Google Voice question or the network neutrality question.”  It is, unfortunately, anything but easy.

All of these stories are conflated for a reason beyond simply trying to muddy the waters as much as possible.  Unfortunately, they really are all hopelessly intertwined. Much as the FCC wishes there was still a clear distinction between “the Internet” and “the telephone network,” technology has obliterated that difference.  Internet companies (Vonage, Skype) provide phone service using TCP/IP, “phone companies” offer Internet access over their equipment, while “cable companies” offer the same service over cable–along with phone services and television, which the phone companies also offer.

Under the law, “telecommunications” services are still treated (badly) as common carriers, a significant competitive disadvantage that may or may not still be justifiable.  “Information” services are not.  When AT&T offers Internet, it’s a telecommunications service.  When Vonage offers telephony, it’s an information service.  The current rules don’t distinguish between the two kinds of uses based on the protocol used, the network technology used, or the equipment used.  The current rules distinguish based solely on the historical (that is, pre-1982) business of the provider.  Providers that didn’t exist in 1982 (Vonage, Skype, Google) or who weren’t in the voice business at the time (Comcast, TimeWarner) are presumed to be offering information services, regardless of what services they are offering.

AT&T’s point, hopelessly lost in 25 years of FCC rulemaking, is that regardless of whether it still makes sense to regulate the hell out of their copper-wire phone network, they ought to be held to the same rules as everyone else when offering new services on new infrastructure and new equipment.  Which is to say, far fewer rules than when they are offering POTS (Plain Old Telephone Service).

Pull one strand of this spider web, and every other strand responds.  Unfortunately, net neutrality is bigger than just net neutrality, and not just because Internet providers say it is.  It really is.

There’s a simple solution to all this, one that might make a rational conversation about net neutrality possible.  And that is to eliminate the distinction between common carriers and everyone else.  Hold everyone to the same rules regardless of what information they are transporting–whether voice, video, television, data.   Because regardless of who’s doing what, these days it’s all bits.  There is no rational reason to regulate the bits based on who is transporting them.  The FCC doesn’t even try to justify the distinction anymore.  Let’s just get rid of it.

We need, as I say in Chapter 6 of The Laws of Disruption, a cure for the common carrier.  We need to eliminate most if not all of the FCC’s byzantine sets of taxes, rules, funds, rate-setting, and (for broadcast but not cable TV) content oversight.  When the 1996 Communications Act was signed into law, Congress predicted it would signal the end of the FCC, much as the deregulation of the airline industry ended the reign of terror of the Civil Aviation Board.

Today, the FCC’s budget is bigger than ever.  And in the thirteen years since deregulation, American consumers have fallen behind in every conceivable metric their counterparts in much of Europe and Asia.  We pay more and get less, even as Moore’s Law makes everything faster, cheaper, smaller and Metcalfe’s Law converges all the networks and protocols and applications into a single glorious buffet of information.

True consumer interests are absent from the picture here.  So even when they win, they often lose.  And what constitutes a win is anyone’s guess.

Consumer activists and consumers themselves would do well to channel their anger and energy away from individual providers and individual problems and focus on the real devil.  Let’s really deregulate the communications industry.  Let’s put the squeeze on the only group that actually makes a profit here–industry lobbyists and lawyers.

By the way, understanding the impact of federal regulation on the infrastructure industry (which no one does, or can) is only part of the puzzle.  Phone, cable, and wireless providers are also subject to local regulations, many of which add to and subtract from the perverse incentives that  dictate  industry behavior.  We need to get the almost entirely corrupt local agencies out of the regulatory business altogether.

Ready for that hangover helper now?