Category Archives: Digital Life

The Growth of Digital Life in Numbers

As I write in The Laws of Disruption, the pace with which digital life is developing and expanding is easy to measure but impossible to comprehend.  Changes in the ways in which we interact, experience entertainment and other information content, and exist as citizens of a digital realm happen so fast they outstrip our ability to stand back and observe them.

A website called Royal Pingdom has published some interesting metrics for 2009 Internet activity, which might help the quantitatively-minded to get their heads around the information revolution.

Here are a few that stood out for me:

– 90 trillion emails were sent in 2009 by 1.4 billion users.  That’s the good news.  The bad news is that 81% of those emails were spam.  It’s amazing that despite all that wasted traffic, much of it stopped before it reaches a user’s mailbox, the network continues to function at higher levels of performance all the time.

– There are now 1.73 billion Internet users worldwide.  Despite the economic chaos of last year (or perhaps in part because of it), that number represents an 18% increase in the number of users.  Nearly 1 billion of those users are in Asia.

– YouTube now serves up 1 billion videos each day.  It would, I think, greatly aid the debate over copyright and “piracy” to know what percentage of those videos are legally licensed to YouTube.  My guess is that it’s a much higher number than most people would guess.

Bret Swanson at Digital Society posted more statistics showing the growth of Internet activity over the course of the entire decade.  My favorite:  Google’s index of pages in 2000 covered 1 billion website pages.  By 2008 the number was up to 1 trillion.

These are not numbers I can visualize.  Can anybody else?

The White House’s New Internet Policy, and thoughts on Comcast v. FCC

I published the first of two pieces on CNET today about interesting and even encouraging developments in Washington over Internet policy. (See “New Year, New Policy Push for Universal Broadband”)

In short, I believe that over the past year the Obama administration has come to see Internet products and services as one of the best hopes for economic recovery and continued competitiveness for U.S. businesses.  At least as a matter of policy, this is the first administration to see digital life as a source of competitive advantage.

Tomorrow’s piece concerns the “spectrum crisis” and what the federal government hopes to do to solve it. (The federal government “owns” the radio waves, after all.)

Cut due to the length of the piece was a longer analysis of the arguments a few weeks ago in the U.S. Court of Appeals for the D.C. Circuit in Comcast v. FCC, in which cable provider Comcast challenged a sanction the FCC issued in 2008 for the company’s attempts to limit use by some customers of peer-to-peer applications including BitTorrent.

Depending on how the court rules, the commission’s proposed Net neutrality rules could be dead in the water for lack of authority to regulate in this space.

Here’s the longer version of that section:

The fate of net neutrality may depend on the outcome of important development that took place during CES, but at the other end of the country. This involved litigation over the one notorious instance of non-neutral behavior that largely reawakened the net neutrality debate in 2008.

Comcast admitted that it has used some fairly clumsy techniques to limit the speed or sometimes the availability of peer-to-peer Internet services that allowed users to share very large files, notably using the BitTorrent protocol. In the wake of that revelation, Comcast agreed to change its practices and to make them more transparent, and made peace with BitTorrent developers. (They are also in the process of settling a class-action lawsuit brought by Comcast customers affected by the limits.)

The FCC issued a non-monetary sanction against the company, claiming that the techniques violated net neutrality principles which, while not formally enacted by the FCC, nonetheless applied to Comcast. Comcast challenged the sanctions in the U.S. Court of Appeals for the D.C. Circuit, which hears all challenges to FCC rulings.

On Jan. 8th, hours before Chairman Genachowski took the stage at CES, the D.C. Circuit heard oral arguments in Comcast’s appeal. As was widely reported, the three-judge panel considering the appeal questioned government lawyers severely.

Some statements from the judges suggested they were skeptical at best about the Commission’s authority to sanction Comcast. Chief Judge Sentelle, who sat on the panel, complained about the Commission’s view of its own authority. “You can’t get an unbridled, roving commission to go about doing good,” he was reported to have said. Judge Randolph, another panelist, complained that the lawyers for the FCC could not “identify a specific statute” that authorized the sanction.

The case is certainly important, though too much was read into the tone of the arguments by a number of mainstream media sources. As a former circuit court law clerk, I can attest to FCC Commissioner McDowell’s warning at CES not to draw conclusions about the outcome of the case from comments or even the appearance of hostility by appellate judges at an oral argument. (McDowell, notably, was one of the Commissioners who dissented from the Comcast sanction, on the grounds that the FCC did not have the authority to issue it.)

Wired, for example, ran the extreme headline: “Court to FCC: You Don’t Have Power to Enforce Net Neutrality.”

The accompanying article was a little less hyperbolic, but still misleading: “A federal appeals court gave notice Friday it likely would reject the Federal Communications Commission’s authority to sanction Comcast for throttling peer-to-peer applications.” That was an interpretation of the arguments echoed in many publications.

There is, however, no way to predict from the oral arguments how appellate judges are “likely” to rule. They may have just been in a bad mood, or annoyed with the government’s lawyers for reasons unrelated to the merits of the appeal. Unlike political office holders, federal judges are appointed for life, and do not measure their questions or comments at oral arguments to signal how they are likely to rule in a case.

Indeed, the judges may have objected not so much to the conclusion urged by the FCC as much as the line of reasoning the Commission followed in its briefs. The Commission may have relied too much on its general authority to regulate communications companies, for example, rather than citing more specific regulatory powers that Congress and the courts have already recognized.

Still, the outcome in this case could have serious repercussions for the proposed net neutrality rules. Why? Most of the FCC’s rulemaking authority comes from longstanding regulatory power over telephone companies, classified as “common carriers” who must follow nondiscriminatory practices overseen closely by the Commission.

But cable Internet providers are not common carriers, and indeed the FCC itself argued that point successfully in a 2005 Supreme Court case. The FCC later determined that traditional phone companies, when offering broadband Internet service, were also not subject to common carrier regulations.

So if broadband Internet services are not subject to common carrier rules, where does the FCC get the authority to propose net neutrality rules in the first place?

The Commission argued both in the Comcast case and in its proposed net neutrality rules that its jurisdiction comes from “traditional ancillary authority,” that is, from authority that is implicit in the governing statute that defines the FCC’s power. The skepticism expressed at oral argument seemed to be focused on the argument that ancillary jurisdiction was all the FCC needed to sanction Comcast’s behavior.

The D.C. Circuit could rule that such authority does not extend so far as to allow the FCC to create or enforce net neutrality. It could also rule more narrowly, and reject the sanctions only on the grounds that they were not issued pursuant to a formal rulemaking–that is, the kind of rules now being considered. Or, of course, the court could agree with the Commission that ancillary authority is sufficient both to issue the sanctions and to enact formal rules.

The proposed rules are not directly at issue in the Comcast case. Even if the court rules that ancillary jurisdiction is insufficient to make net neutrality rules (as, among others, the Electronic Frontier Foundation has argued, see “FCC Perils and Promise”), the FCC could technically still go ahead with its rulemaking. But the court would surely hold the new rules exceed the agency’s power, and pretty quickly.

Rather than pass rules that would be dead on arrival, the Commission would likely head back to Congress for explicit authority to define and enforce net neutrality regulations. Since 2007, there have been several bills floating around committees that would grant precisely that power (and, indeed, mandate that the FCC use it) but none of them has yet to be reported out. There are also bills explicitly forbidding the FCC from enacting net neutrality rules, also sitting in committee.

Net Neutrality Doublespeak: Deep Packet Inspection is a Bad Idea, Except When it Isn’t

An interesting tempest in a teapot has emerged this week following some overblown rhetoric by and in response to celebrity causemeister Bono. There’s a deeper lesson to the incident, however, one with important implications for the net neutrality debate. (More on that in a moment.)

In a New York Times op-ed column on Jan. 2, 2010, Bono provided “10 ideas that might make the next 10 years more interesting, healthy or civil.” These include the salvation of the entertainment industry from the clutches of peer-to-peer file sharers, who are just a few turns of Moore’s Law away from being able to “download an entire season of “24” in 24 seconds.”

“Many will expect to get it for free,” Bono laments, apparently unaware that in the U.S., we don’t have a mandatory television license for television content as they do in the U.K. (U.K. residents pay £142.50 a year tax, the principal source of income for the BBC.) So long as you watch 24 when Fox broadcasts it, you will expect to and indeed will get it “for free,” without breaking any laws whatsoever. Hooray for America.

Bono’s proposal to solve this problem, also factually challenged, is to force ISPs to clean up the illegal sharing of copyrighted content:

We’re the post office, they tell us; who knows what’s in the brown-paper packages? But we know from America’s noble effort to stop child pornography, not to mention China’s ignoble effort to suppress online dissent, that it’s perfectly possible to track content. Perhaps movie moguls will succeed where musicians and their moguls have failed so far, and rally America to defend the most creative economy in the world….

As several commentators have already pointed out, America’s “noble effort to stop child pornography” has almost nothing to do with looking inside the broken up pieces of Internet transactions, known as  “deep packet inspection.”  Indeed, as I write in Law One (“Convergence”) of The Laws of Disruption, most federal and state efforts at solving that scourge at least in the online world have been so broad and clumsy that they instantly fail First Amendment scrutiny. (Another feature of American law that Bono may not fully appreciate.) Congress has tried three times to pass laws on the subject, two of which were declared unconstitutional and the third reigned in to be almost meaningless.

State efforts have been even more poorly-crafted. I write in the book about Pennsylvania’s Child Sexual Exploitation Unit, formed in 2008 by act of the Pennsylvania legislature. Staffed by three former state troopers, the CSEU “analysts” surfed the web looking for sites they felt contained child porn, then wrote letters demanding that ISPs block access to those sites for all their Pennsylvania customers. (The easiest way for large ISPs including AOL and Verizon to do that was simply to block the sites, period.)

Aside from the lack of any training or standards by the regulators, the sites that made the list included several host sites with hundreds or thousands of private websites that had nothing to do with pornography of any kind. By the time the courts put the CSEU out of business a year later, Pennsylvania had banned 1.19 million websites, only 376 of which actually contained content the troopers deemed offensive. (An official geographic survey of Spain and the International Philatelic Society made the banned list.) There was also no mechanism for getting a web address off the list, even if the ownership and contents changed hands.

But that’s a mere quibble, as is the fact that Chinese censorship of content, hardly a “best practice,” apparently includes some 30,000 Internet police and perhaps millions of servers—and even then, the surveillance appears to be on the back-end, after the packets have already been reassembled. (Not surprising, China hasn’t exactly published its processes in the Harvard Business Review.)

Regular readers of this blog will be expecting the twist ending, and here it comes. I’m less interested in the misinformed opinions of a musician and humanitarian than in the response it drew from Internet activists. Gigi Sohn of Public Knowledge characterized Bono’s proposal as “mind-bogglingly ignorant” both as to what really caused the fall of the music industry and the technology that would be required for ISPs to become the content police on behalf of copyright owners. Packet filtration, Sohn points out, would lead to “blocking lawful content and encouraging an encryption arms race that would allow filesharing to proceed unabated.” And anyway, the real problem here is overprotective IP laws. (I agree.)

Somewhat less hyperbolic, Leslie Harris of the Center for Democracy and Technology (CDT) wrote today on The Huffington Post that ISPs are taking concrete and responsible steps stop to reduce child pornography that don’t include deep packet inspection, and reiterated Sohn’s point about an encryption arms race.

More interesting, however, Harris notes the danger of mandating ISPs to exert “centralized control over Internet communications.” Harris writes:

In this country, ISPs do not control what their users send to the Internet any more than a phone company controls the topics of someone’s phone call. Does the U.S. really want to move in the direction of the Chinese model of always-on surveillance? Once we begin to break into all Web traffic to search for copyright violations, evaluating content for its “decency” or appropriateness for children, then analyzing each user’s search habits to determine buying habits and government surveillance without lawful process (remember the NSA warrantless wiretapping) will follow close behind.

The U.S. has the most vibrant, free and innovative Internet because we don’t have gatekeepers in the middle of the network.

Well, at least we don’t yet.

As I’ve pointed out before (see, for example, “Zombieland – The Return of Net Neutrality”) my principal concern with net neutrality is not the idea that information should flow freely on the Internet. That’s a fine principle, and central to the success of this largely unregulated, non-proprietary infrastructure.

Rather, I worry about the unfortunate details of implementation. If net neutrality also means that ISPs are forbidden from offering premium or priority routing within the back-end segments of the network they control (that is, the last mile to the consumer), then it will necessarily fall to the government to monitor, audit, and investigate the flow of packets across the network, if only in response to complaints by consumers of real or perceived non-neutral behavior.

Under the rules proposed in the fall, the FCC has said only that it will investigate complaints of non-neutrality on “a case-by-case basis;” under the proposed Internet Freedom Preservation Act, any consumer would have the right to complain directly to the FCC, which would be required to investigate all complaints within 90 days.

How else can the FCC determine whether some packets are being given priority in defiance of neutrality rules without intercepting at least a random subset of those packets and opening them up?

Very quickly, the enforcement of net neutrality would lead us  into the “model of always-on surveillance,” not by ISPs but, worse, by federal regulators. The opportunities for linking the FCC’s enforcement powers with “government surveillance” will be even more irresistible than if would be if the ISPs were the ones exerting the “centralized control.”

This, of course, is a worst case scenario, but that is not to say that the risk of the worst case scenario becoming reality is particularly low. Indeed, the history of FCC interference with broadcast TV content, a long a sad story that persists to this day, suggests that the worst case scenario is also the most likely.

(On enforcement, Public Knowledge says only that it “supports a Network Neutrality system that can be enforced through a simple complaint process managed by the Federal Communications Commission, where the network operator must bear the burden of demonstrating that any interference with traffic is necessary to support a lawful goal.” Simple for whom? The complainant, not the investigator.)

I agree that the U.S. has the most free and innovative Internet because we don’t have “gatekeepers in the middle of the network.” So why do groups including Public Knowledge and the CDT, who clearly understand the risks of private and–even worse–of public interference with the flow of packets, advocate so strongly in favor of neutrality rules?

Perhaps because, like Bono, they haven’t thought through the implications of their rhetoric.

Two Smoking Guns and a Cold Case

youtube logo

The copyright war just isn’t dramatic enough to warrant a good novel, let alone a big movie deal.

Consider a few recent stories from the on-going battle between content owners and consumers:

  • In October, sources reported to CNET’s Greg Sandoval that part of the document exchange between Viacom and YouTube in the on-going $1.1 billion infringement case revealed evidence that YouTube management knew about rampant uploading of copyrighted film and TV clips.  Worse, the source indicated that there was also evidence that YouTube employees were among those uploading unauthorized material.  (A YouTube spokesman responded that Sandoval’s characterizations were “wrong, misleading, or lack important context.”)

Sounds pretty exciting, doesn’t it?

Wrong.

In the Wolverine case, Sandoval reported a few days later that the man accused of uploading the stolen film, Gilberto Sanchez, had purchased a DVD “from a Korean guy on the street for five bucks.  Then I uploaded it.”  In other words, Sanchez apparently has nothing to do with the real crime—that is, whoever inside the industry managed to steal the pre-release version of the film and put it in circulation in the first place.  The real case may have gone cold.

And Viacom’s potential smoking gun was greatly undermined yesterday when it was revealed the company asked the judge in the case for permission to remove 250 of its claims of infringement.  Why?  Well, at least 100 of the removed claims involved clips that had been intentionally uploaded to YouTube by Viacom employees.  It turns out that Viacom and other content owners regularly used and continue to use YouTube to promote their programming by uploading clips and hoping they go viral.  (According to a YouTube lawyer who attended last month’s Supernova conference in San Francisco, those uploads are often done anonymously to mask the fact that the clip is a marketing effort.)

Make no mistake.  Sanchez’s uploading of the bootleg DVD of the movie still constitutes a copyright infringement.  And just because the owner of a copyright, in Viacom’s case, decides to license some of its content without receiving any royalties doesn’t in any way negate its right to pursue third parties who do the same thing without permission.

But the two stories underline that the problems of copyright in the digital age are much more complicated than the battle of good vs. evil Hollywood portrays.  Content owners continue to hide behind the rhetoric of “pirates” and “stealing,” arguing that every file share or on-line viewing, no matter how poor the quality, represents precisely one less customer paying full retail price.

The reality, clearly, is something very different.  The Wolverine movie viewed 4.1 million times was an unfinished copy, missing special effects and other elements (I didn’t see either).  It seems likely that some, perhaps most of those who watched the unfinished movie before it was released later saw the finished film in an authorized format.  It also seems likely that some who saw the unfinished movie wouldn’t have seen the real thing in any case, or were moved to see the real thing by what they saw in the pre-release.  (Think of it as a full-length trailer.)

Likewise, many who watched unauthorized YouTube clips of Viacom content may have already seen authorized versions of the same content and wanted to see it again without fussing with their DVRs or waiting for reruns, or may have been inspired by seeing a clip to start watching the program regularly.  Clearly Viacom’s marketing department thinks so, or they wouldn’t have put up at least 100,000 clips themselves.

I say “seems likely” because there’s no data to support these claims, or at best very incomplete data.  But there’s equally poor data to support the extreme view of copyright damages—that every unauthorized view is cash money out of the pocket of the content owner.

Copyright infringement in the digital age, in other words, isn’t about piracy and theft.  These cases are really about control over markets, many of which are new and emerging.  Their dynamics are still mysterious.  (Why does Viacom believe that an anonymous authorized post of a clip generate better buzz, for example, than an identified authorized post?)

Content owners shouldn’t be allowed to pursue damages—as courts often allow today and as Viacom is claiming in the YouTube case—on the theory that unauthorized uses are always destructive and always completely so.  That is, that unauthorized uses never help sales and indeed translate to fully-marked up losses.

Rather than thieves and pirates, we ought to be talking about productive and destructive uses of content.  A productive use, as I write in The Laws of Disruption, is one that adds more value to the underlying information than it takes away.  A destructive use has the opposite effect.

Media companies needn’t be so apocalyptic in their rhetoric if not their strategy when it comes to unauthorized uses, especially those (like clips and short excerpts) that inherently promote their products.

Giving up some measure of control is hard for these companies, because they believe in a binary world in which one either controls one’s content or loses everything.  I’m not sure that binary world ever existed, but in digital life it clearly doesn’t.

Indeed, in the Olden Days, the law used to recognize that copyright holders couldn’t always be trusted to license content to maximize their own best interests.  The law used to allow for short excerpts, quotes, and clips to be reproduced without permission, in the form of reviews, commentaries and parodies.

The old law was called “fair use.”  It made a lot of sense, but content owners have managed to use the courts and Congress to rob it of any real meaning.

We should really think about putting it back.

Comcast: The New Forces at Work

comcast logoMy op-ed today in The Hill (see “The Winter of Our Content,”) argues against those who want to derail the merger of Comcast and NBC Universal.  I don’t know enough to say whether the deal makes good business sense—that’s for the companies’ shareholders to decide in any case.  But I do know that every media or communications merger of the last twenty years has been resisted for the same reason—that the combined entity will both have and exercise excessive market power to the detriment of consumers.

That argument has turned out to be wrong every time.  It will be here as well.

Under the terms of the agreement, Comcast will get a 51% interest in NBC, Universal and several valuable cable channels including MSNBC and Bravo.  Comcast already owns E!, the Golf Channel, and other content, as well as being a leading provider of cable TV access, Internet access and, more recently, phone service.

A wide range of public advocacy groups have already objected that the new Comcast will be too powerful, and will have “every incentive” to keep programming it controls off the Internet, including new services such as Hulu, which is 33% owned by NBC.  Consumer groups also fear that Comcast will dismantle NBC’s broadcast network, all in the service of pushing American consumers onto paid cable TV subscriptions.

Why Comcast would want to use its leverage in the interest of only one part of its business I don’t understand.  But even if that was the goal, I very much doubt that goal would be achievable even with the new assets it will acquire.

As is typical in industries undergoing wrenching and dramatic consolidation and reallocation of assets, the urge to merge is a function of three principal forces, first introduced in my earlier book, Unleashing the Killer App. These forces—globalization, digitization, and deregulation—are themselves a function of the profound technological innovation that all of us know as consumers of devices, services, and products that didn’t exist just a few years ago.

There are several technologies involved here, including standards (the Internet protocols as well as compression and data structures for various media), software (the Web et al), hardware (faster-cheaper-smaller everything) and new forms of bit transportation, including cable, satellite, and fiber.  It’s the combination of these that makes possible the dramatic ascent of new applications—everything from Napster to YouTube to the iPhone to TiVo.  It’s why there are now hundreds if not thousands of channels of available programming, increasingly in high-definition and perhaps soon in 3D and other innovations.

With the advance of digital technology, driven by Moore’s Law and Metcalfe’s Law, all content is moving at accelerating speeds from analog to digital forms of creation, storage, and transport.  (This includes media content a well as user content—email, phone calls, home movies and photos.)  See my earlier post, “Hollywood:  We have met the enemy…”

That fundamental shift has made it easier to create global markets for content use and in turn has put pressure on regulators to open what had been highly-parochial approaches to  protecting the diversity of content.  Until very recently,  in the U.S. that diversity was represented by a whopping three choices of television programming—that of ABC, CBS, and NBC.

As globalization and digitization advance, the pressure to deregulate increases.  Caps and other artificial limitations of media ownership have been falling away over the last twenty years.  Clear rules separating who can transport data versus voice versus video make less and less sense, and have been removed.

Each of these changes has been resisted by consumer groups.  One long-forgotten change to the media industry occurred even before the rise of digital life, in the stone age of 1995.  That was the year the FCC eliminated the “financial syndication” rules, or finsyn, which had been adopted in 1970 to limit the power of the three broadcast networks.  (See Capital Cities v. FCC, 29 F.3d 309 (7th Cir.1994)).

Finsyn, among other controls, limited the ownership in prime-time programming the networks could obtain, and prohibited them from selling the programming they owned directly.  Once a program, say “Gilligan’s Island,” finished its prime-time network run, the networks could only syndicate it through third party syndicators.  The goal was to protect non-affiliated stations (mostly on the UHF band), who might not get a chance to buy syndicated programs at all if the networks kept control.  The networks might have only syndicated to their own affiliates.

Cable TV, which made the weak UHF signal stronger, along with the rise of Fox as a fourth network and independent producers who self-syndicated (particularly Paramount, which produced several made-for-syndication Star Trek series), made clear that the finsyn rules were no longer necessary.  The independent stations and consumer advocates fought to retain them anyway, and lost.

Of course we now have more diversity of programming than anyone in 1995 would have ever imagined possible.  Not because finsyn was repealed, but in spite of that fact.  Technology, left alone, achieved multiples of whatever metric regulators established for their efforts.

Those who object to the reallocation of industry assets see these deals entirely as efforts by vested interests to resist change inspired by what I called “the new forces.”  In part these deals are surely trying to hold back the flood.  They may even be motivated by the belief that consolidation translates to control.

But it never works out that way.  Consumers always get what they want, usually sooner than later, and regardless of what entrenched industry providers may or may not want.  Artificial limits on who can do what do more to hold back the technological inevitability than they do to protect consumers.

Resistance here is not only futile, it’s counter-productive.

Protecting consumers from Moore's Law: CNET

intel_logo

I write today on CNET News.com (see “FTC’s new strategy:  kick ’em when they’re down”) that the FTC’s decision yesterday to attack Intel seems oddly-timed.

Regular readers of this blog will recall that only a month ago, I wrote that Intel’s settlement of long-standing disputes with rival AMD (see “The Intel/AMD Settlement:  Watch What Happens”) was likely to mean the end of government-sponsored litigation against Intel, or at least a toning down of the rhetoric.  I was, clearly, wrong.

It’s hard to know the real background here, but piecing together bits and pieces it appears that the FTC and Intel were close to resolving issues related to how the company sells CPU chips for personal computers when, perhaps at the urging of Nvidia and other graphics processing unit makers, the FTC began looking at the GPU market as well.  Intel flinched, the FTC got mad, and filed a complaint that recites all over again the issues that appear in most of the other litigation, plus the GPU complaints.

Hell hath no fury, it seems, like a regulator scorned.

Aside from the addition of GPU complaints, there are several important differences between the FTC’s action and the rest of the pending or already-completed litigation.  Most disturbing is the proposed remedy.  Instead of money damages and fines, the FTC is proposing, should it make its case, to dramatically redesign the way Intel–and therefore the rest of the semiconductor industry–does business.  Some of the relief the agency is seeking is, truly, draconian.  Intel would be essentially run by an outside monitor, and would need to pre-approve most transactions and even advertising with the FTC.

The FTC is charged with protecting consumers from fraudulent practices–false advertising, for example, or inadequate cigarette warnings, or misleading terms in credit card applications and the like.  It’s hard to see how it has anything to offer here by way of expertise in the chip market, which only affects consumers after-the-fact.  The likelihood that the agency’s actions will help consumers seems very very low.

It’s also hard to see what the harm to consumers (harm to competitors aside) can be.  As I write in The Laws of Disruption, the continued operation of Moore’s Law means that computing power gets faster, cheaper and smaller all the time–indeed, on a predictable schedule.  The PS3 that now sells for $299 is the rough equivalent of enough early-era computers to fill the state of Washington.  Today’s cell phones have more processing power than yesterday’s supercomputers.  And so on.

Well, the FTC replies, maybe if Intel didn’t have a monopoly on PC CPUs those prices would fall even faster.  Maybe, doubtful, but in any case, don’t they have bigger problems and more broken industries to mess with?