Category Archives: Information Economics

Rescuing Mobile Broadband from Washington's Deadly Embrace

 (Adapted from Bloomberg BNA Daily Report for Executives, May 16th, 2012.)

Two years ago, the Federal Communications Commission’s National Broadband Plan raised alarms about the future of mobile broadband. Given unprecedented increases in consumer demand for new devices and new services, the agency said, network operators would need far more radio frequency assigned to them, and soon. Without additional spectrum, the report noted ominously, mobile networks could grind to a halt, hitting a wall as soon as 2015.

That’s one reason President Obama used last year’s State of the Union address to renew calls for the FCC and the National Telecommunications and Information Administration (NTIA) to take bold action, and to do so quickly. The White House, after all, had set an ambitious goal of making mobile broadband available to 98 percent of all Americans by 2016. To support that objective, the president told the agencies to identify quickly an additional 500 MHz of spectrum for mobile networks.

By auctioning that spectrum to network operators, the president noted, the deficit could be reduced by nearly $10 billion. That way, the Internet economy could not only be accelerated, but taxpayers would actually save money in the process.

A good plan. So how is it working out?

Unfortunately, the short answer is:  Not well.  Speaking this week at the annual meeting of the mobile trade group CTIA, FCC Chairman Julius Genachowski had to acknowledge the sad truth:  “the overall amount of spectrum available has not changed, except for steps we’re taking to
add new spectrum on the market.”

The tortured grammar (how can “steps we’re taking to add new spectrum” constitute an exception to the statement that the amount of available spectrum “has not change”) belies the reality here—all the FCC Chairman can do is promise more spectrum sometime in the vague future.  For now, the FCC and the NTIA have put almost no new spectrum into actual use.  Instead,  the two agencies have piled up a depressing list of delays, scandals, and wasted opportunities. Consider just a few:

– NTIA’s long-overdue report on freeing up government spectrum identified nearly 100 MHz of frequencies that could be reallocated for mobile broadband. But the 20 agencies involved in the study demanded 10 years and nearly $18 billion to vacate the spectrum—and insist on moving to frequencies that are already assigned to other public or private license holders. An available 20 MHz of unassigned frequency, left over from the 2009 conversion to digital TV, was actually added to the government’s supply when it was set aside  this year for a dedicated public safety network

– After years of wrangling with Congress, the FCC finally won limited authority to hold “voluntary incentive auctions” for spectrum currently licensed to over-the-air television broadcasters. But those auctions will take years to complete, and a decided lack of enthusiasm by broadcasters doesn’t portend well for the outcome.  As for reducing the deficit, the agency has reserved its right to disqualify bidders it believes already hold more spectrum than the agency thinks best to stimulate competition, even without any measurable signs of market failure. (Voice, data, and text prices continue to decline, according to the FCC’s own data.

– LightSquared’s efforts to reallocate satellite spectrum for use in a competitive new mobile broadband network were crippled—perhaps fatally–by concerns raised by the Department of Defense and others over potential interference with some global positioning system (GPS) devices.  Initial permission to proceed was swiftly revoked–after the company had invested billions.  The FCC’s procedural blunders in the LightSquared case ignited a political scandal that continues to distract the agency. A similar effort by Dish Networks is now being put through the full set of administrative hurdles, delayed at least until after the election..

– Transactions in the secondary spectrum markets—long the only real source of supply for mobile network operators–have received an increasingly frosty reception. Last year, AT&T’s planned merger with T-Mobile USA was scuttled on the basis of dubious antitrust concerns the FCC backed up with data that was clumsily rigged by agency staff.  Now, the agency has expanded its review of Verizon’s efforts to buy spectrum from a consortium of cable companies—spectrum that currently isn’t being used for anything.

– After the FCC mandated data roaming agreements even for carriers who hold spectrum in the same markets, Sprint announced it would stop serving customers with its own network in two metropolitan areas, piggybacking instead on AT&T’s band-new LTE facilities. Sprint’s move underscores concerns that mandatory roaming will reduce incentives for carriers to invest in infrastructure. According to the FCC, mobile industry investments have reached nearly 15 percent of total revenue in recent years. Of the leading providers, only Sprint decreased its investments during the recession.

Not an impressive showing, to say the least.  Meanwhile, in the real world, demand for mobile broadband continues to mushroom. Network usage has increased as much as 8,000%  since 2007, when Apple’s iPhone first hit the market. It was followed by an explosion of new devices, operating systems, and software apps from a cottage industry of developers large and small. This remarkable ecosystem is driving lightning-fast adoption of mobile services, especially bandwidth-intense video apps.

The mobile broadband ecosystem is one of the few bright spots in the sour economy, creating jobs and generating tax revenues. Makers of tablet computers, for example, expect to sell over 100 million units this year alone. Tablet users, by the way, already rely on the wildly popular devices for 15 percent of their TV viewing, raising the demand for high-bandwidth video services on existing mobile broadband networks.

Spectrum is the principal fuel of these fast-growing mobile applications. So FCC Chairman Julius Genachowski is right to repeatedly emphasize the catastrophic consequences of an imminent “spectrum crunch.”  The FCC is leading the chorus of doomsayers who believe that without more spectrum—and soon—our  mobile revolution will never reach its full economic, educational, and social potential.

But the government has done nothing to head off that disaster. Instead, the FCC, the NTIA, and the Obama administration continue to make policy choices that do little to get more spectrum into the system. If anything, we’re moving backwards.

Many of these decisions appear to be driven by short-term political imperatives, overriding the worthy  goal of making mobile broadband available to all Americans as quickly as possible. The AT&T/T-Mobile deal, for example, was killed simply because the FCC didn’t like the idea of taking even a failing carrier out of the competitive equation. Yet had the deal been approved, AT&T committed to deploy mobile broadband to 95 percent of all Americans—nearly meeting the president’s goal in a single stroke.

This is nothing new. The FCC has a very long and very messy history of using its spectrum management powers to shape emerging markets, and to pick winners and losers among new technologies,  applications, and providers.  Their guiding principle for nearly 100 years has been the so-called “public interest” standard—an undefined and highly-malleable policy tool the FCC employs like a bludgeon.

The era of micromanaging the airwaves by federal fiat must now end once and for all. For first time in a century of federal stewardship, there is almost no inventory of usable spectrum. It has all been allocated to some 50,000 public and private license holders, each the one-time favorite of the FCC. Our spectrum frontier has closed.  And it wouldn’t have closed so soon if the FCC hadn’t remained so determined to manage a 21st century resource as if it were still the 19th century.

Technology may come to our rescue, at least in part. Hardware and software for sharing spectrum, switching frequencies, and maximizing the technical properties of different bandwidths continue to be part of the innovation agenda of the mobile industry. But it is unlikely these developments will be enough to keep spectrum supply even slightly ahead of unbridled consumer demand. Many of these technologies, in any case, still require FCC approval to be deployed. That means even more delays.

Saving the mobile ecosystem–and making way for the next generation of mobile innovation–demands a bold new strategy. For starters, it is time to stage an intervention for  federal agencies hoarding spectrum. Private licensees who no longer need the spectrum they have must be able to sell their rights quickly in a working market, and be prodded when needed to do so. Buyers need the freedom to repurpose spectrum to new uses.

Also, we need to increase incentives for network operators to continue investing in better and more efficient infrastructure, not throw cold water on them in the name of a vague and largely undefined public interest.   The number of competitors isn’t what matters.  It’s the ability of consumers to get what they want at prices that, at least up until now, continue to decline.

In short, we need to take the FCC out of the middle of every transaction and each innovation, slowing Silicon Valley-paced markets down to Washington speed.

With the appetite of mobile consumers growing more voracious, it is long past time for Congress to take a cold, sober look at our obsolete system for spectrum management and the antiquated agency that can’t stop fussing over it. We need a new system, if not a new FCC. That’s the only way to keep the mobile frontier booming, let alone meet the admirable goal of providing a homestead there for every American.

The Closing of the Spectrum Frontier

On Fierce Mobile IT, I’ve posted a detailed analysis of the NTIA’s recent report on government spectrum holdings in the 1755-1850 MHz. range and the possibility of freeing up some or all of it for mobile broadband users.

The report follows from a 2010 White House directive issued shortly after the FCC’s National Broadband Plan was published, in which the FCC raised the alarm of an imminent “spectrum crunch” for mobile users.

By the FCC’s estimates, mobile broadband will need an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020, in order to satisfy increases in demand that have only amped up since the report was issued.  So far, only a small amount of additional spectrum has been allocated.  Increasingly, the FCC appears rudderless in efforts to supply the rest, and to do so in time.

It’s not entirely their fault.  At the core of the problem, the FCC is simply not constituted to resolve this increasingly urgent crisis.  That’s because, as I write in the article, the management of radio frequencies has entered new and unchartered territory.

For the first time since the FCC and its predecessor agencies began licensing spectrum nearly 100 years ago, there is no unassigned spectrum available, or at least none of which current technology can make effective use.

The spectrum frontier is now closed.  But the FCC, as created by Congress, is an agency that only functions at all on the frontier.

So it’s worth remembering what happened a hundred years earlier, when a young historian named Frederick Jackson Turner showed up at the 1893 annual meeting of the American Historical Association to present his paper on “The Significance of the Frontier in American History.”

The meeting took place that year on the grounds of the World’s Columbian Exposition in Chicago.  The weather was unspeakably hot, and Turner’s talk was poorly attended.  (The President of the AHA, Henry Adams, was in attendance but appears not to have heard Turner’s talk or ever to have read the paper—he was meditating in the Hall of Turbines, as he wrote in his autobiography, “The Education of Henry Adams,” having a nervous breakdown.)   But the paper has had an outsized and long-lasting impact, launching the field of western or frontier history.

Turner’s thesis was simple and unassailable.  Citing census data that showed there was no longer a recognizable line of American territory beyond which there was no settlement, Turner declared that by 1890 the frontier had “closed.”  The era of seemingly endless supplies of readily-available cheap land, dispensed for free or for nominal cost by the federal government, had come to an end.

For Turner, the history of the west was the history of the American experience.  And the defining feature of American life—shaping its laws, customs, culture and economy–had disappeared.  A new phase, with new rules, was beginning.

 

The FCC Only Functions, When it Functions at All, on the Frontier

Our problem, at least, is equally easy to describe.  The FCC, as created by Congress, is an agency that only functions, when it functions at all, on the frontier.

All the talk of “spectrum crunch” boils down to a simple but devastating fact:  it’s no longer possible to add capacity to existing mobile networks by assigning them unused ranges of radio frequencies.  While technology continues to expand the definition of “usable” frequencies, demand for mobile broadband is increasing faster than our ability to create new supply.

We need more spectrum.  And the only way to put more spectrum to use for the insatiable demands of mobile consumers is to reallocate spectrum that has already been licensed to someone else.

In the American west, reallocation of land was easy.  Land grants were given with full legal title, and holders were under no lasting obligation to use their land for any specific purpose or in any particular way.

The various acts of Congress that authorized the grants were intended to foster important social values—populating the frontier, developing agriculture, compensating freed slaves, building the railroads.  But those intentions were never translated into the kind of limited estates that plagued modern Europe after the feudal age came to an end.  (For a good example of the mischief a conditional estate can cause hundreds of years later, watch “Downton Abbey.”  Watch it even if you don’t want to see an example of inflexible estate law.)

Speculators sold to farmers, farmers to ranchers, ranchers to railroads and miners and oil drillers, and from there to developers of towns and other permanent settlements.  The market established the transfer price, and the government stood behind the change of title and its enforcement, where necessary.  Which was rarely.

So the closing of the western frontier, while it changed the nature of settlement in the American west, never threatened to bring future development to a screeching halt.

 

Reallocation Options are Few and Far Between

Unfortunately, spectrum licensing has never followed a property model, even though one was first proposed by Ronald Coase as early as 1959.  Under the FCC’s command-and-control model, spectrum assignments have historically been made to foster new technologies or new applications the FCC deems to be of good potential to advance national interests.  Spectrum has been licensed, usually at no or nominal cost to the licensor, for particular uses, with special conditions (often unrelated) attached.

In theory, of course, the FCC could begin revoking the licenses of public and private users who aren’t using the spectrum they already have, or who aren’t using it effectively or, to use the legal term of art, “in the public interest.”  Legally and politically, however, revoking (or even refusing to renew) licenses is a non-starter.

Consequently, the most disastrous side-effect of the “public interest” approach to licensing has been that when old technologies grow obsolete, there is no efficient way to reclaim the spectrum for new or more valuable uses.  The FCC must by law approve any transfer of an existing license on the secondary market, slowing the process at best and creating an opportunity to introduce new criteria and new conditions for the transfer at worst.

Even when the agency approves a transfer, the limitations on use and the existing conditions of the original licensor apply in full force to the new user.  That means that specific ranges of spectrum more-or-less arbitrarily set aside for a particular application remains forever set aside for that application, unless and until the FCC undertakes a rulemaking to reassign it.

That also takes time and effort, and offers the chance for new regulatory mischief.  (Only since 1999, the FCC has had the power, under limited circumstances, to grant flexible use licenses.  The power cannot be applied retroactively to existing licenses.)

With the spectrum frontier closed, mobile broadband providers must find additional capacity from existing license holders.  But because of the use restrictions and conditions, the universe of potential acquisition targets immediately and drastically shrinks to those making similar use of their licenses–that is, to current competitors.

So it’s no surprise that since 2005, as mobile use has exploded with the advent of 2G, 3G, and now 4G networks, the FCC has been called upon to approve over a dozen significant transfers within the mobile industry, including Sprint/Nextel, Verizon/Alltel, and Sprint Nextel/Clearwire.  Indeed, expanding capacity through merger seemed to be the agency’s preferred solution, and the one that required the least amount of time and effort.

But with the rejection last year of AT&T’s proposed merger with T-Mobile USA, the FCC has signaled that it no longer sees such transactions as a preferred or perhaps even potential avenue for acquiring additional capacity.  At least not for AT&T–and perhaps as well for Verizon, which is currently fighting to acquire unused spectrum held by a consortium of cable providers.

What other avenues are left?  With the approval of “voluntary incentive auction” legislation earlier this year, the FCC can now begin the process of gently coercing over-the-air television broadcasters to give up some or all of their licensed capacity in exchange for a share of the proceeds of any auctions the agency conducts to repurpose that spectrum for mobile broadband.

(Broadcast television seems the obvious place to start freeing up spectrum.  With the transition to digital TV, every station was given a 6 MHz. allocation in the 700 MHz. range.  But over-the-air viewership has collapsed to as few as 10% of homes in favor of cable and fiber systems, which today reach nearly every home in the country and offer far greater selection and services.  Many local broadcasters remain in business largely through the regulatory arbitrage of the FCC’s retransmission consent and must-carry rules.)

Those auctions will likely take years to complete, however, and the agency and Congress have already fallen out over how and how much the agency can “shape” the outcomes of these future auctions by disqualifying bidders who the agency feels already have too high a concentration of existing licenses.

And it’s far from clear that the broadcasters will be in any hurry to sign up, or that enough of them will to make the auctions worthwhile.  Participation is, at least so far, entirely voluntary.  Just getting Congress to agree to give the FCC even limited new auction authority took years.

There’s also the possibility of reassigning other kinds of spectrum to mobile use—increasing the pool of usable spectrum allocated to mobile, in other words.  That option, however, has also failed to produce results.  For example, the FCC initially gave start-up LightSquared a waiver that would allow it to repurpose unused spectrum allocated for satellite use for a new satellite and terrestrial-based LTE network.

But after concerns were raised by the Department of Defense and the GPS device industry about possible interference, the waiver was revoked and the company now stands on the brink of bankruptcy.  (Allegations of political favoritism in the granting of the waiver are holding up the nominations of two FCC commissioners.)

So when Dish Networks recently asked for a similar waiver, the agency traded speed and flexibility for the relative safety of  full process.  The FCC has now published a formal Notice of Proposed Rulemaking to evaluate the request.  If the rulemaking is approved, Dish will be able to repurpose satellite spectrum for a terrestrial mobile broadband network (possibly a wholesale network, rather than a new competitor).  That, of course, will take time.  And given enough time, anything can and will happen.

Finally, there’s the potential to free up unused or underutilized spectrum currently licensed to the federal government, one of the largest holders of usable spectrum and a notoriously poor manager of this valuable resource.

That was the subject of the NTIA’s recent report, which seemed to suggest that the high-priority 1755-1850 MHz. range (internationally targeted for mobile users) could be cleared of government users within ten years—some in five years, and in some cases, with possible sharing of public and private use during a transitional phase.

But as I point out in the article, the details behind that encouraging headline suggest rather that some if not all of the twenty agencies who currently hold some 1,300 assignments in this band are in no hurry to vacate it.  Having paid nothing for their allocations and with no option to get future auction proceeds earmarked to their agency, the feds have little incentive to do so.  (NTIA can’t make them do much of anything.)  The offer to share may in fact be a stalling tactic to ensure they never actually have to vacate the frequencies.

 

What’s Left?  Perhaps Nothing, at Least as Far as the FCC is Concerned

The color-coded map of current assignments is so complicated it can’t actually be read at all except on very large screens.  There are currently some 50,000 active licenses.  The agency still doesn’t even have a working inventory of them.  This is the legacy of the FCC’s command-and-control approach to spectrum allocation over nearly 100 years.

Almost everyone agrees that even with advances in hardware and software that make spectrum usage and sharing more efficient, large quantities of additional spectrum must be allocated soon if we want to keep the mobile ecosystem healthy and the mobile revolution in full and glorious swing.

With the closing of the spectrum frontier, the easy solutions have all been extinguished.  And the century-long licensing regime, which tolerated tremendous inefficiency and waste when spectrum was cheap, has left the FCC, the NTIA, the mobile industry and consumers dangerously hamstrung in finding alternative methods to meet demand.  Existing spectrum, by and large, can’t be repurposed even when everyone involved wants to do so and where the market would easily catalyze mutually-beneficial transactions.

Given the law as it stands and the FCC’s current policy choices, carriers can’t get spectrum from outside the mobile industry, nor can they get it from their competitors.  They can’t get it from the government, and may not be allowed to participate in future auctions of spectrum agonizingly pried loose from broadcasters who aren’t using what they have cost-effectively—assuming those auctions ever take place.  They also can’t put up more towers and antennae to make better use of what they have, thanks to the foot-dragging and NIMBY policies of local zoning authorities.

And even when network operators do get more usable spectrum, it comes burdened with inflexible use limits and unrelated conditions that attach like barnacles at every stage of the process—from assignment to auction to transfer—and which require regular reporting, oversight, and supervision by the FCC.

 

A New Approach to Spectrum Management–Following an Old Model that Worked

The frontier system for spectrum management is hopelessly and dangerously broken.  It cannot be repaired.  For the mobile broadband economy to continue its remarkable development (one bright spot throughout the sour economy), Congress and the FCC must transition quickly to a new model that makes sense in a world without a spectrum frontier.

That model would look much more like the 19th century system of federal land management than the FCC’s legacy command-and-control system.  The new approach would start by taking the FCC out of the middle of every transaction, and leave to the market to determine the best and highest use of our limited range of usable frequencies.  It would treat licenses as transferable property, just like federal land grants in the 18th and 19th centuries.

It would leave to the market—with the legal system as backup—to work out problems of interference, just as the common law courts have stood as backup for land disputes.

And it would deal with any genuine problems of over-concentration (that is, those that cause demonstrable harm to consumers) through modern principles of antitrust applied by the Department of Justice, not the squishy and undefined “public interest” non-standard of the FCC.  It would correct problems once it was clear the market had failed to do so, not short-circuit the market at the first hint of theoretical trouble.  (Hello, net neutrality rules.)

That’s the system, according to Frederick Jackson Turner, that formed American culture and values, shaped American law and provided the fuel to create the engine of capitalism.

For starters.

On LightSquared and Dish Networks Use of Spectrum: What Would Ronald Coase Do?

 

On CNET today, I have a longish post on the FCC’s continued machinations over LightSquared and Dish Networks respective efforts to use existing satellite spectrum to build terrestrial mobile broadband networks. Both companies plan to build 4G LTE networks; LightSquared has already spent $4 billion in build-out for its network, which it plans to offer wholesale.

After first granting and then, a year later, revoking LightSquared’s waiver to repurpose its satellite spectrum, the agency has taken a more conservative (albeit slower) course with Dish. Yesterday, the agency initiated a Notice of Proposed Rulemaking that would, if adopted, assign flexible use rights to about 40 Mhz. of MSS spectrum licensed to Dish.

Current allocations of spectrum have little to do with the technical characteristics of different bands. That existing licenses limit Dish and LightSquared to satellite applications, for example, is simply an artifact of more-or-less random carve-outs to the absurdly complicated spectrum map managed by the agency since 1934. Advances in technology makes it possible to successfully use many different bands for many different purposes.

But the legacy of the FCC’s command-and-control model for allocations to favor “new” services (new, that is, until they are made obsolete in later years or decades) and shape competition to its changing whims is a confusing and unnecessary pile-up of limitations and conditions that severely and artificially limit the ways in which spectrum can be redeployed as technology and consumer demands change. Today, the FCC sits squarely in the middle of each of over 50,000 licenses, a huge bottleneck that is making the imminent spectrum crisis in mobile broadband even worse.

Even with the best of intentions, the agency can’t possibly continue to micromanage the map. And, as the LightSquared and Dish stroies demonstrate yet again, the risk of agency capture and political pressure often mean the agency doesn’t do the right thing even when it does act.

Who would be the more efficient and neutral regulator? According to Nobel Prize-winning economist Ronald Coase’s seminal 1959 article, “The Federal Communications Commission,” the answer is the market. In his trademark straight-forward, common sense style, Coase elegantly dismantles the idea that scarce spectrum resources demand a non-market solution of government management.

For one thing, Coase demonstrates how screwed up the system already was over fifty years ago. There’s little doubt that the problems he describes have only gotten worse with time and increased demand on the airwaves by insatiable consumers.

Instead, Coase proposed to treat spectrum like any other industry input–as property. The FCC, he said, should auction spectrum rights to the highest bidder, without licenses, conditions, or limitations on use, and then stand back. (He acknowledged the risk of antitrust problems, but, as in any industry, such problems could be addressed by antitrust regulators and not the FCC.) Spectrum rights would efficiently change hands when new applications and devices created higher-value uses.

Potential interference problems–such as those raised by GPS device manufacturers in the case of LightSquared–would be resolved precisely as they are in other property contexts. Without an FCC to run to, the parties would be forced to negotiate against a backdrop of established liability rules and a safety net of potential litigation. Indeed, LightSquared and GPS offer a classic example of Coase’s later work demonstrating that regardlesss of how property is initially allocated, liability rules ensure that parties will bargain to the most socially-efficient solution to interference.

Of course we’ll never know if the socially-optimal solution here is for LightSquared to protect GPS devices from receiving its signal or for device manufacturers to change their designs to stay out of LightSquared’s bands. The heavy hand of the regulator has foreclosed a market solution, or even an attempt at negotiations.

Instead, we have the disaster of the FCC’s decision in January, 2011 to grant a conditional waiver to LightSquared and then, last month, indefinitely revoking it. Meanwhile, LightSqaured spent $4 billion on infrastructure it may never use, and lost its CEO and key customers including Sprint. No one is happy, and no one reasonably argue this is was an optimal outcome, or even close to one.

For Dish, the NPRM will ensure a more orderly process, but at the cost of months or perhaps longer delay before Dish can begin building its terrestrial network. And in the interim, all sorts of irrelevant issues may interfere with the orderly (and expensive) resolution.

When Coase proposed a property model for spectrum in 1959, the idea was considered too radical. Congress and the FCC have, slowly but surely, taken pieces of the proposal to heart, introducing auctions (but not property rights) in the 1990’s. Yesterday’s NPRM takes a small step toward more flexible use licenses, but this may be too little reform too late. We have all the evidence we need that micromanagement of spectrum can’t possibly keep up with the pace of innovation. Time to try a new, fifty year old, approach.

Updates to the Media Page

2012 is off to a fast start, and we’re trying hard just to keep up. We’ve already added over thirty posts to the Media Page, including articles, radio and television interviews, and quotes in a wide range of online and offline publications. There are several video and audio clips for your enjoyment.

The year began with two big stories: the successful fight to halt efforts for quick passage of SOPA and PIPA, two bills that would have added dangerous new legal remedies for government and private parties to tinker with the underlying engineering of the Internet in a fool-hardy effort to curb unlicensed copying by consumers. Larry was front and center, making several trips to Washington to speak with Members of Congress urging them to reconsider the bills, and reported as well from the annual Consumer Electronics Show in Las Vegas, where the tide turned definitively against the bills.  Larry’s work, including a controversial article for Forbes on “Who Really Stopped SOPA, and Why,” was cited in publications as varied as The National Review, Aljazeera, The National Post, TechCrunch, Techdirt, AdWeek and a radio interview on WHYY’s “Radio Times.”

The second big story was Larry’s barn-burning article for Forbes on the failure of electronics retailer Best Buy to adapt to changing market and technology dynamics.  The original article how has nearly 3,000,000 pageviews, and set off a firestorm of response both positive and negative. The article spawned several follow-up pieces on Forbes, as well as extensive coverage nearly everywhere else, including  The Financial Times, The Wall Street Journal, The New York Times, TechCrunch, Slashdot, MetaFilter, Reddit, The Huffington Post, The Motley Fool, MSN Money and CNN. Some of these articles generated thousands of user comments, in addition to over a thousand that appeared on Forbes.

With the SOPA and PIPA fights temporarily on hold, Larry pivoted back to other important innovation and policy matters, including reform of the FCC’s troubled Lifeline program, Internet privacy,  and a fight over future spectrum auctions vital to the future of mobile broadband.  Look for articles in CNET and Forbes as well as interviews in U.S. News, This Week in Law, The Los Angeles Times, The Hill, WebProNews and the Heartland Institute.

Stayed tuned!  It’s going to be an exciting year.

What Makes an Idea a Meme?

Ceci c'est un meme.

On Forbes today, I look at the phenomenon of memes in the legal and economic context, using my now notorious “Best Buy” post as an example. Along the way, I talk antitrust, copyright, trademark, network effects, Robert Metcalfe and Ronald Coase.

It’s now been a month and a half since I wrote that electronics retailer Best Buy was going out of business…gradually.  The post, a preview of an article and future book that I’ve been researching on-and-off for the last year, continues to have a life of its own.

Commentary about the post has appeared in online and offline publications, including The Financial Times, The Wall Street Journal, The New York Times, TechCrunch, Slashdot, MetaFilter, Reddit, The Huffington Post, The Motley Fool, and CNN. Some of these articles generated hundreds of user comments, in addition to those that appeared here at Forbes.

(I was also interviewed by a variety of news sources, including TechCrunch’s Andrew Keen.)

Today, the original post hit another milestone, passing 2.9 million page views.

Watching the article move through the Internet, I’ve gotten a first-hand lesson in how network effects can generate real value.

Network effects are an economic principle that suggests certain goods and services experience increasing returns to scale.  That means the more users a particular product or service has, the more valuable the product becomes and the more rapidly its overall value increases.  A barrel of oil, like many commodity goods, does not experience network effects – only one person can own it at a time, and once it’s been burned, it’s gone.

In sharp contrast, the value of networked goods increase in value as they are consumed.  Indeed, the more they are used, the faster the increase–generating a kind of momentum or gravitational pull.  As Robert Metcalfe, founder of 3Com and co-inventor of Ethernet explained it, the value of a network can be plotted as the square of the number of connected users or devices—a curve that approaches infinity until most everything that can be connected already is.  George Gilder called that formula “Metcalfe’s Law.”

Since information can be used simultaneously by everyone and never gets used up, nearly all information products can be the beneficiaries of network effects.  Standards are the obvious example.  TCP/IP, the basic protocol that governs interactions between computers connected to the Internet, started out humbly as an information exchange standard for government and research university users.  But in part because it was non-proprietary and therefore free for anyone to use without permission or licensing fees, it spread from public to private sector users, slowly at first but over time at accelerating rates.

Gradually, then suddenly, TCP/IP became, in effect, a least common denominator standard by which otherwise incompatible systems could share information.  As momentum grew, TCP/IP and related protocols overtook and replaced better-marketed and more robust standards, including IBM’s SNA and DEC’s DECnet.  These proprietary standards, artificially limited to the devices of a particular manufacturer, couldn’t spread as quickly or as smoothly as TCP/IP.

From computing applications, Internet standards spread even faster, taking over switched telephone networks (Voice over IP), television (over-the-top services such as YouTube and Hulu), radio (Pandora, Spotify)—you name it.

Today the TCP/IP family of protocols, still free-of-charge, is the de facto global standard for information exchange, the lynchpin of the Internet revolution.  The standards continue to improve, thanks to the largely-voluntary efforts of The Internet Society and its virtual engineering task forces.  They’re the best example I know of network effects in action, and they’ve created both a platform and a blueprint for other networked goods that make use of the standards.

Beyond standards, network effects are natural features of other information products including software.  Since the marginal cost of a copy is low (essentially free in the post-media days of Web-based distribution and cloud services), establishing market share can happen at relatively low cost.  Once a piece of software—Microsoft Windows, AOL instant messenger in the old days, Facebook and Twitter more recently—starts ramping up the curve, it gains considerable momentum, which may be all it takes to beat out a rival or displace an older leader.  At saturation, a software product becomes, in essence, the standard.

From a legal standpoint, unfortunately, market saturation begins to resemble an illegal monopoly, especially when viewed through the lens of industrial age ideas about markets and competition.  (That, of course, is the lens that even 21st century regulators still use.)  But what legal academics, notably Columbia’s Tim Wu, misunderstand about this phenomenon is that such products have a relatively short life-cycle of dominating.  These “information empires,” as Wu calls them, are short-lived, but not, as Wu argues, because regulators cut them down.

Even without government intervention, information products are replaced at accelerating speeds by new disruptors relying on new (or greatly improved) technologies, themselves the beneficiaries of network effects.  The actual need for legal intervention is rare.  Panicked interference with the natural cycle, on the other hand, results in unintended consequences that damage emerging markets rather than correcting them.  Distracted by lingering antitrust battles at home and abroad, Microsoft lost momentum in the last decade.  No consumer benefited from that “remedy.”

For more, see “What Makes an Idea a Meme?” on Forbes.

 

How the SOPA Fight Was Won…For Now

On Forbes yesterday, I posted a detailed analysis of the successful (so far) fight to block quick passage of the Protect-IP Act (PIPA) and the Stop Online Piracy Act (SOPA). (See “Who Really Stopped SOPA, and Why?“) I’m delighted that the article, despite its length, has gotten such positive response.

As regular readers know, I’ve been following these bills closely from the beginning, and made several trips to Capitol Hill to urge lawmakers to think more carefully about some of the more half-baked provisions.

But beyond traditional advocacy–of which there was a great deal–something remarkable happened in the last several months. A new, self-organizing protest movement emerged on the Internet, using social news and social networking tools including Reddit, Tumblr, Facebook and Twitter to stage virtual teach-ins, sit-ins, boycotts, and other protests. Continue reading