Category Archives: Information Economics

Big Bang Launch of "Big Bang Disruption"–and a Note on Regulatory Implications

In the upcoming issue of Harvard Business Review, my colleague Paul Nunes at Accenture’s Institute for High Performance and I are publishing the first of many articles from an on-going research project on what we are calling “Big Bang Disruption.”

The project is looking at the emerging ecosystem for innovation based on disruptive technologies, following up on work we have done separately and now together over the last fifteen years.

Our chief finding is that the nature of innovation has changed dramatically, calling into question much of the conventional wisdom on business strategy and competition in information-intensive industries–which is to say, these days, every industry.

The drivers of this new ecosystem are ever-cheaper, faster, and smaller computing devices, cloud-based virtualization, crowdsourced financing, collaborative development and marketing, and the proliferation of mobile everything (including, increasingly, not just people but things).

The result is that new innovations now enter the market cheaper, better, and more customizable than products and services they challenge.  (For example, smartphone-based navigation apps versus standalone GPS devices.)  In the strategy literature, such innovation would be characterized as thoroughly “undiscplined.”  It shouldn’t succeed.  But it does.

So when the disruptor arrives and takes off with a bang, often after a series of low-cost, failed experiments, incumbents have no time for a competitive response.  The old rules for dealing with disruptive technologies, most famously from the work of Harvard’s Clayton Christensen, have become counter-productive.   If incumbents haven’t learned to read the new tea leaves ahead of time, it’s game over.

The HBR article doesn’t go into much depth on the policy implications of this new innovation model, but the book we are now writing will.  The answer should be obvious.

This radical new model for product and service introduction underscores the robustness of market behaviors that quickly and efficiently correct many transient examples of dominance, especially in high-tech markets.

As a general rule (though obviously not one without exceptions), the big bang phenomenon further weakens the case for regulatory intervention.  Market dominance is sustainable for ever-shorter periods of time, with little opportunity for incumbents to exploit it.

Quickly and efficiently, a predictable next wave of technology will likely put a quick and definitive end to any “information empires” that have formed from the last generation of technologies.

Or, at the very least, do so more quickly and more cost-effectively than alternative solutions from regulation.  The law, to paraphrase Mark Twain, will still be putting its shoes on while the big bang disruptor has spread halfway around the world.

Unfortunately, much of the contemporary literature on competition policy from legal academics is woefully ignorant of even the conventional wisdom on strategy, not to mention the engineering realities of disruptive technologies already in the market.  Looking at markets solely through the lens of legal theory is, truly, an academic exercise, one with increasingly limited real-world applications.

Indeed, we can think of many examples where legacy regulation actually makes it harder for the incumbents to adapt as quickly as necessary in order to survive the explosive arrival of a big bang disruptor.  But that is a story for another day.

Much more to come.

Related links:

Why Best Buy is Going out of Business…Gradually,”

What Makes an Idea a Meme?“,

The Five Most Disruptive Technologies at CES 2013,”


Disruptive Tecnologies and the Watchful Waiting Principle

When the smoke cleared and I found myself half caught-up on sleep, the information and sensory overload that was CES 2013 had ended.

There was a kind of split-personality to how I approached the event this year.  Monday through Wednesday was spent in conference tracks, most of all the excellent Innovation Policy Summit put together by the Consumer Electronics Association.  (Kudos again to Gary Shapiro, Michael Petricone and their team of logistics judo masters.)

The Summit has become an important annual event bringing together legislators, regulators, industry and advocates to help solidify the technology policy agenda for the coming year and, in this case, a new Congress.

I spent Thursday and Friday on the show floor, looking in particular for technologies that satisfy what I coined the The Law of Disruptionsocial, political, and economic systems change incrementally, but technology changes exponentially.

What I found, as I wrote in a long post-mortem for Forbes, is that such technologies are well-represented at CES, but are mostly found at the edges of the show–literally.

In small booths away from the mega-displays of the TV, automotive, smartphone, and computer vendors, in hospitality suites in nearby hotels, or even in sponsored and spontaneous hackathons going on around town, I found ample evidence of a new breed of innovation and innovators, whose efforts may yield nothing today or even in a year, but which could become sudden, overnight market disrupters.

Increasingly, it’s one or the other, which is saying something all by itself.  For one thing, how do incumbents compete with such all or nothing innovations?

That, however, is a subject for another day.

For now, consider again the policy implications of such dramatic transformations.  As those of us sitting in room N254 debated the finer points of software patents, IP transition, copyright reform, and the misapplication of antitrust law to fast-changing technology industries (increasingly, that means ALL industries), just a few feet away the real world was changing under our feet.

The policy conference was notably tranquil this year, without such previous hot-button topics as net neutrality, SOPA, or the lack of progress on spectrum reform to generate antagonism among the participants.  But as I wrote at the conclusion of last year’s Summit, at CES, the only law that really matters is Moore’s Law.  Technology gets faster, smaller, and cheaper, not just predictably but exponentially.

As a result, the contrast between what the regulators talk about and what the innovators do gets more dramatic every year, accentuating the figurative if not the literal distance between the policy Summit and the show floor.  I felt as if I had moved between two worlds, one that follows a dainty 19th century wind-up clock and the other that marks time using the Pebble watch, a fully-connected new timepiece funded entirely by Kickstarter.

The lesson for policymakers is sobering, and largely ignored.  Humility, caution, and a Hippocratic-like oath of first-do-no-harm are, ironically, the most useful things regulators can do if, as they repeat at shorter intervals, their true goal is to spur innovation, create jobs, and rescue American entrepreneurialism.

The new wisdom is simple, deceptively so.  Don’t intervene unless and until it’s clear that there is demonstrable harm to consumers (not competitors), that there’s a remedy for the harm that doesn’t make things, if only unintentionally, worse, and that the next batch of innovations won’t solve the problem more quickly and cheaply.

Or, as they say to new interns in the Emergency Room, “Don’t just do something.  Stand there.”

That’s a hard lesson to learn for those of us who think we’re actually surgical policy geniuses, only to find increasingly we’re working with blood-letting and leeches.  And no anesthesia.

In some ways, it’s the opposite of an approach that Adam Thierer calls the Technology Precautionary Principle.  Instead of panicking when new technologies raise new (but likely transient) issues, first try to let Moore’s Law sort it out, until and if it becomes crystal clear that it can’t.  Instead of a hasty response, opt for a delayed response.  Call it the Watchful Waiting Principle.

Not as much fun as fuming, ranting, and regulating at the first sign of chaos, of course, but far more helpful.

That, if anything, is the thread of my dispatches from Vegas, in any case:

  1. Telcos Race Toward an all-IP Future,” CNET
  2. At CES, Companies Large and Small Bash Broken Patent System, Forbes
  3. FCC, Stakeholders Align on Communications Policy—For Now,” CNET
  4. The Five Most Disruptive Technologies at CES 2013, Forbes

The Latest Leak Makes Even Clearer UN Plans to Take Over Internet Governance

On Friday evening, I posted on CNET a detailed analysis of the most recent proposal to surface from the secretive upcoming World Conference on International Telecommunications, WCIT 12.  The conference will discuss updates to a 1988 UN treaty administered by the International Telecommunications Union, and throughout the year there have been reports that both governmental and non-governmental members of the ITU have been trying to use the rewrite to put the ITU squarely in the Internet business.

The Russian federation’s proposal, which was submitted to the ITU on Nov. 13th, would explicitly bring “IP-based Networks” under the auspices of the ITU, and would in specific substantially if not completely change the role of ICANN in overseeing domain names and IP addresses.

According to the proposal, “Member States shall have the sovereign right to manage the Internet within their national territory, as well as to manage national Internet domain names.”  And a second revision, also aimed straight at the heart of today’s multi-stakeholder process, reads:  “Member States shall have equal rights in the international allocation of Internet addressing and identification resources.”

Of course the Russian Federation, along with other repressive governments, uses every opportunity to gain control over the free flow of information, and sees the Internet as it’s most formidable enemy.  Earlier this year, Prime Minister Vladimir Putin told ITU Secretary-General Hamadoun Toure that Russia was keen on the idea of “establishing international control over the Internet using the monitoring and supervisory capability of the International Telecommunications Union.”

As I point out in the CNET piece, the ITU’s claims that WCIT has nothing to do with Internet governance and that the agency itself has no stake in expanding its jurisdiction ring more hollow all the time.  Days after receiving the Russian proposal, the ITU wrote in a post on its blog that, “There have not been any proposals calling for a change from the bottom-up multistakeholder model of Internet governance to an ITU-controlled model.”

This would appear to be an outright lie, and also a contradiction of an earlier acknowledgment by Dr. Touré.  In a September interview, Toure told Bloomberg BNA that “Internet Governance as we know it today,” concerns only “Domain Names and addresses.  These are issues that we’re not talking about at all,” Touré said. “We’re not pushing that, we don’t need to.”

The BNA article continues:

Touré, expanding on his emailed remarks, told BNA that the proposals that appear to involve the ITU in internet numbering and addressing were preliminary and subject to change.

‘These are preliminary proposals,’ he said, ‘and I suspect that someone else will bring another counterproposal to this, we will analyze it and say yes, this is going beyond, and we’ll stop it.’

Another tidbit from the BNA Interview that now seems ironic:

Touré disagreed with the suggestion that numerous proposals to add a new section 3.5 to the ITRs might have the effect of expanding the treaty to internet governance.

‘That is telecommunication numbering,’ he said, something that preceded the internet. Some people, Touré added, will hijack a country code and open a phone line for pornography. ‘These are the types of things we are talking about, and they came before the internet.’

I haven’t seen all of the proposals, of course, which are technically secret.   But the Russian proposal’s most outrageous proposals are contained in a proposed new section 3A, which is titled, “IP-based Networks.”

There’s more on the ITU’s subterfuge in Friday’s CNET piece, as well as these earlier posts:

1.  “Why is the UN Trying to Take Over the Internet?”, Aug 9, 2012.

2.  “UN Agency Reassures:  We Just Want to Break the Internet, Not Take it Over,”, Oct. 1, 2012.

Praise for California passage of law protecting VoIP from local utility regulators

On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.

California now joins over twenty states that have enacted similar legislation.

The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February.  It passed both houses of the California legislature with wide bi-partisan majorities.

California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.

Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime.

Historically, as I argued in a Forbes piece last month, the imposition of public utility status on a service provider has been an extreme response to an extreme situation—a monopoly provider, unlikely to have competition because of the high cost of building  and operating competing infrastructure (so-called “natural monopoly”), offering a service that is indispensable to everyday life.

Service providers meeting that definition are transformed by PUC oversight into entities that are much closer to government agencies than private companies.  The PUC sets and modifies the utility’s pricing in excruciating detail.  PUC approval is required for each and every change or improvement to the utility’s asset base, or to add new services or retire obsolete offerings.

In exchange for offering service to all residents, utilities in turn are granted eminent domain and rights of way to lay and maintain pipes, wires and other infrastructure.

VoIP services may resemble traditional switched telephone networks, but they have none of the features of a traditional public utility.  Most do not even charge for basic service, nor do they rely on their own dedicated infrastructure.  Indeed, the reason VoIP is so much cheaper to offer than traditional telephony is that it can take advantage of the existing and ever-improving Internet as its delivery mechanism.

Because entry is cheap, VoIP providers have no monopoly, natural or otherwise.  In California, according to the FCC, residents have their choice of over 125 providers—more than enough competition to ensure market discipline.

Nor would residents be in any way helped by interposing a regulator to review and pre-approve each and every change to a VoIP provider’s service offerings.  Rather, the lightning-fast evolution of Internet services provides perhaps the worst mismatch possible for the deliberate and public processes of a local PUC.

Software developers don’t need eminent domain.

But the most serious mismatch between PUCs and VoIP providers is that there is little inherently local about VoIP offerings.  Where a case can be made for local oversight of public utilities operating extensive–even pervasive–local infrastructure, it’s hard to see what expertise a local PUC brings to the table in supervising a national or even international VoIP service.

On the other hand, it’s not hard to imagine the chaos and uncertainty VoIP providers and their customers would face if they had to satisfy fifty different state PUCs, not to mention municipal regulators and regulators in other countries.

In most cases that would mean dealing with regulators on a daily basis, on every minor aspect of a service offering.  In the typical PUC relationship, the regulator becomes the true customer and the residents mere “rate-payers” or even just “meters.”

Public utilities are not known for their constant innovation, and for good reason.

Whatever oversight VoIP providers require, local PUCs are clearly the wrong choice.  It’s no surprise, then, that SB 1161 was endorsed by major Silicon Valley trade groups, including TechNet, TechAmerica, and the Silicon Valley Leadership Group.

The law is a win for California residents and California businesses—both high-tech and otherwise.


  1. Government Control of Net is Always a Bad Idea,” CNET, June 4, 2012.
  2. Memo to Jerry Brown:  Sign SB 1161 for all Internet users,” CNET, August 30, 2012.
  3. The Madness of Regulating VoIP as a Public Utility,”, Sept. 10, 2012.
  4. Brown Endorses Hands off Stance on Internet Calls,” The San Francisco Chronicle, Sept. 28. 2012.

What Google Fiber, Gig.U and US Ignite Teach us About the Painful Cost of Legacy Regulation

On Forbes today, I have a long article on the progress being made to build gigabit Internet testbeds in the U.S., particularly by Gig.U.

Gig.U is a consortium of research universities and their surrounding communities created a year ago by Blair Levin, an Aspen Institute Fellow and, recently, the principal architect of the FCC’s National Broadband Plan.  Its goal is to work with private companies to build ultra high-speed broadband networks with sustainable business models .

Gig.U, along with Google Fiber’s Kansas City project and the White House’s recently-announced US Ignite project, spring from similar origins and have similar goals.  Their general belief is that by building ultra high-speed broadband in selected communities, consumers, developers, network operators and investors will get a clear sense of the true value of Internet speeds that are 100 times as fast as those available today through high-speed cable-based networks.  And then go build a lot more of them.

Google Fiber, for example, announced last week that it would be offering fully-symmetrical 1 Gbps connections in Kansas City, perhaps as soon as next year.  (By comparison, my home broadband service from Xfinity is 10 Mbps download and considerably slower going up.)

US Ignite is encouraging public-private partnerships to build demonstration applications that could take advantage of next generation networks and near-universal adoption.  It is also looking at the most obvious regulatory impediments at the federal level that make fiber deployments unnecessarily complicated, painfully slow, and unduly expensive.

I think these projects are encouraging signs of native entrepreneurship focused on solving a worrisome problem:  the U.S. is nearing a dangerous stalemate in its communications infrastructure.  We have the technology and scale necessary to replace much of our legacy wireline phone networks with native IP broadband.  Right now, ultra high-speed broadband is technically possible by running fiber to the home.  Indeed, Verizon’s FiOS network currently delivers 300 Mbps broadband and is available to some 15 million homes.

But the kinds of visionary applications in smart grid, classroom-free education, advanced telemedicine, high-definition video, mobile backhaul and true teleworking that would make full use of a fiber network don’t really exist yet.  Consumers (and many businesses) aren’t demanding these speeds, and Wall Street isn’t especially interested in building ahead of demand.  There’s already plenty of dark fiber deployed, the legacy of earlier speculation that so far hasn’t paid off.

So the hope is that by deploying fiber to showcase communities and encouraging the development of demonstration applications, entrepreneurs and investors will get inspired to build next generation networks.

Let’s hope they’re right.

What interests me personally about the projects, however, is what they expose about regulatory disincentives that unnecessarily and perhaps fatally retard private investment in next-generation infrastructure.  In the Forbes piece, I note almost a dozen examples from the Google Fiber development agreement where Kansas City voluntarily waived permits, fees, and plodding processes that would otherwise delay the project.  As well, in several key areas the city actually commits to cooperate and collaborate with Google Fiber to expedite and promote the project.

As Levin notes, Kansas City isn’t offering any funding or general tax breaks to Google Fiber.  But the regulatory concessions, which implicitly acknowledge the heavy burden imposed on those who want to deploy new privately-funded infrastructure (many of them the legacy of the early days of cable TV deployments), may still be enough to “change the math,” as Levin puts it, making otherwise unprofitable investments justifiable after all.

Just removing some of the regulatory debris, in other words, might itself be enough to break the stalemate that makes building next generation IP networks unprofitable today.

The regulatory cost puts a heavy thumb on the side of the scale that discourages investment.  Indeed, as fellow Forbes contributor Elise Ackerman pointed out last week, Google has explicitly said that part of what made Kansas City attractive was the lack of excessive infrastructure regulation, and the willingness and ability of the city to waive or otherwise expedite the requirements that were on the books.(Despite the city’s promises to bend over backwards for the project, she notes, there have still been expensive regulatory delays that promoted no public values.)

Particularly painful to me was testimony by Google Vice President Milo Medin, who explained why none of the California-based proposals ever had a real chance.  “Many fine California city proposals for the Google Fiber project were ultimately passed over,” he told Congress, “in part because of the regulatory complexity here brought about by [the California Environmental Quality Act] and other rules. Other states have equivalent processes in place to protect the environment without causing such harm to business processes, and therefore create incentives for new services to be deployed there instead.”


This is a crucial insight.  Our next-generation communications infrastructure will surely come, when it does come, from private investment.  The National Broadband Plan estimated it would take $350 billion to get 100 Mbps Internet to 100 million Americans through a combination of fiber, cable, satellite and high-speed mobile networks.  Mindful of reality, however, the plan didn’t even bother to consider the possibility of full or even significant taxpayer funding to reach that goal.

Unlike South Korea, we aren’t geographically-small, with a largely urban population living in just a few cities.  We don’t have a largely- nationalized and taxpayer-subsidized communications infrastructure.   On a per-person basis, deploying broadband in the U.S. is much harder, complicated and more expensive than it is in many competing nations in the global economy.

Of course, nationwide fiber and mobile deployments by network operators including Verizon and AT&T can’t rely on gimmicks like Google Fiber’s hugely successful competition, where 1,100 communities applied to become a test site.  Nor can they, like Gig.U, cherry-pick research university towns, which have the most attractive demographics and density to start with.  Nor can they simply call themselves start-ups and negotiate the kind of freedom from regulation that Google and Gig.U’s membership can.

Large-scale network operators need to build, if not everywhere, than to an awful lot of somewheres.  That’s a political reality of their size and operating model, as well as the multi-layer regulatory environment in which they must operate.  And it’s a necessity of meeting the ambitious goal of near-universal high-speed broadband access, and of many of the applications that would use it.

Under the current regulatory and economic climate, large-scale fiber deployment has all but stopped for now.  Given the long lead-time for new construction, we need to find ways to restart it.

So everyone who agrees that gigabit Internet is a critical element in U.S. competitiveness in the next decade or so ought to look closely at the lessons, intended or otherwise, of the various testbed projects.  They are exposing in stark detail a dangerous and useless legacy of multi-level regulation that makes essential private infrastructure investment economically impossible.

Don’t get me wrong.  The demonstration projects and testbeds are great.  Google Fiber, Gig.U, and US Ignite are all valuable efforts.  But if we want to overcome our “strategic bandwidth deficit,” we’ll need something more fundamental than high-profile projects and demonstration applications.  To start with, we’ll need a serious housecleaning of legacy regulation at the federal, state, and local level.

Regulatory reform might not be as sexy as gigabit Internet demonstrations, but the latter ultimately won’t make much difference without the former.  Time to break out the heavy demolition equipment—for both.

The federal Government Plays the Spectrum Shell Game

On CNET today, I’ve posted a long critique of the recent report by the President’s Council of Advisors on Science and Technology (PCAST) urging the White House to reverse course on a two-year old order to free up more spectrum for mobile users.

In 2010, soon after the FCC’s National Broadband Plan raised alarms about the need for more spectrum for an explosion in mobile broadband use, President Obama issued a Memorandum ordering federal agencies to free up as much as 500 MHz. of radio frequencies currently assigned to them.

After a great deal of dawdling, the National Telecommunications and Information Administration, which oversees spectrum assignments within the federal government, issued a report earlier this year that seemed to offer progress.  95 MHz. of very attractive spectrum could in fact be cleared in the ten years called for by the White House.

But reading between the lines, it was clear that the 20 agencies involved in the plan had no serious intention of cooperating.  Their cost estimates (which were simply reported by NTIA without any indication of how they’d been arrived at or even whether NTIA had been given any details) for relocation appeared to be based on an amount that would make any move economically impossible.

And the NTIA’s suggestion that some of the bands could be “shared” sounded appealing until the details revealed that the feds would place impossible conditions on that sharing.

In the end, the NTIA report was 200 pages of classic smoke-and-mirrors from an entrenched bureaucracy that is expert at avoiding change.

The PCAST report seemed to throw in the cards and accept the political reality that actual spectrum clearing in the federal bands would never happen.  Instead, the President’s advisors doubled down on “sharing,” and called  for a new “Spectrum Access System” that would be based on sharing technologies it admitted don’t exist yet.

SAS might be a better system in the long-term, but current technical and political limitations make such a system impractical. I argue in the piece that the NTIA and PCAST reports are just providing cover for federal agencies, notably the DoD and Justice, to avoid actually having to follow the President’s order and take aggressive steps to free up spectrum that is needed now.  Whether this is intentional or not I leave to more savvy tea-leaf readers.