Category Archives: Information Economics

How Should Governments Approach Technology Policy? Modestly.

Earlier this week, I spoke in San Francisco at an event co-sponsored by the Reason Foundation, TechFreedom, and the Koch Institute.  Hosted by my CNET colleague Declan McCullagh, the topic was “DC vs. SF:  A Clash of Visions for Tech Policy.”

The discussion ranged widely, from copyrights and patents to NSA surveillance to the failure of the government’s Healthcare.gov website.  Although panelists from the political left and right disagreed on some issues, there was as usual widespread consensus that from the standpoint of entrepreneurs and engineers, the core problem in technology policy is that the pace of change for innovation continues to accelerate while the pace of government, at best, remains constant.  Moore’s Law, increasingly, trumps legislated law, often unintentionally, and often with unintended negative consequences.

At the same time, as I emphasized, the shift in commerce, social life, civil life and nearly everything else from offline to online media means that future collisions at the intersection of innovation and regulation are inevitable and, indeed, certain to increase in both frequency and the degree of collateral damage.  Governments claim a monopoly on regulating much of this activity, after all, and like any institution that believes in its own mission is unlikely to let itself go quietly out of business as its markets change.

Governments rely for revenue on taxes. As more traditionally taxable activity migrates online, lawmakers are certain to follow. That’s been true in the development of any frontier, physical or virtual.

The longstanding Silicon Valley approach of ignoring Washington in hopes lawmakers won’t notice what we’re doing was always a dangerous policy, and has now become downright reckless.  So how should innovators engage with regulators?

Watch the video below to find out!

DC v. SF: A Clash of Visions for Tech Policy from Charles Koch Institute on FORA.tv

The return of Net Neutrality

Over on Forbes today, I have a very long post inspired by Monday’s oral arguments in Verizon’s challenge of the FCC’s Open Internet rules, passed in 2010

I say “inspired” because the post has nothing to say about the oral arguments which, in any case, I did not attend.  Mainstream journalists can’t resist the temptation to try to read into the questions asked or the mood of the judges some indication of how the decision will come out

But as anyone who has ever worked in a court or followed appellate practice  well knows, the tone of oral arguments signals nothing about a judge’s point-of-view.  Often, the harshest questioning is reserved for the side a judge is leaning towards supporting, perhaps because the briefs filed were inadequate.  Bad briefs create more work for the judge and her clerks.

I use the occasion of the hearing to take a fresh look at the net neutrality “debate,” which has been on-going since at least 2005, when I first started paying attention to it.  In particular, I try to disentangle the political term “net neutrality” (undefined and, indeed, not even used in the 2010 Open Internet order) from the engineering principles of packet routing.

According to advocates for government regulation of broadband access, the political argument for net neutrality regulation is simply a codification of the Internet’s design.  But regardless of whether it would even make sense to transform the FCC into the governing body of engineering protocols for the network (the Internet Society and the its engineering task forces are and always have been doing a fine job, thanks very much), the reality is that the political argument has almost nothing to do with the underlying engineering.

Indeed, those most strongly advocating for more government regulation either don’t understand the engineering or intentionally mischaracterize it, or both.  That’s clear from the wide range of supposed competitive problems that have been lumped together under the banner of “net neutrality” issues over the years–almost none of which have anything to do with packet routing.

Fortunately, very little of the larger political agenda of the loose coalition of net neutrality advocates is reflected in the rules ultimately passed by a bare majority of the FCC in 2010.  Even so, those rules, limited as they were, face many challenges.

For one thing, the FCC, despite over a year of dedicated attention to the problem, could identify only four incidents that suggested any kind of market failure, and only one of which (the Comcast-BitTorrent incident) was ever actually considered in detail by the Commission.  (Two of the others never even rose to the level of a complaint.)  The agency was left to regulate on the basis of “preserving” the Open Internet through what it called (nearly a dozen times) “prophylactic” rules.

Second, and of particular interest in the D.C. Circuit proceeding, Congress has never authorized the FCC to issue rules dealing with broadband Internet access.  Though many authorizing bills have circulated over the years, none have ever made it out of committee.  With no legal basis to regulate, the agency was left pointing to irrelevant provisions of the existing Communications Act–most of which were already rejected by the same court in the Comcast case.  Nothing in the law has changed since Comcast, and on that basis, regardless of the merits of Internet regulation, the FCC is very likely to lose.  Which the Commission surely knew in passing the rules in 2010.

The piece ends by describing, as I did in my testimony before the House Judiciary Committee in early 2011, how the Report and Order betray the technical reality that from an engineering standpoint, even the supposed neutrality of packet routing is largely a sentimental myth.  The FCC identified and exempted a dozen network management technologies, practices, and protocols that they acknowledged do not follow the neutrality principle, but which are essential to effective and efficient management of the network.  There is no “neutral” Internet to preserve, and never was.

The agency was right to exempt these practices.  But the problem with the rules as written is that they could not and did not extend to future innovations that new applications and new users will certainly make as essential as today’s management techniques.

If the rules stand, network engineers, application developers, device makers and others in the vibrant, dynamic Internet ecosystem will be forced to seek permission to innovate from the FCC, which will both slow the high-speed world of Internet design to a crawl and introduce a decision maker with no technical expertise and lots of political baggage.

That of course was the kind of counter-productive and unnecessary regulatory intrusion that Internet users successfully rose up against last year when the UN’s International Telecommunications Union threatened to assert itself in basic Internet governance, or the year before that when Congress, without technical understanding of the most basic variety, tried to re-architect the Internet  on behalf of media companies in the failed SOPA and PIPA legislation.

If the FCC gains a foothold in broadband access with the Open Internet rules or other efforts to gain oversight where Congress has delegated none, expect a similar reaction.  Or, in any case, hope for one.

Big Bang Disruption vs. Conventional Wisdom

HBR March 2013Last month, Paul F. Nunes and I hosted an audio webinar for Harvard Business Review on “Big Bang Disruption,” our article from the March, 2013 issue of the magazine.  An archive of the webinar has now been posted, including the PowerPoint slides.

We had a lively discussion with the audience, who posted some terrific questions and comments during the course of the session.

What companies, products, start-ups, and industries do you see being transformed by this new kind of disruptive innovation?  We’d like to know.

Where to next for the FCC?

crossroads

Tuesday was a big day for the FCC.  The Senate Commerce, Science and Transportation Committee held an oversight hearing with all five Commissioners, the same day that reply comments were due on the design of eventual “incentive auctions” for over-the-air broadcast spectrum.  And the proposed merger of T-Mobile USA and MetroPCS was approved.

All this activity reflects the stark reality that the Commission stands at a crossroads.  As once-separate wired and wireless communications networks for voice, video, and data converge on the single IP standard, and as mobile users continue to demonstrate insatiable demand for bandwidth for new apps, the FCC can serve as midwife in the transition to next-generation networks.  Or, the agency can put on the blinkers and mechanically apply rules and regulations designed for a by-gone era.

FCC Chairman Julius Genachowski, for one, believes the agency is clearly on the side of the future.  In an op-ed last week in the Wall Street Journal, the Chairman took justifiable pride in the focus his agency has demonstrated in advancing America’s broadband advantage, particularly for mobile users.

Mobile broadband has clearly been a bright spot in an otherwise bleak economy.  Network providers and their investors, according to the FCC’s most recent analysis, have spent over a trillion dollars since 1996 building next-generation mobile networks, today based on 4G LTE technology.

These investments are essential for high-bandwidth smartphones and tablet devices and the remarkable ecosystem of voice, video, and data app they have enabled.  This platform for disruptive innovation has powered a level of “creative destruction” that would do Joseph Schumpeter proud.

Mobile disruptors, however, are entirely dependent on the continued availability of new radio spectrum.  In the first five years following the 2007 introduction of the iPhone, mobile data traffic increased 20,000%.  No surprise, then, that the FCC’s 2010 National Broadband Plan conservatively estimated that mobile consumers desperately needed an additional 300 MHz. of spectrum by 2015 and 500 MHz. by 2020.

With nearly all usable spectrum long-since allocated, the Plan acknowledged the need for creative new strategies for repurposing existing allocations to maximize the public interest.  But some current licensees including over-the-air television broadcasters and the federal government itself are resisting Chairman Genachowski’s efforts to keep the spectrum pipeline open and flowing.

So far, despite bold plans from the FCC for new unlicensed uses of TV “white spaces” and the  passage early in 2012 of “incentive auction” legislation from Congress, almost no new spectrum has been made available for mobile consumers.  The last significant auction the agency conducted was in 2008, based on capacity freed up in the digital television transition.

The “shared” spectrum the agency has recently been touting would have to be shared with the Department of Defense and other federal agencies, which have so far stonewalled a 2010 Executive Order from President Obama to vacate its unused or underutilized allocations.  (The federal government is, by far, the largest holder of usable spectrum today, with as much as 60% of the total.)

And after over a year of on-going design, there is still no timetable for the incentive auctions.  Last week, FCC Commissioner Jessica Rosenworcel, speaking to the National Association of Broadcasters, urged her colleagues at least to pencil in some dates.  But even in the best-case scenario, it will be years before significant new spectrum comes online for mobile devices.  The statute gives the agency until 2022.

In the interim, the mobile revolution has been kept alive by creative use of secondary markets, where mobile providers have bought and sold existing licenses to optimize current allocations, and by mergers and acquisitions, which allow network operators to combine spectrum and towers to improve coverage and efficiency.  Many transactions have been approved, but others have not.  Efforts to reallocate or reassign underutilized satellite spectrum are languishing in regulatory limbo.  Local zoning bodies continue to slow or refuse permission for the installation of new equipment.  Delays are endemic.

So even as the FCC pursues its visionary long-term plan for spectrum reform, the agency must redouble efforts to encourage optimal use of existing resources.  The agency and the Department of Justice must accelerate review of secondary market transactions, and place the immediate needs of mobile users ahead of hypothetical competitive harms that have yet to emerge.

In conducting the incentive auctions, unrelated conditions and pet projects need to be kept out of the mix, and qualified bidders must not be artificially limited to advance vague policy objectives that have previously spoiled some auctions and unnecessarily depressed prices on others.

Let’s hope today’s oversight hearing will hold Chairman Genachowski to his promise to “[keep] discussions focused on solving problems, and on facts and data….so that innovation, private investment and jobs follow.”  We badly need all three.

(A condensed version of this essay appears today in Roll Call.)

Big Bang Launch of "Big Bang Disruption"–and a Note on Regulatory Implications

In the upcoming issue of Harvard Business Review, my colleague Paul Nunes at Accenture’s Institute for High Performance and I are publishing the first of many articles from an on-going research project on what we are calling “Big Bang Disruption.”

The project is looking at the emerging ecosystem for innovation based on disruptive technologies, following up on work we have done separately and now together over the last fifteen years.

Our chief finding is that the nature of innovation has changed dramatically, calling into question much of the conventional wisdom on business strategy and competition in information-intensive industries–which is to say, these days, every industry.

The drivers of this new ecosystem are ever-cheaper, faster, and smaller computing devices, cloud-based virtualization, crowdsourced financing, collaborative development and marketing, and the proliferation of mobile everything (including, increasingly, not just people but things).

The result is that new innovations now enter the market cheaper, better, and more customizable than products and services they challenge.  (For example, smartphone-based navigation apps versus standalone GPS devices.)  In the strategy literature, such innovation would be characterized as thoroughly “undiscplined.”  It shouldn’t succeed.  But it does.

So when the disruptor arrives and takes off with a bang, often after a series of low-cost, failed experiments, incumbents have no time for a competitive response.  The old rules for dealing with disruptive technologies, most famously from the work of Harvard’s Clayton Christensen, have become counter-productive.   If incumbents haven’t learned to read the new tea leaves ahead of time, it’s game over.

The HBR article doesn’t go into much depth on the policy implications of this new innovation model, but the book we are now writing will.  The answer should be obvious.

This radical new model for product and service introduction underscores the robustness of market behaviors that quickly and efficiently correct many transient examples of dominance, especially in high-tech markets.

As a general rule (though obviously not one without exceptions), the big bang phenomenon further weakens the case for regulatory intervention.  Market dominance is sustainable for ever-shorter periods of time, with little opportunity for incumbents to exploit it.

Quickly and efficiently, a predictable next wave of technology will likely put a quick and definitive end to any “information empires” that have formed from the last generation of technologies.

Or, at the very least, do so more quickly and more cost-effectively than alternative solutions from regulation.  The law, to paraphrase Mark Twain, will still be putting its shoes on while the big bang disruptor has spread halfway around the world.

Unfortunately, much of the contemporary literature on competition policy from legal academics is woefully ignorant of even the conventional wisdom on strategy, not to mention the engineering realities of disruptive technologies already in the market.  Looking at markets solely through the lens of legal theory is, truly, an academic exercise, one with increasingly limited real-world applications.

Indeed, we can think of many examples where legacy regulation actually makes it harder for the incumbents to adapt as quickly as necessary in order to survive the explosive arrival of a big bang disruptor.  But that is a story for another day.

Much more to come.

Related links:

Why Best Buy is Going out of Business…Gradually,” Forbes.com.

What Makes an Idea a Meme?“, Forbes.com

The Five Most Disruptive Technologies at CES 2013,” Forbes.com

 

Disruptive Tecnologies and the Watchful Waiting Principle

When the smoke cleared and I found myself half caught-up on sleep, the information and sensory overload that was CES 2013 had ended.

There was a kind of split-personality to how I approached the event this year.  Monday through Wednesday was spent in conference tracks, most of all the excellent Innovation Policy Summit put together by the Consumer Electronics Association.  (Kudos again to Gary Shapiro, Michael Petricone and their team of logistics judo masters.)

The Summit has become an important annual event bringing together legislators, regulators, industry and advocates to help solidify the technology policy agenda for the coming year and, in this case, a new Congress.

I spent Thursday and Friday on the show floor, looking in particular for technologies that satisfy what I coined the The Law of Disruptionsocial, political, and economic systems change incrementally, but technology changes exponentially.

What I found, as I wrote in a long post-mortem for Forbes, is that such technologies are well-represented at CES, but are mostly found at the edges of the show–literally.

In small booths away from the mega-displays of the TV, automotive, smartphone, and computer vendors, in hospitality suites in nearby hotels, or even in sponsored and spontaneous hackathons going on around town, I found ample evidence of a new breed of innovation and innovators, whose efforts may yield nothing today or even in a year, but which could become sudden, overnight market disrupters.

Increasingly, it’s one or the other, which is saying something all by itself.  For one thing, how do incumbents compete with such all or nothing innovations?

That, however, is a subject for another day.

For now, consider again the policy implications of such dramatic transformations.  As those of us sitting in room N254 debated the finer points of software patents, IP transition, copyright reform, and the misapplication of antitrust law to fast-changing technology industries (increasingly, that means ALL industries), just a few feet away the real world was changing under our feet.

The policy conference was notably tranquil this year, without such previous hot-button topics as net neutrality, SOPA, or the lack of progress on spectrum reform to generate antagonism among the participants.  But as I wrote at the conclusion of last year’s Summit, at CES, the only law that really matters is Moore’s Law.  Technology gets faster, smaller, and cheaper, not just predictably but exponentially.

As a result, the contrast between what the regulators talk about and what the innovators do gets more dramatic every year, accentuating the figurative if not the literal distance between the policy Summit and the show floor.  I felt as if I had moved between two worlds, one that follows a dainty 19th century wind-up clock and the other that marks time using the Pebble watch, a fully-connected new timepiece funded entirely by Kickstarter.

The lesson for policymakers is sobering, and largely ignored.  Humility, caution, and a Hippocratic-like oath of first-do-no-harm are, ironically, the most useful things regulators can do if, as they repeat at shorter intervals, their true goal is to spur innovation, create jobs, and rescue American entrepreneurialism.

The new wisdom is simple, deceptively so.  Don’t intervene unless and until it’s clear that there is demonstrable harm to consumers (not competitors), that there’s a remedy for the harm that doesn’t make things, if only unintentionally, worse, and that the next batch of innovations won’t solve the problem more quickly and cheaply.

Or, as they say to new interns in the Emergency Room, “Don’t just do something.  Stand there.”

That’s a hard lesson to learn for those of us who think we’re actually surgical policy geniuses, only to find increasingly we’re working with blood-letting and leeches.  And no anesthesia.

In some ways, it’s the opposite of an approach that Adam Thierer calls the Technology Precautionary Principle.  Instead of panicking when new technologies raise new (but likely transient) issues, first try to let Moore’s Law sort it out, until and if it becomes crystal clear that it can’t.  Instead of a hasty response, opt for a delayed response.  Call it the Watchful Waiting Principle.

Not as much fun as fuming, ranting, and regulating at the first sign of chaos, of course, but far more helpful.

That, if anything, is the thread of my dispatches from Vegas, in any case:

  1. Telcos Race Toward an all-IP Future,” CNET
  2. At CES, Companies Large and Small Bash Broken Patent System, Forbes
  3. FCC, Stakeholders Align on Communications Policy—For Now,” CNET
  4. The Five Most Disruptive Technologies at CES 2013, Forbes