Category Archives: Information Economics

Praise for California passage of law protecting VoIP from local utility regulators

On Friday, California Governor Jerry Brown signed SB 1161, which prohibits the state’s Public Utilities Commission from any new regulation of Voice over Internet Protocol or other IP-based services without the legislature’s authorization.

California now joins over twenty states that have enacted similar legislation.

The bill, which is only a few pages long, was introduced by State Senator Alex Padilla (D) in February.  It passed both houses of the California legislature with wide bi-partisan majorities.

California lawmakers and the governor are to be praised for quickly enacting this sensible piece of legislation.

Whatever the cost-benefit of continued state regulation of traditional utilities such as water, power, and landline telephone services, it’s clear that the toolkit of state and local PUCs is a terrible fit for Internet services such as Skype, Google Voice or Apple’s FaceTime.

Historically, as I argued in a Forbes piece last month, the imposition of public utility status on a service provider has been an extreme response to an extreme situation—a monopoly provider, unlikely to have competition because of the high cost of building  and operating competing infrastructure (so-called “natural monopoly”), offering a service that is indispensable to everyday life.

Service providers meeting that definition are transformed by PUC oversight into entities that are much closer to government agencies than private companies.  The PUC sets and modifies the utility’s pricing in excruciating detail.  PUC approval is required for each and every change or improvement to the utility’s asset base, or to add new services or retire obsolete offerings.

In exchange for offering service to all residents, utilities in turn are granted eminent domain and rights of way to lay and maintain pipes, wires and other infrastructure.

VoIP services may resemble traditional switched telephone networks, but they have none of the features of a traditional public utility.  Most do not even charge for basic service, nor do they rely on their own dedicated infrastructure.  Indeed, the reason VoIP is so much cheaper to offer than traditional telephony is that it can take advantage of the existing and ever-improving Internet as its delivery mechanism.

Because entry is cheap, VoIP providers have no monopoly, natural or otherwise.  In California, according to the FCC, residents have their choice of over 125 providers—more than enough competition to ensure market discipline.

Nor would residents be in any way helped by interposing a regulator to review and pre-approve each and every change to a VoIP provider’s service offerings.  Rather, the lightning-fast evolution of Internet services provides perhaps the worst mismatch possible for the deliberate and public processes of a local PUC.

Software developers don’t need eminent domain.

But the most serious mismatch between PUCs and VoIP providers is that there is little inherently local about VoIP offerings.  Where a case can be made for local oversight of public utilities operating extensive–even pervasive–local infrastructure, it’s hard to see what expertise a local PUC brings to the table in supervising a national or even international VoIP service.

On the other hand, it’s not hard to imagine the chaos and uncertainty VoIP providers and their customers would face if they had to satisfy fifty different state PUCs, not to mention municipal regulators and regulators in other countries.

In most cases that would mean dealing with regulators on a daily basis, on every minor aspect of a service offering.  In the typical PUC relationship, the regulator becomes the true customer and the residents mere “rate-payers” or even just “meters.”

Public utilities are not known for their constant innovation, and for good reason.

Whatever oversight VoIP providers require, local PUCs are clearly the wrong choice.  It’s no surprise, then, that SB 1161 was endorsed by major Silicon Valley trade groups, including TechNet, TechAmerica, and the Silicon Valley Leadership Group.

The law is a win for California residents and California businesses—both high-tech and otherwise.

Links                                                                                                                                         

  1. Government Control of Net is Always a Bad Idea,” CNET News.com, June 4, 2012.
  2. Memo to Jerry Brown:  Sign SB 1161 for all Internet users,” CNET News.com, August 30, 2012.
  3. The Madness of Regulating VoIP as a Public Utility,” Forbes.com, Sept. 10, 2012.
  4. Brown Endorses Hands off Stance on Internet Calls,” The San Francisco Chronicle, Sept. 28. 2012.

What Google Fiber, Gig.U and US Ignite Teach us About the Painful Cost of Legacy Regulation

On Forbes today, I have a long article on the progress being made to build gigabit Internet testbeds in the U.S., particularly by Gig.U.

Gig.U is a consortium of research universities and their surrounding communities created a year ago by Blair Levin, an Aspen Institute Fellow and, recently, the principal architect of the FCC’s National Broadband Plan.  Its goal is to work with private companies to build ultra high-speed broadband networks with sustainable business models .

Gig.U, along with Google Fiber’s Kansas City project and the White House’s recently-announced US Ignite project, spring from similar origins and have similar goals.  Their general belief is that by building ultra high-speed broadband in selected communities, consumers, developers, network operators and investors will get a clear sense of the true value of Internet speeds that are 100 times as fast as those available today through high-speed cable-based networks.  And then go build a lot more of them.

Google Fiber, for example, announced last week that it would be offering fully-symmetrical 1 Gbps connections in Kansas City, perhaps as soon as next year.  (By comparison, my home broadband service from Xfinity is 10 Mbps download and considerably slower going up.)

US Ignite is encouraging public-private partnerships to build demonstration applications that could take advantage of next generation networks and near-universal adoption.  It is also looking at the most obvious regulatory impediments at the federal level that make fiber deployments unnecessarily complicated, painfully slow, and unduly expensive.

I think these projects are encouraging signs of native entrepreneurship focused on solving a worrisome problem:  the U.S. is nearing a dangerous stalemate in its communications infrastructure.  We have the technology and scale necessary to replace much of our legacy wireline phone networks with native IP broadband.  Right now, ultra high-speed broadband is technically possible by running fiber to the home.  Indeed, Verizon’s FiOS network currently delivers 300 Mbps broadband and is available to some 15 million homes.

But the kinds of visionary applications in smart grid, classroom-free education, advanced telemedicine, high-definition video, mobile backhaul and true teleworking that would make full use of a fiber network don’t really exist yet.  Consumers (and many businesses) aren’t demanding these speeds, and Wall Street isn’t especially interested in building ahead of demand.  There’s already plenty of dark fiber deployed, the legacy of earlier speculation that so far hasn’t paid off.

So the hope is that by deploying fiber to showcase communities and encouraging the development of demonstration applications, entrepreneurs and investors will get inspired to build next generation networks.

Let’s hope they’re right.

What interests me personally about the projects, however, is what they expose about regulatory disincentives that unnecessarily and perhaps fatally retard private investment in next-generation infrastructure.  In the Forbes piece, I note almost a dozen examples from the Google Fiber development agreement where Kansas City voluntarily waived permits, fees, and plodding processes that would otherwise delay the project.  As well, in several key areas the city actually commits to cooperate and collaborate with Google Fiber to expedite and promote the project.

As Levin notes, Kansas City isn’t offering any funding or general tax breaks to Google Fiber.  But the regulatory concessions, which implicitly acknowledge the heavy burden imposed on those who want to deploy new privately-funded infrastructure (many of them the legacy of the early days of cable TV deployments), may still be enough to “change the math,” as Levin puts it, making otherwise unprofitable investments justifiable after all.

Just removing some of the regulatory debris, in other words, might itself be enough to break the stalemate that makes building next generation IP networks unprofitable today.

The regulatory cost puts a heavy thumb on the side of the scale that discourages investment.  Indeed, as fellow Forbes contributor Elise Ackerman pointed out last week, Google has explicitly said that part of what made Kansas City attractive was the lack of excessive infrastructure regulation, and the willingness and ability of the city to waive or otherwise expedite the requirements that were on the books.(Despite the city’s promises to bend over backwards for the project, she notes, there have still been expensive regulatory delays that promoted no public values.)

Particularly painful to me was testimony by Google Vice President Milo Medin, who explained why none of the California-based proposals ever had a real chance.  “Many fine California city proposals for the Google Fiber project were ultimately passed over,” he told Congress, “in part because of the regulatory complexity here brought about by [the California Environmental Quality Act] and other rules. Other states have equivalent processes in place to protect the environment without causing such harm to business processes, and therefore create incentives for new services to be deployed there instead.”

Ouch.

This is a crucial insight.  Our next-generation communications infrastructure will surely come, when it does come, from private investment.  The National Broadband Plan estimated it would take $350 billion to get 100 Mbps Internet to 100 million Americans through a combination of fiber, cable, satellite and high-speed mobile networks.  Mindful of reality, however, the plan didn’t even bother to consider the possibility of full or even significant taxpayer funding to reach that goal.

Unlike South Korea, we aren’t geographically-small, with a largely urban population living in just a few cities.  We don’t have a largely- nationalized and taxpayer-subsidized communications infrastructure.   On a per-person basis, deploying broadband in the U.S. is much harder, complicated and more expensive than it is in many competing nations in the global economy.

Of course, nationwide fiber and mobile deployments by network operators including Verizon and AT&T can’t rely on gimmicks like Google Fiber’s hugely successful competition, where 1,100 communities applied to become a test site.  Nor can they, like Gig.U, cherry-pick research university towns, which have the most attractive demographics and density to start with.  Nor can they simply call themselves start-ups and negotiate the kind of freedom from regulation that Google and Gig.U’s membership can.

Large-scale network operators need to build, if not everywhere, than to an awful lot of somewheres.  That’s a political reality of their size and operating model, as well as the multi-layer regulatory environment in which they must operate.  And it’s a necessity of meeting the ambitious goal of near-universal high-speed broadband access, and of many of the applications that would use it.

Under the current regulatory and economic climate, large-scale fiber deployment has all but stopped for now.  Given the long lead-time for new construction, we need to find ways to restart it.

So everyone who agrees that gigabit Internet is a critical element in U.S. competitiveness in the next decade or so ought to look closely at the lessons, intended or otherwise, of the various testbed projects.  They are exposing in stark detail a dangerous and useless legacy of multi-level regulation that makes essential private infrastructure investment economically impossible.

Don’t get me wrong.  The demonstration projects and testbeds are great.  Google Fiber, Gig.U, and US Ignite are all valuable efforts.  But if we want to overcome our “strategic bandwidth deficit,” we’ll need something more fundamental than high-profile projects and demonstration applications.  To start with, we’ll need a serious housecleaning of legacy regulation at the federal, state, and local level.

Regulatory reform might not be as sexy as gigabit Internet demonstrations, but the latter ultimately won’t make much difference without the former.  Time to break out the heavy demolition equipment—for both.

The federal Government Plays the Spectrum Shell Game

On CNET today, I’ve posted a long critique of the recent report by the President’s Council of Advisors on Science and Technology (PCAST) urging the White House to reverse course on a two-year old order to free up more spectrum for mobile users.

In 2010, soon after the FCC’s National Broadband Plan raised alarms about the need for more spectrum for an explosion in mobile broadband use, President Obama issued a Memorandum ordering federal agencies to free up as much as 500 MHz. of radio frequencies currently assigned to them.

After a great deal of dawdling, the National Telecommunications and Information Administration, which oversees spectrum assignments within the federal government, issued a report earlier this year that seemed to offer progress.  95 MHz. of very attractive spectrum could in fact be cleared in the ten years called for by the White House.

But reading between the lines, it was clear that the 20 agencies involved in the plan had no serious intention of cooperating.  Their cost estimates (which were simply reported by NTIA without any indication of how they’d been arrived at or even whether NTIA had been given any details) for relocation appeared to be based on an amount that would make any move economically impossible.

And the NTIA’s suggestion that some of the bands could be “shared” sounded appealing until the details revealed that the feds would place impossible conditions on that sharing.

In the end, the NTIA report was 200 pages of classic smoke-and-mirrors from an entrenched bureaucracy that is expert at avoiding change.

The PCAST report seemed to throw in the cards and accept the political reality that actual spectrum clearing in the federal bands would never happen.  Instead, the President’s advisors doubled down on “sharing,” and called  for a new “Spectrum Access System” that would be based on sharing technologies it admitted don’t exist yet.

SAS might be a better system in the long-term, but current technical and political limitations make such a system impractical. I argue in the piece that the NTIA and PCAST reports are just providing cover for federal agencies, notably the DoD and Justice, to avoid actually having to follow the President’s order and take aggressive steps to free up spectrum that is needed now.  Whether this is intentional or not I leave to more savvy tea-leaf readers.

Updates to the Media Page

We’ve added over a dozen new posts to the Media page, covering some of the highlights in articles and press coverage for April and May, 2012.

Topics include privacy, security, copyright, net neutrality, spectrum policy, the continued fall of Best Buy and antitrust.

The new posts include links to Larry’s inaugural writing for several publications, including Techdirt, Fierce Mobile IT, and Engine Advocacy.

There are also several new video clips, including Larry’s interview of Andrew Keen, author of the provocative new book, “Digital Vertigo,” which took place at the Privacy Identity and Innovation conference in Seattle.

June was just as busy as the rest of the year, and we hope to catch up with the links soon.

Everyone Out of the Internet!

During the 1970’s, I remember a bumper sticker that summed up the prevailing anti-colonial attitude that had developed during the late 1960’s:  “U.S. Out of North America.”

That sentiment reflects nicely my activities this week, which include three articles decrying efforts by regulators to oversee key aspects of the Internet economy.  Of course their intentions—at least publicly—are always good.  But even with the right idea, the unintended negative consequences always overwhelm the benefits by a wide margin.

Governments are just too slow to respond to the pace of change of innovations in information technology.  Nothing will fix that.  So better just to leave well enough alone and intercede only when genuine consumer harm is occurring.  And provable.

The articles cover the spectrum from state (California), federal (FCC) and international (ITU) regulators and a wide range of  truly bad ideas, from the desire of California’s Public Utilities Commission to “protect” consumers of VoIP services, to the FCC’s latest effort to elbow its way into regulating broadband Internet access at the middle milel, to a proposal from European telcos to have the U.N. implement a tariff system on Internet traffic originating from the U.S.

 Here they are:

  1. “Government Control of the Net is Always a Bad Idea” (CNET) – http://news.cnet.com/8301-13578_3-57446383-38/government-control-of-net-is-always-a-bad-idea/?tag=mncol;cnetRiver
  2. “The FCC Noses Under the Broadband Internet Tent” (Forbes) – http://www.forbes.com/sites/larrydownes/2012/06/06/the-fcc-noses-under-the-broadband-internet-tent/
  3. “U.N. Could Tax U.S.-based Websites, Leaked Docs Show” (CNET) – http://news.cnet.com/8301-1009_3-57449375-83/u.n-could-tax-u.s.-based-web-sites-leaked-docs-show/?tag=mncol;topStories

That third one, by the way, was written with CNET’s Chief Political Correspondent Declan McCullagh.  It represents a genuine scoop, based on leaked documents posted by my Tech Liberation Front colleagues Jerry Brito and Eli Dourado on WCITLeaks.org!

Rescuing Mobile Broadband from Washington's Deadly Embrace

 (Adapted from Bloomberg BNA Daily Report for Executives, May 16th, 2012.)

Two years ago, the Federal Communications Commission’s National Broadband Plan raised alarms about the future of mobile broadband. Given unprecedented increases in consumer demand for new devices and new services, the agency said, network operators would need far more radio frequency assigned to them, and soon. Without additional spectrum, the report noted ominously, mobile networks could grind to a halt, hitting a wall as soon as 2015.

That’s one reason President Obama used last year’s State of the Union address to renew calls for the FCC and the National Telecommunications and Information Administration (NTIA) to take bold action, and to do so quickly. The White House, after all, had set an ambitious goal of making mobile broadband available to 98 percent of all Americans by 2016. To support that objective, the president told the agencies to identify quickly an additional 500 MHz of spectrum for mobile networks.

By auctioning that spectrum to network operators, the president noted, the deficit could be reduced by nearly $10 billion. That way, the Internet economy could not only be accelerated, but taxpayers would actually save money in the process.

A good plan. So how is it working out?

Unfortunately, the short answer is:  Not well.  Speaking this week at the annual meeting of the mobile trade group CTIA, FCC Chairman Julius Genachowski had to acknowledge the sad truth:  “the overall amount of spectrum available has not changed, except for steps we’re taking to
add new spectrum on the market.”

The tortured grammar (how can “steps we’re taking to add new spectrum” constitute an exception to the statement that the amount of available spectrum “has not change”) belies the reality here—all the FCC Chairman can do is promise more spectrum sometime in the vague future.  For now, the FCC and the NTIA have put almost no new spectrum into actual use.  Instead,  the two agencies have piled up a depressing list of delays, scandals, and wasted opportunities. Consider just a few:

– NTIA’s long-overdue report on freeing up government spectrum identified nearly 100 MHz of frequencies that could be reallocated for mobile broadband. But the 20 agencies involved in the study demanded 10 years and nearly $18 billion to vacate the spectrum—and insist on moving to frequencies that are already assigned to other public or private license holders. An available 20 MHz of unassigned frequency, left over from the 2009 conversion to digital TV, was actually added to the government’s supply when it was set aside  this year for a dedicated public safety network

– After years of wrangling with Congress, the FCC finally won limited authority to hold “voluntary incentive auctions” for spectrum currently licensed to over-the-air television broadcasters. But those auctions will take years to complete, and a decided lack of enthusiasm by broadcasters doesn’t portend well for the outcome.  As for reducing the deficit, the agency has reserved its right to disqualify bidders it believes already hold more spectrum than the agency thinks best to stimulate competition, even without any measurable signs of market failure. (Voice, data, and text prices continue to decline, according to the FCC’s own data.

– LightSquared’s efforts to reallocate satellite spectrum for use in a competitive new mobile broadband network were crippled—perhaps fatally–by concerns raised by the Department of Defense and others over potential interference with some global positioning system (GPS) devices.  Initial permission to proceed was swiftly revoked–after the company had invested billions.  The FCC’s procedural blunders in the LightSquared case ignited a political scandal that continues to distract the agency. A similar effort by Dish Networks is now being put through the full set of administrative hurdles, delayed at least until after the election..

– Transactions in the secondary spectrum markets—long the only real source of supply for mobile network operators–have received an increasingly frosty reception. Last year, AT&T’s planned merger with T-Mobile USA was scuttled on the basis of dubious antitrust concerns the FCC backed up with data that was clumsily rigged by agency staff.  Now, the agency has expanded its review of Verizon’s efforts to buy spectrum from a consortium of cable companies—spectrum that currently isn’t being used for anything.

– After the FCC mandated data roaming agreements even for carriers who hold spectrum in the same markets, Sprint announced it would stop serving customers with its own network in two metropolitan areas, piggybacking instead on AT&T’s band-new LTE facilities. Sprint’s move underscores concerns that mandatory roaming will reduce incentives for carriers to invest in infrastructure. According to the FCC, mobile industry investments have reached nearly 15 percent of total revenue in recent years. Of the leading providers, only Sprint decreased its investments during the recession.

Not an impressive showing, to say the least.  Meanwhile, in the real world, demand for mobile broadband continues to mushroom. Network usage has increased as much as 8,000%  since 2007, when Apple’s iPhone first hit the market. It was followed by an explosion of new devices, operating systems, and software apps from a cottage industry of developers large and small. This remarkable ecosystem is driving lightning-fast adoption of mobile services, especially bandwidth-intense video apps.

The mobile broadband ecosystem is one of the few bright spots in the sour economy, creating jobs and generating tax revenues. Makers of tablet computers, for example, expect to sell over 100 million units this year alone. Tablet users, by the way, already rely on the wildly popular devices for 15 percent of their TV viewing, raising the demand for high-bandwidth video services on existing mobile broadband networks.

Spectrum is the principal fuel of these fast-growing mobile applications. So FCC Chairman Julius Genachowski is right to repeatedly emphasize the catastrophic consequences of an imminent “spectrum crunch.”  The FCC is leading the chorus of doomsayers who believe that without more spectrum—and soon—our  mobile revolution will never reach its full economic, educational, and social potential.

But the government has done nothing to head off that disaster. Instead, the FCC, the NTIA, and the Obama administration continue to make policy choices that do little to get more spectrum into the system. If anything, we’re moving backwards.

Many of these decisions appear to be driven by short-term political imperatives, overriding the worthy  goal of making mobile broadband available to all Americans as quickly as possible. The AT&T/T-Mobile deal, for example, was killed simply because the FCC didn’t like the idea of taking even a failing carrier out of the competitive equation. Yet had the deal been approved, AT&T committed to deploy mobile broadband to 95 percent of all Americans—nearly meeting the president’s goal in a single stroke.

This is nothing new. The FCC has a very long and very messy history of using its spectrum management powers to shape emerging markets, and to pick winners and losers among new technologies,  applications, and providers.  Their guiding principle for nearly 100 years has been the so-called “public interest” standard—an undefined and highly-malleable policy tool the FCC employs like a bludgeon.

The era of micromanaging the airwaves by federal fiat must now end once and for all. For first time in a century of federal stewardship, there is almost no inventory of usable spectrum. It has all been allocated to some 50,000 public and private license holders, each the one-time favorite of the FCC. Our spectrum frontier has closed.  And it wouldn’t have closed so soon if the FCC hadn’t remained so determined to manage a 21st century resource as if it were still the 19th century.

Technology may come to our rescue, at least in part. Hardware and software for sharing spectrum, switching frequencies, and maximizing the technical properties of different bandwidths continue to be part of the innovation agenda of the mobile industry. But it is unlikely these developments will be enough to keep spectrum supply even slightly ahead of unbridled consumer demand. Many of these technologies, in any case, still require FCC approval to be deployed. That means even more delays.

Saving the mobile ecosystem–and making way for the next generation of mobile innovation–demands a bold new strategy. For starters, it is time to stage an intervention for  federal agencies hoarding spectrum. Private licensees who no longer need the spectrum they have must be able to sell their rights quickly in a working market, and be prodded when needed to do so. Buyers need the freedom to repurpose spectrum to new uses.

Also, we need to increase incentives for network operators to continue investing in better and more efficient infrastructure, not throw cold water on them in the name of a vague and largely undefined public interest.   The number of competitors isn’t what matters.  It’s the ability of consumers to get what they want at prices that, at least up until now, continue to decline.

In short, we need to take the FCC out of the middle of every transaction and each innovation, slowing Silicon Valley-paced markets down to Washington speed.

With the appetite of mobile consumers growing more voracious, it is long past time for Congress to take a cold, sober look at our obsolete system for spectrum management and the antiquated agency that can’t stop fussing over it. We need a new system, if not a new FCC. That’s the only way to keep the mobile frontier booming, let alone meet the admirable goal of providing a homestead there for every American.