Once More into the Tar Pits of Privacy Policy

No doubt the gooiest problem at the intersection of technology and law continues to be what is unhelpfully referred to as “privacy.” I’ve clipped five articles in just the last week on the subject, including several about Facebook’s efforts to appease users and government regulators in Canada, demands by Switzerland to Google to stop its “street view” application, and a report from Information Week about proposals from a coalition of U.S. public interest groups for new legislation to beef up U.S. privacy law.

The term “privacy” is unhelpful because, as I explain in Chapter 3 of “The Laws of Disruption,” it is a very broad term applied to a wide range of problems. “Privacy” is shorthand for problems including government surveillance, criminal misuse of information, concealment of personal information from friends and family, and protection of potentially harmful or embarrassing information from employers and other private parties (e.g., insurance firms) who might use it to change the terms of business interactions (e.g., increasing premiums when your vehicle’s RFID toll tag tracks your tendency to speed.)

Unfortunately, all of these problems are included in discussions about privacy, and in many cases it’s clear the different parties in the conversation are actually talking about very different aspects of the problem.

Beyond the definitional issues, there is also what one might think of as the irrational or emotional side of privacy. Instinctively, and for different cultural reasons, most of us have a strong reaction to new products and services that will make use of what we think of as intimate facts about ourselves and our activities, even when the goal of that use is to improve the usefulness of such products or make them available at a lower cost.

In the U.S., the national character of the “frontier” nation stressed the desire of eccentric individuals to get away from the moral, religious, or social strictures of European life.

In Europe, historical events in which private information was abused to horrific ends (the Inquisition, the Holocaust, the oppression of life in the Soviet Union and its satellites such as East Germany, where as many as one in three citizens were paid informants on the others) bubble just below the surface of the “debate” about “privacy.”

At one extreme, a small but vocal group of pseudo-millenialists believe that identification technologies are a signal of the coming of the Antichrist, as prophesized in the Book of Revelations.

Of course the problems faced by policy makers on a day-to-day basis seem modest, even trivial, in isolation. Users of Facebook applications (quizzes and the like) allow outside software developers to use the identity of their friends to pass scores around, or to challenge other users. Google Street View, which aims to enhance Google Maps with real photos of streets and houses, inadvertently and perhaps unavoidably take photos that show random but identifiable people and vehicles that happened to be present when the photos are taken.

Behavioral advertising aims to take contextual information about what users are doing online to present ads that are more likely to be of interest than the kind of random guessing that has historically been the realm of ads, such as those that might show up during a television program.

I have to be honest and say I too have a visceral reaction when a targeted ad pops up in an unexpected context, as happens regularly on Facebook, Google, and other applications where I might be engaged in a variety of personal and business communications.

It always reminds me of the time, in the early 1980’s, when I called in to a local cable access program in New York where a hippie astrologer gave consultations on the air, aided by a small, well-dressed man sitting next to her with a laptop computer. There was no tape delay on the show, and when I was on the air, the TV was literally talking directly to me, a true out-of-body experience. (The astrologer also gave profoundly good advice!)

Like most consumers, however, I quickly get over that response and realize that the appearance of intimacy, indeed of inappropriate intimacy, is just that—an appearance. Google isn’t trying to get photos of people who aren’t where they’re supposed to be. Facebook isn’t trying to undermine my personal relationships.

Behavioral ads appear to be personal, but the reality of course is that ALL of the processing is being done by cold, lifeless, uncaring computers. Gmail may “read” the contents of my messages in order to serve me certain ads. But Gmail is not a person, and there is no person or army of persons at Google sitting around reading my mail. No one, sad to say, would care enough to do so. I’m not worth blackmailing. And blackmailing is already a crime.

National governments and public interest groups can and will continue to impose new conditions on Internet products and services (Europeans, for example under Directive 95/46 have a powerful right against any use of their personally-identifiable information.)

The reality, however, is that such regulations are always straining for a balance between the visceral response to “new” privacy invasions and the benefits to consumers that comes from allowing the information to be used. That, in the case of business use of information, is always the goal, even if it’s consumers as a whole who benefit rather than the individual, as in my driving habits/insurance example. (Companies make lots of mistakes, and launch ill-conceived products and services, and some of the abuses have been spectacular and public. Criminal use, such as identity theft, and government surveillance, again, are different problems.)

For the most part, consumers, if only unconsciously, seem to know how to weight the pluses and minuses of new uses of “private” information and decide which ones to allow. I don’t mean to suggest that the market is always right, and never needs external correction. But for every change in privacy law that requires new disclosures, opt-in or opt-out provisions, and other consumer protection, it doesn’t seem to take long for most if not the vast majority of consumers to agree to let the information flow where it may. Information, even private information, wants to be free.