The china at the electronic-spy agency’s dining room was exquisite, as was the meal. Ron Rivest, inventor of the RSA approach to public cryptography, and I were having lunch with the National Security Agency’s director, Bobby Inman. We were trying to impress on him that the forthcoming growth of the Information Marketplace would create severe privacy problems and the agency should extend the role of cryptography from ensuring secure communications within the U.S. government (and breakable ones outside it) to protecting the privacy of U.S. citizens and organizations, with approaches like RSA. The admiral didn’t believe us-our claims of a widely interconnected civilian world in the ’90s sounded like pie in the sky. Twenty-five years later, in April 1999, at the other extreme, The Economist proclaimed on its cover “The End of Privacy.”
Under-reaction then! Over-reaction now!
No doubt, the technologies of information can be used to attack our privacy. But they can also be used to protect it. For example, if we agreed that everyone using the Internet did so under the RSA regime of creating and using their own public and private keys, we would end up with secure communications and files and the ability to digitally sign contracts and checks as effectively as we do now by hand. This high level of personal privacy would, however, preclude governments from legally tapping a suspect’s private data and would also prevent anonymity-thereby angering Right and Left simultaneously. If we don’t like this outcome, we have technologies on hand to establish nearly any desired blend of personal privacy, anonymity and governmental intervention.
Such cryptographic approaches would not stop companies with which you do business from selling personal data you give them, corrupting it, or tracking Web sites you frequent. Not to worry. There is technology around to handle these problems, as well: A scheme called P3P, developed by the World Wide Web Consortium, places software within your browser and in the Web sites of vendors. In a P3P personal profile, which you write once, you specify the personal information you are willing to give away along with what others are allowed to do with it. A similar script in the vendor’s software identifies the personal information the vendor requires and its planned disposition. These two pieces of software “shake hands” prior to every business transaction and allow it to proceed only if both privacy declarations are satisfied. In a variation of this scheme, governments can introduce absolute privacy policies, by requiring, for example, a minimal level of privacy in the P3P profile of every citizen.
These examples accurately suggest that we have enough technology around to provide nearly any level of privacy we want. But what do we want? In the United States, consumers have become accustomed to treating privacy as a tradable commodity-we don’t mind giving some of it away to get the goods and services we desire. Vendors are pushing for this approach because they are moving away from mass marketing to one-on-one selling, and are therefore anxious to build intimate knowledge of individual interests and habits.
To most non-Americans, however, privacy is not a tradable commodity but an inalienable right that must be guaranteed and protected, especially in the case of minors. The European Union, flexing its muscle, recently threatened to forbid its citizenry from doing electronic commerce with organizations (read U.S.) that do not meet a minimal threshold of absolute privacy guarantees. They have since backed down and gone to committee, as they and their American partners search for common ground. Last February at the World Economic Forum in Davos, Switzerland, a few industrialists tried to establish a voluntary code under which vendors would give you, upon request, all personal information they have on you, explain what they plan to do with it, and correct it if asked. Adoption of this code seemed a small and achievable step, but it failed to pass. The American vendors saw it as an expensive and difficult proposition to implement, and a potential leak of their marketing approaches to adversaries.
Clearly, we disagree about the kind of privacy we want. And we don’t seem serious enough about reaching agreement-at that same meeting in Davos, I almost fell out of my chair when several world leaders asked the technologists present to “go figure out a solution to the privacy problems you brought upon us!” This abrogation of what should be a central responsibility of politicians and legislators must stop.
Let’s not surrender our privacy to the big lie of technological inevitability. Let us, instead, augment the debates of privacy specialists, with a far broader discussion in the national legislatures of the industrial world and within international organizations, focusing on one issue-the kind of privacy people want. And let’s be flexible-even though the United States sports most of the world’s Web sites, we cannot expect six billion people to automatically adopt American constitutional amendments and habits. Reaching agreement on the kind of privacy people want nationally and internationally is an important and achievable goal at this stage of our history: We should be able to do it, as we have already done with passports, trade, airlines and cross-border justice.