5 Va. J.L. & Tech. 6 (2000) <http://www.vjolt.net>
1522-1687 / © 2000 Virginia Journal of Law and Technology Association
VIRGINIA JOURNAL of
LAW and TECHNOLOGY
UNIVERSITY OF VIRGINIA
5 VA. J.L. & TECH. 6
Between Big Brother and the Bottom Line: Privacy in Cyberspace
By Seth Safier[*]
II. Cyberspace and the Digital Revolution
III. The Reagan Revolution
V. A Digital Taxonomy
1. Traditional Collection
2. Transaction Generated Information
B. Storage and Processing
1. List Vending and Direct Marketing
2. Intelligent Agents and Push Technology: Cyberapplications
VI. Existing Legal Protection
A. Federal Statutory Protection
1. Fair Credit Reporting Act
2. Privacy Act
3. Cable Communications Policy Act
4. Electronic Communications Privacy Act
5. Video Privacy Protection Act
6. Telephone Consumer Protection Act
7. Proposed Legislation
8. Federal Trade Commission Initiatives
B. State Statutory Protection
C. The European Model
D. Constitutional Protection
E. Common Law Protection
XII. Self-Regulation and Market Based Solutions
A. Opt In/Opt Out
C. OPS and P3P
VIII. The Existing Regulatory Debate
IX. Enhancing the Digital Trust
X. Shifting Ground
The right to be left alone--the most comprehensive of rights, and the right most valued by a free people.
Justice Louis Brandeis, Olmstead v. U.S., 27 U.S. 438 (1928)
- On January 25, 1994, in the prepared text of his first State of the Union Address, President Clinton declared,
We must work with the private sector to connect every classroom, every clinic, every library, and every hospital
in America to a national information superhighway by the year 2000. Instant access to information will increase
productivity, help educate our children, and provide better medical care and create jobs, I call on Congress this
year to pass legislation to establish the information superhighway.
- In the speech, President Clinton formally introduced the population to what academics, computer scientists,
techies and digerati call "Cyberspace and the Information Age." Over six years later, as the technology
at the heart of the "information superhighway" continues to develop at exponential rates, cyberspace,
the information age, and the information superhighway are not so easily defined,
established or developed, nor technically or sociologically understood.
II. Cyberspace and The Digital Revolution 
- In the mid-1960s, around the time that the first commercially successful computer, the IBM 360, was introduced,
the business world was one of file cabinets, carbon copies, Dictaphones, ribbon typewriters and handwritten receipts
and ledgers. Customers primarily bought goods and services with cash drawn from neighborly tellers on wages earned
from local employers. Research and development were patriotic and the IBM man de rigeur.
- Gradually, the research and development began to pay dividends, and rudimentary photocopiers replaced carbon
paper, and cumbersome, incipient computers and printers replaced ribbon typewriters. File cabinets eventually gave
way to databases, and cash transactions evolved to charge. By the late 1970s, computers were fixtures in most businesses,
and growing numbers of consumers were buying home computers such as the Apple II, TRS-80 and Commodore PET. By
the 1980’s, new computer chip driven technologies were exploding and heralding the information age. Cellular phones,
fax machines, powerful personal computers, digital databases, electronic cash registers, and intricate interoperable
networking systems bombarded businesses and consumers alike. By the early 1990s, mail had gone electronic, computing
went super, voice went digital, cable went fiberoptic and "surfing" went from a board on the ocean to
a mouse on a desktop.
- In a 1992 opinion survey, 79% of Americans agreed that "computers have improved the quality of life in
our society." The ubiquity of computers and networks had drastically
affected daily existence in Western societies. Whether it be computer aided drafting, word processing, just in
time manufacturing, finding point to point directions in less than 30 seconds on the World Wide Web ("Web"
or "WWW") or visiting the ATM machine, computers enhanced efficiency, and consumers understood and appreciated
it. Computer chips had established themselves as the engines driving the age.
- With the same speed, however, these technologies and resultant social changes began to precipitate challenges
to individual privacy. Just as the advent of the wiretap created
a dilemma for Fourth Amendment jurisprudence in 1928, the digitization
of records, parabolic microphones, remote sensing satellites, smartcards, lie detector tests, genetic fingerprinting,
caller-identification, cookie.txt files, clickstream data collection, "push technologies" and "intelligent agents" are
similarly forcing us to take pause and ask fundamental questions about cyberspace and the limits of individual
- Due to enhanced processing, retrieval and storage power, intricate networks and the ubiquity of chip driven
technologies, 68% of Americans surveyed in the same 1992 opinion poll, also agreed that "the present use of
computers represents an actual threat to personal privacy," an increase from 1974 and 1978, when only 38%
and 37% of Americans, respectively, agreed with the statement. A
1993 public opinion survey subsequently revealed that 83 percent of Americans were "concerned" with threats
to personal privacy. And, a 1995 Louis Harris poll illustrated
that the number of people who were "very concerned" about privacy had increased almost 50% between 1978
and 1995. In recent years, numerous corporations and governmental
organizations, including Lexis-Nexis, Blockbuster, Lotus Development Corporation, Equifax Marketing Decision Systems, America Online and the Social Security Administration, have changed policies, or altered business decisions, in reaction
to public outcry over privacy concerns.
- While the collection, processing, use and storage of personal information in cyberspace may raise pivotal concerns
about privacy, if numbers are accurate indicators, personal information primarily raises revenue. Venture capitalists
are betting big dollars on cyberspace's potential in markets such as banking, wholesale business transactions,
entertainment, retail, investment, marketing and, even, universal currency. The Internet economy is estimated to grow to past the $1 trillion mark in 2001 and then to $2.8 trillion
in 2003. A recent study from Ciemax-WEFA, an economics consulting
group, commissioned by the Direct Marketing Association, indicated that one of every 13 jobs in the United States
was the result of direct marketing sales activity, including jobs designing and selling advertising, supplying
or delivering goods, and selling other support services, such as customer lists and consumer profiles to direct-response
businesses. The same study revealed that direct marketing sales to consumers reached $630 billion in 1996, up from
$458 billion in 1991. Business to business sales were another $540 billion in 1996, up from $349 billion in 1991.
- Other than remarkable profit potentials, the common denominator of these chip-driven technologies--simultaneously
improving our daily lives and threatening our privacy--is that they all function as gateways to cyberspace. Essentially,
cyberspace is the space where digitized information lives, works and dies. More fundamentally, cyberspace knows
few if any physical limitations; inherently, it is a social construct. Physics does not exist in cyberspace. Rather,
the most comprehensible and malleable limitation in cyberspace is technology or, in Harvard Law School Professor
Lawrence Lessig's terminology, code. Thus, unless it is prohibitively
expensive in terms of cost of storage, time or effect on customer relationships or unless the cyber transaction
has been technologically secured, in theory every purchase, page
turned, call made, e-mail sent and key stroked can be archived, stored, filtered, correlated, networked, regressed,
matched, connected, catalogued, categorized, compared and/or labeled.
- As computers and advanced telecommunications networks increasingly render cyberspace the least restrictive
or most efficient medium in which to collect, process and utilize personal information the world has ever known,
the fundamental concern about privacy in cyberspace becomes the manner in which the medium, and the technologies
driving the medium, permit, enable and enhance the collection, processing, use and storage of vast amounts of in-depth,
and potentially sensitive, personal information. One need not think long nor hard regarding the possibilities and
implications of new technology to develop Orwellian visions regarding the capacity of the Government and, ever
increasingly, the private sector to gather, sort and process massive amounts of information regarding ourselves.
Yet, the futility of eliminating all the information or the efficient means of collecting and sifting it, must
serve as the impetus for a fuller understanding and potentially more efficient shaping and, perhaps, regulating
of the people and technologies collecting the data, employing the information, and shaping, building and dreaming
about the next technology with which to do so. As always, the potential is simultaneously exhilarating and terrifying.
III. The Reagan Revolution
- As the digital revolution rendered life more efficient, and traditionally private domains less and less existent,
a parallel ideological revolution was occurring in the minds of the American polity and halls of government. Arguably,
since independence the American political and social milieu has been ideologically characterized by forceful arguments
for decentralization of government power. Whether the arguments took the Federalist or laissez faire form, historically Americans have been ill at ease with centralized authority.
In practice and rhetoric, market and individual freedom have been inextricably linked.
- However, because of the severity and length of the Great Depression in the 1930s, government welfare programs
and government intervention in the market both increased significantly. The suffering experienced in the Great
Depression further reinforced popular support for social welfare programs. For many, the recovery attributable
to Franklin D. Roosevelt’s New Deal initiatives established a new found faith in government regulation of market
mechanisms and political support for the establishment of a mixed economy and the institutionalization of government
programs guaranteeing social insurance and concomitant redistributive or transfer payments to support those programs.
- Where the Social Justice movement and the Progressive Era of the early twentieth-century began to set the tone
for liberalism, the strong presidencies of Theodore Roosevelt
and Woodrow Wilson gave way to a burst of centralizing legislative activity which also marked Franklin D. Roosevelt’s
New Deal, Harry Truman’s Fair Deal and John Kennedy’s New Frontier.
During the period following World War II, the American political and social order continued to be characterized
by a schizophrenia on a number of national issues such as the relationship of government to the economy, the proper
size of the welfare state, and the scope of labor power in business.
But, with the support of the Truman administration, increasingly labor began to hold sway in the debates surrounding
economic regulation such as tax reform, unemployment insurance, minimum wage and the continued existence of the
Office of Price Administration.
- Where the post-war period was marked by a burgeoning economy,
the 60’s were marked by social and political turmoil and activism. During both periods, government agencies were
established and reorganized at a brisk pace. Government found
itself funding higher education through the GI Bill and health and social welfare through Social Security payments,
such as Old Age benefits, Survivors and Disability insurance, Aid to Families with Dependent Children ("AFDC")
and school lunch programs. Consumer protection legislation spawned public and private organizations for enforcement.
In total, the initiatives, legislation and government spending combined to cement the growth and institutionalization
of a massive administrative/regulatory state.
- As Frances Fox Piven and Richard A. Cloward have convincingly argued, the political and social turmoil and
public aid revolt found its response in Lyndon Johnson’s Great Society program and War on Poverty, arguably the
apogee of government intervention. Through a mix of new laws and
regulations, Johnson simultaneously gave previously unrecognized groups of citizens new rights and entitlements
and expanded the federal government’s role in protecting and administering those rights and entitlements. Johnson
launched a war on poverty that writer Nicholas Von Hoffman has referred to as the nation’s second Civil War and
Reconstruction. Johnson secured the passage of the Voting Rights
Act, a fair immigration law, legislation strengthening cancer and strike research, installed the Medicaid and Medicare
programs and tightened pollution controls. These initiatives built
upon, and furthered, the belief that government had a responsibility to protect and play a larger role in the lives
of its citizens. Although marginally diluted by President Nixon, this basic liberal bent towards welfare interventionism
and market regulation remained extant through the Carter administration.
- On January 21, 1981, in his inaugural address, President Ronald Reagan stated, "government is not the
solution to our problem." With those words, at a minimum,
Reagan set the stage for the predicament we currently find ourselves in with regard to privacy rights in cyberspace.
According to Reagan, "outside of its legitimate functions, government does nothing as well or economically
as the private sector." Ideologically, Reagan, and the men
around him, believed in rugged individualism, survival of the fittest and the primacy of an unfettered and unregulated
free market to solve the nation’s ills. Not since the New Deal
had a group entered office with such a determination to remake the American political and economic system. Their
economic philosophy not only changed the way the economy had been run in the United States since the New Deal,
but was arguably the biggest development in American economic philosophy since Keynesianism. Over the next 12 years,
Reagan and Bush set about dismantling the web of regulation and bureaucracy spun over the previous six decades.
- Throughout the 1980s, Reagan and Bush slashed funding and curbed the regulatory power of many governmental
agencies such as the E.P.A., S.E.C., F.C.C. and H.U.D. Reagan and Bush also deregulated and privatized a number
of industries including communications, utilities and transportation. Rhetorically, at least, the Reagan/Bush era
stood for a time of deregulation and increased distrust of government.
More importantly, during the Reagan/Bush era the ideological baseline returned to the laissez faire individualism
of the early twentieth century. Again, free markets were trumpeted, a resurgence of Social Darwinism rationalized
deregulation and decentralization, and proposing a regulatory solution to negative externalities became tantamount
to political suicide.
- At some point in this narrative, the contemporaneous digital and Reagan revolutions became intertwined. For
better or worse, the confluence of these revolutions, especially among Netizens and with regard to cyberspace,
has been marked by the emergence of a fervently libertarian political and ideological culture that is increasingly
rhetorically dominant. John Gilmore’s, "[t]he Net interprets
censorship as damage and routes around it," and Stewart Brand’s "[i]nformation wants to be free,"
statements became mantras for the space. Several quotes from the January 1998 issue of Wired magazine, The State
of the Planet 1998, further illustrate the continuing trend and ideological understanding: "Networks are
inherently decentralizing and anti-hierarchical . . . . Networks tend to leach power out of traditional institutions,
including electoral politics and the state." According to
the digitarians, technology has evolved to the point where government regulation is superfluous. Moreover, in the eyes of both the cyber-intelligentsia and the average citizen, Big Brother
is not to be trusted, especially, with personal information. At a minimum, it is apparent that government is no
longer welcome to protect personal information from private entities through legislative initiatives. On a macro
level, the conjunction of the dawning digital revolution, as embodied by cyberspace, and the vestiges of the Reagan
revolution, reveals that the debate surrounding this particular issue may be indicative of a larger debate surrounding
the future of government regulation in, for want of a better term, the post-technopolitical age.
- As cyberspace presents so many new legal issues and problems, it has quickly surfaced as the place where the
regulatory debate is most heated. Yet the traditional regulatory debate has become so recast that it appears increasingly
schizophrenic. For example, when the Clinton administration attempted to extend government protection to intellectual
property rights on-line, sharp criticism rapidly descended from the both sides of the political spectrum. Immediately,
Wired folk, such as Stewart Brand and John Perry Barlow, a former lyricist for the Grateful Dead, and right wing
intellectuals like George Gilder of the Manhattan Institute, and Newt Gingrich adviser, found themselves in the
oddest of alliances. Even President Clinton, who three years earlier
brazenly called on Congress to pass legislation to establish the information superhighway and wire every classroom,
moderated his interventionist position and announced a "hands off" policy for cyberspace. Governments, according to Clinton's new understanding, should not "stand in the way"
of the Internet, but instead they should simply enforce "a predictable, minimalist, consistent and simple
legal environment for commerce."
- Until this point in the United States, the libertarian argument, embodied by the industry and digitarian understanding,
has successfully resisted the application of government regulation aimed at enhancing informational privacy. There
can be little doubt that relative to the informational privacy rights of consumers, our legislators have assigned
greater value to the laissez faire principle and ideology. The comments of Robert Potsch Jr., a Vice-President
and Marketing Law specialist at Doubleday, illustrate the common foundation of the industry’s and legislators’
baseline positions on the existence of privacy rights in personal information. Posch vehemently argues that protecting
consumer privacy is antithetical to the comprehensive development of cyberspace and the success of the information
intensive industries that support it. He states that "[t]hose advocating the restriction of aggregate data
to satisfy an imagined problem could take us out of the leadership of the 21st century economy [because] reducing
privacy burdens on the free flow of information is the surest way to stimulate the information economy." Indeed Posch mockingly says "[privacy is] the ultimate subjective,
touchy-feely issue, . . . just some notion of the right to be left alone. Spare me."
- While the rhetoric and practice of "self regulation" has carried the debate, it comes at a price
to informational privacy. The result, at least for now, is that we find ourselves caught in a straightjacket: on
balance, we no longer trust government regulation to enhance much of anything,
let alone privacy. And, increasingly, we are rapidly discovering that private industry, and its bottom line, is
not much better. A study released in March 1997, by the Boston Consulting Group, revealed that 41 percent of Internet
users avoided sites that requested personal information, out of a concern for how the data might be used. According to the 6th annual World Wide Web survey run by the Graphics, Visualization and Usability
Center of the Georgia Institute of Technology (commonly called the Annual GVU Survey), 70% of consumers surveyed
cited privacy concerns as their primary reason for not registering demographic information with Websites on the
Internet, and 86% of consumers surveyed expressed a desire to control the use of their demographic information. Another study commissioned by TRUSTe confirmed these findings. In
its study, TRUSTe learned that 78% of individuals surveyed would feel more comfortable in providing information
over the Internet when visiting sites that provide privacy assurance.
- Thus, as individuals become more opaque in cyberspace and companies, employers and governments become more
invisible and anonymous, we begin to perceive and understand the effects of a demise of trust: gone are the previous
generations that grew to trust government and, increasingly, gone are the consumers and employees that grew to
trust corporations, employers and the invisible hand of the market. As cyberspace is an inherently social construct, the most frightening aspect of the debate may well be that from
our position on the cusp of the information age, and at the end of the Reagan Revolution, where and how we find
balance on these continuums and issues, however, delicate they may be, descends to the level of debating, defining
and assigning value to personal privacy, technological advancement, economic efficiency and profiteering. The decisions
are ours alone to make; the problems ours to solve.
- In a sense, the entire cyber debate might be recast into a wider debate on post-technological politics. In
that regard, any workable solution to the problem of privacy in cyberspace must be free from the rhetoric, fears
and perceived realities of Big Brother and the bottom line. In the following, I will argue that while the information
we seek to protect may be binary, the options for doing so are not. In particular, potential solutions lie beyond
the artificially limiting dichotomy that pits governmental regulatory intervention against laissez faire market
solutions. Until we shift the contemporary debate away from that dichotomy, informational privacy will erode at
the pace of technological development.
- To facilitate this arguably more fruitful discussion, I will begin with an in-depth analysis of the emerging
technological methods for collecting, storing and processing, and using personal information. I then argue that the current regulatory debate and attempts at enhancing informational privacy
incorrectly focus on regulating the use of information. I will use several statutory and common law proposals and
examples to illustrate the deficiencies of focusing on data use. Thereafter, I will present a proposal for refocusing
the regulatory debate on the collection stage of personal information. By combining the positive characteristics
of market based and regulatory approaches, the interaction between consumers and the personal information industry
will ideally lead to the realization of the commercial potential of cyberspace while still maximizing informational
privacy. Finally, I will suggest that the privacy debate in cyberspace is a model for a wider reformulation of
the blurring polarities of governance.
V. A Digital Taxonomy
- The collection and use of personal information is not a modern phenomenon. Historians document that as far
back as the eleventh century, monarchs collected information on their subjects for the purposes of planning taxation
and other state affairs. William the Conqueror, for example, collected
information on his subjects in the Domesday Book. First, William’s
assistants collected information about the subjects via interviews and observation, and stored this information
in the Domesday Book using pen, ink and vellum. His aides used a ledger system to organize the data. Because they
had organized the raw data, William’s servants were able to levy taxes using the information in the Domesday book.
- Up until roughly twenty years ago, the collection, processing, storage and use of personal information was
similar to the method used to compile the Domesday Book. It was time consuming, subject to broad error and relatively
expensive. In fact, other than the means of collection and modes of processing, little has changed in the method
of collecting and utilizing personal information.
- Although modern processes remain similar, contemporary technological advancements have resulted in greatly
enhanced storage capacity, retrieval speed, processing and utilization of personal information. In fact, the whole
process has become so efficient and integrated that it is often impossible to separate it into its component parts.
For example, online technologies like Double Click network are capable of rapidly collecting information (reading
a Web site visitor’s consumer information and preferences), processing it (statistically correlating it with existing
information), storing it in databases and using it (supplying the Web site with advertising tailor fitted to each
user’s personal profile). Like the machine and programs at the
heart of data collection, filtration, storage and utilization, the process is contingent and iterative, characteristically
defying simplistic categorization. However, analysis of each of the component parts of the personal information
system is still the best way to understand the entire process of information collection and use.
- While the methods of collecting personal information, or data, are complex and varied, there are essentially
two modes by which information or data is gathered.
1. Traditional Collection
- For lack of a better term, I call the first, and foundational, method, traditional or "ask and answer"
collection. Traditional collection requires an affirmative step on the individual’s behalf, which usually insures
that the individual is at least aware that information collection is occurring. A consumer, employee or citizen,
either voluntarily or necessarily, provides personal information when she registers, applies, enrolls or requests
information, products, services or jobs. Information is transferred
via a number of media and usually flows from the consumer, citizen or employee in writing, orally or by depressing
computer keys, telephone dialing pads or touch screens, often in response to questions or requests for information.
The nature of the information collected and the methods of collection are numerous and familiar and vary with collector
- Government, for example, gathers a tremendous amount of information
through the traditional method. Virtually all American citizens and immigrants apply for a Social Security card,
visa or green card. It also assigns other numbers, locators and indicators. Most people over a certain age have
a driver’s license, passport or some other form of official identification. Taxpayers fill out tax forms such as
the 1040EZ and W4. Americans apply for tax extensions, food stamps, Social Security or disability benefits, financial
aid, GI benefits and disaster relief by filling out forms. They apply to, and enroll in, public elementary schools,
state universities and community colleges. They register for the selective service and some register to vote. In
2000, the federal government will conduct a census, which will provide it with volumes of demographic information
regarding the religion, race, age, number of children, socio-economic status and geographic location of the population.
- The government gathers information primarily for the purpose of administering bureaucratic functions such as
Social Security, taxation, Medicaid, Medicare, transportation, commerce, national security, education and welfare.
A 1986 study by the U.S. Office of Technology Assessment (USOTA) revealed that 12 government agencies maintained
539 records systems classified under the Privacy Act, which contained
more than 3.5 billion records, 60 percent of which were computerized.
- Private industry, usually corporate entities, also uses traditional methods of collection. Consumers apply
for credit cards, membership in promotional programs and frequent buyer programs, subscribe to magazines, register
for access to websites and enter contests. Doing so inevitably requires certain information such as name, address,
Social Security number, place of employment and/or a reference. Consumer surveys and warranty cards request information
about consumer preferences. Magazine subscriptions require name, address and often age. Registering a new Web browser
requires a name or alias, e-mail address, age or date of birth and a password. Occasionally, consumers answer long
consumer surveys or participate in polls. Sometimes they are paid for this information with coupons, free tee-shirts,
prizes or money back refunds. Usually, however, if consumers do not provide the information, they will not be granted
privileged use of the service or product.
- Moreover, products and services are increasingly becoming contingent. Certain products and/or services are
prerequisites for others, making the cost of exit or foregoing the product or service higher. For example, video rental stores often require prospective members to provide a credit card
number before they will issue them membership cards. As a result, choosing not to provide information to credit
card issuers could mean that an individual not only does not have access to a credit card, but also cannot rent
movies. Video rentals are but one example of a contingent service. Others can affect an individual’s career or
- Until recently, perhaps, the greatest source of personal information available to corporations and the government
was the information available to them as employers. Employers require employees to fill out applications, reports,
timesheets and surveys for health care, employment, payroll purposes and employee performance and satisfaction. Now, however, the greatest sources of information are information
clearinghouses, list brokers and the like. This might indicate that buying and selling personal information has
proven more valuable than knowing your employees.
- Generally, traditional methods are slow and inefficient. Filling out a card or bubble sheet and mailing it
to an address to be processed takes time and considerable expense. In addition, the information, or data, is still
raw, and for the most part worthless. After it has been processed, digitized, or, that is, after it enters cyberspace,
the information becomes more valuable because it is "understood" and inexpensively and efficiently transferable.
Currently, information collected, even via the traditional method, is rarely collected in raw form. Rather, the
information goes from the consumer’s, employee’s or citizen’s possession directly to cyberspace via some networked
gateway technology such as a computer terminal, telephone or other numeric keypad. More importantly, once it is
digitized, information derived from small or specialized applications, surveys or registration forms is combined
with more in-depth information previously collected, such as census information or lengthy consumer surveys.
2. Transaction Generated Information
- The second, more efficient method piggybacks on traditional collection methods. In this process, called transaction
generated information (TGI), the individual interfaces directly with cyberspace through one of a variety of networked
technologies. A person may establish a modem link via a desktop computer, dial a telephone number, slide a bank
card into an ATM, purchase something such as gas or food with a credit, frequent shopper or debit card, clock in
at work, enter a library with a student identification or even vote. In the vast majority of these situations,
the user, consumer, employee or citizen has already been through some strain of traditional information collection.
The person might have filled out an application to open a bank account and received an ATM card, registered at
a particular Web site and received a cookie.txt file or filled out the preference information on their Web browser,
signed up for a frequent shopper, library or Diner’s club card, filed their W2 or Social Security application form,
or agreed to have a Nielsen rating box in their home. Just as each traditional method requires different information,
each transaction varies as to what information is collected, how it is collected, and to whom or what the information
is connected. While nuanced, TGI is built on the concept of universal identification.
- Just as consumers identify themselves through, or in contrast to, certain causes, religions or ideologies,
they are identified by much of what they carry in their wallets, store in their computers and, eventually, by their
fingerprints. Ultimately, the "ask and answer" method of data collection produces identifiers. Each new
sign up, application or registration creates new identification numbers or symbols like credit card numbers, Social
Security numbers or aliases and passwords. The human corpus does not currently digitize well, but people are linked
to identifiers that do. In sufficiently complex networks, identifiers link individual transactions to a stored
body of previous transactions and information. Increasingly, one identifier links to previous identifiers and the
body of information grows with each successive transaction. For example, imagine that Bob purchases a new maroon
blazer from the Gap with his recently acquired Gap charge card. Perhaps, Bob's first identifier (the Gap identifier
or charge account number) links to his bank account number, which then links to his credit card number, and all
the corresponding information. The credit card identifier might, in turn, be linked to a Social Security number,
and, thereby, Bob's census, IRS, health, insurance, spring break arrest and employer information. Although this
information may be linked using a single identifier or series of connected identifiers, individual information
providers, like the Gap, might not have access to all the information contained in other information providers’
database. However, the more seamless the network is the more seamless the ascertainable informational profile of
Bob will be. From income to religion, as long as the marginal utility of the information is more than the cost
of ascertaining it, the information can and will be collected and employed.
- Another example of TGI is known as telephone transaction-generated information (TTGI), TTGI is information
generated by telephone usage and transactions related to telephone services.
In addition to information about telephone subscribers generated by the application forms and billing information,
call-detail records provide comprehensive and potentially useful information about individuals. Information generated
by "exchange and interexchange phone calls [includes] the date and the time of the call, the number called,
the calling number, the geographic location or address of caller and call recipient, the duration of the call and
the charge." Consumers and businesses most commonly make
use of this information through caller-id.
- A routine cyberspace example of TGI helps to complete the collection picture. The last time I visited the Center
for Democracy and Technology’s Website at http://www.13x.com/cgi-bin/cdt/snoop.pl from a law library computer,
the exchange proceeded as follows:
- Hi! This is what we know about you:
- You are affiliated with Harvard University
- You are located around Cambridge, MA
- Your Internet browser is Mozilla/4.02[en]
- You are coming from langopen2-13973.law.harvard.edu
- I see you’ve recently been visiting this page at www.cdt.org
- Someone in the computer center input this information into the Netscape browser on the library’s computer network.
When I visit the same site from my laptop computer, the information known and, theoretically, collected refers
directly to my home address, personal e-mail, and other personal data. Each time I enter and exit the site, the
information bank grows in direct relation with the sites previously visited.
- Of course, if you know that the information is being collected, or generated by your transaction, you can thwart
information collection by visiting an anonymizing site when you
first login, by not filling out, or filling out with false information, your browser preferences, or by abstaining from the transaction. Here the fundamental difference between TGI and the traditional
method becomes apparent in that the TGI is collected silently. Given the recent flood of press attention devoted
to this phenomenon, more and more people are aware of the collection of TGI. Unfortunately, as users gain awareness
and engage in blocking or other techniques to avoid collection, the collection technologies gain the equal and
opposite sophistication. What results is a tug of war, a technological arms race wherein capital remains elementary.
- Since the portals to cyberspace are currently varied, identifiers are similarly varied. Eventually, all identifiers,
and identification technologies, might collapse into one. Perhaps it will be a digital fingerprint or handshake.
Placing your thumbprint to a screen might allow you to pass your resume to a potential employer, purchase a smoothie
with funds automatically deducted from your checking account or make reservations at your favorite trattoria. Until
that time, we will continue to have many identifiers, and a variety of technologies will be required to read and
digitize the information. Frequent shopping cards may continue to depend on bar coding and UPCs, while credit cards
continue to employ magnetized strips. Certain supermarkets may also continue to be unlinked to popular (ATM) networks
such as Plus, Cirrus or Most. A greater number of dead-ends on the network may mean greater information privacy
for consumers since businesses will be unable to link information that they have gathered with other information.
Unfortunately, dead-ends, or disconnects, also force consumers to carry many cards in their wallets and PINs in
- Before moving on to processing, storage and use, it is important to briefly elaborate on the other side of
the collection coin--the benefits to consumers of collection through identification. Primarily, identification
mechanisms enhance efficiency by saving consumers time in situations which require screening and authentication.
Instead of having to produce your driver’s license and a major credit card every time you go shopping with a personal
check or facing a teller every time you want cash, you simply scan your shopper’s card or insert your ATM card.
Every time you revisit www.thenewrepublic.com, you do not have to reenter your address, name, password and age.
Rather, the New Republic’s server recognizes your cookie.txt file and welcomes you back. With time, these technologies
will only get better. One really smart card might someday contain all the essentials, including passport, credit
cards, bank cards, frequent shopper’s cards, driver’s license, digital cash, Social Security number, automatic
tax deductions and exemptions, bookkeeping functions and even pictures of the kids. Perhaps, more importantly,
the card may enjoy military strength encryption and a digital fingerprint, rendering it useless to anyone other
than its owner. Back in real space, however, the principal dividends from direct marketing and advertising, such
as coupons and other incentives, are currently available to consumers only after the information has been processed.
B. Storage and Processing
- As explained above, William the Conqueror understood that the data he collected was essentially worthless until
it was organized, processed, sorted or understood. Organization adds value to raw data. A tidy, alphabetized file
cabinet is infinitely more valuable than messy, coffee stained stacks of the same documents. Depending on use,
the same principles apply to information that is sorted according to age, name, zip code, religion and race. Data
correlated by the same factors would, in theory, be more valuable still.
- Typically personal information is processed and stored in databases simultaneously. Personal information, or
data, can be stored in any number of forms and repositories. File cabinets, punch cards and ledgers were rudimentary
storage vessels. For the most part, they were relatively labor intensive, slow and spatially inefficient. The data
sector subsequently embraced vast storage capacity and instantaneous retrieval. Magnetic tapes and mainframes gradually
replaced file cabinets and ledgers. Currently, the bulk of information worldwide is stored in computerized database
systems. A database is nothing more than a horizontally structured and vertically integrated collection of information. Database management programs or systems (DBMS) became readily available
to large businesses in the late 1970’s. By the early 1980’s, many of the programs were inexpensively available
for microcomputing systems. DBMS computer programs automated filing in virtual cabinets inside computers with what
seemed like endless capacity. Storage and retrieval of data similarly became amazingly efficient. When a user retrieved
information, the computer provided her with a copy of it and the original data remained safely in the database.
Data and records could be used in several locations simultaneously and, depending on the network, utilized worldwide.
Again, DBMS’s were more than just storage units, because the programs could collect information as well as process
- Increasingly, databases are networked. For example, the U.S. Government has networked the databases of the
Customs Service, the Drug Enforcement Agency (D.E.A.), the I.R.S., the Federal Reserve and the State Department. Additionally, the Counter Narcotics Center combines the database
power of the F.B.I., C.I.A., D.E.A., National Security Agency, the Department of Defense, State Department and
Coast Guard. The Treasury Department’s Financial Crimes Enforcement
Network (FinCEN) has compiled a large and sweeping database to combat money-laundering activities. To some extent, market forces delimit the networking of databases in the private sector. However,
a great deal of centralization has effectively taken place because of the tendency of the industry to move toward
oligopoly and because almost all private actors buy and sell information
lists, archives and datasets with increasing frequency. This creates, to the extent that public and private databases
are networked, the gradual ascendancy of one central database. This trend, in turn, enhances the ability to assemble
broad-based information selectively or to correlate existing information. One author’s description of this as "functionally
the equivalent to the ability to create new information," illustrates the blurring of the lines between the
storage, collection and processing of personal information.
- For data to be processed and stored electronically, it must be readable. Thus, data that is collected in non-digital
form must first be digitized or turned into binary code--computer-readable bits of information. Depending on end-use,
digitizing proceeds upon a predetermined model or program (DBMS). Utilizing any number of rubrics, technologies,
processes, software programs or hardware systems, raw information/data is assessed, labeled, classified, categorized,
zoned, sorted, matched, clustered, segmented or filtered. In the simplification process, the collected information
is assigned a numeric value and sorted accordingly. For example, a survey might ask consumers to respond to three
questions: (1) age, (2) income and (3) toothpaste preference. Given five age ranges, six income brackets and four
brands of toothpaste, the information in age might be assigned a signifier of 1-1 through 1-5, income 2-1 though
2-6, and toothpaste 3-1 through 3-4. After assigning numbers to all the possible answers, the program might then
sort all the responses into categories. Depending on the results sought by the commissioner of the survey, the
program would then compare, correlate or regress the categories. Eventually, the results will be analyzed, either
by the computer or, in this case, a marketer. The results might indicate that individuals in categories 1-3 and
2-4 buy almost no 3-3. While this may be enough for some enterprises, the sorting, however, does not necessarily
stop there. Rather, the results and possibilities increase in complexity and value in direct relation to enhanced
- A fuller understanding of the storage and processing of information requires a rudimentary comprehension of
computer software and hardware. Computers essentially process information in the form of two electrical impulses,
on and off. Each electronic impulse is read as a number, 1 (for "on") and 0 (for "off"). By
translating the electronic impulses, or information, into a series of 1s and 0s, computers are capable of performing
mathematical operations. Early computers contained circuitry designed,
or were "hardwired," to perform specific tasks. For example, in 1946, one of the very first computers,
the Electronic Numerical Integrator and Calculator ("ENIAC") consisted of 18,000 vacuum tubes, 6,000
switches, 10,000 capacitors, 70,000 resistors, and 1,500 relays. The computer took up 1,800 square feet space. The ENIAC was basically a supercharged calculator, performing 5,000
calculations per second, a thousand times faster than any previous calculator.
- Modern computers, especially personal computers, are required to perform a number of tasks that require a high
degree of flexibility. As hardwiring limits flexibility, most computers are hardwired to perform more general functions
and depend on software programs for more specific, detailed instructions that direct a computer’s hardware (central
processing unit (CPU) and memory) to produce certain results or perform specific functions such as data processing
- The two most important components of databases and database technology are storage and processing/utilization.
Databases as storage units hold data on entities and transactions of different typology and allow the retrieval
of mass quantities of information. Relational databases, such as Oracle’s main product, hold data in simple structures
called, variously, "records" or "rows," each of which contains a subset of "fields."
A database of people, for example, might contain a number of records or rows, each of which would also contain
a number of fields for name, birth date, address, sex, telephone number, etc. Some database fields are "references"
to entire records in the same or alternate databases. In this manner, for instance, the employer field in a person
record might reference a large record in a company database. More cutting edge "object" databases hold
data in less rigid structures, allowing for easier modeling of complex data relations, sometimes at the expense
of data retrieval efficiency.
- The utilization of data stored in databases is more complex, though storage is ultimately designed with an
eye toward process and function. In general, there are two types of data utilization techniques: store and query.
Storing data occurs either upon the creation of a record or in the modification of one or more of a record’s fields.
This is relatively simple, though many database products offer tools for designating how users enter the data to
be stored. Data queries might take milliseconds or months to complete. A typical query might ask, "Give me
a list of all persons who work for a company headquartered in Canada, and give me an average of their salaries
by industry. Sort the persons alphabetically by industry, then by company, with the industry sorted by gross sales
highest to lowest, then alphabetically by last name." Implementing this query might require accessing a number
of different databases and a great deal of sorting and applying selection conditions. All database products come
with means to enter such queries (relational databases conform to a particular query language called SQL) and return
results. The means of presenting these results in terms of reports, charts, graphs, etc. is another aspect of data
utilization. A result of a query might also be used to drive programs to take one fork rather than another. And,
of course, it is important to optimize the query processing, as users desire efficiency in response time and storage.
- Thus, a database system is a combination of hardware, software and a compilation of coded data known as the
database. Hardware is required to store the database and to process
or manipulate the data according to the dictates of the software. Hardware runs the gamut from powerful mainframe
computer systems to networked desktop models. The database itself is made up of previously processed files that
contain related records. Each data record is further divided into subfields, by DBMS generally according to mathematical
formulae or algorithms. The file organization and storage capacity are inextricably linked in terms of capacity
and speed of recovery. Databases are designed to minimize the time required to access, retrieve or update the records
and to minimize the database’s storage space. A database system provides efficient access to large amounts of information
and allows users to manufacture new arrangements, configurations or sorts based on evolving software inputs. Although discussed more in-depth below, the newest, cutting edge
database processing programs in cyberspace are called intelligent agents or bots. Many agents depend on a patented
algorithm called automated collaborative filtering (ACF). After
a user completes the threshold ask and answer form, the user interfaces with an "agent." Utilizing the
baseline information and the TGI, the collaborative filtering sorts data in much the same way that William’s civil
servants did--assigning, averaging, correlating and memorizing numbers, symbols and/or values. However, ACF is
different from traditional techniques because, it correlates the data by juxtaposing incoming information, such
as opinions, tastes and preferences gathered by the intelligent agent, with previously filtered information (first
from this particular user and then from all other users) stored in company databases. The "strongest,"
or highest significant positive, correlations are returned to the user, with the significant correlations, in one
instance, translated into recommendations or "intelligent information." This loop back feature opens
a line of communication between the user and storage unit mediated by the (pre-programmed) filter or sort. The
more data it receives, the smarter it gets. The more an individual clicks, the more complete the filtration becomes.
In theory, every option selected by the user is another step towards the essential categorization and storage of
the user's self. Again, because agents are networked devices, the access, collection and utilization of "intelligent"
personal information increasingly becomes unlimited.
- Once the raw data has been processed and stored, it is ready for use. Just as William used the information
to increase his revenue from taxation, the manufacturer of toothpaste 3-3 (the commissioner of the survey in the
example above) will logically change marketing strategy according to the results of the survey. Perhaps, it might
start targeting the 1-2 and 2-4 niche, get a new advertising agency or write the group off altogether. After it
has been processed and stored, the information becomes more valuable in terms of what it illustrates and predicts.
Currently, the most widespread use of collected and processed information is direct marketing done with products
from list vendors.
1. List Vending and Direct Marketing
- David Shenk traces the ascendancy of list vending, and niche marketing, back to Claritas, the market research
company founded by Jonathan Robbin. In the 1970s, Claritas invented
direct marketing. In direct marketing (also known as niche-marketing, one-to-one marketing, relationship marketing,
loyal marketing and dialogue marketing) goods, services, ideologies or opinions are positioned to appeal to small,
defined groups, previously identified by demographics. Marketing and advertising are tailored to these audiences
and placed in media outlets that, based on previous data analysis, each particular audience is known to frequent. It seems foreign to the modern observer, but in the recent past,
Ivory soap, Fruit Loops and STP were targeted only at the general audience listening or watching a particular program.
Niche marketing was not yet widely used. Technological innovations led to the development and management of electronic
databases on consumer behavior that perform statistical analysis of that data and determine the precise geographic
location of consumers. In 1976, Robbin introduced the PRIZM database, a comprehensive geo-demographic analysis
tool of amazing complexity and ability. The database was organized on a set of refined algorithms designed to extract
the statistically relevant similarities between demographic information and geographic location nationwide, enabling
the organization, and segmentation, of a nation into the sum of its ideological beliefs and consumer habits.
- Currently, list vendors compile information by buying, selling and trading lists from a variety of public and
private sources. Given the raw data, they simply process or reprocess (using a variety of "enhancement"
techniques) the information to produce "new" lists tailored to specified interests. Lists can be obtained
for virtually any category of consumer or belief. Conceivably, there are lists of expectant mothers and their due
dates and lists of middle class (above $75,000 in annual salary) Latino men under 6 feet who voted Republican in
1988 and drive European sports cars originally purchased for over $50,000. You name it, they list it. And, if they do not, they’ll figure out a statistically persuasive argument for why another
list is predictive or indicative of what a list buyer is searching for.
- As described above, both list vending and direct marketing are big businesses. There are a number of firms
that specialize in selling lists of tailored information that can be used by businesses in conjunction with existing
internally gathered information. Donnelly Marketing Information Services, a division of InfoUSA, sells access to
a database covering 200 million U.S. households. Subscribers to
the information can generate profiles of consumers according to demographics, life-styles, and retail expenditures,
such as mail responsiveness, credit worthiness, vehicle information, financial investments, hobbies, occupations
and census demographics. The other major players in this domain are the credit reporting agencies, Equifax, TRW
and TransUnion, which maintain files on more than 90% adult Americans.
Numerous other agencies offer similar information products.
- According to the Direct Marketing Association (DMA), the largest trade association for businesses engaged in
database marketing, manufacture and collection, with more than 3600 member companies from the United States and
47 nations worldwide, over 50 percent of direct marketers use
the Internet and Web for advertising and 48 percent actively mine the membership rosters of major computer online
services for e-mail addresses and other information.
- Lists, polls, surveys and data enhancement services drive virtually all advertising and marketing nationwide.
Personal information is at the backbone of everything from telemarketing to the President’s radio addresses. At
a point, the information becomes self-reinforcing, an autonomous referent.
2. Intelligent Agents and Push Technology: Cyberapplications
- Intelligent agents, push technologies and other cutting edge cyberspace technologies have been referred to
as "direct marketing on steroids." While the paradigm
is analogous to direct marketing, push and agent technologies are more efficient and have a greater potential to
reach consumers. These technologies, in a certain sense, have aptitude, an ability to learn as they iterate. They keep the data channels open and information constantly flowing
in both directions. The more you use your agent, the smarter it gets. The smarter a consumer’s agent gets, or the
more she receives pushed content, the better it will serve her and the more she will use it, until eventually,
according to Kai G. Wussow, a director of Eutelis Consult, a German consulting firm, "[the marketers, advertisers,
list vendors, etc.] know what [she] like[s] to have better than [she] do[es] [her]self." However, current e-commerce companies have been criticized for not utilizing the power of agents
and the most interesting work on agents is still being done in labs.
- In strong form, agents will memorize, and process, every mouseclick and purchase, and the amount of money and
time a consumer spent doing so. Agents take note of a consumer’s reading material and her correspondence, including
the most frequent e-mail addresses, coming and going. They memorize each piece of information and tirelessly compare
and contrast the next.
- Agents come in all shapes and sizes. Some are programmed to act like humans. Behind the facade, agents are
sophisticated programs interfaced with powerful computers and databases. Some agents are sedentary, in that they
remain on a particular server and scan the desktop for aberrations such as viruses or notify the user of abnormalities
like unsaved revisions. Other agents roam cyberspace searching for information that their users have requested
or that the agent deems the user might want.
- Analogous to the DBMS’s, agents filter information. The difference being, agents are personalized, they work
closely with their subject. For example, a consumer could conceivably program her agent to find and purchase seats
to an upcoming opera. The agent, having access to her personal digital assistant ("PDA"), would check
her calendar, find the cheapest available seats, purchase the tickets and e-mail her.
- Push technologies are based on similar technological innovations. In 1997, Wired Magazine broadly predicted
that the Web browser would soon be replaced by push technologies.
While thus far push technology has not fulfilled analysts’ initial high expectations, some commentators have recently declared that push technology is staging a comeback, because
of the emergence of new more viable business models. Furthermore,
personalized services from web portals, like Yahoo!’s My Yahoo! service (http://my.yahoo.com) and Netscape’s My
Netscape (http://my.netscape.com), also allow individuals to make selections about what push content they would
like to receive. Wired initially argued that the noise and congestion on the Internet that diminished its utility
could be avoided via push technologies such as PointCast. Push
technology, like intelligent agents, revolves around customer profiles. The user sets a profile of what interests
her and the push programs do the rest. Each time the user logs on, an identifier trips a certain profile and the
program starts grabbing content and advertising, targeted, or filtered, according to the user’s profile. The push
programs will monitor pull and push practices, gaining intelligence while manufacturing data commensurate with
use. Filtering processes, again, are fundamental to the technology. They use the same model as the direct marketers,
yet the channel and connection are always open and literally one to one.
- The marriage of agents and push will offer companies the ability to inexpensively gather data far more detailed
than the standard Madison Avenue demographic fare of consumer ZAGs (zip code, age and gender). As John Sviokla,
a marketing professor at Harvard Business School, proclaims, "[i]t’s a fundamentally cheaper way to identify
customers, sort them, and sell to them." However, agents
and push do much more than Sviokla suggests. They create a fundamentally different paradigm in marketing.
Push and agent software no longer offer products to people, but rather people to products--a potentially more efficient
manner of consumption.
- A glance back reveals how far we have traveled. Traditionally, data collection and employment was product oriented
(PO). Essentially, marketers reverse-engineered information regarding customer choice to infer why customers’ decisions
were made. Advertisers then used available demographics to further understand whether a particular class of consumers
might prefer one product over another. This is rapidly changing.
- Today agents and push technologies are still PO. However, their "P" stands for people not product.
Eventually list vendors will sell people. No longer, will it be sufficient for marketing data to reveal aggregate
ZAGs on who buys what and how much. Rather, agents and push will offer individual preferences and desires. People,
through their intelligent agent or push preferences, will map their next purchase, the brand and how much they
can afford. The only marketing that retailers will have to do will be buying consumer preferences, locating the
person and selling to them at a price they can afford. In the future, people may not need to self-consciously create
their own images. Instead, direct marketers will be able to appeal to their true personaes and unconscious desires
through computer programs that analyze their credit card purchases, television viewing, movie selection and their
taste in partners.
- If this sounds impressive, we must also bear in mind that the real masters of the agents and the peddlers of
push are the companies that wrote and patented the code or bought the company that did. More fundamentally, all
the potentially sensitive information rendering agents intelligent and powering the push is secured in the purveyor’s
databases. They own it.
- Inevitably, popular articles discussing these new technologies and their potential either paint a rosy picture
of a future so bright or a horror story of just how little personal privacy is extant. The stories are commonplace: what happens when an employer, or, god forbid, your health insurer,
finds out that an employee is interested in medicinal marijuana. While technological innovations have always spawned
a mixture of fear and speculation, these technologies are not going to disappear. When they transcend the Web and
find their way to the cyberspace inhabited by the rank and file--the ATMs, grocery stores, telephone, etc.--the
real fun and problems will begin. The issue will then become finding a comfortable middle ground, a domain where
the future is bright, but where some sort of protection salvages the requisite amount of shade for those who value
VI. Existing Legal Protection
- The United States lacks a comprehensive or omnibus law to protect personal informational privacy. Rather, personal
privacy rights in the U.S. are protected through a loose, and often ineffective, patchwork of Constitutional, statutory,
common law and private sector guidelines, which at best provide piecemeal protection. As will be discussed below, regulatory efforts have targeted the use or employment, as opposed
to the collection, storage or processing, of personal information. The contemporary regulatory debate regarding
methods for enhancing informational privacy also focuses on the identification and regulation of categories of
malignant uses of personal information.
A. Federal Statutory Protection
- Absent omnibus protection of informational privacy, Congress has reactively passed piecemeal, industry-specific
statutes and regulations to control the use of information according to the specialized intricacies and interests
of particular industries and consumer groups. Because of the ascendancy of technology and the greatly enhanced
capacities for storage, collection and use of personal information, this piecemeal and reactive approach is increasingly
less and less effective in protecting consumers.
1. Fair Credit Reporting Act
- In 1970, Congress passed the Fair Credit Reporting Act ("FCRA"). The FCRA, perhaps the most comprehensive protection of consumer privacy rights, provides a
list of permissible purposes for which personal information about a consumer may be released without the consumer’s
consent. For example, under the FCRA, credit agencies may furnish
credit reports without the consumer’s consent under a number of circumstances (i.e, for the purpose of establishing
an individual’s credit worthiness, employability or "other legitimate business need"). When credit is
denied to an individual, the FCRA mandates that the user of the credit report supply the name and address of the
credit-reporting agency and follow reasonable procedures to insure the accuracy of the credit information. Credit
agencies must also have established administrative procedures for investigating disputes and alleged inequities.
Certain "obsolete" information may not be disclosed, although the obsolescence threshold has been set
exceedingly low. It should also be noted that the FCRA only
purports to regulate credit-reporting companies. Other than the laughable "obsolete" standard, the FRCA
is silent on information regarding consumer preferences and purchases collected by or from credit agencies or card
2. Privacy Act
- The primary instrument for regulating the information practices of the federal government, the Privacy Act
of 1974 ("PA"), is thought to have been passed in
response to the excesses and abuses of Watergate. Pursuant
to the PA, federal agencies are permitted to collect and maintain records containing personal information to the
extent that the information is "relevant to accomplishing" the agency’s purpose. Information that is collected, however, must be maintained accurately and completely, and,
where practicable, gathered from first parties. Excluding seven
classifications of records, primarily related to law enforcement and defense,
the PA specifically requires that every federal agency maintain a system to: 1) permit the individual to control
disclosure of the information in the record; 2) retain records
of information that has been disclosed; 3) permit the individual
to review and maintain a copy of the information in the agency’s records;
and 4) allow the individual to request an amendment of information contained in an agency’s records. These requirements, however, are diluted by the empowerment of agency heads to promulgate rules
exempting any system of records within the agency from the reach of the PA.
Finally, because it applies only to governmental actors, the PA does nothing to restrain marauding private entities.
3. Cable Communications Policy Act
- In 1984, Congress enacted the Cable Communications Policy Act ("CCPA"). The CCPA requires cable television companies to provide annual notification to subscribers
regarding the use and disclosure of their personal information.
Furthermore, the CCPA prevents cable companies from utilizing the cable system to collect or disclose personal
information about subscribers without their consent except as required to render cable services or detect unauthorized
cable reception, or pursuant to a court order. Thus, a cable
operator must destroy any information "unnecessary" for the purposes for which it was collected. Nonetheless,
a cable company can distribute a mailing list of subscribers provided it allows each subscriber an opportunity
to remove her name from the list. While the remedies available
to subscribers for violation of the CCPA include actual and punitive damages and reasonable attorney’s fees, these
regulations may be easily circumvented since they apply only to cable companies. Thus, once a particular list has
left the hands of a cable company these restrictions do not apply.
4. Electronic Communications Privacy Act
- The Electronic Communications Privacy Act of 1986 ("ECPA"), like the Communications Act of 1934,
prohibits the unauthorized collection and recording of the contents of telephone conversations or data transmissions,
including the contents of e-mail messages. This statute provides
only limited protection of personal privacy from the state, since it allows the government to seek a court order
for a specified law enforcement purpose. The only bite in the statute comes from a provision prohibiting a public
telecommunications service from disclosing the contents of communications or an electronic message without either
the consent of one of the parties or an authorized law enforcement action.
While this prohibition has yet to be tested, a suit by a decorated Naval Officer, may shed some light on the statute
and issues discussed herein. The Naval Officer, Timothy McVeigh
(no relation), was threatened with discharge from the Navy after a Naval legal officer acquired information from
McVeigh's personal profile from America Online. In his personal profile, McVeigh listed his hobbies as "driving,
boy watching, collecting pictures of other young studs," and his martial status as "gay." McVeigh filed suit in Federal District Court in Washington against the United States Navy and
America Online alleging, inter alia, a violation of the ECPA.
5. Video Privacy Protection Act
- The media’s access to the list of videos rented by Supreme Court nominee Judge Robert Bork worried many members
of Congress. Congress reacted by quickly passing the Video Privacy Protection Act of 1988 ("VPPA"). The VPPA is a criminal statute regulating the disclosure of information
about videotape rentals. Specifically, the VPPA prohibits the disclosure of the title, description or subject matter
of a film rented by a particular customer without written consent.
Names and addresses, however, are excluded. Like other federal
privacy statutes that regulate private businesses, the law only applies to a narrow category of information and
a specific industry.
6. Telephone Consumer Protection Act
- The Telephone Consumer Protection Act of 1991 ("TCPA") 
was aimed at the companies responsible for millions of dinnertime interruptions--direct telephone marketers. Under
the TCPA, telemarketers cannot use automatic telephone dialing systems or pre-recorded voice messages to call patient
rooms in health care facilities, emergency lines or any telecommunication receiver where the called party must
pay for the call. More substantially, the TCPA prohibits, with
certain exceptions, pre-recorded calls to residential lines without the prior consent of the recipient. The TCPA also empowers the Federal Communications Commission ("FCC") to further regulate
calls to businesses and to exempt from liability certain non-commercial calls which would not "adversely affect"
privacy rights. Pursuant to the TCPA, the FCC is permitted
to amass a database of telephone numbers of residential subscribers who object to receiving telephone solicitations,
and prohibit unsolicited calls to persons listed in that database.
An interesting question remains: what statute would stop the FCC from selling a list of those names?
7. Proposed Legislation
- In recent years there have been numerous legislative initiatives drafted to address the numerous shortcomings
of existing informational privacy protection. Representatives have proposed bills expanding online privacy as well
as bills protecting financial and health care related personal information. However, with, the exception of the
Children’s Online Privacy Protection Act ("COPPA"),
none of the proposed bills have been enacted.
- One of the failed bills, H.R 3508, the Children’s Privacy Bill, was introduced by Bob Franks (R-NJ) in late
1996. The bill would have made it a crime, punishable by up
to one year imprisonment and subject to civil action, for a list broker to engage in any of the following acts:
1) to buy or sell personal information about a child without parental consent; 2) to knowingly fail to comply with the request of a parent to disclose the source of information
about the child; 3) to knowingly fail to disclose all information that the broker has sold regarding the child
or to disclose all people who have received information about that child;
4) to contact the child or parents for commercial purposes; 5) to fail to comply with the request of a parent to
disclose the source of the information; 6) to knowingly use prison labor to process information about children; or 7) to knowingly distribute or receive any information about
- Edward Markey (D-MA) authored another failed bill, entitled the Communications Privacy and Consumer Empowerment
Act. Markey’s bill would have required the FCC to study the
impact of new technology on privacy rights and, if necessary, to take protective action. The legislation became
part of the larger debate surrounding government regulation of the Internet, and thereby never emerged from committee.
- On January 7, 1997, Representative Bruce Vento (D-MN) introduced the Consumer Internet Privacy Protection Act
of 1997. The bill endeavored to prohibit the disclosure of
personally identifiable information without the consent of the individual. In general, the bill stated that an
interactive computer service shall not disclose to a third
party any personally identifiable information provided by a
subscriber to such service without the subscriber's prior informed written consent. Furthermore, "such service shall permit a subscriber to revoke the consent granted under
paragraph (1) at any time, and upon such revocation, such service shall cease disclosing information to a third
party." Knowing disclosure of falsified personally identifiable
information to a third party was also prohibited. Under the
proposed bill, subscribers were explicitly granted access to personally identifiable information and to the identity
of third party recipients. Upon receiving access to such information, subscribers could verify or correct such
information without sustaining any fees or charges. The bill would have granted The Federal Trade Commission ("FTC")
the authority to examine and investigate an interactive computer service to determine whether it had violated the
Act. Under the bill, if the FTC determined that an interactive computer service had engaged in any act or practice
prohibited by the bill, the FTC could issue a cease and desist order. Finally, a subscriber aggrieved by a violation
of Section 2 (of the Act) could, in a civil action, obtain appropriate relief. For better or worse, this bill did
not pass the House.
8. Federal Trade Commission Initiatives
- In response to the public’s outcry surrounding the Lexis-Nexis P-Trak database, on October 8, 1996, Senators Bryan, Pressler and Hollings sent a letter to the Chairman of
the FTC requesting a study of possible violations of consumer privacy rights by companies that operate computer
databases. The FTC released a December 1996 Staff Report on "Enhancing Consumer Privacy Online." Basically, the report outlined the problems pertaining to online consumer privacy and provided
some statistics on consumers’ attitudes about privacy and interactive media. The report took the position that
consumers must receive notice of information practices and maintain choice with respect to whether and how their
information is used. Further, the report underlined a concern with sensitive data, such as medical and financial
information online, analyzed three possible technological solutions (I/CODE, cookies, and PICS), mentioned self-regulation
and the possibility of government regulation, and but came to no firm conclusions as to the best way to proceed.
- In summary, current, and proposed, federal statutory and regulatory protections of informational privacy are
unsatisfactory in two respects: their failure to comprehensively target private industry, and their dependence
on a piecemeal (use based) approach. By not targeting private industry, the largest collectors and users of personal
information remain essentially unregulated in their collection and use of potentially sensitive personal information.
More importantly, by focusing regulatory attention reactively on ephemeral and inchoate uses of personal information,
federal regulation remains fundamentally incapable of keeping pace with technological advances in the art of collection
and use of personal information.
B. State Statutory Protection
- In June 1995, the National Association of Attorneys General ("NAAG") Internet Working group was established.
In November, members of the group gathered for a speech by, Minnesota Attorney General, Hubert H. Humphrey III.
Humphrey was pleased to report that "the states, [were working], under a resolution of the NAAG, to pool resources
and meet the [Internet] challenges of the future." To
their credit some state Attorneys General have promulgated regulations to enhance consumer, employee and citizen
privacy in personal information, but most states are waiting for the FTC to take the lead.
- Some states have opted to directly enact laws concerning the manner in which personal information is collected
and disseminated. Not surprisingly, California and New York are leading the charge. In California, for example, Article I of the state constitution expressly states that the right
to privacy is an inalienable right of all people. Unlike the
penumbra of the U.S. Constitution the California Constitution
has been interpreted to protect against government snooping,
the overbroad collection and retention of unnecessary personal information properly obtained, and the improper use of any information obtained for a specific purpose other than business
or government. Furthermore, the California Constitution has
been interpreted to provide a reasonable check on the accuracy of information already collected and to require a "compelling interest" for the intrusion into individual privacy.
- To a lesser extent, New York Civil Rights Law establishes similar privacy rights and protections. For example, New York codified the common law doctrine of misappropriation of name or likeness
of an individual for commercial purposes. Furthermore, New
York law limits public access to various personal records such as personnel records and the identities of certain
categories of crime victims.
- While some states have enacted laws protecting informational privacy, these states are currently the exception.
Analogous to federal and common law, privacy protection under state law remains piecemeal and inadequate in the
face of the challenges presented by the technological explosion. This is especially true when one considers the
difficult jurisdictional issues associated with cyberspace.
C. The European Model
- The European privacy model solves many of the inadequacies of the contemporary American regime. Previously,
piecemeal data protection laws existed in some European countries.
In an effort to synchronize existing laws, on July 24, 1995, the Council of European Union adopted an omnibus Council
Directive aimed at the "Protection of Individuals with Regard to the processing of Personal Data and on the
Free Movement of Such Data" ("European Data Directive," "Directive" or "EDD").
- The EDD protects individual informational privacy and prevents obstacles, previously unavoidable under the
cacophony of regulation, to the free flow of information within the EU. For the purposes of this article, the most
important aspect of the Directive is the affirmative obligation it imposes on EU governments and private industries to collect and process data only for specified and legitimate purposes. "Processing" should be interpreted broadly, and thereby
encompasses collecting, recording, altering, and making data available in any form. Because the EDD is modeled on the property regime paradigm, either the person concerned must
contractually consent to the processing or collection of their personal information, or the processing must be
necessary to carry out pre-contractual measures undertaken at the request of the person or contract to which the
person involved is a party. Processing may also occur where
it is necessary for compliance with legal obligations or where
the activity involved is an assignment of public interest, not involving an infringement of fundamental rights
- The EDD grants the subjects of information collection the privilege of requesting that erroneous data be corrected.
In certain instances, collectees may also oppose the prospective use of the information. Furthermore, collectees must be given notice of informational processing and collection and
the intended uses of collected data. While not wholly specified in the actual Directive, the EDD does threaten
meaningful liability and sanction for transgressors. Also, the E.U. has established governmental agencies to oversee
the development and implementation of the Directive and assure the protection of subjects’ rights. The agencies
will require public registration, reporting, and justification of the methodologies, categories and employment
of personal data actively being collected on employees and customers.
- Understandably, American companies, especially those active in European markets, and digitarians are having
nightmares about the EDD and the possibility that similar regulation will be implemented in the United States.
Because of its focus on collection and its acceptance of an individual property right in personal information,
the EDD is unquestionably a step towards more vigorous protection of personal information and is potentially many
times more effective than the existing American system. However, the EDD fails to mitigate many of the deficiencies
of the property regime discussed below. More importantly, the EDD is a prime example of the danger of grafting
antiquated regulatory thinking (and with it an arbitrarily imposed privacy baseline) onto a fundamentally different
environment. Although it does enhance consumer information privacy, the EDD does not spawn discussion or provide
a fluid model capable of changing as cyberspace matures. Instead, the Directive threatens to stifle the potential
of cyberspace by capriciously defining and limiting the manner
in which information collection, processing and use can occur.
D. Constitutional Protection
- While the United States Constitution makes no explicit mention of privacy, under the "penumbra theory,"
inferred from the Bill of Rights, the Supreme Court has referred to, and protected, certain fundamental or substantive
due process rights. Privacy protection under the penumbra of
substantive due process is limited to the protection of individuals from governmental or public intrusion under
a rational basis due process balancing test.
- Whalen v. Roe, 429 U.S. 479 (1965), is the Supreme Court’s most in-depth, modern discussion of informational
privacy in a "government context." At issue in Whalen
was whether it was constitutionally permissible for the state of New York to keep a computerized list of prescription
records for "dangerous drugs" and require physicians to disclose the names of patients to whom they prescribed
those drugs. The Court found that the right to privacy generally
includes the "right to be left alone," which encompasses "the individual interest in avoiding disclosure
of personal matters." Balancing the competing interests,
however, the Supreme Court upheld New York’s program as constitutional in that it was sufficiently narrowly tailored
with adequate security provisions to reduce the danger of unauthorized disclosure. Nonetheless, the Court left
the door open to future restrictions in light of technological change,
noting that it was "not unaware of the threat to privacy implicit in the accumulation of vast amounts of personal
information in computerized data banks or other massive government files."
- In Tureen v. Equifax, 571 F.2d 411 (8th Cir. 1978), the Eight Circuit briefly flirted with the idea
of extending the penumbra to private encroachment. The Court stated, "in order to make informed judgments
in these matters, it may be necessary for the decision maker to have information which normally would be considered
private, provided the information is legitimately related to a legitimate purpose of the decision maker. In such
a case, the public interest provides the defendant a shield which is similar in principle to qualified privilege
- Successive interpretations of the Whalen and Tureen decisions have opted for narrow readings
of the holdings. Effectively, the privacy penumbra stops at government action, rendering constitutional protection
impotent in the face of the erosion of informational privacy rights by corporations.
E. Common Law Protection
- In a landmark law review article, Samuel D. Warren and Louis Brandeis persuasively argued for the extension
of common law protection of personal privacy to non-governmental or private party intrusion. Warren and Brandeis defined this zone of privacy, as "the right to be left alone." Today, consumers and employees receive a modicum of protection
from private parties under the common law tort doctrine of invasion of privacy. Professor William Prosser divides
the invasion of privacy into four doctrinal categories: 1)
intrusion upon one’s seclusion; 2) the public disclosure of
private facts; 3) false light privacy; and 4) the misappropriation of one’s name and likeness for commercial purposes., Although an exhaustive survey of the doctrine in these four categories is outside
the scope of this discussion, several notable cases illuminate the doctrine’s structure and deficiencies.
- After subscribing to a weekly periodical under a misspelled name (Avrahani) and receiving junk mail at his
home addressed to the same, Ram Avrahami, a Wharton graduate, filed suit in state court against U.S. News &
World Report, arguing that the magazine tortiously sold his name and address to a third party without prior consent.
On February 7, 1996, General District Judge Karen A. Hennenberg dismissed the suit for lack of jurisdiction to
hear equity issues. On June 12, 1996, Circuit Judge William T. Newman Jr. held that Avrahami had no property rights
to a "fake name." Avrahami appealed the Circuit Court’s
decision to the Virginia Supreme Court and posted a copy of the petition for appeal on-line. However, in December of 1996, the Virginia Supreme Court declined to hear the appeal without
- Other courts have held that the sale of information to direct mail advertisers without the consent of subscribers
does not constitute an invasion of subscribers’ privacy, even if it amounts to a sale of personal profiles, inasmuch
as the profiles are only used to determine what kind of advertisement is to be sent. One such court found that the "appropriation of one’s personality," required to illustrate
tortious conduct, refers only to those situations "where the plaintiff’s name or likeness is displayed to
the public to indicate that the plaintiff endorses the defendant’s product of business."
- In June of 1997, the New York Times ran a front page story detailing the plight of Beverly Dennis. According
to the article, Beverly Dennis, an Ohio factory worker, filled out a product preference survey in 1994. In the
summer of 1996, Dennis received a 12-page letter mentioning her birthday, marital status and product preferences.
The letter also contained sexual suggestions and proposed a visit to Dennis' home. The writer was a convicted rapist
serving a six and a half year sentence in a Texas prison that had contracted with Metromail Corp. under a prison
work release program. In the spring of 1996, Dennis filed suit in Travis County, Texas District Court, seeking
to represent all U.S. citizens against R.R. Donnelley and its subsidiary Metromail Corp. Dennis alleged that the
defendants committed the tort of intentional and/or reckless disregard for safety when Metromail disclosed plaintiffs'
personal information to third parties. Under the settlement
terms, Metromail has proposed to never use prison labor again, disclose in clear language how it will use personal
information and adopt new confidentiality practices. Finally, Metromail will establish a fund to compensate claimants
who were or are injured by prior privacy breaches.
- In January 2000, a consumer filed a class action suit against Amazon and its Alexa Internet subsidiary. Newby
v. Alexa Internet and Amazon.com, C 00 0054, U.S. District Court, Northern District of California (filed Jan.
6, 2000). The consumer, Joel Newby, alleges that Alexa Internet, a Bay Area company that offers client side server
technology to work with users' browsers to provide "useful information about the sites you are viewing and
suggesting related sites," secretly intercepted and sent confidential information to Amazon.com, without his
consent. Newby's suit seeks class certification, damages, attorney's fees and restitution of profits made by both
companies as a result of legal violations. While the facts
are interesting, it is doubtful whether the Amazon case will generate any novel legal findings.
- DoubleClick has also been hit by a similar suit accusing the company of gathering personal information from
Internet users--such as names, addresses, and patterns of online browsing and buying--without their knowledge or
data collected from users with identifying information accessed through Abacus Direct Corp., a direct marketing
firm the company acquired in 1999.
- The DoubleClick lawsuit was filed in California Superior Court in Marin County. Judnick v. DoubleClick,
Inc., CV 000421, Superior Court, Marin County, California (filed Jan. 27, 2000). Interestingly, the complaint
is seemingly premised on an unfair business and trade practices cause of action brought under California Civil
Code § 17200. Thereunder, the attorney seeks to represent the state's general public. In California, these
suits are often viewed as a plea for a quick settlement. Again, it is unlikely that the DoubleClick case will amount
to much other than a tidy sum in legal fees. However, because of the impact of other private lawsuits coupled with
public scrutiny from the FTC and state attorneys general, Doubleclick has recently decided not to merge its anonymous
web-browsing data with Abacus’ personally identifying information. It is also offering consumers the ability to
block its cookies.
- In theory, protection of informational privacy via a liability regime would function ex post through rules
deterring violations of privacy interests by requiring transgressors to pay victims for the harm suffered. Courts,
by and large, employ a negligence standard regarding what the party may and may not do with the information. As
alluded to earlier, courts generally maintain a fairly deferential negligence standard and require significant
personal injury before requiring transgressors to pay victims for harm suffered. There are a number of other glaring deficiencies inherent in the existing liability regime
that serve to undermine privacy interests. Liability rules create obstacles for individuals to solve collective
problems. If many people are minimally injured by the disclosure of personal information, the judgment value of
their individual cases invariably prevents adjudication and often settlement. While this might argue for a class
action, judges often refuse to certify classes on these matters due to the individual nature of the harm and damages. More significantly, litigation on the disclosure of personal
information may only perversely exacerbate the injury by focusing attention on information that litigants wanted
to keep private.
- In the 1970s, privacy protection for personal information was considered to exist in the nature of a contract
between the individual and information collector. Accordingly,
the individual divulged personal information to a second party, who then conferred some benefit on the individual
in return. The assumption was that a good faith contract existed between the two parties and that the record keeper
was bound not to "misuse the information," in derogation of the contract. However, the information holder's post-use obligations were rarely formalized, and
there was no monitoring of the bargain, due, in part, to the high transaction costs involved therein. By divesting
the individual of any power to prevent or limit disclosure of their personal information, the common law and statutory
default position over the subsequent twenty years has largely moved away from a property theory. This trend seems to be reversing.
- A recent example of the employment of a property regime is evident in a California Supreme Court decision concerning
the right to control and benefit from the exploitation of individual genetic information. In Moore v. Regents
of the University of California, 793 P.2d 479 (Cal. 1990), the lower court’s decision granting the plaintiff
property rights in his genetic code was overturned by the California Supreme Court. The California Supreme Court
focused on the chilling impact propertization of genetic code, or personal information, would have had on medical
research. While the Court opted to delay this delicate balancing and leave open the question of propertization
of less "socially precious" personal information, it did offer Moore a remedy upholding the claim that
university researchers breached a contractual, or fiduciary duty, by failing to obtain his informed consent before
doing research on his DNA for potentially commercial purposes. The court also maintained that if public support
existed for a right to compensation under these circumstances, the legislature could and should provide it. By
saying that the legislature could act, the court merely highlighted the existing deficiencies and limitations of
common law and statutory protections that stem from the legislature’s inaction
- In the well-developed debate on the efficiencies and inefficiencies of liability versus property regimes, a
number of theorists have argued that a doctrinal shift from a liability to property regime, in common law decision-making,
will further enhance informational privacy. According to Professor
Coase's theorem, given zero transaction costs and assuming parties intend to contract, it would be irrelevant which
legal regime was adopted, as the most efficient outcome would obtain regardless. As the collection, storage and processing and use of personal information increasingly occurs
in cyberspace, if cyberspace were devoid of transaction costs (for example, in contracting or consummating mutually
beneficial bargains) common law rulings and regimes would be irrelevant to the domain.
- While the debate regarding transaction costs in cyberspace is relatively open, unquestionably, relative to
traditional real space collection, storage, processing and use of personal information, transaction costs are quite
low, although not zero. Here Guido Calabresi and Douglas Melamed added to Coase's theorem by convincingly arguing
that high transaction costs (due to information asymmetries or the impracticality of bargaining) will render liability
rules, in the form of damage awards, more efficient in protecting entitlements. Conversely, when transaction costs were low, Calabresi and Melamed predicted that
property rules would be a more efficient manner of protecting legal entitlements.
- There is a cognizable trend, in theory and practice, of replacing the liability regime currently in place with
a property or contractual understanding of protecting legal entitlements. Thus, in a "primary information
market," the individual voluntarily discloses personal
information in exchange for some benefit, thereby forming a contract. In cyberspace negotiating these exchanges
is quick, easy and, largely cost free. When people encounter an information collecting technology, in theory, it
should clearly state the information required and how it will be used. Again, in theory, people will either accept
this option, or refuse it and look elsewhere because, at least currently, the cost of exit is minimal.
- Given the existing legal geography and activity online, without opting into a higher property standard, many
companies are relatively impervious to consumer suits for tortious disclosure of personal information. Consumers seemingly have intuited the governing legal regime and for the most part understand
that they are relatively unprotected under a liability regime. Thus, for many cyber-companies attempting to use
personal information in the next generation of cyber-technologies, the nature of the business and the existence
of palpable consumer distrust has required companies to voluntarily grant consumers greater assurances for access
to their personal information.
- Although substantively distinguishable, the underlying issue involved in Stratton Oakmont, Inc. v. Prodigy
Servs. Co, 1995 WL 323710 (NY Sup. Ct. 1995), motion for renewal of argument denied, 1995 WL 805178 (N.Y.Sup.)
illustrates market forces pushing cyberspace business entities toward guaranteeing their customers greater legal
protection. Prodigy’s policy of moderating statements made
in its discussion forums is analogous to the voluntary expansion of privacy rights by other Internet companies.
In Stratton, the court held that Prodigy, an Internet Service Provider ("ISP"), constituted a
"publisher" with respect to statements made on an online bulletin board that stated inter alia
that Stratton Oakmont, a broker, had committed criminal violations of the securities laws. The court found that
Prodigy held itself out as an online service that exercised editorial control over the content of messages posted
on its computer bulletin boards. By expressly differentiating
itself from its competition and explicitly likening itself to a newspaper, the court found Prodigy liable under
a publisher standard, a higher standard than the default distributor standard previously applied to ISPs.
- Akin to the Prodigy model, a number of companies involved in the collection, storage, processing and use of
personal information (on the Internet/Web) have voluntarily held themselves to a stricter legal standard with regard
to informational privacy. These companies are moving towards
an understanding of a person's information as her property. Many on-line companies offer their users a quasi-contract
that implicitly exchanges the use of a product or a service for personal information. Companies also make promises defining exactly how they will use this information.
- Beyond consumer distrust of cyberspace and new technology, the usual problems surrounding imperfect information--lack
of consumer education, unequal bargaining power and the inability to individually tailor agreements (though this
might easily be taken care of technologically)--combine to further dilute the privacy enhancing value of the property
regime. In light of these problems, Professor A. Michael Froomkin has convincingly argued that a property rights
approach to privacy enhancement is unlikely to have much real influence so long as "courts refuse to rewrite
or ignore contracts of adhesion, and as long as in each individual transaction the cost of not providing the information
is disproportionate to the loss (which is a function of the accumulation of transactions, not any single transaction)
. . ." Eventually, this may pose a problem because the
cost of exit may skyrocket like the costs of not having a credit card. In lieu of a property regime, Froomkin believes
that the realities of modern transactional life, like the inflation of exit costs, will eventually defeat a property regime.
This is currently not an issue with regard to on-line companies and many market niches in cyberspace. However,
in time it may become fundamental. Instead, Froomkin and others propose using technological or market based solutions,
such as anonymous communication (including sensitive consumer transactions with anonymous digital cash), to do
a much better job of protecting privacy rights and privacy enhancement in cyberspace.
VII. Self-Regulation and Market Based Solutions
- In June of 1997, the FTC sponsored a four-day discussion between government officials, consumer privacy advocates
and, for the most part, representatives and executives from interested corporations, including such heavyweights
as Microsoft, Lexis-Nexis, IBM, AOL, Visa, Mastercard and American Express. Predictability, both industry and consumer
groups put significant pressure on regulators.
- While the FTC has conducted numerous inquiries and investigations into online data privacy, outside the context
of children’s privacy, it supports a policy of industry self-regulation.
In short, the FTC has honored the wishes of the regulated, and officially opted for a policy of self-regulation
intended to protect and enhance informational privacy. Accordingly, Netscape spokesman Sean Gaddis points out the
major industry concern. Gaddis states that the information dependent industries must "do everything possible
to self-regulate. [Because] if it becomes a mess, then the FTC will step in." Thus, companies that manufacture, use, sell and depend on personal information have offered
many arguments and proposals for how personal privacy rights can be protected and enhanced through self-regulation.
Outlined and discussed below are several proposals and options offered by the FTC and industry as potential market
solutions to the problem of informational privacy.
A. Opt In/Opt Out
- The first, and most elementary, proposal is often referred to as opt in/opt out. The DMA is among the most
outspoken proponents of this proposal. In its most consumer friendly form, the opt in/opt out proposal allows consumers
to designate whether they want their information collected by responding to yes/no questions asked by a computer.
For example, if a consumer entered a particular Website or carried out a grocery transaction, the clerk or the
internet browser would ask the consumer whether or not the company could collect and disseminate information from
the transaction. Essentially, this amounts to a quick contract of adhesion, as discussed above in relation to the
property regime. At worst, the proposal is opt out. Under this method, the default standard permits the company
to collect and use the information unless the customer indicates otherwise. Often it is time consuming or difficult
for consumers to locate the opt out, thus reducing the likelihood of consumers opting out. This proposal, where
the burden of justification falls on individuals seeking protection in a perplexing new market, is the dominant
form of information collection today. While some businesses allow a user or consumer to opt out of information
collection, currently most businesses collecting and selling personal information will not grant a user or consumer
access to their site or products without the deposit of at least some personal information.
- The deficiencies with this system are readily apparent. First, levels of consumer education and understanding
of technological advances in informational collection vary widely. Many consumers have no idea that their information
is being stored, collected and used. While this may argue for greater education, many companies will undoubtedly
stay one step ahead of the education, introducing generations of progressively more stealthy collection and employment
technologies. Furthermore, as the cost of exit grows, consumers will find it increasingly difficult to eschew the
use of many cyber goods and services.
- Industry pundits respond that when consumers place a high enough marginal value in informational privacy, like
organic vegetables, the market will provide purveyors of goods and services that abstain from collecting or using
personal information. However, as personal information is valuable, these purveyors, to survive, would necessarily
have to pass along some of their lost revenues to consumers in the form of higher prices. At that point, in theory,
some consumers will be priced out of the market. In no uncertain terms, the result will be a market system of informational
privacy, further reducing privacy "from an assumed right to the unceremonious status of a commodity." This is ultimately the same situation of the status quo, which
very few people consciously support.
- The opt in/out system is not wholly differentiable from the property regime discussed above. By opting in or
out, consumers essentially enter into a contract of adhesion. Given existing technologies, opting in or out is
time consuming and thereby expensive and inefficient. Above and beyond efficiency concerns, the fundamental question
remains: what if a company disregarded its contractual obligation? In the current legal climate, the injured consumer
would have little recourse given the disparities between consumers and institutions.
- Another market-based solution is based on the idea that encryption technology and anonymity can enhance privacy
just as other technological advancements have eroded it. Under this argument, if people are really interested in
informational privacy, companies will develop software that provides a means of blocking the initial collection
of information. For example, Community ConneXion created a website called the Anonymizer for people who wanted
to browse the Web anonymously. The Anonymizer shields a consumer’s
personal information from the other Web sites that she visits. By visiting the Anonymizer site prior to visiting
other sites, a consumer is assigned an anonymous identity, which is revealed (instead of her real identity) as
she surfs the Web. This permits a consumer to surf freely, even if she follows a hypertext link to another site,
without having to worry about whether the site is keeping track of her comings and goings and personal preferences
regardless of the site’s stated policies. Of course, the consumer still views any advertisements on the site. Furthermore,
since Community ConneXion knows the true identity of the anonymous surfer, in theory, it could compile a profile
based on the anonymous surfer’s habits. Again, the anonymous surfer must trust Community ConneXion not to collect
or disclose her personal information, since any legal remedy available to the consumer is tenuous or non-existent.
- Froomkin, a major proponent of online anonymity, advocates privacy enhancement through anonymous communication
and untraceable digital money. To Froomkin, anonymous communication is a form of speech that deserves constitutional
protection. His theory would easily lead to increased privacy--if computers can only collect, filter and use information
without an identifier, they cannot gain individual information about the searcher, purchaser or interested party.
- An Economist cartoon sheds light on several of the problems with anonymity. The cartoon depicts two dogs sitting
in front of a computer. "On the Internet," ran the caption, "nobody knows you are a dog." Perhaps not. But increasingly, the real question has, in effect,
become: if someone knows that you purchase Purina instead of Alpo eight times out of ten and on the other two occasions
you had a 50 cent coupon found in Parade magazine; you subscribe to Poodle Monthly; your Internet address is email@example.com;
your home address is Doghouse, 12 Elm Street; your Web browser is set to filter out any and all content mentioning
or depicting cats; and the last four Websites you visited were www.dogtalk.com, www.leashlaws.gov, www.fleasbegone.net
and www.milkbones.com, does it really matter whether they "know you are a dog?"
- In the not so distant future, there is little doubt that Froomkin’s model will be vindicated and anonymity
and encryption will play an important role in getting there. However, in the short term, anonymity has a number
of flaws that render it inoperable for many businesses and consumers. First, transactions that take place in cyberspace
must eventually connect back to the real world. For example, the goods a consumer purchases on the Web or on the
phone must be shipped to an address, which is inextricably linked to that person. Most people still prefer credit
cards and still shop in person. Second, absolute anonymity, the inability to retrace an interest or a transaction
to a personal identifier, would reduce the newly expanded capacity for gathering, processing and using personal
information to the level of 1950’s pollsters. Thus, many technologies,
such as agents and smartcards, in theory, would only be useful to the extent consumers were willing to pay to be
deluged with intelligent information.
- Perhaps Froomkin’s major fault is that he is seemingly five dimensions ahead of the rest of us. Unfortunately,
to get to the fifth dimension, there are four others along the way. While we are not yet at this point technologically,
getting there will either require a great deal of education or a marked increase in consumer trust of security
and encryption systems. This, I maintain, will not happen without at least a shadow of regulatory protection. Finally,
new companies currently working on advanced means of data collection, processing and use will need to develop marketable
products for today using current technology, before they can offer the technology of Froomkin’s fifth dimension.
Moving the technology to that point will require capital and consumer trust. Because of this, I predict that the
newest technologies will necessarily be designed to produce and sell volumes of "intelligent personal information."
As a result, until there is a critical mass of consumer trust, the new and unknown technological entities of encryption
and anonymity will only exacerbate the consumers’ distrust of the government and increasingly the private sector.
C. OPS and P3P
- The deficiencies of the aforementioned market based proposals have led to several new protocols/platforms.
Netscape, Firefly and VeriSign jointly proposed the Open Profiling System ("OPS") to the World Wide Web
Consortium ("W3C") on May 27, 1997. The W3C also offers its own Platform for Privacy Preferences Project
("P3P"). These platforms are an extremely important step toward a new paradigm of technological regulation
that could simultaneously enhance consumer informational privacy while expanding the size, potential and efficiencies
of the market for personal information.
- Both platforms are ingenious examples of secure, automated one to one applications for the instantaneous formulation
of legal and social contracts and agreements and associated business processes. In both platforms, a consumer creates
a "Personal Profile" with her information. This profile
is stored on her personal computer and at the user's option may be securely stored in a corporate-wide or global
directory. The first time that an individual visits a website that supports OPS, the website requests information
from the Personal Profile. The individual then chooses to release all, some or none of the requested information
to the website. Additionally, if the website collects additional information about the individual's preferences,
it can (with the individual's permission) store that information in the user's Personal Profile for future use.
On subsequent visits, the individual can authorize the website to retrieve the same personal information without
asking permission each time.
- In theory, an individual can predetermine a level of privacy with which she is comfortable. For example, the
user might desire to block all her personal information, release the information regarding her preference for sexually
explicit material or block only her telephone number. When a user connects to a site that supports OPS or a similar
standard, a discussion ensues. The site either grants her access according to the terms of her profile or indicates
that it requires access to certain information as a prerequisite for use.
- While still in the initial stages of development, the P3P platform is a similar protocol that allows a Web
site and a visitor to establish privacy policies by negotiating between the site's data collection practices and
the user's privacy preferences. When practices and preferences match, the website grants visitors seamless access
to the site at a preferred level of service. Otherwise, the site notifies the user of the difference and offers
alternative methods of gaining access. P3P also enables surfers to download the preference settings and recommended
industry associations and consumer advocacy groups. While making P3P easier to use for consumers, this feature
will also provide models for developers to follow when establishing privacy policies for their sites. The W3C's
eventual plans for P3P include the development of tools for a coherent privacy preference graphic user interface
and a privacy transportation mechanism that is embeddable in cookies, channels, HTTP and related technologies.
- The W3C also plans to design a "negotiation" protocol that will operate at the practice/preference
mediation point. This protocol will draw upon related W3C projects, including the Platform for Internet Content
Selection (PICS) and the Joint Electronic Payment Initiative (JEPI).
- While P3P is quite similar to the OPS initiative, the focus and genesis of each technology is different. P3P
initially focused on enabling the expression of privacy practices and preferences--more a vocabulary and system
of notification than anything else. OPS focused on the secure storage, transport, and control of user data. Yet, the developers of P3P claim to have understood that "data exchange" was relevant
to P3P from the beginning. When OPS was submitted to W3C, P3P
members decided to examine OPS and determine how to integrate P3P with data exchange technologies. P3P originally
allowed a service and user to reach an explicit understanding about the site's data privacy practices. OPS allows
users to control the release of their data in a secure manner. In the future, however, P3P will allow users and
services to reach similar agreements and help ensure that any release of data is in accordance with the agreement,
thereby enabling sites to declare their privacy practices in a way that is understandable to the user's browser.
Once this occurs, seemingly, P3P and OPS will do the same things and will compete for market dominance.
- Although these technologies are currently tied to the Web, their purveyors undoubtedly have grander visions.
In time, the platforms and profiles will be applicable to cyberspace writ large, whether it be encoded in a chip
on a credit card, driver's license, identification card or an amalgamation thereof, and a number of companies plan
to take advantage of this opportunity.
- In its FAQ page regarding OPS, Netscape unintentionally reveals the major shortcoming of OPS, P3P and other
similar initiatives. Netscape writes,
Once an individual releases his or her Personal Profile to a website, there is no technical way to prevent that
website from retaining the information for reuse, or sharing it with others. Therefore, websites that adopt OPS
are strongly encouraged to adopt a recognized privacy assurance program that includes third-party auditing, and
to clearly and widely post their privacy policies on their website where visitors can see them. In addition, consumers
are cautioned not to release their Personal Profile to any site that does not post its privacy policies and submit
to third-party auditing.
- This is familiar territory--it loops us back to square one. We return to the ultimate question--what happens
when these companies misuse the information that has been collected? Or, worse yet, what sanctions or remedies
exist for companies that disregard informational preferences and grab all the information? While this may be alleviated
by promised encryption, the previously identified problems
with anonymity, encryption and consumer trust and a technological arms race between consumers and information collectors
- At this point, it becomes clear that the sustainability, novelty and utility of these market proposals and
potential solutions run up against the usual ultimate issue of consumer trust in cyberspace. Again, because consumers
intuitively sense their helplessness in a technological arms race and the contemporary legal milieu, all these
solutions are unworkable as long as consumers do not trust the technology, its corporate purveyors or the government
with their sensitive information.
VIII. The Existing Regulatory Debate
- Having analyzed several market-based and rule and sanction approaches to the problem, if this was an exercise
in legal centrism, I would now attempt to balance the competing
perspectives and weigh in on one side or the other--the market or the government, digitarianism or regulatory interventionism.
However, this would perpetuate an analytical framework that I find to be part of the problem. Instead, I argue
that moving toward a workable solution requires a shift in the debate and contemporary understanding of the issue
- Heuristically speaking, there are two primary ideological camps in the informational privacy debate: the digitarians,
who are well versed in their understanding of the context,
and oppose any and all regulation of cyberspace, and those who believe that government regulation is capable of
enhancing informational privacy without compromising the integrity of cyberspace. The digitarians and interested
industry players regard the regulation of cyberspace as a zero-sum game. Under this view, government regulation,
no matter how minimal, is antithetical to the medium and will necessarily destroy the integrity of the space. Given
their superior knowledge and understanding of cyberspace, the digitarians have been quite successful in framing
the debate in their own politically charged language. Thus, in the current debate, the digitarians have succeeded
in portraying regulatory solutions as the ultimate slippery slope. They allow no middle ground or the possibility
of balancing interests. By virtue of the space and, perhaps, the contemporary socio-political zeitgeist, all discussion
related to the realm of informational privacy inevitably transcends the immediate issue and plays out on the larger
ideological screen. Thereby, any question, discussion, debate or proposed solution quickly devolves from the evaluation
of informational privacy to a politically charged defense of living free and dying the same.
- The regulators and academics arguing in favor of regulatory approaches for enhancing informational privacy
are also guilty of myopia in their approach. Traditionally,
as discussed above, regulation of personal information has proceeded through an employment or use based categorical,
or substance determination, understanding of personal information. As discussed above, the VPPA (video rental),
FCRA (credit), CCPA (cable communications), and TCPA (telephone) each regulate potentially detrimental uses of
personal information through rule and sanction. Today, articles waxing apocalyptic regarding the lack of protection
for sensitive domains such as medical information are all too common.
Authors and theorists argue convincingly that the detrimental use of medical information should be regulated. Again,
the problem with this approach, in the digital and networked age, is that flimsy categorizations (e.g., distinguishing
between a video rental in a store and downloading one on-line
or, more fundamentally, distinguishing the books you buy on living with cancer and an actual medical diagnosis)
are increasingly meaningless. As evidenced by clickstream data and TGI, in general, the lines between use, storage
and processing are blurring at the rate of technological innovation. More fundamentally, these line blurring, amorphous
and enigmatic characteristics of cyberspace, to many cyber-thinkers, make up the core of the digitarian arguments
for the impossibility of meaningful regulation of cyberspace. To these same thinkers, these characteristics also
account for cyberspace’s beauty and vast potential. In large part, they are correct.
- Sensing the inadequacy and antiquation of this categorical or use based approach, a number of theorists favor
a reevaluation and translation of informational privacy protection to the digital age. Justice Brandeis took this approach in Olmstead.
As Professor Lessig comments,
- If there is a Justice who deserves c-world's praise, if there is an opinion of the Supreme Court that should
be the model for cyber-activists, if there is a first chapter in the fight to protect cyberspace, it is [Brandeis],
[his] opinion and [Olmstead]. Here, in as clear an example as any, is a method that will be central to cyberspace's
survival as a place where values of individual liberty are sustained. The method is translation: Brandeis first
identifies values from the original Fourth Amendment, and then translates these values into the context of cyberspace.
He read beyond the specific applications that the Framers had in mind, to find the meaning they intended to constitutionalize.
He found a way to read the Constitution in 1928 to preserve the meaning it had in 1791.
- Theorists making this argument believe that enhancing informational privacy will require a process similar
to Brandeis’: first, identify our values with regard to informational privacy and then craft legislation or decide
cases to protect those norms.
- Such a consensus might well resolve many informational privacy problems and issues. However, given the current
context of the debate, we are fundamentally incapable of isolating a discrete set of ontological privacy values.
Even if we were capable of doing so, these values would be grafted on to the broader regulatory debate and weighed
against an absolute disdain for cyberspace regulation writ large.
- Thus, a number of forces within the existing regulatory debate combine to make a regulatory solution to the
existing problems impracticable or impossible. The first force is the ideologically charged nature of the debate:
the minimally supported assumption that any cyberspace regulation will destroy the space. This assumes that there
are only two choices: regulation and non-regulation. The second assumption is that rule and sanction regulation
will necessarily maximize informational privacy and that informational privacy will necessarily be eroded by private
industry under the current digitarian and industry regime. A corollary holds that efficiency, economic value and
utility of personal information will all be diminished by regulation and maximized by digitarian policies. Another
assumption is that rule and sanction regulation can only be applied to the people and technologies at the endpoints
of the process or at the use/employment stage of personal information. In truth, there has been very little thought
or discussion surrounding the possibility of controlling the mechanisms of collection. Finally, supporters of regulation
assume that regulation must either assess value and balance interests or categorically determine harm and benefit.
For the remainder of this article, I will argue that these assumptions, while not necessarily incorrect, approach
a new problem (or, at a minimum, one markedly different in degree) through antiquated modes of understanding, which
necessarily detracts from the discussion and, thereby, potential solutions.
IX. Enhancing the Digital Trust
- In the absence of these charged assumptions and for the sake of argument, one might put forward a regulatory
proposal built upon the understanding implicit in the OPS and P3P platforms: 1) propertization of consumer information,
2) perfectly discriminated and informed, instantaneous one to one contracting, and 3) ex ante technological regulation
of the collection of information. For example, if the FTC propagated a regulation mandating that collectors of
consumer information employ a standard or technology similar
to OPS or P3P, it would ameliorate many of the previously discussed problems.
The FTC, by doing so, would essentially be requiring companies (who collect personal information) to offer the
consumer the advantages implicit in a "trusted systems" architecture.
- For example, imagine that a consumer enters a grocery store and purchases a number of goods. In accordance
with FTC regulations, this particular supermarket, which collects and uses consumer information, operates a privacy
platform on its cash register system. The customer's credit card or frequent shopper card is encoded with her privacy
preferences. Thus, when it comes time to pay, the toothpaste, potato chip, mustard and cola purchases automatically
flow into the supermarket's database. However, the cash register system does not collect the information surrounding
the customer’s purchase of alcohol and health related products, such as an early pregnancy test, because this consumer
blocked that information through the encoded privacy preference mechanism. The technological capacity is limitless--preferences
can and will become quite minute in detail. For example, in the future, a peculiar consumer may only block purchases
of Budweiser and Denorex.
- Essentially, this scenario has the same unresolved problems as OPS and P3P, consumer trust and the threat of
marauding information collectors. To confront these problems and thus complete the proposal, the FTC might also
provide consumer education. However, education alone will be insufficient. Instead, in order to deter violation
of consumer trust, sanctions for breaching the one to one contracts would be required to level certain contractual
and technological asymmetries. I predict that once consumers understand that the technology is backed by an enforcement
mechanism and that they have practical and available protection for their informational privacy rights, the levels
of consumer trust will rise, especially given the potential ubiquity of use and concomitant benefits.
- The ascendance of consumer trust, in turn, will expand the market mechanism and increase the amount of consumer
information available to interested parties. Ideally, the market for consumer information would then lead the transformation
of an analog environment into a fluid, digital market with the very real potential of becoming perfectly informed
and segmented by individual privacy and consumer preferences.
X. Shifting Ground
- Because of the absence of government regulation in cyberspace, informational privacy is becoming a thing of
the past. Theoretically, citizens maintain a say in government through the ballot box. Yet, the separation of ownership
and control entrenched in American corporations, assures most of us that the bottom line will continue to be the
only mechanism holding sway over industry intrusion into this domain of personal privacy. Thus, while the theoretical
distinction still exists between the government and private sector, any functional distinction seems to be fleeting.
- The quasi-regulatory solution proposed in this article may enhance informational privacy and harness the potential
of cyberspace without threatening or stunting its growth, or ossifying its potential. However, arriving at a point
where we can seriously discuss such novel proposals, requires that we free ourselves from the limiting tendency
to view regulation as ontologically beneficial or detrimental and the tendency to automatically associate lesser
or greater privacy protection, efficiency gains or market value with digital libertarianism. Finally we must not
limit our regulatory thinking to the use or employment of personal information organized by category. Thus, any
workable solution to the problem of informational privacy in cyberspace will require an understanding free from
the rhetoric, fears and perceived realities of Big Brother and the bottom line. This will ideally produce a discussion
and understanding with a greater degree of functional novelty and non-dogmatic understanding than the current debate.
Failure to shift the debate may well result in the erosion of privacy rights at the pace of technological development.
- Ideally, by approaching the current debate from a different perspective and harnessing the beneficial characteristics
of a "market based regulatory solution," in the future, consumers and the personal information industry
will help realize the potential of cyberspace as an architecture and commercial space, while maximally enhancing
and individualizing informational privacy. In a sense, this argument and advice might serve to recast the entire
cyber-debate into a wider debate on post-technological politics. More optimistically, the privacy debate in cyberspace
might be viewed as the leading edge in coming to grips with a reformation in the blurring polarities of governance.
[*] Business Development, QUIQ, Inc., firstname.lastname@example.org; Harvard University,
J.D. 1998; Brandeis University, B.A. 1993. The author would like to thank Isabel Dedring, Jassmine Safier, and
Aaron Naparstek for ideas, edits, and arguments. Of course, they are also responsible for any and all errors and
or oversights contained herein.
 The term "cyberspace" is usually traced back to William
Gibson’s 1984 science-fiction novel, Neuromancer. Gibson, and other like-minded free-radicals, regard cyberspace
as "a consensual hallucination . . . .A graphic representation of data abstracted from the bank of every computer
in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations
of data. Like city lights receding." WILLIAM GIBSON, NEUROMANCER
51 (1984). To civic-minded cyberthusiasts, like Howard Rheingold, cyberspace is "the conceptual space where
words, human relationships, data, wealth and power are manifested by people using CMC [computer-mediated communications]
technology." HOWARD RHEINGOLD, THE VIRTUAL COMMUNITY 5 (1994). Closer to home, Trotter Hardy defines cyberspace
as a "means of communication directly between human beings." Trotter Hardy, The Proper Regime for
Cyberspace, 55 U. PITT. L. REV. 993, 1000 (1994). In the end, while
functionally irrelevant, "cyberspace," as used herein shall refer to the ether (whether it be the human
mind, coaxial cable, fiber optics, computer RAM, hard drive or floppy, network, integrated database or modem interface)
where information of any shape or form is stored, processed, transported, collected, formatted, uplinked, downloaded,
communicated and/or debated. In short, cyberspace is the space where digital information lives, works and dies.
 Ironically, Wired magazine owns a trademark in the phrase "digital
 See SCOTT, MIND
YOUR OWN BUSINESS 1-10 (1995).
 EQUIFAX INC., HARRIS-EQUIFAX CONSUMER PRIVACY
SURVEY, 4 (1992).
 It has been said that the right to privacy "as used in law
has as many meanings as a hydra has heads." Diane Zimmerman, False Light Invasion of Privacy: The Light
That Failed, 64 N.Y.U. L. REV. 364 (1989). I won’t add another. Rather, see Patricia
Mell, Seeking Shade in a Land of Perpetual Sunlight: Privacy as Property in the Electronic Wilderness, 11
BERKELEY TECH. L.J. 1, 81 (1996), for a comparison of federal statutes
regulating informational privacy. Like Professor Michael Froomkin, for the most part I use privacy to mean "the
control of information about oneself." Michael Froomkin, Regulation and Computing and Information Technology:
Flood Control on the Information Ocean, 15 J.L. & COM. 395. n.3 (1996). The "right
to privacy" or "privacy rights" remains an implicit assumption in this discussion. For more normative,
if not philosophical, debate regarding whether there is a "right" to privacy or what that right may mean,
see Judith Jarvis Thompson, The Right to Privacy, 4 PHIL. & PUB.
AFF. 295 (1975). See also, Warren & Brandeis, The Right To Privacy, 4 HARV. L. REV. 193, 205 (1890) ("the right to be left alone"); Charles
Fried, Privacy, 77 YALE L.J. 475 (1968) ("control of personal access to oneself");
Ruth Gavison, Privacy and the Limits of the Law, 89 YALE L.J. 421 ("having control
of [one’s] entire realm of intimate decisions"); and William Prosser, Privacy, 48 CAL.
L. REV. 383 (1977).
 See Olmstead v. United States, 277 U.S. 438 (1928).
 Cookie.txt refers to a file kept on the hard drives of computers
using the Netscape Browser. The file collects data about the user’s previous access to individual Web sites. Often
the technology allows repeat visitors to bypass CIG (or sign in) screens. However, the information is not necessarily
secure and extensive media coverage raised serious concerns regarding the threats to personal privacy associated
with cookie files. See John M. Moran, Cybercommerce: While You Browse, Someone Else on the Web is Taking
Very Careful Aim, L.A. TIMES, June 10, 1996, at D2. Or for a more beneficent view, see
<http://www.netscape.com/security/basics/privacy.html#cookies>. Of course, Microsoft’s Internet Explorer
("IE") utilizes a similar technology under a different name.
 This is information manufactured by clicks of the mouse (e.g.,
turning Web pages, following hypertext links or selecting 25-30 years of age, male on a scroll down menu).
 EQUIFAX INC., HARRIS-EQUIFAX CONSUMER PRIVACY
SURVEY, 4 (1992).
 U.S. DEP'T OF COMMERCE, NATIONAL TELECOMMUNICATIONS AND INFORMATION ADMIN.,
PRIVACY AND THE NII, SAFEGUARDING
TELECOMMUNICATIONS-RELATED PERSONAL INFORMATION
 As cited in Shapiro, Privacy For Sale, THE
NATION, 11 (June 23, 1997).
 After it was reported in September of 1996 that the Lexis-Nexis
online database P-Trak listed millions of private names, addresses and social security numbers, the company was
deluged with thousands of angry phone calls and e-mail messages from people requesting removal. Jim Dillon, Lexis-Nexis:
Rumor Unleashes ‘Net Flap, DAYTON DAILY NEWS,
Sept. 19, 1996, at A1.
 The Wall Street Journal reported that Blockbuster was planning
to sell mailing lists of its customers, categorized according to preference in movie genres. A few days later,
Blockbuster’s CEO announced that a vice-president "misspoke" when discussing the plan with the journalist.
See Michael Miller, Blockbuster Contradicts Official, Saying It Won’t Sell Rental Data, WALL
ST. J., January 2, 1991, at B6.
 These two companies canceled the release of Lotus Marketplace:
Households, a CDROM software package that would have provided small businesses with information regarding American
households. The information would have been used for targeted marketing. After 30,000 consumers requested removal
from the CDROM, the companies capitulated. Once consumer outrage subsided, Lotus sold its interest in the software
and another company released the information under another name. Shelby Gilje, Consumers Deliver A Loud No!
To 'Households', SEATTLE TIMES, Jan. 24, 1991, at G1.
 In response to consumer outrage, AOL abandoned its plans to
begin providing lists of its customers’ telephone numbers to telemarketers and other direct-sales merchants. Associated
Press, AOL Learns Power of Riled Customers (July 26, 1997).
 Citing privacy concerns, on April 9, 1997, the SSA shut down
a feature on its Web site where users could get "personal earning and benefits estimate statements" by
providing a name, address, telephone number, place of birth, social security number, and mother’s maiden name.
A letter from congressional leaders to John J. Callahan, the Acting Commissioner of Social Security, stated that
the agency’s on-line service "may not afford sufficient protections against violations of individual privacy."
Social Security Closes On-Line Site, Citing Risks to Privacy, N.Y. TIMES, Apr. 10,
1997, at A15.
 The Internet specifically refers to a worldwide group of interconnected
networks that use the Transmission Control Protocol/Internet Protocol, a sort of universal language for computer
communication via the telecommunications infrastructure. See What is the Internet <www.infoctr.edu/pa>.
However, for my purposes the internet, cyberspace, the information superhighway, the Global Information Infrastructure
("GII"), and other related terms, while nuanced, are used interchangeably.
 IDC Expects the Worldwide Internet Economy to Exceed $1Trillion
by 2001, press release, Nov. 3, 1999, <http://www.idc.com/Data/Internet/content/NET110399PR.htm>.
 Bernstein, N.Y. TIMES, June 12, 1997,
 See Lawrence Lessig, The Constitution of Code: Limitations
of Choice-Based Critiques of Cyberspace Regulation, 5 COMMLAW CONSPECTUS 181 (1997) and Lawrence Lessig, Symposium, The Zones of Cyberspace, 48 STAN. L. REV. 1403 (1996). While other factors may limit cyberspace, such
as legal regulation, social norms and the market, these factors limit people, corporations and governments in real
space. For example, law might inform people that certain actions in cyberspace will result in jail time in real
space. Or, market analysis might indicate that proceeding in one direction in cyberspace will result in filing
Chapter 7 in real space.
 Security or encryption plays a dynamic role in this debate.
See text infra.
 See THE FEDERALIST
NO. 17 (Alexander Hamilton), attempting to assuage fears of an overstrong central government;
Michael W. McConnell, Federalism: Evaluating the Founder’s Design, 54 U. CHI. L. REV. 1484 (1987).
 See McConnell.
 Or, a national impulse toward reform and a perfected society
in American politics. See HAYNES JOHNSON, SLEEPWALKING
THROUGH HISTORY 65 (1991).
 See ELIZABETH A. FONNES-WOLF, SELLING FREE ENTERPRISE
 Id. at 79.
 Id. at 79-140.
 See generally MORTON J. HORWITZ, THE TRANSFORMATION OF
AMERICAN LAW, 1870-1960 (1992).
 FRANCES FOX PIVEN & RICHARD A. CLOWARD, REGULATING
THE POOR: THE FUNCTIONS
OF PUBLIC WELFARE (1971).
 JOHNSON at 66.
 Princeton historian Arthur S. Link has described these decades
of continuing expansion of government’s role in American life as representative of a profound shift from the dominant
nineteenth-century tradition of "’the neutral, umpire state, which gave special favors to no class or interest,’
. . . to a twentieth-century one exemplified by Theodore Roosevelt’s belief ‘that the federal government should
play a dynamic, positive role by directly regulating business activities and giving special protection to labor
and farmers.’" ARTHUR S. LINK, AMERICAN
EPOCH: A HISTORY OF THE
UNITED STATES SINCE THE
1890’S, at 69 (1955), cited in JOHNSON at 65-66.
 Ronald Reagan, Inaugural Address (Jan. 21, 1981).
 Ronald Reagan, speech, A Time for Change (Oct. 1, 1964)
 Winfred M. Adams, Reagan’s cabinet secretary and adviser,
encapsulated the ideology. "The Roosevelt administration was the beginning of what we have increasingly had
since, [essentially, it] is that the government’s responsibility is more or less cradle-to-grave. . . . I don’t
think most of us in my generation agree with that philosophy or with that approach. I think probably in this country
we have basically two viewpoints on government. . . . One would be liberal and one would be conservative. I think
it is difficult for people like myself to think in terms of the common man. . . . Reagan or I or most of us really
[can’t] think in terms of the common man. We don’t think that way. We think more in terms of the individual."
Winfred M. Adams, cited in JOHNSON at 73.
 See generally David C. King, The Polarization of
American Political Parties and Mistrust of Government, available at <http://www.ksg.harvard.edu/prg/king/polar.htm>,
in WHY PEOPLE DON'T TRUST GOVERNMENT, (David C. King et al. eds., 1997).
 Jamie Boyle refers to this "set of political and legal
assumptions" as digital libertarianism. See Jamie Boyle, Foucault in Cyberspace: Surveillance, Sovereignty,
and Hardwired Censors, 66 U. CIN. L. REV. 177 (1997). I will draw
heavily on Boyle's criticism that this brand of libertarianism is inadequate because of its blindness towards the
effects of private power, and the state's own power in cyberspace. I will also draw heavily upon Lawrence Lessig's
related, though more understated, conclusions in The Constitution of Code: Limitations on Choice-Based Critiques
of Cyberspace Regulation, 5 COMMLAW CONSPECTUS
 The State of the Planet 1998, WIRED
6.01, Jan. 1998, 163-207.
 For an early statement of this argument, see Ithiel
de Sola Pool, TECHNOLOGIES OF FREEDOM (1983).
 See generally Boyle; Lessig, The Constitution of
Code; and Gerald Frug, Decentering Decentralization, 60 U. CHI. L. REV. 253 (1993).
 President William J. Clinton, White House Press Release (July
 President William J. Clinton, A Framework for Global Electronic
Commerce, Executive Summary (July 1, 1997).
 Robert Posch, quoted in Oscar H. Gandy Jr., Legitimate
Business Interest: No End in Sight? An Inquiry into the Status of Privacy in Cyberspace, 1996 U. CHI.
LEGAL F. 77, 135 (1996).
 For an in-depth analysis on the roots and existence of contemporary
distrust for government, see King, supra note 38.
 TrustE Privacy Study, Boston Consulting Group for CommerceNet
and the Electronic Frontier Foundation (Mar. 1997) <http://www.truste.org/webpublishers/pub_surveys.html>.
 Georgia Institute of Technology, Graphics, Visualization
and Usability Center Annual Survey (Oct. 1996) <http://www.gvu.gatech.edu/user_surveys/survey-10-1996/>.
 Boston Consulting Group, supra note 49.
 See Lessig, supra note 43.
 While I occasionally make reference to the problems associated
with privacy in the workplace (employer and employee) and in politics (elected officials and the electorate), the
primary focus of this discussion remains privacy in the marketplace, that is, consumer privacy.
 REGAN, LEGISLATING
PRIVACY, 69 (1995).
 See <http://www.domesdaybook.co.uk/> for a searchable
 For an interesting account of the circumstances surrounding
the making of the Domesday book, see Sue Arnold, The British are making book on Great and Little Domesday,
SMITHSONIAN 82 (July 1986).
 DoubleClick’s method of matching web surfers to personally
identifiable information worked like this: first, when a surfer visited a site participating in the Doubleclick
advertising network, Doubleclick would send a cookie to her computer containing a unique identification number.
Following that, Doubleclick would send the same ID number to a site that knows who the surfer is because of her
registration there. That company then sends back the data that DoubleClick needs to look her up in the Abacus database.
This departed from DoubleClick’s earlier stated policy that it would track only computers connected to the internet
through cookies and would not attempt to link the data to individual users. Will Rodger, Activists Charge DoubleClick
Double Cross, USATODAY.com (Feb. 21, 2000) <http://www.usatoday.com/life/cyber/tech/cth211.htm.>
 Gandy separates personal contributions to machine readable
or network linked data files into 11 categories: personal information for identification and qualification, financial
information, insurance information, social services information, utility services information, real estate information,
entertainment and leisure information, consumer information, employment information, educational information, and
legal information. See GANDY, THE PANOPTIC
SORT, 63 (1993).
 Unless otherwise indicated, government includes federal, state
and local governments.
 See note 105 and accompanying text, infra.
 GANDY, PANOPTIC
at 55. For an argument that these numbers were grossly underestimated because the act of recording their existence
was complicated by the growth in the use of personal computers by employees in government agencies, see
Regan, Privacy, Government Information, and Technology, 46 PUBLIC ADMIN.
REV. 629-34 (Nov/Dec 1986).
 David R. Johnson and David Post argue that the cost of exit
(leaving, or opting out of, a particular jurisdiction, service or product) in cyberspace is so low that government
regulation is futile. However, as the cost of exit inches upward, the need for some sort of regulation, whether
it be governmental or other, as argued herein, concomitantly increases. See Johnson & Post, Law and
Borders: The Rise of Law in Cyberspace, 48 STAN L. REV. 1367. See
also David G. Post, Anarchy, State, and the Internet: An Essay on Law-Making in Cyberspace, 1995 J.
ONLINE L. 3 (1995) <http://warthog.cc.wm.edu/law/publications/jol/post.html>.
 Employer collection and use of personal information is an
increasingly important issue. However, given my consumer orientated focus, it is plainly outside the scope of this
 This foundational premise has been referred to as personal
 See THOMAS E. MCMANUS, TELEPHONE TRANSACTION-GENERATED
INFORMATION: RIGHTS AND RESTRICTIONS
1 (1990) cited in Gandy, 1996 U. CHI. LEGAL F. 77, 106 (1996).
 Id. at 107.
 See note 221 and accompanying text, infra.
 While it may still be possible to provide false information
on, or not fill out, browser preferences, often this is not possible if you want to use the good or service. This
ability might also be phased out once certain companies and products gain market share and consumers become increasingly
 DANIEL MARTIN,
ADVANCED DATABASE TECHNIQUES 5 (1986).
 Froomkin, Flood Control on the Information Ocean, 15
J.L. & COM. 395, 489 (1996).
 Robert Garcia, Garbage In, Gospel Out: Criminal Discovery,
Computer Reliability, and the Constitution, 38 UCLA L. REV. 1043, 1065 (1991).
 See Steven A. Bercu, Toward Universal Surveillance
in an Information Age Economy: Can We Handle Treasury’s New Police Technology?, 34 JURIMETRICS
J. 383, 429 (1994).
 See GANDY, PANOPTIC
 COLIN BENNETT,
REGULATING PRIVACY: DATA PROTECTION
AND PUBLIC POLICY IN EUROPE AND
THE UNITED STATES 18 (1992).
 ROBERT P. MERGES,
ET AL., INTELLECTUAL PROPERTY
IN THE NEW TECHNOLOGICAL
AGE 830 (1997).
 DAVID SHENK, DATA SMOG 114-115 (1997).
 See Herman, The Protection of Computer Databases
Through Copyright, PENN. BAR ASSOC. QUARTERLY 35 (Jan. 1994).
 Patent US05790426, Issued Aug. 4, 1998, Automated collaborative
filtering system, assignee: Athenium L.L.C., Cambridge, Mass., inventor: Robinson, Gary B.; Ellsworth, Me.
 SHENK, supra note 76 at 113.
 Id. at 115-116.
 For a laundry list of lists, see Jeff Sovern, Opting
In, Opting Out, Or No Options At All: The Fight For Control Of Personal Information, 74 WASH.
L. REV. 1033 (1999).
 See (visited Apr. 17, 2000) <http://www.infousa.com>.
 Virginia G. Maurer & Robert E. Thomas, Getting Credit
Where Credit Is Due: Proposed Changes In The Fair Credit Reporting Act, 34 AM. BUS. L.J. 607, 611 (1997).
 See (visited Apr. 17, 2000) <http://www.the-dma.org/aboutdma/whatisthedma.shtml>.
 Many chapters have been written extolling the virtues, or
decrying the existence, of agents and push technologies. For example, Jaron Lanier maintains that the ascendancy
of agents is "both wrong and evil." See Jaron Lanier, My Problem with Agents, WIRED
4.11, November 1996, and Jaron Lanier, Agents of Alienation, 2 J. of CONSCIOUSNESS
STUD. 1 (1995), <http://www.well.com/user/jaron/agentalien.html >. Pattie Maes of MIT
and Nicholas Negroponte of MIT and Wired, with some reservations, have supported the technology. Maes founded,
and Negroponte sits on the board of, Agents, Inc. (now Firefly, Inc.) in Cambridge, MA. Predictably, Firefly was
purchased by Microsoft. Negroponte, however, remains sufficiently concerned about the flattening of some dialectical
processes, and argues for a certain level of "serendipity" to be built into intelligent agents. In response
to such concern, some companies have developed technologies wherein the original user informs her agent that she
is interested in a filtration through the perspective of certain personalities, reviewers or even random individuals.
Predictably, at this point, this feature cannot be found in any company’s products. Nonetheless, the sociological,
psychological and semiotic issues and possibilities surrounding these technologies are at once terrifying and fascinating.
A particularly good (semiotic) discussion can be found in STEVEN JOHNSON,
INTERFACE CULTURE: HOW NEW
TECHNOLOGY TRANSFORM THE WAY
WE CREATE AND COMMUNICATE
 For a general introduction to intelligent agents see
James Hendler, Is There an Intelligent Agent in Your Future? NATURE (March 11, 1999)
<http://helix.nature.com/webmatters/agents/agents.html>. For a discussion of where agent research is going
see Hyacinth S. Nwana & Divine T. Ndumu, A Perspective on Software Agents Research, 14 THE KNOWLEDGE ENGINEERING REV.
2, 1-18 (1999) <http://agents.umbc.edu/introduction/hn-dn-ker99.html>.
 Paul C. Judge, Why Firefly Has Madison Ave. Buzzing: The
Internet Startup Takes Word of Mouth to a New Level, BUSINESS WEEK,
Oct. 7, 1996.
 Chip Bayers, Capitalist Econstruction, WIRED
8.03, March 2000.
 See Hendler, supra note 89.
 Kevin Kelly & Gary Wolf, Kiss Your Browser Goodbye:
The Radical Future of Media Beyond the Web, WIRED 5.03 at 12, Mar. 1997. See also,
Kelly, Kevin, et. al. Kill Your Browser, Wired 5.03 at 12-23, Mar. 1997.
 Malcolm Maclachlan, Pointcast Comes Full Circle With Idealab
Sale, CMP TechWeb, May 11, 1999.
 See David Einstein, The Return of Push, FORBES DIGITAL TOOL, Nov. 8, 1999, <http://www.forbes.com/tool/html/99/nov/1108/feat.htm>,
arguing that despite the bankruptcy of nearly two dozen push startups, push technology will play a key role in
the way that content will be distributed on the Internet in the future.
 PointCast Network (now Entrypoint, Inc., <http://www.entrypoint.com>)
is a Silicon Valley company that made its way onto the Internet in the winter of 1996. PointCast allowed users
to download its original software for free. Using the PointCast interface, a user defines an interest profile or
selected a "filter." The software then installed a modified screen saver on the user’s computer. When
the computer was jacked in, a persistent connection was established between the user and PointCast’s server. Pointcast
pushed (filtered) content and advertising while the computer remained connected, and cached information to run
when the connection was interrupted. At the height of the push craze in 1997, Pointcast reportedly turned down
a $450 million takeover bid by News Corp. After Pointcast scrapped IPO plans, a venture capital firm acquired it
for $7 million in May 1999. CMP TECHWIRE (May 11, 1999) available at 1999 WL 2495095.
 Judge, supra note 90.
 See Nicholas Negroponte, 000 000 111-Double Agents,
WIRED 3.03, Mar. 1995.
 For example, see CVS and Giant Food, infra
note 236. Or, any of the Lanier pieces at note 88, supra.
 In Hannah Arendt's words, "A life spent entirely in
the public, in the presence of others, becomes, as we would say, shallow. While it retains its visibility, it loses
the quality of rising into sight from some darker ground which must remain hidden if it is not to lose its depth
in a very real, non-subjective sense. The only efficient way to guarantee the darkness of what needs to be hidden
against the light of publicity is private property, a privately owned place to hide in." ARENDT,
THE HUMAN CONDITION 71 (1958).
 See generally Gellman, Fragmented, Incomplete,
and Discontinuous: The Failure of Federal Privacy Regulatory Proposals and Institutions, 6 SOFTWARE
L.J. 199 (1993).
 Fair Credit Reporting Act, 15 U.S.C. § 1681, et seq.
 Id. at § 1681(b).
 Privacy Act, 5 U.S.C. § 552a (1974).
 See GANDY, PANOPTIC
 5 U.S.C. § 552a(e)(1).
 Id. at § 552a(e)(2).
 Id. at § 552a(k). Exemptions include records
that are investigatory in nature, such as those used for law enforcement purposes, information used to protect
the President, statistical records like the census, information used for eligibility for federal employment or
service, and information used to evaluate positions in the armed forces.
 Id. at § 552a(b).
 Id. at § 552a(c).
 Id. at § 552a(d)(1).
 Id. at § 552a(d).
 Id. at § 552a(j)-(k).
 Cable Communications Policy Act, 47 U.S.C. § 551 (1994).
 Id. at § 551(a)(1)(A).
 Id. at § 551(c)(2).
 Id. at § 551(c)(2)(C).
 See 18 U.S.C. § 2702.
 Id. at § 2702(b).
 See Philip Shenon, Navy Case Combines Gay Rights
and On-Line Privacy, N.Y. TIMES, January 16, 1998, at A-6.
 Citing harm to McVeigh and the public interest in privacy
and the enforcement of the ECPA, Judge Sporkin (D.C. Dist.) issued an injunction permitting McVeigh to remain in
active service. Sporkin held that the injunction was supported by the officer's likely success on the merits of
his claims that the Navy violated the "Don't Ask, Don't Tell, Don't Pursue" policy in obtaining the identity
of e-mail for America Online. See Timothy R. McVeigh v. William S. Cohen, 983 F. Supp 215 (D.C. Dist. 1998).
See also McVeigh v. Cohen, 996 F.Supp. 59, (D.D.C. 1998) (holding that the court had the authority
to issue and enforce its remedial order).
 Video Privacy Protection Act, 18 U.S.C. § 2710 (1988).
 Id. at § 2710(b)(2)(D).
 Telephone Consumer Privacy Act, 47 U.S.C. § 227 (1994).
 Id. at § 227 (b)(1)(A).
 Id. at § 227 (b)(1)(B)
 Id. at § 227 (b)(2).
 Id. at § 227 (c)(3)(F).
 The answer, with sufficient reservation for an argument
about the Privacy Act is none.
 Children's Online Privacy Protection Act of 1998, Pub.
L. No. 105-277, 112 Stat. 2681, tit. XIII (1998). COPPA limits the ability of websites to collect personal information
from children. Websites that are directed to children must obtain prior parental consent before collecting data
from children under thirteen. The bill also gives the Federal Trade Commission authority to enact regulations and
enforce the bill. For the FTC’s final rule under the act, see <http://www.ftc.gov/os/1999/9910/childrensprivacy.pdf>.
 For current information on the status of pending privacy
bills, see <http://www.epic.org/privacy/bill_track.html> and <http://www.techlawjournal.com/cong106/privacy/Default.htm>.
 Children’s Privacy Protection and Parental Empowerment
Act of 1996, H.R. 3508, 104th Cong. (1996).
 Id. at § 2(a).
 Communications Privacy and Consumer Empowerment Act, H.R.
3685, 104th Cong. (1996).
 The proposed Act, available at < http://www.epic.org/privacy/internet/hr_98.html
>, was atrocious. Suffice it to say, all non-servers, prima facie, were not covered under the proposed
act. Rather, only commercial servers such as AOL, Lexis-Nexis, The Well, and Prodigy were targeted. One might guess
that the lobbyists had yet to swarm, although for the most part the act seems relatively toothless. In essence,
the act requires the first guy who answers the cyberspace door to offer you a written contract before he takes
and uses your information, whereas the next guy may strip search you while the Act remains silent. Finally, there
was no indication whether a server may deny you access if you refuse to release your personal stats. Obviously,
the problems continue...
 Defined as "any information service that provides
computer access to multiple users via modem to the Internet." Consumer Internet Privacy Protection Act of
1997 § 4(1), H.R. 98, 105th Cong. (1997).
 Id. at § 4(3) (equating with the meaning of
such term in § 631 the Communications Act of 1934 (47 U.S.C. 551)).
 Defined as a statement--in writing and freely signed by
a subscriber, consenting to the disclosure such services will make of the information provided, and describing
the rights of the subscriber under this Act. Id. at § 4(4)(A)-(C), § 2(a)(1).
 Id. at § 2(a)(2).
 Id. at § 2(b).
 See supra note 12.
 See <www.ftc.gov/bcp/reports/privacy/privacy1.htm>.
 National Association of Attorneys General, Consumer
Protection Report (Dec. 1995).
 See, e.g., CAL. PUB.
UTIL. CODE § 2891 (West Supp. 1997) (limiting the extent to which
telephone and telegraph companies may use personal information.)
 See generally Maureen S. Dorney, Privacy and
the Internet, 19 HASTINGS COMM. & ENT
L.J. 635, 648-50 (1997).
 CAL. CONST.
Art I, §1.
 See infra note 173 and accompanying text.
 See White v. Davis, 533 P.2d 222, 234 (Cal. 1975).
 See Porten v. University of San Francisco, 134
Cal. Rptr. 839, 841-44 (Cal. Ct. App. 1976).
 See generally Dorney, supra note 151,
 See N.Y. CIV. RIGHTS
LAW §§ 50-52 (McKinney 1992).
 See N.Y. CIV. RIGHTS
LAW §§ 50(a)-(d) (McKinney 1992).
 James R. Maxeiner, Freedom of Information and the EU
Data Protection Directive, 48 FED. COM. L.J. 93, 95 (1995).
 Council Directive of 24 July 1995 on the Protection of
Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data ("Directive"),
available on-line at <www2.echo.lu/legal/en/dataprot/directiv>.
 The Directive covers the private and public sectors, but
does not apply to data processed for national security, defense, and public security purposes. See Directive
at Art. 13(1)(a)-(c).
 Id. at Art. 6(1)(b).
 Id. at Art. 2(b).
 Id. at Art. 7(a)-(b).
 Id. at Art. 7(c).
 Id. at Art. 7(d)-(f).
 Id. at Art. 12.
 Id. at Art. 26(1) - (2).
 Here the plasticity of cyberspace is exceedingly evident.
Seemingly, the EDD, in its current form, is a direct threat to that plasticity. By forcing a conformity that many
argue is inefficient, unwise and out of step with the infinite possibilities of the environment, the EDD may well
succeed in collapsing the potentialities of the space into the myopia of the present.
 Cf. Griswold v. Connecticut, 381 U.S. 479, 484 (1965)
(holding "specific guarantees in the Bill of Rights have penumbras, formed by emanations from those guarantees
that help give them life and substance. Various guarantees create zones of privacy.") (citation omitted).
See also Loving v. Virginia, 388 U.S. 1, 12 (1967).
 Whalen v. Roe, 429 U.S. 589 (1970).
 Id. at 599 n.25.
 Id. at 598-99 (citing Olmstead at 478).
 See A. Michael Froomkin, Flood Control on the
Information Ocean: Living with Anonymity, Digital Cash, and Distributed Databases,15 J.L. & COM.
395, 493 (1996).
 Whalen, 429 U.S. at 605. But cf. Tureen v.
Equifax, 571 F.2d 411, 416 (8th Cir. 1978) ("in order to make informed judgments in these matters, it may
be necessary for the decision maker to have information which normally would be considered private, provided the
information is legitimately related to a legitimate purpose of the decision maker. In such a case, the public interest
provides the defendant a shield which is similar to qualified privilege in libel.").
 Tureen v. Equifax, 571 F.2d 411, 416 (8th Cir. 1978).
 Samuel D. Warren & Louis Brandeis, The Right to
Privacy, 4 HARV. L. REV. 193 (1890).
 Id. at 195.
 See William Prosser, Privacy, 48 CAL.
L. REV. 383, 389 (1960).
 The tort of intrusion upon one’s seclusion is based on
a right, or (Hohfeldian) privilege, to remain free from intentional intrusion "upon the solitude or seclusion
of another or his private affairs." See Restatement (Second) of Torts § 652B (1977). However,
according to the Restatement (Second) of Torts, to qualify as an intrusion, the conduct must be highly offensive
to a reasonable person, and the personal information must not be voluntarily disclosed to the public. Id.
at § 652B(c),(d).
 The doctrine penalizing the public disclosure of private
facts prohibits specific uses of personal information regardless of how the information was obtained. Again, to
recover a plaintiff must illustrate that the disclosure of information would be highly offensive to an ordinary
person. Id. at § 652D(c). This common law claim however is limited by the first amendment right to
privacy, especially when the injured party happens to be a public official or the matter is of a "legitimate
 Here the privilege is security from publicity that places
an individual in a false light. Traditional defamation law shadows this doctrinal area. In short, an objectionable
false representation which does not meet or satisfy the defamation standard and which was revealed to the public
is actionable. Id. at § 652E.
 The misappropriation of name or likeness protects against
the use of an individual’s name, voice or likeness in a commercial capacity. Id. at § 652C(b). Most
often, plaintiffs making a misappropriation of name or likeness claim are celebrities. However, there is a definite
potential to extend this doctrine to the commercial exploitation of personal information and preferences.
 Avrahami v. U.S. News & World Report, No. 96-203 (Arlington
County, Va. Cir. Ct. 1996).
 See <www.epic.org/privacy/junk_mail/petition.html>.
 See Shibley v. Time Inc., 341 N.E. 2d 337, 339 (Oh.
 Id. See also, Heights Community Congress
v. Veterans Administration, 732 F.2d 526 (6th Cir. 1984) (holding, inter alia, that the production of property
addresses, loan amounts, and identities of lenders on VA insured loans granted in a particular suburb would have
constituted an invasion of privacy which was not outweighed by an asserted public interest in protecting the rights
of minority veterans to receive their benefits and determining of VA loans had been manipulated to resegregate
the area through racial steering); See generally Reidenberg, Setting Standards for Fair Information Practice
in the U.S. Private Sector, 80 IOWA L. REV. 497 (1995), Reidenberg,
Privacy in the Information Economy: A Fortress or Frontier for Individual Rights? 44 Fed. Comm. L.J. 195
 Kathryn Ericson, Suit Over Prisoner Access to Marketing
Survey May Open Data Privacy Discussion, WEST LEGAL NEWS
6347, 1996 WL 359993, July 1, 1996, at *1.
 See Amazon, Alexa Internet Sued Over Web Tracking,
DALLAS MORNING NEWS, Jan. 7, 2000, at 11D.
 See Sasha Samberg-Champion, ITAA Seeks Administration
Help On Privacy Using Digital Divide, 3/6/00 COMM. DAILY (Mar.
6, 2000) available at 2000 WL 4694599.
 Andrew J. Frackman and Rebecca C. Martin, Surfing The
Wave Of On-Line Privacy Litigation, N.Y.L.J., Mar. 14, 2000, at 5.
 See supra notes 190-191.
 See FED. R. CIV.
P. 23 (1996); cf. Wilcher, et al. v. City of Wilmington, 1998 WL 113931 (3d Cir. 1998).
 See Mell, supra note 5.
 See, e.g., Maureen A. O’Rouke, Fencing Cyberspace:
Drawing Borders in A Virtual World, 83 MINN. L. REV. 609 (1998);
James Boyle, Shamans, Software, and Spleens (1996); J.H. Reichman & Pamela Samuelson, Intellectual
Property Rights in Data?, 50 VAND. L. REV. 51 (1997).
 See generally RONALD COASE, THE FIRM, THE MARKET
AND THE LAW (1990).
 See Guido Calabresi & A. Douglas Melamed,
Property Rules, Liability Rules and Inalienability: One View of the Cathedral, 85 HARV.
L. REV. 1089 (1972). Calabresi and Malamed recognize that before determining the most effective
way to protect an entitlement, the threshold question of what entitlements to protect must be addressed. Here,
I make an implicit assumption that people have an entitlement to privacy, or at least to maintain some aspects
of themselves secret. This may be intuitive but it is not given, nor (as evidenced supra) does the common law or
existing statutes explicitly recognize it.
 See Mell, supra note 5, at 4.
 See Post, supra note 82.
 See notes 174-188 and accompanying text, supra.
 Note that because of the subsequent enactment of 47 U.S.C.
§ 230 Stratton is no longer good law for the proposition that ISP’s should be held to a strict liability
standard in defamation actions.
 For example, Prodigy ran ads stating, "[w]e make no
apology for pursuing a value system that reflects the culture of the millions of American families we aspire to
serve. Certainly no responsible newspaper does less when it chooses the type of advertising it publishes, the letters
it prints, the degree of nudity and unsupported gossip its editors tolerate." Stratton Oakmont, Inc. v. Prodigy
Servs. Co., 1995 WL 323710, *3 (N.Y. Sup. Ct. 1995).
 Id. But cf. Cubby Inc. v. CompuServe Inc.,
776 F. Supp. 135 (1991).
 For example, Barnes and Noble, Amazon, Netscape, etc.
 In the "secondary information market," where
information collected, stored and filtered is sold and used by third parties with whom the individual did not deal
or by any party for purposes the individual did not foresee, bargaining becomes quantitatively more difficult.
Down the road, this problem too undoubtedly will be addressed with a technological fix permitting the consumer
to indicate to whom their information may or may not flow and a concomitant ability to trace, and possibly retract,
personal information, will be bundled with this capability. When this occurs, the issue will become solely one
of consumer trust in cyberspace.
 For example, if you do not grant the information, the good
or service will remain untenable. See note 62, supra.
 See Froomkin at 462.
 See Self-Regulation and Privacy Online: A Federal
Trade Commission Report to Congress, <http://www.ftc.gov/os/1999/9907/privacy99.pdf>. The report states that
"the Commission believes that legislation to address online privacy is not appropriate at this time. We also
believe that industry faces some substantial challenges. Specifically, the present challenge is to educate those
companies which still do not understand the importance of consumer privacy and to create incentives for further
progress toward effective, widespread implementation."
 See Mitchell Patrick, Open Profiling System Introduced
to Protect Online Privacy, 1997 WL 9025558 (1997).
 Andrew Shapiro, Privacy for Sale, THE
NATION, June 23, 1997, 15.
 Id. at 16. As Shapiro frames the issue, "[i]f
privacy is for sale, will we peddle our digits or save our data souls?" Id.
 THE ECONOMIST,
May 31, 1997, 22.
 To people, like Potsch, this fix is infinitely worse than
regulatory intervention. Certainly, Potsch has enough confidence in his ability to lobby that any privacy enhancing
regulation will constitute a compromise position. With anonymity, Potsch fears that the information will disappear
altogether unless consumers are paid a price for divulging their digits.
 OPS, for example, allows for the fully extensible, trusted
exchange of information of any sort. For ease of use, there are a small number of "well known sections"
contained in Personal Profiles. The first is a Unique Identifier that's assigned to the Personal Profile when it's
first created. The second is a Unique Identifier that is assigned to each service visited, and only available to
that service. The third is basic demographic information (Country, Zip Code, Age and Gender) that's of use to a
broad range of Websites. The fourth is contact information (based on the vCard standard), such as name, address,
zip or postal code, country of residence, telephone number, fax number, electronic mail address, etc. There will
also be the possibility for creating sections for commerce information (such as Credit Card numbers, eCash, etc.)
and site-specific information, such as detailed personal preferences (favorite books, magazines and music) that
are of value to users in the context of one or a small group of Websites. Thereby, in theory, OPS places individuals
in full control of their personal information--they can choose to release all, some or none of their information
to Websites that request it.
 See Open Profiling Standard (OPS) Frequently Asked Questions
(May 27, 1997) <http://developer.netscape.com/ops/opsfaq.html>. The W3C proposal is also available at
W3C (visited Apr. 24, 2000) <http://www.w3.org/Submission/1997/6/>.
 Composite Capability/Preference Profiles (CC/PP): A
User Side Framework for Content Negotiation (visited Apr. 24, 2000) <http://www.w3.org/TR/NOTE-CCPP/>.
 See P3P and Privacy on the Web FAQ (visited
Apr. 24, 2000) <http://www.w3.org/P3P/P3FAQ.html>.
 Open Profiling Standard, supra note 225.
 The Open Profiling Standard includes safeguards to help
keep Personal Profiles away from unauthorized parties. Personal Profiles may be sent between individuals and Websites
through the Secure Sockets Layer (SSL Version 3.0) as encrypted messages, and we recommend that Personal Profiles
be encrypted on the individual's hard disk.
 See supra. Also, see generally GANDY,
THE PANOPTIC SORT, (1993).
 See generally Oliver W. Williamson, Credible
Commitments: Using Hostages to Support Exchange, 73 AM. ECON. REV. 519 (1983).
 It is important to appreciate the context in which this
debate occurs. For the most part, the digital libertarians are cybertheorists in that they understand the nuances
of cyberspace quite well. Amongst these cybertheorists and cyberlosophers, there is often a profound disrespect
for the perceived "analog understanding" of the regulators. This dynamic is played out in the debate
quite frequently. Perhaps, the best example surrounds the reaction to the Communications Decency Act, where, in
the press and courtroom, the DOJ lawyers looked less than fluent in the medium.
 Although individual differences and nuances exist, prime
examples of this thinking may be found in Johnson & Post, supra note 62. See also, Post,
supra note 62. Furthermore, the understanding is embodied by such organizations as the Electronic Frontier
Foundation and magazines such as Wired.
 See, e.g., Joshua B. Sessler, Computer Cookie
Control: Transaction Generated Information and Privacy Regulation of the Internet, 5 J.L. & POL'Y 627, 676 (1997) (arguing for new legislation and "something more: a substantive expansion
of legal privacy protection doctrine to include tort and property rights in order to guard against the non-consensual
use of TGI."); Recommendations, Duncan, et. al, PRIVATE LIVES
AND PUBLIC POLICIES, 219 (1993).
 For example, the Washington Post recently ran the following:
CVS Corp. and Giant Food Inc. are using a computer database marketing specialist to send personalized letters to
customers who haven't refilled their prescriptions, reminding them to keep taking their medicine and pitching new
products that treat the customer's ailments. The editor of the Journal of the American Medical Association calls
the practice a "breach of fundamental medical issues" and asks: "Do you want ... the great computer
in the sky to have a computer list of every drug you take, from which can be deduced your likely diseases -- and
all without your permission?" Robert O'Harrow Jr., Prescription Sales, Privacy Fears; CVS, Giant Share
Customer Records With Drug Marketing Firm, WASH. POST, Feb. 15,
1998, at A1.
 See note 59 and accompanying text, supra.
 This argument was very well presented in Lessig, Reading
the Constitution in Cyberspace, 45 EMORY L.J. 869 (Summer 1996). See also,
Regan, Legislating Privacy, (1994) (arguing for legislation to be based on a recontextualization of privacy
from an aggregation of individual preferences to a privacy derivative from a sense of connection and mutuality.)
 See Olmstead v. U.S., 277 U.S. 438 (1928).
 Lessig, Constitution . . . at 12-13.
 Some discussion is already occurring around the use of
such a standard. This discussion, however, for the most part, currently revolves around which product is more worthy.
While, this is not the forum, the discussion should eventually turn toward implementation and function (e.g., should
it be public, private, open, etc.).
 The specifics of the proposal must be left to later stages.
For now, a rough outline must suffice.
 A secure viewer acts as a sort of "embassy on the
Net." It enables "extraterritorial" enforcement of a data provider's access restrictions. Data is
distributed encrypted and can only be accessed or managed through the secure viewer controlled by the information
distributor. See generally Mark Stefik, Trusted Systems, SCI. AM.,
Mar. 1997, at 78-81, as cited by Joel R. Reidenberg, Lex Informatica: The Formulation Of Information Policy
Rules Through Technology, 76 TEX. L. REV. 553 at 568 (1998).