Shopping for Privacy on the Internet

Page created by Bernard Parsons
 
CONTINUE READING
WINTER 2007                          VOLUME 41, NUMBER 2                                          351

                                      JAMES P. NEHF

                  Shopping for Privacy on the Internet
             Privacy is a concern for all major stakeholders in modern society, and
             technology to erode privacy continually emerges. Studies show that
             individuals are concerned about database privacy; yet, they seldom
             make privacy a salient attribute when deciding among competing alter-
             natives. Although privacy policies are present on many Web sites, Web
             users rarely bother to read them. Professor Nehf explores why this is so,
             identifying rational reasons why Web users do not shop for privacy and
             discussing the implications for the expanding market for consumer
             information. Unless privacy becomes a salient attribute influencing
             consumer choice, Web site operators will continue to obtain and use
             more personal information than Web users would choose to provide
             in a more transparent exchange. In a responding commentary, Profes-
             sors Pitt and Watson use an ecosystem approach that explores the mul-
             tiple dimensions of privacy. Investigating the interactions between the
             three major players-citizen/consumerrinvestor, government, and
             corporation-they identify reasons for the failure of market mechanisms
             to arise to protect privacy.

   Protecting consumer privacy in the United States is largely the respon-
sibility of individuals who are expected to guard their personal information
and take steps to minimize the risk that it will be used in an unauthorized
way. Although federal (and a few state) laws restrict sharing some kinds of
personal information-in health-related fields (Health Insurance Portability
and Accountability Act (19961), the financial services industry (Gramm-
Leach-Bliley Act 2), and a handful of other economic sectors such as video
rentals, children's Web sites, and telecom industries 3 -_the restrictions are
riddled with exceptions. In most aspects of daily life, individuals are
expected to take steps to protect their own privacy interests (Solove

  James P. Nehf is a professor of law and Cleon H. Foust fellow at the Indiana University School of
Law, Indianapolis, IN Onehf@iupui.edu).
  This article draws upon three of the author's previous publications (Nehf 2003, 2005a, 2005b) on this
subject.

   1. Health Insurance Portability and Accountability Act of 1996, Pub. L. No. 104 191, 110 Stat. 1936
(codified as amended in scattered sections of 18, 26, 29, & 42 U.S.C.).
   2. Gramm-Leach-Bliley Act, 15 U.S.C. 6801-6809 (2000).
   3. Cable Communications Policy Act of 1984, 47 U.S.C. 551 (2000); Children's Online Privacy
Protection Act of 1998, 15 U.S.C. 6501 (2000); Telecommunications Act of 1996, 47 U.S.C. 222
(2000); Video Privacy Protection Act of 1988, 18 U.S.C. 2710-2711 (2000).

The Journal of Consumer Affairs, Vol. 41, No. 2, 2007
ISSN 0022-0078
Copyright 2007 by The American Council on Consumer Interests
352                                       THE JOURNAL OF CONSUMER AFFAIRS

2001). This is particularly true for consumer transactions on the Internet,
most of which are not subject to state or federal privacy laws.
    The self-policing model would be more effective if a market for infor-
mation privacy were conducive to individuals shopping their privacy pref-
erences online. This paper summarizes many of the reasons privacy
shopping seldom occurs.
    On the surface, market incentives seem to be present. Many online busi-
nesses purport to collect only a minimum of customer data and to keep it
secure. On the consumer side, many individuals are concerned about iden-
tity theft or the embarrassing release of private facts about them (Hoar
2001; Norberg, Home, and Home 2007; Saunders and Zucker 1999)
and they give as little personal information as possible in online transac-
tions (Sheehan and Hoy 1999).
    For most consumers and businesses, however, privacy-enhancing mar-
ket incentives are weak, and the conditions for market failure are strong.
Consumers do not shop for privacy, and there are several reasons why.

         AGGREGATION AND EASY TRANSFER OF DATA

   A system that relies on individuals to police their privacy rights pre-
sumes that individuals can value privacy rights meaningfully. If people
do not know what information is being collected, how it could be used,
and what harm might result from its collection and use, they have no
way to judge how much it is worth to them (in time, money, or other
trade-offs). To make an informed choice about whether and how to share
personal information, and whether to make an effort to protect it, people
need to know what is at stake.
   Most people have no idea what information a Web site collects and how
it will be used. In rare instances, a user will take time to read a Web site's
privacy policy, but even then the information is only marginally helpful.
Most privacy policies are obtuse and noncommittal (LaRose and Rifon
2007; Milne, Culnan, and Greene 2006), but even a straightforward policy
can be deceiving. For example, many privacy policies state that the site uses
cookies and other means to obtain customer information and that it shares
customer data only with affiliated companies and firms that have entered
into joint marketing agreements with the site host. A customer might decide
to use the site, especially if the site is only requesting a few simple facts
(e.g., name and postal or e-mail address). Yet, affiliated companies and
joint marketers could be numerous and involved in entirely different lines
of business, each with its own bits of information about the customer in its
own database already. Each likely will have its own set of information
WINTER 2007                 VOLUME 41, NUMBER 2                           353

practices, unknowable to the customer. Because even diligent Web site
users lack the information necessary to evaluate the risks of information
sharing, users rarely can evaluate the risk of a proposed information
exchange (Varian 1992).

      SIGNALING MECHANISMS ARE NOT YET EFFECTIVE

    The information asymmetry might be ameliorated by signaling mecha-
nisms that supplement an individual's knowledge. Voluntary privacy seals,
trust marks, and similar indicators could signal strong privacy practices and
thereby help the privacy market work better (Franz 2001; Miyazaki and
Krishnamurthy 2002; Rifon, LaRose, and Choi 2005). Unfortunately, in
online interactions the current signals given by such seals are poor (LaRose
and Rifon 2007). At present, the scope of mark assurance is narrower than
one might expect. The most popular marks, at best, ensure only that the
business discloses a privacy policy with minimal protection of consumer
interests and that the mark issuer has no knowledge that the business is not
following its policy as stated. Licensors of marks do not require that sub-
scribers limit or reduce the amount of information they collect in any mean-
ingful way nor do they dictate how collected information can be shared and
with whom. Mark issuers also do not perform regular and rigorous audits on
their clients to ensure that the site's policy is being honored (Pippin 1999).
Thus, a nonmarking business might not collect or share any information at
all, whereas a trust mark subscriber might be collecting data and selling
information with numerous outside entities (Miyazaki and Krishnamurthy
2002; Rifon, LaRose, and Choi 2005).
    Moreover, market incentives are not driving Web site operators to tough-
minded trust mark licensors (Miyazaki and Krishnamurthy 2002). Trust
marks allow Web sites to appear concerned about privacy, but they do
not provide restrictions that are specific, limiting, and enforceable.

              ACCOUNTABILITY PROBLEMS INSULATE
                     PRIVACY VIOLATORS

   For individuals to protect their privacy interests, they must be able to
identify the person who broke a law, breached a privacy policy, or allowed
access to its database because of lax security procedures. Businesses that
collect data must fear that they will be exposed and held accountable if they
do something wrong.
   There are two fundamental accountability problems. First, individuals
seldom know when a privacy breach has occurred. The vast majority of
354                                       THE JOURNAL OF CONSUMER AFFAIRS

data collection-lawful and unlawful-occurs outside of public view
(Bellotti 1997). Although on occasion a breach of privacy norms results
in media exposure, far more frequently, breaches remain hidden for
months, years, or indefinitely.
   Second, even if an injury or breach is detected, individuals may find it
impossible to trace the problem to a particular cause or source. With per-
sonal information residing in countless databases, often there will be no
way to locate the entity that caused a particular problem, sold the data,
or permitted a hack or leak that ultimately caused someone to be harmed.
Even with a noticeable harm such as identity theft, it may be impossible to
learn how the thief obtained the personal information. Tracing the injury to
the originating source often will be difficult or impossible.

          PRIVACY MUST BE SALIENT TO CONSUMERS

    For individuals to police their privacy preferences, they must incorporate
privacy concerns into their decisions whether to share personal information.
If privacy is not salient, businesses that wish to collect and share data will
offer weaker privacy terms than consumers prefer because they pay little or
no market penalty for their practices.
    Research on the saliency of privacy is conflicting. On the one hand,
behavioral economics studies suggest that consumers are concerned about
information privacy. Consumers in controlled studies have been asked to
make decisions that reveal their privacy preferences in a way that places the
question firmly into the decision-making process. When this happens, sev-
eral conclusions emerge:
    First, consumers are generally aware of privacy issues, and they are con-
cerned about guarding their personal information (Dommeyer and Gross
2003; Hann et al. 2003).
    Second, although consumers value their information, they also are will-
ing to trade information for other benefits. Consumers who are aware of the
value of their information will ask for rewards in exchange for disclosure,
suggesting that consumers can place a value on personal information, and
data can be elicited through monetary and other trade-offs (Caudill and
Murphy 2000; LaRose and Rifon 2007; Olivero and Lunt 2004; Sheehan
and Hoy 2000).
   Third, since many consumers assume the information will be sold to
third parties, an increasing number tend to disclose only those bits of infor-
mation that are not perceived to be particularly risky or too valuable to risk
trading without high rewards in exchange (Olivero and Lunt 2004; Sheehan
and Hoy 2000).
WINTER 2007                VOLUME 41, NUMBER 2                           355

   Fourth, educated, experienced, and knowledgeable consumers tend to be
more concerned and take more precautions to protect their personal infor-
mation. High levels of technical knowledge are positively correlated with
privacy concerns (Olivero and Lunt 2004). Better educated and more afflu-
ent computer users are more likely to refuse to share personal information
online (Equifax-Harris' (1995) Mid-Decade Consumer Privacy Survey;
Milne and Rohm 2000; Phelps, Nowak, and Ferrell 2000). More savvy
online consumers may even provide false information about themselves
in an effort to remain anonymous (Milne 2000).
   Fifth, perceived risk is reduced, and more personal information shared,
when consumers have a feeling of trust with the data collector (Milne and
Rohm 2000). When consumers are faced with uncertainty and risk, the rep-
utation of the data collector becomes increasingly important. People are
more willing to disclose data when the collector is well known and has
an image to maintain because a data collector's desire to maintain its rep-
utation is a perceived deterrent to data misuse (Olivero and Lunt 2004). If
consumers have an established relationship with the data collector, they
usually have fewer privacy concerns (Olivero and Lunt 2004; Sheehan
and Hoy 2000).
   These and other findings suggest that consumers have incentives and are
motivated to shop their privacy preferences. They also show that online
firms have incentives to respond to those preferences. All of this suggests
that a market for information privacy is emerging, but the suggestion is
misleading.

                  Privacy Is Seldom Salient in Practice

   Unfortunately, what occurs in a controlled research environment does
not happen in the online world. While consumers in controlled environ-
ments seem to value privacy and strive to protect it in their decisions about
sharing information, their decisions about disclosing information in online
transactions often do not match their stated privacy concerns (Dommeyer
and Gross 2003; Norberg, Home, and Home 2007). Consumers seldom
read privacy policies, and seldom even cite privacy as a factor in deciding
which business to use or which Web sites to frequent (Fogg et al. 2002).
There are several reasons why.
   Generally speaking, consumers make decisions under conditions of lim-
ited or bounded rationality, and decisions about sharing personal informa-
tion online are no different. People have a limited capacity for obtaining,
understanding, and using information at each stage in a decision-making
process (Apter 2002; Jacoby 2000; Simon 1955). Most consumer behavior
356                                       THE JOURNAL OF CONSUMER AFFAIRS

is predicated upon low-effort or low-involvement decision making that
involves a limited number of salient attributes, with the consumer disregarding
less salient attributes to choose the best alternative (Hoyer and Maclnnis
1997).
   This does not necessarily mean that people act irrationally but that they
pursue goals other than the strict accuracy of the decision. In making deci-
sions about interactions with Web sites, consumers pursue other goals that
render privacy less salient than other attributes.

      Rational Decision-Making Goals Other Than Maximum Accuracy

    People choose decision strategies that are a compromise between their
desire for complete accuracy (choosing the alternative that best serves
their interests) and their desire to achieve other goals. Other than maxi-
mizing the accuracy of the decision, another important goal is the min-
imization of cognitive effort (Bettman, Luce, and Payne 1998). When
making decisions, people tend to expend only as much effort as is nec-
essary to reach a satisfactory, rather than optimal, decision (Garbarino and
Edell 1997). As circumstances require more cognitive effort to process
available information, decision makers often choose decision methods
that are easier to implement, though less accurate because important fac-
tors are left out of the calculus (Garbarino and Edell 1997; Johnson,
Payne, and Bettman 1998; Lussier and Olshavsky 1979). Moreover, when
people are required to exert more cognitive effort to evaluate a particular
alternative, they often are less inclined to prefer it to alternatives that
require less effort to evaluate, unless that alternative is clearly superior
in the end (Garbarino and Edell 1997). In other words, exerting
more cognitive effort can result in a negative effect associated with that
alternative and can make that alternative less appealing simply because it
is harder to evaluate.
   Even when individuals are motivated to exert cognitive effort to eval-
uate alternatives accurately, practical problems can create obstacles that
affect saliency. Research suggests that the number of attributes decision
makers are capable of investigating and integrating into the decision pro-
cess is as few as five, though the number will vary depending on the per-
ceived importance of the decision (Bettman, Luce, and Payne 1998;
Lussier and Olshavsky 1979; Olshavsky 1979). In addition, if people
do not notice an attribute, it cannot have an impact on the decision process
(Fogg 2003). Time constraints also can be important. When time to make
a decision is scarce, people switch from more complete decision-making
strategies to strategies that accelerate their information processing (Payne,
WINTER 2007                VOLUME 41, NUMBER 2                           357

Bettman, and Luce 1996; Pieters and Warlop 1997; Wright 1974). While
there may be plenty of time to read the privacy practices of each Web site
visited, to do so would substantially impair one of the principal benefits of
going online-a fast and convenient way to learn information, communi-
cate with others, and purchase goods and services. Regardless of time con-
straints, however, if there are limits on the number of attributes consumers
can effectively investigate when making choices, privacy has to be impor-
tant enough to be in that top tier.
   Another important goal in consumer decision making is minimizing the
negative emotional response that people experience when forced to make
difficult trade-offs. We are emotional beings, and choices sometimes
involve wrenching decisions, giving up something of value that we do
not wish to lose (Bettman, Luce, and Payne 1998; Lazarus 1991). People
want to minimize the discomfort that arises from facing emotion-laden
choices, and they tend to select decision strategies that further
this goal. This can reduce the accuracy of the decision because the indi-
vidual will avoid certain parts of the calculus that require discomforting
comparisons. When this occurs, individuals focus their attention
elsewhere and choose strategies that allow them to avoid making uncom-
fortable comparisons (Luce 1998; Tetlock 1992; Tversky and Shafir
 1992).
    Depending on the context, one or more of these goals (accuracy of the
decision, cognitive ease, and emotional comfort) may be more prominent
in the decision process. For example, when faced with an irreversible
decision that will have profound effects on one's life, the decision maker
may care less about cognitive ease and emotional comfort and work hard
to make the most accurate choice. The relative weight given to each goal
also is influenced by the decision maker's ability to get feedback about the
choice. In general, feedback about cognitive effort and emotional comfort
will be more immediate and less ambiguous than feedback about the accu-
racy of the choice, which may come at a later time (Bettman, Luce, and
Payne 1998; Einhorn 1980). When that occurs, the decision maker likely
gives less weight to the accuracy goal and more weight to the other two
goals.

               Other Factors Influencing Consumer Choice

   As consumers pursue these decision-making goals, the likelihood that
a consumer will process a particular attribute, and thereby make it salient
in the decision-making process, is influenced by many factors. In the con-
text of online decision making, several factors are relevant.
358                                       THE JOURNAL OF CONSUMER AFFAIRS

Inferences

   If an attribute is important but not easy to evaluate, people may infer the
missing value rather than investigate it (Fogg 2003). They may infer a value
from the values they already know. For example, they may assume that the
attribute is similar across brands (e.g., all Mercedes Benz car warranties
probably are similar). Or, they may infer a value in line with the values
they assigned to other attributes of the given option (e.g., since Mercedes
Benz engineering is first rate, the warranty probably is as well). Consumers
often use a brand as a proxy for credibility rather than investigating the
important characteristics more completely (Smith and Brynjolfsson
2001; Wernerfelt 1988).

Framing Effects

   The form and manner in which information is provided will affect its
saliency (Magat, Viscusi, and Huber 1987; Viscusi 1966). Consumers
process information in a way that is congruent with the format of its pre-
sentation, processing the information in the form presented without
rearranging it (Bettman and Zins 1979; Slovic 1972). The effect is most
pronounced when consumers perceive the costs of accepting the given for-
mat (both the effort required to delve into the subject more deeply and the
lost accuracy in accepting the information as given) is low. Only if costs of
format acceptance are perceived to be high, or if the information is pre-
sented in a disorganized or confusing way, will consumers discount the
format as presented and seek additional information. Thus, people often
choose between descriptions of options rather than the options themselves,
accepting the description as accurate (Baron 1997; Jenni and Lowenstein
1997).

The Availability Heuristic

   People overrespond to risks that are well known because of news cov-
erage or immediacy. Such risks are "available" in people's minds, and they
can therefore bring the information into the decision process more readily
(Kuran and Sunstein 1999). The availability heuristic becomes relevant
when people base judgments on the probability of certain events happen-
ing. Judgments about probability often are affected by how familiar deci-
sion makers are with instances of the event occurring. "Availability
entrepreneurs" exploit the heuristic by focusing public attention on events
to ensure that the event will be more available and more salient in the
WINTER 2007                  VOLUME 41, NUMBER 2                             359

decision-making process. The availability heuristic can work the other way
as well. People underestimate the likelihood of certain events because those
events do not come to their attention often (Jolls, Sunstein, and Thaler
1998).

        IMPLICATIONS FOR PRIVACY SHOPPING ONLINE

   The decision strategies, goals, and behavior patterns outlined above have
important implications for consumers making decisions about information
privacy online.
   If consumers use decision strategies rationally to pursue goals other than
maximum accuracy of the decision, one outcome may be that they prefer to
forego the cognitive effort that is needed to read and decipher privacy pol-
icies. Thus, while it may be in a Web site's interest to post a privacy policy
or display a trust mark to give the impression that it cares about safeguard-
ing user information, it may not be in the site's interest to encourage or
direct consumers to view privacy terms before entering into a transaction,
or to require a click on an "I agree" button. This is the case even if the
site has a stronger privacy policy than its competitors because requiring
such a step requires more effort to work with the site. Unless the site
can demonstrate a substantially superior privacy practice, its efforts may
be counterproductive.
    Because of the desire to minimize emotional conflict, people may avoid
comparing attributes that are dissimilar, especially when asked to put a price
on something she intuitively believes should not be compromised. A well-
informed consumer may learn that a Web site does not retain or sell personal
information of any kind, but find it difficult to compare the value of that site's
privacy policy with different benefits (such as lower prices) from another site.
When considering the alternatives, people are faced with a fundamental
incomparability among competing options (Adler 1998; Chang 1997). What
is the value of knowing that the details of one's life are not sold to third
parties? Is it worth giving up the benefits offered by the competing, but less
private, alternative? Comparing disparate categories of benefits and costs is
extremely difficult in any circumstance, and when making decisions about
privacy the attributes we are asked to compare vary widely. The emotional
conflict created by the comparison is heightened when a person is asked to
put a price on something she believes should not be commodified or traded
away (Bettman, Luce, and Payne 1998; Bettman and Sujan 1987). The prob-
lem is most acute when people are asked to trade values they view as sacred
or protected (Baron and Spranca 1997; Tetlock, Peterson, and Lerner 1996);
for most people privacy is such a value.
360                                       THE JOURNAL OF CONSUMER AFFAIRS

    Rather than struggle to make a difficult comparison, individuals may
turn to affect cues (feelings derived from a consumer's experiences with
a particular alternative) as a decision-making guide. Literature shows that
affect cues exert a stronger influence on choice when consumers have
diminished ability to judge alternatives rationally (Pham 1998). When
consumers find it difficult to process and compare the information neces-
sary to make an accurate decision, affect cues become even more pro-
nounced (Pham 1998; Winkielman, Zajonc, and Schwarz 1997). As
a result, feelings generated from a Web user's experience interacting with
a Web site may affect decisions about sharing personal information, but
those feelings can lead to inaccurate decisions. Feelings of confidence
and security about a site, for instance, may not correlate with the site's
privacy practices.
    Moreover, the more immediate and concrete the feedback about a par-
ticular goal, the more emphasis one is likely to give it in making choices.
This is important in the market for privacy protection because the accuracy
of any decision about revealing personal information usually will not be
apparent until long after the transaction has ended (if ever). Only rarely
will a consumer be able to trace the spam, identity theft, consumer profiling,
annoying advertising campaign, or junk mail to a particular Web site's
weak privacy practices. In contrast, feedback on cognitive effort and emo-
tional conflict is experienced at the same time as the Web user is making
a decision about sharing information or choosing which site to use. As
a result, the latter two goals tend to weigh more heavily in the decision
strategy, and the user therefore is less likely to search for and choose
the privacy practices that best align with the user's privacy preferences.
   Framing effects also can contribute to the decreased saliency of a Web
site's privacy practices. Because consumers tend to process information in
the form in which it is displayed without transforming it, a Web site may
give the impression that it has a strong privacy policy when in fact it does
not, knowing that consumers will take them at their word without discov-
ering the details. Unless consumers believe that the cost of accepting the
given format are high (i.e., they have suspicions about a privacy claim and
fear that they will pay a high cost if they do not verify the claim), they will
not be motivated to obtain additional information. Web seals and trust
marks, in particular, take advantage of framing effects because they signal
a genuine concern about privacy when the site could, in fact, be compar-
atively lax in its privacy practices (Miyazaki and Krishnamurthy 2002;
Rifon, LaRose, and Choi 2005).
   Inferences also can lead to erroneous assumptions about a site's privacy
practices. Consumers may assume erroneously that the privacy policies of
WINTER 2007                 VOLUME 41, NUMBER 2                            361

similar retailers are roughly alike or that brand name retailers must have
strong privacy policies because they are generally reliable and credible
in other aspects of their business. Competing Web sites have to work hard
to overcome such inferences if they want to distinguish themselves as
strong privacy providers. Yet, if competitors do make efforts to draw atten-
tion to their privacy practices, they risk increasing the cognitive effort of
users and forcing emotion-laden comparisons, both of which can make the
site less appealing.
    The availability heuristic also may direct consumers away from shop-
ping for privacy online. People may underestimate the effects of informa-
tion disclosure, and its potential costs, if the adverse consequences of weak
privacy practices come to their attention infrequently. While there is
increasingly more publicity about security leaks and unauthorized access
to consumer databases, such as the highly publicized security breach at
ChoicePoint (CNN Money 2005), consumers seldom hear about the actual
harms resulting from weak privacy practices. Hearing about security
leaks in the news raises a societal concern about privacy, but because con-
sumers seldom know what information about them is collected and sold
by and to whom, connecting a risk to particular data brokers is extremely
difficult.
    Even in the ChoicePoint incident, where thousands of consumers were
notified that their files were compromised, people likely will not know if the
security breach resulted in any harm to them. Even if a consumer suffers
from identity theft at some future date, the source of the problem likely will
never be known. Moreover, data brokers such as ChoicePoint do not deal
with consumers directly, and few consumers know how brokers build their
databases and what sources they use. Without knowing the sources, con-
sumers cannot avoid sharing information with them to protect against future
similar problems. Thus, while publicity can increase societal concern about
information privacy in general, it does little to raise the saliency of privacy
in any particular decision-making process.
    Concerns about data collection and sharing frequently are met with
assurances that the concern is temporary, and evolving behavioral patterns
or emerging technologies will address the problem in time (Swindle 2000).
New generations of consumers may be better equipped to protect their pri-
vacy interests in an increasingly digital world, and only time will tell. Tech-
nological advances promise increased privacy protection, but if technology
is to become an effective control against data collection practices on the
Internet, one of two things must happen. First, many more Internet users
must become capable of selecting and working with the required technol-
ogy (Wall Street Journal 2001). While this transformation may occur, it
362                                                THE JOURNAL OF CONSUMER AFFAIRS

seems unlikely in the near term. The data collection industry is continually
developing more sophisticated methods of data mining, and the technology
required to defend against it will have to keep pace with equal or greater
sophistication.
    Second, a universal and mandatory privacy software standard could be
developed. To succeed, it would have to be compatible with most Internet
sites, personal computers, servers, and interfacing hardware and software;
be relatively easy for ordinary consumers to use; and be readily updated so
that data seekers would find it difficult to evade. Creating such a universal
standard seems both politically and technologically infeasible at present. If
mandated by government, such a measure might face constitutional chal-
lenges as well (Volokh 2000). For the time being, economic incentives on
the Internet produce technologies that foster data collection and sharing
more than they restrict it (Reidenberg 2000).
    More importantly, during any period of evolving consumer behavior and
emerging technologies, the privacy interests of many citizens will be com-
promised in ways that could be prevented by stronger laws and more rig-
orous enforcement of existing laws. For those who are harmed when their
personal information falls into the wrong hands, there is little comfort in
knowing that they are participants in a larger evolutionary process that will
result in a policy initiative to benefit future generations. Remaining passive
has other costs as well. As businesses continue to collect, manipulate, and
share personal data in increasingly sophisticated ways, practices and atti-
tudes about privacy will crystallize, thus making it more difficult to change
the status quo and initiate reforms at a later time. Politically, arguments
against policy change become stronger as vested interests become more
entrenched. In short, unless change comes quickly, our self-policing pri-
vacy regime may be with us for a very long time.

                                     REFERENCES

Adler, Matthew. 1998. Law and Incommensurability: Introduction. University of Pennsylvania Law
    Review, 146 (5): 1169-1184.
Apter, Michael J. 1992. The Dangerous Edge: The Psychology of Excitenent. New York: Free Press.
Baron, Jonathan. 1997. Confusion of Relative and Absolute Risk in Valuation. Journal of Risk and
    Uncertainty, 14 (3): 301-309.
Baron, Jonathan and Mark D. Spranca. 1997. Protected Values. OrganizationalBehavior and Human
    Decision Processes, 70 (1): 1-16.
Bellotti, Victoria. 1997. Design for Privacy in Multimedia Computer and Communications Environ-
    ments. In Technology and Privacy: The New Landscape, edited by Philip E. Agre and Marc
    Rothenberg (63-98). Cambridge: MIT Press.
Bettman, James R., Mary Frances Luce, and John W. Payne. 1998. Constructive Consumer Choice
    Processes. Journal of Consumer Research, 25 (3): 187-217.
WINTER 2007                          VOLUME 41, NUMBER 2                                           363

Bettman, James R. and Mita Sujan. 1987. Effects of Framing on Evaluation of Comparable and Non-
    comparable Alternatives by Expert and Novice Consumers. Journalof ConsurnerResearch, 14 (2):
    141-154.
Bettman, James R. and Michel A. Zins. 1979. Information Format and Choice Task Effects in Decision
    Making. Journal of Consumer Research, 6 (2): 141-153.
Caudill, Eve M. and Patrick E. Murphy. 2000. Consumer Online Privacy: Legal and Ethical Issues.
   Journal of Public Policy and Marketing, 19 (1): 7-19.
Chang, Ruth. 1997. Introduction to Incomnmensurability. Incomparability, and Practical Reason.
    Cambridge: Harvard University Press.
CNN Money. 2005. ChoicePoint: More ID Theft Warnings. February 17. http://money.cnn.com]2005/
    02/17/techuology/personaltech/choicepoint/.
Dommeyer, Curt J. and Barbara J. Gross. 2003. What Consumers Know and What They Do: An Inves-
    tigation of Consumer Knowledge, Awareness, and Use of Privacy Protection Strategies. Journalof
    Interactive Marketing, 17 (2): 34-51.
Einhom, Hillel J. 1980. Learning from Experience and Sub-Optimal Roles in Decision Making.
    In Cognitive Processes in Choice and Decision Behavior, edited by Thomas S. Wallstein (1-20).
    Hillsdale, NJ: Lawrence Erlbaurn.
Equifax-Harris. 1995. Equifax-HarrisMid-Decade Consumer PrivacySurvey. New York. Louis Harris.
Fogg, B.J. 2003. Prominence-Interpretation Theory: Explaining How People Assess Credibility Online.
    Proceedings of ACM CHI 2003 Conference on Hurnan Factors in Computing Systerns. http://cred-
     ibility.stanford.edu/pit.html.
Fogg, B.J., Cathy Soohoo, David Danielson, Leslie Marable, Julianna Stanford, and Ellen R. Tauber.
    2002. How Do People Evaluate a Web Site's Credibility? Results from a Large Study. Consumer
     Reports WebWatch and Stanford University. October 29. http://consumerwebwatch.orgldynamic/
    web-credibility-reports-evaluate-abstract.cfm.
Franz, Rapheal. 2001. Privacy Standards for Web Sites: Web Seals. Internet Law Journal. February 5.
    http://www.tilj.com/content/ecomarticle02050I03.htm.
Garbarino, Ellen C. and Julie A. Edell. 1997. Cognitive Effort, Affect, and Choice. Journal ofConsumer
    Research, 24 (2): 147-158.
Hann, Il-Hom, Kai-Lung Hui, Sang-Yong Tom Lee, and Ivan P.L. Png. 2003. The Value of Online
     Information Privacy: An Empirical Investigation. SSRN Working Paper (March). http:/Hssrn.com/
    abstract-391993.
Hoar, Sean B. 2001. Identity Theft: The Crime of the New Millennium. Oregon Law Review, 80 (4):
     1423-1448.
Hoyer, Wayne D., and Deborah J. Maclnnis. 1997. Consumer Behavior. Boston: Houghton Mifflin.
Jacoby, Jacob. 2000. Is It Rational to Assume Consumer Rationality? Some Consumer Psychological
     Perspectives on Rational Choice Theory. Roger Williams University Law Review, 6 (1): 81-162.
Jenni, Karen and George Lowenstein. 1997. Explaining the "Identifiable Victim Effect." Journal of
     Risk and Uncertainty, 14 (3): 235-257.
Johnson, Eric J., John W. Payne, and James R. Bettman. 1998. Information Displays and Preference
     Reversals. OrganizationalBehavior and Hunian Decision Processes, 42 (1): 1-21.
Jolls, Christine, Cass R. Sunstein, and Richard Thaler. 1998. A Behavioral Approach to Law and Eco-
     nomics. Stanford Law Review, 50 (5): 1471-1500.
Kuran, Timur and Cass Sunstein. 1999. Availability Cascades and Risk Regulation. Stanford Law
     Review, 51 (4): 683-768.
LaRose, Robert and Nora J. Rifon. 2007. Promoting i-Safety: Effects of Privacy Warnings and Privacy Seals
     on Risk Assessment and Online Privacy Behavior. Journalof Consumer Affairs, 41 (1): 127-149.
Lazarus, Richard S. 1991. Progress on a Cognitive-Motivational-Relational Theory of Emotion. Anier-
     ican Psychology, 46 (8): 9-34.
Luce, Mary Frances. 1998. Choosing to Avoid: Coping with Negatively Emotion-Laden Consumer
     Decisions. Journal of Consumer Research, 24 (4): 409-433.
Lussier, Denis A. and Richard W. Olshavsky. 1979. Task Complexity and Contingent Processing in
     Brand Choice. Journal of Consumer Research, 6 (2): 154-165.
364                                                   THE JOURNAL OF CONSUMER AFFAIRS

Magat, Wesley, W. Kip Viscusi, and Joel Huber. 1987. An Investigation of the Rationality of Consumer
    Valuations of Multiple Health Risks. The RAND Journal of Economics, 18 (4): 465-479.
Milne, George R. 2000. Privacy and Ethical Issues in Database/Interactive Marketing and Public Policy:
    A Research Framework and Overview of the Special Issue. Journalof PublicPolicy andMarketing,
     19 (1): 1-6.
Milne, George R., Mary J. Culnan, and Henry Greene. 2006. A Longitudinal Assessment of Online
    Privacy Notice Readability. Journal of Public Policy and Marketing, 25 (2): 238-249.
Milne, George R. and Andrew J. Rohm. 2000. Consumer Privacy and Name Removal across Direct
    Marketing Channels: Exploring Opt-in and Opt-out Alternatives. Journal of Public Policy and
    Marketing, 19 (2): 238-249.
Miyazaki, Anthony D. and Sandeep Krishnamurthy. 2002. Internet Seals of Approval: Effects on Online
    Privacy Policies and Consumer Perceptions. Journal of Consumer Affairs, 36 (1): 28-49.
Nehf, James P. 2003. Recognizing the Societal Value in Information Privacy. Washington Law Review,
    78 (1): 1-92.
         . 2005a. Incomparability and the Passive Virtues of Ad-Hoc Privacy Policy. University of
    Colorado Law Review, 76 (1): 1-56.
         . 2005b. Shopping for Privacy Online: Consumer Decision Making Strategies and the Emerging
    Market for Information Privacy. Journal of Law, Technology & Policy (1): 1-57.
Norberg, Patricia A., Daniel R. Home, and David A. Home. 2007. The Privacy Paradox: Personal Infor-
    mation Disclosure Intentions Versus Behaviors. Journalof Consumer Affairs, 41 (1): 100-127.
Olivero, Nadia and Peter Lunt. 2004. Privacy Versus Willingness to Disclose in E-Commerce
    Exchanges: The Effect of Risk Awareness on the Relative Role of Trust and Control. Journal
    of Economic Psychology, 25 (2): 243-262.
Olshavsky, Richard W. 1979. Task Complexity and Contingent Processing in Decision Making: A
    Replication and Extension. OrganizationalBehavior and Human Decision Processes,24 (August):
    300-316.
Payne, John W., James R. Bettman, and Mary Frances Luce. 1996. When Time is Money: Decision
    Behavior Under Opportunity-Cost Time Pressure. OrganizationalBehavior and Human Decision
    Processes, 66 (2): 131-152.
Pham, Michael Tuan. 1998. Representativeness, Relevance, and the Use of Feelings in Decision
    Making. Journal of Consumer Research, 25 (2): 144-159.
Phelps, Joseph, Glenn Nowak, and Elizabeth Ferrell. 2000. Privacy Concerns and Consumer
    Willingness to Provide Personal Information. Journal of Public Policy and Marketing, 19 (1):
    27-41.
Pieters, Rik and Luk Warlop. 1997. The Effect of Time Pressure and Task Motivation on Visual Atten-
    tion to Brands. In Advances in Consumer Research, edited by Merrie Brucks and Deborah J.
    Maclnnis, 24 (1): 281-287.
Pippin, R. Ken. 1999. Consumer Privacy on the Internet: It's Surfer Beware. Air Force Law Review,
    47 (1): 125.
Reidenberg, Joel R. 2000. Statement before Oversight Hearing on Privacy and Electronic Commerce
    before the House Subcommittee on Courts and Intellectual Property, House Committee on the Judi-
    ciary. May 18.
Rifon, Nora J., Robert LaRose, and Sejung Marina Choi. 2005. Your Privacy Is Sealed: Effects of Web
    Privacy Seals on Trust and Personal Disclosure. Journal of Consumer Affairs, 39 (2): 337-360.
Saunders, Kurt M. and Bruce Zucker. 1999. Counteracting Identity Fraud in the Information Age: The
    Identity Theft and Assumption Deterrence Act. Cornell Journal of Law and Public Policy,
    8 (August): 661-667.
Sheehan, Kim Bartel and Mariea Grubbs Hoy. 1999. Flaming, Complaining, Abstaining: How Online
    Users Respond to Privacy Concerns. Journal of Advertising, 28 (3): 37-52.
         .2000. Dimensions of Privacy Concern among Online Consumers. Journalof PublicPolicy and
    Marketing, 19 (1): 62-73.
Simon, Herbert A. 1955. A Behavioral Model of Rational Choice. Quarterly Journalof Economics,
    69 (1): 99-118.
WINTER 2007                        VOLUME 41, NUMBER 2                                          365

Slovic, Paul. 1972. From Shakespeare to Simon: Speculations and Some Evidence About Man's Ability
    to Process Information. Oregon Research Institute Bulletin, 12 (2): 1-19.
Smith, Michael D. and Erik Brynjolfsson. 2001. Consumer Decision-Making at an Internet Shopbot.
    l1ournal of IndustrialEconomics, 4 (December): 541-558.
Solove, Daniel J. 2001. Privacy and Power: Computer Databases and Metaphors for Information Pri-
    vacy. Stanford Law Review, 53 (6): 1393-1462.
Swindle, Orson. 2000. Dissenting Statement, Federal Trade Commission. Privacy Online: Fair Infor-
    mation Practices in the Online Environment. http://www.ftc.gov/reports/privacy200O/swindledis-
    sent.pdf.
Tetlock, Philip. 1992. The Impact of Accountability on Judgment and Choice: Toward a Social
    Contingency Model. In Advances in Experimental Social Psychology, edited by Mark P. Zanna
    (331-376). New York: Academic Press.
Tetlock, Philip E., Randall S. Peterson, and Jennifer S. Lemer. 1996. Revising the Value Pluralism
    Model: Incorporating Social Content and Context Postulates. In The Psychology of Values: The
    Ontario Symposium on Personality and Social Psychology, edited by Clive Seligman. James
    M. Olson, and Mark P. Zanna (25-51). Hillsdale, NJ: Lawrence Erlbaum.
Tversky, Amos and Eldar Shafir. 1992. Choice under Conflict: The Dynamics of Deferred Decisions.
    Psychological Sciences, 3 (6): 358-361.
Varian, Hal R. 1992. Microeconornic Analysis, 3rd edition. New York: W.W.
Viscusi, W. Kip. 1996. Individual Rationality, Hazard Warnings, and the Foundations of Tort Law.
    Rutgers Law Review, 48 (3): 625-672.
Volokh, Eugene. 2000. Freedom of Speech, Information Privacy: The Troubling Implications of a Right
     to Stop People from Speaking about You. Stanford Law Review, 52 (5): 1049-1124.
Wall Street Journal. 2001. Exposure in Cyberspace. Wall Street Journal. March 21.
Wemerfelt, Birger. 1988. Umbrella Branding as a Signal of New Product Quality: An Example of Sig-
   naling by Posting a Bond. The RAND Journalof Economics, 19 (3): 458-466.
Winkieltnan, P., R.B. Zajonc, and N. Schwarz. 1997. Subliminal Affective Priming Resists Attribu-
   tional Interventions. Cognition antd Einotion, 11 (4): 433-465.
Wright, Peter. 1974. The Harassed Decision Maker: Time Pressures, Distractions, and the Use of Evi-
   dence. Journal of Applied Psychology, 59 (5): 555-561.

             A Reply: An Ecosystem Perspective on Privacy by
                  Leyland F. Pitt and Richard T. Watson

    Privacy is the right to be alone-the most comprehensive of rights, and the right most
    valued by civilized man. (Louis D. Brandeis)

   Professor Nehf provides an excellent perspective on Internet privacy.
He explores a diversity of disciplines, including behavioral economics,
cognitive and consumer psychology, and public policy to construct an
argument that the market for online consumer privacy is inefficient. Like
many important issues, privacy is embedded in an ever-changing ecosystem,

   Leyland F. Pitt is a professor of marketing at the Faculty of Business Administration, Simon Fraser
University, Vancouver, Canada. Richard T. Watson is the J. Rex Fuqua Distinguished Chair of Internet
Strategy at the Terry College of Business, University of Georgia, Athens, GA.
366                                           THE JOURNAL OF CONSUMER AFFAIRS

and we believe our broad examination in this article complements Nehf's
deep analysis of a portion of the ecosystem.
   The paradox of technologies is that for humans they are never univer-
sally good or entirely bad (Mick and Foumier 1998). Automobiles are
good-they transport us speedily, in comfort, and give us both pleasure
and status. Simultaneously, many hours are lost in traffic congestion, peo-
ple die in crashes, and cars are a major cause of environmental degradation.
Computers are good-they make us more productive and leverage our cre-
ativity. Yet, most humans wonder what they did with their time before com-
puters kept them busy all day, and then there are the ever-present error
messages, malfunctions, and crashes. The Internet is no exception-it
presents the greatest opportunity in history to find and use information,
to interact with others everywhere, to serve oneself, and control one's
own destiny. Simultaneously, it is the single biggest threat to individual
privacy and a malicious means of laying one's life bare to the world. Tech-
nology is not universally good or entirely bad-it is merely indifferent to
the human condition. Internet privacy is not a new privacy problem; it is
merely a privacy issue accelerated into overdrive by technology.

                             PRIVACY DEFINED

   You have zero privacy anyway ... Get over it. (Scott McNealy)

   Privacy, according to the Oxford English Dictionary, is "The state or
condition of being withdrawn from the society of others, or from public
interest; seclusion" (Simpson and Weiner 1989). It is an old English word,
with the first noted occurrence around 1450. The modem notion of privacy
connotes being "free from public attention, as a matter of choice or right;
freedom from interference or intrusion" (Simpson and Weiner 1989)
and began to evolve through the courts in the 19th century. Warren and
Brandeis (1890) observed, "The question whether our law will recognize
and protect the right to privacy ... must soon come before our courts for
consideration." More recently, privacy has been defined as "the ability of
the individual to control the terms under which personal information is
acquired and used" (Westin 1967, p. 7), while information privacy has
been used to refer to "the ability of the individual to personally control
information about one's self" (Stone et al. 1983, p. 461).
   The preceding definitions create the illusion that privacy is entirely
attainable-it never has been, is not, and likely never will be (Dinev and Hart
2006). The minute one interacts, one surrenders privacy; as soon as an indi-
vidual trades, that person sacrifices privacy. This is as true for all prior eras as
WINTER 2007                           VOLUME 41, NUMBER 2                                            367

it is for this electronic age. As soon as an individual visited a trader, even for
the simplest household requirements, and that trader remembered facets of the
individual's personal details, that individual forfeited some privacy. Even if
this had merely meant the revelation of tastes and preferences, some privacy
was foregone. Only those prepared to exist in complete isolation, never to
interact with others, to live the life of a hermit, can aspire to perfect privacy.
Few are willing or even desire to achieve this state of isolation. Absolute pri-
vacy exacts high social, functional, and emotional costs.
    In pre-industrial days, customers' details were stored in the minds and
memories of traders, who undoubtedly used this knowledge to exchange
advantageously. As writing tools and techniques advanced, customer
 information was committed to hard copy. Much later, these data were
stored electronically, and a range of devices was used to record all manner
of customer information, preferences, behaviors, and transaction histories.
While the Internet automates much of the capturing of extensive customer
data, it is not the only technology with vast potential to invade customer
 privacy. Security cameras observe citizens in a myriad of places in many
 large cities, devices to overhear conversations can be purchased over the
 counter in many stores, and telephone tapping is within the reach of many
 outside of law enforcement agencies. Radio-frequency identification (RFID)
 technology has the potential to speed up supermarket checkout; yet, it also
 has the potential to create privacy nightmares for innocent individuals
 (Ohkubo, Suzuki, and Kinoshita 2005). Picture a consumer who purchased
 a can of RFID-tagged soda, consumed it, and then disposed of the can at
 a spot where a crime later was committed. Law enforcement authorities might
 pick up the can as evidence, and then use the RFHD tag not only to identify the
 store at which it was purchased but also to track the purchaser's loyalty and
 credit cards. Technology with the potential to invade the individual's privacy
 constantly evolves because humans are driven by a desire for access to all
 pertinent information at all times in all places (Junglas and Watson 2006;
 Watson et al. 2002).
     The concept of privacy, as captured in the earlier definitions, does not
 confine the domain of consideration to one particular group or entity.
 Privacy is not only an issue for citizens but also a concern for all
  stakeholders in modem society. Privacy exists within an interacting, ever-
 changing ecosystem of three major players: consumers,l governments,
  and corporations. We concentrate on privacy from the perspective of these
  three major societal agents (Figure 1).

    I. Depending on their role, they are also investors, citizens, and so forth, but we stick with consumer
 in this commentary.
368                                             THE JOURNAL OF CONSUMER AFFAIRS

FIGURE I
Privacy Ecosystem

                                         Citizen/
                                       Consumer/
                                        Investor

                    Corporation   -,      ~Government

    It is obvious that in their various internal and external interactions, each
 entity produces and consumes inforrnation, and it is the dissemination of
 these data that is the central issue of privacy (and its twin, transparency).
 Our perspective is that privacy is just one way of viewing the general issue
 of the availability of information because there are some issues where pri-
 vacy is not in the best societal interests (e.g., secret courts and closed leg-
 islative sessions). It also should be apparent from Figure I that the three key
 stakeholders do not only interact with each other but also interact with
themselves, as will be discussed.
    The ecosystem is continually changing because of technology, the
actions of the major players, social change, and various threats (e.g., ter-
rorism). Just as technology with the potential to attack the individual's
confidentiality progresses relentlessly, so too does technology that enables
those individuals to protect their privacy. And then technology to overcome
that technology evolves in turn! Caller ID services made it possible for
telephone subscribers to protect themselves from crank callers and un-
wanted telemarketing. Corporations realized the marketing potential of this
development by using Caller ID to identify the numbers of callers, whose
data subsequently could be sold to others. Telephone companies exploited
this reversal of privacy protection by selling services that allow subscribers
to block their identity. Consumers also pushed governments to get involved
by creating "do-not-call lists" to reduce telemarketing. A technological change
reverberates throughout the ecosystem as each of the key actors attempts
to deploy the technology advantageously and then the other actors
react to new intrusions upon themselves or other constituents of the
ecosystem.
WINTER 2007                   VOLUME 41, NUMBER 2                                  369

   We now turn our attention to analyzing privacy from the perspective of
each of the interactions depicted in Figure 1.

                     CONSUMER AND CONSUMERS

  On the Internet, nobody knows you're a dog. (Peter Steiner, in The New Yorker)

   While the old joke might be that no one knows you are a dog on the
Internet, the reality in many cases is that other consumers can not only
know whether you are a dog but also your bark, pedigree, and much more.
Search engines such as Google permit consumers to find out significant
information about others. While much of this might be harmless-
a great way of tracking old school friends and their new addresses and
positions-there is the possibility that this type of search can lead to
cyber-stalking and the uncovering of information that others might not
want known. "Job hunters or co-op applicants wishing to expunge
one-time indiscretions or criminal pasts may be at the mercy of a googler"
(Vise and Malseed 2005). It is not only Google that represents a tool
for interpersonal privacy invasion. AOL offers users of its AIM instant
messaging service the capability to see where people on their buddy
lists are physically located (see www.aim.com). While the intent of
the service is obviously benevolent, it is easy to imagine the potential
for its abuse.
    Consumers are using the Internet to invade each other's privacy. They
can learn the value of a neighbor's house with a few clicks (e.g., zillow.
com). They also sometimes blatantly give up their privacy (e.g., Facebook
 and prosper.com). We need protection from each other and ourselves to
 preserve our privacy.

                    CONSUMER AND CORPORATION

   Some sites bury your rights in a long page of legal jargon so it's hard to find them
   and hard to understand them once you find them. (Former FFC Chairman Robert
   Pitofsky)

   In their dealings with corporations, consumers generally trade money for
goods and services. The quality of the goods and services they receive is not
only in direct proportion to the money they pay, but it also is usually in
proportion to the information they are prepared to divulge. Two simple ex-
amples suffice: a pair of slacks purchased by a consumer "off the peg" from
an online clothing retailer will not fit as well as those purchased from a Web
370                                            THE JOURNAL OF CONSUMER AFFAIRS

 site requiring the consumer to enter waist, hip, inner leg, and leg measure-
 ments. A consumer purchasing travel insurance from an online insurance
 provider might get a "one-size fits all" package for a trip's duration by
 entering general details; one who is prepared to divulge personal
 details such as age, state of health, complete details of the trip, value of
 goods, and so forth will be able to obtain a tailored package fitting unique
 needs.
     By the very nature of their business, many firns are able to gather mas-
 sive amounts of information not only about consumers but also concerning
 what they are doing and saying. Information-intensive service firms, such
 as financial institutions, have long been able to extract information from
 consumers and also to observe the transactions that consumers make. How-
ever, the Internet has ramped up this activity to hitherto unprecedented
 levels-firms are now able to track the movement of consumers on
Web sites and observe their browsing behavior. This information ostensibly
can be used to target suitable offerings to consumers but the potential for
abuse is obvious.
    A contemporary classification of Internet privacy concerns (Table 1) is
surprisingly robust, but it cannot obviously anticipate the further advance-
ment and refinement of techniques as technology progresses. Many of
Google's products, while obviously very useful to consumers, also have
been severely criticized for their potential to invade consumer privacy.
For example, Google's Desktop Search feature provides a very useful
tool to search for files on a personal computer. On the other hand, it also
potentially gives Google access to the contents of an individual's hard
drive. Likewise, Google's free e-mail package, Gmail, gives users virtually
unlimited storage, so that old emails never have to be deleted. Then again,
the Gmail software "reads" the content of emails in order to target

TABLE 1
A Taxonomy of Consumer Internet Privacy Concerns (Wang, Lee, and Wang 1998)

Action
Improper access           Infiltration of an Intemet consumer's private computer without
                             notice or acknowledgment
Improper collection       Collection of a consumer's private information from the Internet
                             without notice or acknowledgment
Improper monitoring       Conducting surveillance on a consumer's Intemet activities
                             without notice or acknowledgment
Improper analysis         Analyzing a consumer's private information without proper notice,
                             and deriving conclusions from such an analysis
Improper transfer         Transferring a consumer's private information to other businesses
                             without notice to or acknowledgment from the consumer
WINTER 2007                     VOLUME 41, NUMBER 2                                    371

advertising to the reader of the message. This feature was heavily criticized
at the launch (Vise and Malseed 2005) and leads some to avoid the free
service.
   There is a nexus between information and service; typically, the more
information the consumer supplies, the more easily the corporation can
determine and fulfill needs promptly and accurately. The problem is that
service is information driven, and this information needs to be maintained
to support future service activities. Consumers expect firms to know about
their past transactions. Corporations that fail to secure consumer data or
supply it without permission to other parties threaten privacy. The only
way to break this nexus between service and information is for consumers
to manage their personal data and anonymously supply it electronically and
selectively as needed to corporations (Watson et al. 2004).

                     CONSUMER AND GOVERNMENT

   Relying on the government to protect your privacy is like asking a peeping Tom to install
   your window blinds. (John Perry Barlow)

    Consumers trade money (in the form of taxes) and specific information
with government in return for certain services, protection, and general
information. The consumer has no choice in this regard (unlike in the case
of dealings with firms in most markets) and can be required to give very
detailed personal information. In return, consumers, particularly in
democracies, expect governments to secure their data and not share it with
other consumers, corporations, or government agencies. Governments,
however, have varied in their willingness to pass privacy protection
legislation, particularly with regard to the Internet. Some have enacted
strict legislation, others have relied on corporate codes of practice, and
still others have relied on markets and consumers themselves. Govern-
ments, unfortunately, often are negligent with data and consumers
can be unintentionally exposed. Even when there are laws protecting
particularly vulnerable individuals (e.g., CIA agents), governments are
tempted to dip into the information honey pot to deliberately further their
interests.
    While consumers rely on governments for privacy legislation to
protect them from misuse of personal data by all ecosystem members,
governments often rationalize away the privacy rights they have enacted.
Furthermore, electronic networks and supercomputers make massive
government surveillance feasible and less costly. In the information
age, the ability to capture all traffic in a network and analyze it tempts
You can also read