There is another important reason why it might be in the industry`s interest to regulate itself – to avoid mandatory government regulation. Consider how ICA members might rationally prefer an unregulated market to a self-regulatory market. As mentioned earlier, businesses can benefit from using and selling personal data in an unregulated market, largely because customers struggle to monitor which companies have poor information practices. ICA members could not therefore prefer regulation to self-regulation, at least until a credible threat from government regulation arose. At this point, the industry`s calculation changes. The introduction of self-regulation will tend to reduce the likelihood of government regulation. The expected costs of self-regulation to industry may therefore be lower than the expected cost of complying with government regulations. 20 A clear example is the Consumer Bankers Association`s recent announcement about its new privacy policies. The trade press report on the guidelines states: “Consumer bankers have published the privacy guidelines to show the federal government that the banking industry is monitoring itself and that no new regulation is needed.” Barbara A.
Rehm, “Bank Group Issues Guidelines for Protecting Consumer Privacy,” Am. Banker, 22 November 1996. 25 Self-regulation may protect privacy better than government regulation, for example, when .B expertise is better applied to industry regulation or when ethical beliefs and community standards work better in an industrial system than in a system of government. I tend to be cautious about such an optimistic assessment of the effectiveness of self-regulation. For a highly critical assessment of the effectiveness of self-regulation by the Direct Marketing Association, see Paul M. Schwartz & Joel R. Reidenberg, Data Privacy Law: A Study of United States Data Protection 307-48 (1996). As we saw in the text, a more likely scenario is that government regulation leads to stricter privacy protections, but also leads to higher administrative and compliance costs.
It seems to me that this is a reasonably simple solution. The information could be provided in digital form, but at a price that reflects the transaction costs associated with acquiring the information using the old technology. The price paid for the information could then be used to cover the costs of making it available to the public. CME/CFA believes that children`s privacy can only be effectively protected through a combination of full and understandable disclosure and verifiable parental consent. Currently, many websites do not provide disclosure with respect to the collection of information, whether aggregated and anonymous or personally identifiable.28 When websites offer any form of disclosure, it is generally not enough to allow meaningful consent. When websites disclose that information is being collected, they almost everywhere do not ask children to obtain parental consent before providing personal information. On the handful of websites that actually ask for parental consent, no one can actually verify that consent has been given. To say that privacy is a fundamental human right is a noble feeling with which I agree, but it does not follow that privacy is therefore outside the mechanism of transactions. As mentioned earlier, a right is only an initial allowance. It can be acquired freely and universally distributed regardless of wealth, but it is in man`s nature to have different preferences and needs and to exchange what he has for what he wants. Whether we like it or not, people are constantly acting with rights.
In doing so, they exercise a fundamental right, the right to free choice. The main task in terms of the industry`s reputation argument is to specify the conditions under which industry would actually provide the collective good. One first thing to keep in mind is to what extent maintaining the industry`s reputation for privacy is similar to maintaining a company`s reputation for privacy. One problem in both cases is that the market does an imperfect job of monitoring reputation – it`s hard for consumers to tell when a company or industry has misused personal information, and therefore businesses and industry are incentivized to abuse that information. Armed with sophisticated new research, advertisers and marketers have begun to target the growing number of children online. Websites and other interactive online services are being developed to capture the loyalty and purchasing power of the “lucrative category of cyber deaths.” A variety of interactive advertising and marketing techniques have been developed specifically for this new medium. Many of them threaten to manipulate children and deprive them of their privacy. If allowed to develop without intervention, these practices will become widespread and even more blatant. Finally, does the public think that some sort of Federal Data Protection Commission or Agency, either with regulatory powers or as an advisory body, should be added to existing industry-based federal agencies such as the Federal Communications Commission and the Federal Trade Commission? We did not have a clear answer to this question before 1996. When such a question was asked in the 1990 Equifax survey, it offered three options — a federal regulator, a federal advisory body, and improving existing policies — and the public in 1990 was roughly evenly divided between these three options. Dual-key encryption software has emerged with the spread of the Internet: Pretty Good Privacy (PGP) uses dual-key cryptography and is distributed for free for private use. Business users pay.
Privacy Enhanced Mail (PEM) uses DES encryption as well as a dual-key algorithm to secure the transmission of emails. 21 In addition, Commissioner Christine Varney confirmed the FTC`s interest in this approach and stressed the need for swift action to resolve this issue. Commissioner Varney continued her workshop statement with a letter reaffirming the FTC`s intention to re-examine the issue of data protection in six months and requesting a report on the progress and feasibility of individual empowerment technologies at that time. The Internet is alive and well with people involved in a variety of activities that many consider sensitive.19 Both those involved in these activities and those who set up the areas where they take place have a strong interest in developing an environment that creates trust among their users. This can be a good sign of confidentiality. The combination of a medium that has responded to its users, early adopters who are well-known privacy fundamentalists, and a tradition of people engaging in activities they want to keep private can prove to be a powerful tonic for individual privacy. It should be noted that in the 1996 question, respondents were not asked whether a temporary federal commission for data protection studies (as we did in 1975-77) or a federal data protection agency limited to research and advisory missions would be considered a desirable or necessary step in the late 1990s. These are potential measures that would be useful for future data protection investigations. Privacy is an interaction in which the rights of different parties collide. A has a certain preference for the information he receives and leaves out. B, on the other hand, may want to know more about A, perhaps to protect himself. Controversies over caller identification or disclosure of AIDS by medical personnel illustrate that privacy is a matter of controlling the flow of information, with an inherent complexity far greater than a conventional analysis of “consumers versus business” or “citizens versus the state” suggests.
In this case, different parties have different preferences in terms of “information permeability” and need a way to synchronize those preferences or be in tension with each other. This would suggest that interactive privacy negotiations would have a place in the creation and protection of privacy. Certain types of information may be collected and disseminated without revealing the identity of individuals. [Froomkin, 1996] explores some of the legal issues related to anonymity and pseudonymity; see [Camp, Harkavey, Yee, & Tygar, 1996] for a computer perspective. [Karnow, 1994] proposes the interesting idea of “electronic persons” or “epers” that serve to ensure the protection of privacy while conveying a relevant description of the person. While the blocking approach has its merits when applied to potentially offensive material, the same approach is fundamentally flawed when applied to children`s personal information.37 Offensive content is discreet and can be identified by each individual based on their values. A person can point to specific content areas in the GII and explain, “The images on this website and the theme of this chat room are something I don`t want my daughter to see.” The personal information of the same girl, on the other hand, is much more malleable and changes from one context to another. For example, a father may want to prevent his son from giving his name and address to an online stranger or online seller, but he may want his son to give the same information to his baseball coach and this week`s carpooler. .