Outlines by Alexander T. Zachos, 2005
Agre, Philip E., 1997. "Beyond the Mirror World: Privacy and the Representational Practices of Computing". Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press.
Chapter 1 "Beyond the Mirror World: Privacy and the Representational Practices of Computing" pp. 29-58
By Philip Agre
-January 1996, the California Air Resources Board (ARB) issued a request to incorporate transponders into Vehicular On-Board Diagnostic Systems. ARB noticed, starting in 1996, newer cars and light trucks will be equipped with an on- board diagnostic (OBD) system that lights the dashboard when there is a problem.
This system is supposed to be only used to ensure that people are obeying emission regulation. A specified form of this system could be used to track the location of every car equipped with it. The potential for misuse should be a factor in comparing the risks and benefits from the system. The ARB is not concerned with the privacy question that its plan raises. Sierra Research put out a report on legal issues that addresses the issue. This system constitutes an "emerging infrastructure" (Agre, 1997, 30), that is decentralized but definitely invades the right to privacy.
-Social imagination has always linked together, computing and surveillance. Technologies have raised concerns about a "surveillance society". Until the internet, the surveillance purpose was an important one in technology, then it switched to more of a communication function. Architecture and system design of computers have changed. Purpose of this chapter: to explore the evolution of computer system design.. Computers bring up privacy issues when organizations use them to represent humans and their lives. In analyzing the history of these representations, Agre hopes to show that this history is not a linear progress. He will argue that different models of data are coherently related to each other and to computing. He wants to explore the "practical logic" of computing work. Computer work is influenced by its language, constraints of its methods, and the stubbornness of a more complicated world beyond that.
Representing Activity- first methods used to represent human activities on computers were born from the"work-rationalization methods that engineers had developed since the 1910s. Information processing was born: this was the idea that computers run a factory whose raw material is information rather than anything tangible. Flow charts and time and motion studies were pioneers in this field.
Cougar (1973) made a survey that brought about systems-analysis methods. It is this that draws the bridge between process charts and modern computing methods. The initial concern was a flow of organized info. The organizational use of paperwork was used before the invention of the computer. In the eyes of the system analyist the representational nature of the documents were second to the representation of forms, and not what they represented. There was a transition from documents to Hollerith cards to databases, but this transition was not smooth or quick by any means.
The Mirror World- Original databases were modeled on the tabular information that used to be stored in paper form with rows. This view of data was made formal by Chen (1976) in the entity-relationship model of data. There were many databases before Chenís time, his model brought database design into focus. Its greatest innovation, perhaps, was the distinction between data records and the entities they represent. This point was not based on philosophy but rather on organization.
An entity is not a thing, but rather a representation of a thing. Entities are collections of qualities that satisfy a set of rules established for relations modeling.
In database research, there was less ambitious work on semantic and concepts had little influence on mainstream practice. Mainstream database has leaned towards the business importance or large databases of simple records that are quick and efficient.
1980s-data became extremely important to computing. The design of the database accounts for a little part of the cost of designing a system. Data structure is significant because it effects the algorithms. Out of these modern truths, was born a time where information processing was concerned with structure, the newer model is concerned with the structure of data.
Thesis of the new data centered theory of computing was shown in the book Mirror Worlds. The books metaphor is the computer as a mirror of the world. Gelernter says that the progress of computing will sooner or later produce a single vast computer system that creates a mirror image of reality. He was aware of the privacy issues that computers have raised, and was concerned with the idea of public space throughout his book. Answering to fears about computers being instruments of social control through surveillance, Gelernter argues that the Mirror-World can make it possible for ordinary people to turn that same power back against the government. He was aware that security concerns will arise because many mirror worlds will contain much confidential info. Thieves will be attracted. Despite his emphasis on the parts of the world that are open to public view, he also offers a model of selective access to sensitive information.
Despite focusing on the public realm, the ordinary rhetoric organizational control resurfaces whenever his standpoint shifts from that of a citizen to that of a leader. There is a slight flaw in his theory in that sometimes he talks about the public world, and other times he talks about the entire world.
Representation and Professional Distance- Gelernter provides us with one way of giving a central theme to current stories about computing.
More sophisticated analysis begins with the mirror metaphor. There are many diverse forms of mediation between representation and reality:
1.standards-to mirror life, computers must have a group of ontologies be used to represent human affairs.
2.instrumentation- A mirror can be installed next to an activity without affecting it. But is activity is to be reflecxted in a computer database, its participants must be provided with equipment that can capture data.
3.authentication- A mirror world claims to report activities of people and things, but must have some material process to verify their identity.
4.interpretation- Gelernter predicts that Mirror Worlds, will eliminate the need for interest groups by providing everyone with equal access to a view of society.
5.selection- If a mirror world is easy to set up, then why canít I set one up wherever I want to.
6. bias-citizens who use a mirror world to get an overview of real world will depend on the software.
7.performance- Once Mirror World surveillance becomes pervasive, people will design their activities, more cautious to the consequences.
The space between representation and reality does become visible.
Simsion wrote an outstanding text on data modeling points. He warns about imagining that the answer can be from an underlying reality. He referred to the entity relationship approach that distinguishes between entities and records..
Simsion provides a record of database designerís collective experience.. The foreign key maintenance problem is usually the most effextive method of convincing programmers.
New Privacy Strategies- at the end of their 1996 meeting in England EU data commissioners released a statement reporting their decision to "work over the next year on Privacy Enhancing Technologies to promote these techniques for people canít view your stuff.
The data commissionersí analysis of privacy-enhancing technologies (PETs) The underlying technology for identity protection as developed in a large way by David Chaum. He observes data records canít cause privacy problems unless they are traced back to the individual whose lives they represent.
Chaum proposed an alternative approach is to employ digital pseudonyms and present fake credentials. Purpose of the identity protector is to manage and authenticate these pseudonyms. The identity protector is interposed bet. A system user and many little "pseudo-domains".
Complexly, these schemes offer advantages beyond privacy. Databases indexed by pseudonyms donít contain individual identifiable info, they need not be secured as tightly. Info can be more readily transferred across organizational boundaries for statistic purposes. Authentication is a problem for these schemes, most appropriate method of authentication will depend on the particulars.
Habit among system designers is another significant obstacle in the adoption of PETs, as long as the language of computer systems is designed to blur the difference between representations and reality, treating data records as mirror images of the world, the protection that PETs provide for the individual identity will be incomprehensible.
Conclusion The utility of historical investigation of computing as a representational practice.
Privacy problems have arisen through the application of industrial methods to nonindustrial spheres of life, where normal relations of representation and control are different. Database designers, have been forced to clarify their methods on numerous occasions, when existing databases have been used for new purposes.1980s mathematicians were motivated by a desire to protect privacy, faced an uphill battle getting their ideas accepted. Information is not an industrial material or a mirror or a mirror of reality. It is something deeply bound with material practices by which people organize their lives. Computer systems design will not escape its association with social control until it cultivates an awareness of these material practices.
Bellotti, Victoria, 1997. "Design for Privacy in Multimedia Computing and Communications Environments". Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 62-94
Chapter 2 Design for Privacy in Multimedia Computing and Communications Environments.
By Victoria Bellotti
In public and private places there are rules about acceptable behavior, and interpersonal access rights. These depend on their roles and relationships with the place they are located. People learn this due to normal socialization. There are design issues relating to implications that involve the blurring of lines between public and private. There has been innovations in multimedia communications. Main concern, how we enable people to determine that they are presenting themselves well and how to control intrusive access over computer mediated communications infrastructures and systems.
Communications Growth and Privacy Erosion
Steady growth in the use of vast quantities of personal data supported by computers. This is vital for government, public services, business, and livelihood. However, it also has the capability to give access to, manipulation of, and presentation of private data about themselves.
2 major classes of privacy concerns: The first class covers technical aspects of systems. Some are designed for insidious or unethical uses, surveillance, often by legitimate organizations. The second class of privacy concerns relates to issues of multimedia computing and communications systems. These donít have to do with technical aspects, rather they arise from the relationship between user-interface design and socially significant actions. Multimedia computing and communications systems are emerging as a way of supporting distributed communication and collaboration. Systems may include, audio, video, infrared, or other media to capture and process data. The user interfaces to this system, which may give way to unethical use of the technology, these situations are more conducive to inadvertent intrusions of privacy.
The need to understand is becoming more important as computing power moves out of the box on the desk to the world at large. Our environment will contain varieties of computing technologies: microphones, cameras, and signal receivers used in systems for wireless communication when combined with complicated network infrastructures, offer the potential to store personal information.
Multimedia information is processed, accessed, and distributed many ways. Many prototype systems offer activity-based info retrieval, diary services, document tracking, etc.
Two types of definitions are common: normative and operational. Normative involves the idea that some aspects of a personís life are private and shouldnít be revealed to anyone. Privacy is the condition in which others are deprived of access to you. Normative is problematic because it causes many other personal and contextual factors to be involved to determine whether or not privacy has been violated.
Operational definition refers to a capability rather than to a set of norms. This can be thought of as access control. Privacy is the ability to control information about oneself. Samarajiva refers to privacy as the control of outflow of information.
Computing systems increasingly support information. " People donít appreciate what the state of their fail to take steps to control it. " The design framework may be related to the design of other comuting systems, like those that have device tracking, image processing, and online activity monitoring.
Maintaining Privacy in Media Spaces Recent development in computing technology, involving radio, video, and computer networking.
Two important principles emerged in designing for privacy and an aspect to accessibility. Feedback-informing people when and what info about them is being captured.
Control-Empowering people to stipulate what information they project and who can get ahold of it.
Representing Remote Presences
The User Experience Research Group, apple, coffee bar.
Installation was not technologically free Small but significant fortune such as urself
Disembodiment and Dissociation.
Concerns with media spaces relate to the fact that microphones and cameras are unobtrusive. In media space they are left in continuously, and difficult to tell if they are switched on. Concerns about privacy are because of the fact that the social space in which people operate isnít just dealing with the physical space. Can think of privacy in terms of 2 phenomena which cognitively describe, what is lost in social interactions mediated by technology; disembodiment from the context into which one projects and from which one obtains information. Dissociation means that a remote person may have no detectable presence at all. This occurs in CSCW applications when the results of actions are shared but the actions themselves are invisible or are not attributable to a particular actor.
Breakdown of Social and Behavioral Norms and Practices
The effects of disembodiment and dissociation manifest themselves through a variety of violations of social norms and breakdowns. Disembodiment is where a tendency for people to engage in long, unintentional observation of others over AV links.
2 major breakdowns: inability to present oneself appropriately, and not knowing if one is visible in some way to others.
A Design Framework
Disembodiment and dissociation in media spaces and other CSCW systems can be reduced through the provision of enriched feedback about info.
Addressing the Breakdown
Mutual awareness normally taken for granted may be reduced or lost. 4 behaviors to deal. Capture- what kind of information is being picked up? Candidates include voices, actual speech, moving video or frame grabbed images, personal identity, work activity, and its products, data, messages and documents Construction-What happens to information. Is it encrypted or processed Is it combined with other information? Is it stored?
Accessibility- Is my information public, available to particular groups or individuals.
Purpose- To what is information put? How might it be used in the future.?
For those concerned with privacy, and about the potential for subversion.
Evaluating Solutions- This framework emphasizes design to set a criteria, Have been identified from experiences with the design and use of a range of computing services.
Appropriate timing- Feedback and control should be provided at a time when they are most likely to be required and effective.. trustworthiness- systems must be technically reliable. Unobtrusiveness-feedback should not distract or annoy, be selective and relevant, should not overload the recipient with information. Minimal intrusiveness-Feedback should not involve information that compromises the privacy of others.. fial safety- In cases where users omit to take explicit action to protect their privacy.
Flexibility- What counts as private varies according to context and interpersonal relationships, mechanisms of control over user and system behaviors. Low effort- Design solutions must be lightweight to use, requiring as few actions and little effort as possible. Meaningfulness- Feedback and control must incorporate. Learnability- proposed designs should not require a complex mental model of how the system works. Low cost- Naturally we wish to keep costs of design solutions down. These are relevant to the protection of privacy, the last 4 are more general design concerns. Collection and limitation- Data must be only obtained by lawful means. Data-quality- Data collectors must collect only data relevant to their purposes. Purpose specification- At the time of collection, purposes to which the data will be applied must be disclosed to the data subject. Use limitation- The data is not to be disclosed by the collector to outsiders. Security safeguards- Data collectors must take reasonable precautions against loss and destruction
Openness-Data subjects should be able to determine the whereabouts, use, and purpose of personal data relating to them. These all are there to prevent violation of privacy.
Individual participation- data subjects have the right to inspect any data concerning with themselves and the right to challenge the accuracy of such data and have it rectified or erased by the collector. Accountability- the collector is accountable. These principles can be applied to an enormous range of aspects of system design.
Applying the Framework: Feedback and Control for video data from a public area
RAVE and the Apple Virtual café in terms of framework. At apple, the first prototype of the virtual café displayed images taken from the camera. Providing feedback and control mechanisms over video signals or images. The privacy framework prompts the following questions: What feedback is there about when and what information about me gets into the system? A monitor was positioned next to the camera to inform passersby when they were within range. Mannequin placed holding a camera, he looked like another person in the room. Alternative solution for RAVE-movement sensors- use of infrared devices to alert people, either with an audio or visual signal.
Existing Solutions in the Virtual Café- warning sign- a sign displaying information about the presence of the camera tells customers in the café about what the camera is being used for and encourages them to try the system for themselves.
Alternative Solution for Virtual Café-public installation- A second version of the Virtual Café server been built by Apple designers. Placed prominently at the head of the line
Proposed solutions for virtual café- design refinements to signs and audio feedback. What feedback is there about what happens to information about me inside the system?
No existing solution for RAVE- The confidence monitor didnít inform visitors and newcomers to the lab,
Proposed Solutions for RAVE- LED display A simple solution would have been to place an LED status display near the camera. Audio and video feedback- audio could have been provided to indicate connections and image frame-grabbing but it might have been obtrusive and annoying to provide repeated feedback to a public are.
Existing Solution for the RAVE- textual information- To warn people that they might be watched. Proposed Solutions for Rave- viewer display- One option would have been to display a lis of names or pictures on the wall to indicate who was watching a public arena. Propriately timed- updated information, could have adapted portholes to this.
Audio feedback- In private offices this alerted occupants
Existing Solutions or the Virtual Café- displayed the names users chose to type into the authentication page on the web.
Proposed Solutions for Virtual Café- No simple way of overcoming the possibility of dissociation between users and their actions.
What feedback is provided about the purposes of using information about me? RAVE and Virtual Café: No Existing Solution- There is no technical feedback in either of these systems about the intentions of those who access the video signals or images.
Proposed Solution-we canít provide technological feedback about peoples intentions.
What control is there over when and when not to give out what information?
Existing Solutions for RAVE- moving off camera-People could move on or off camera, there was a clear video free zone. Covering the camera- if certain activities were taking place, in the Commons, the lens was covered.
Behaving appropriately- If people had to walk in front of the camera, they could orient themselves appropriately to it.
Existing Solutions for Virtual Café- behaving appropriately the only realistic resource for the customers at OH La La is for them to behave as if they might being photographed.
What control is there over what happens to information, who can access it, and for what purposes is it used.? Neither RAVE nor the Apple Virtual Café offers technical control over the capture, construction, and access of video signals because each of them is a public rather than private accommodation.
Privacy can be definied as capability to determine what one wants to reveal and how accessible it will be. This is tied to the design challenges inherent in media spaces and similar systems which blur the boundaries between private and public space and in which disembodiment. How far should we go? The framework for design for privacy together with its associated design criteria is intened to help designers of CSCW systems. We must ask, how far do we need to go in order to protect privacy?
In private offices, individual users of systems like media spaces can decide whether to allow certain kinds of connections or whether to have a camera pointed at them.
How far can we generalize? Focused on technical design features, rather than on social or policy means of protecting privacy. I have argued that this is an important design focus, and have offered a framework as guidance.
Privacy in the Balance: Some further issues.- providing too much awareness of other peopleís activities and availability may be seen as intrusive. Attitudes toward privacy have changed. Privacy will continue to be balanced against other concerns of modern life it will be a matter of feedback and control.
Bennett, Colin J. 1997, " Convergence Revisited: Toward a Global Policy for the Protection of Personal data". Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 99-120
Chapter 3 Convergence Revisited: Toward a Global Policy for the Protection of Personal Data
By Colin J. Bennett
Strong pressures for "policy convergence" forced different states to legislate a broad set of statutory principles, and gave their citizens more control over personal information. General pattern of policy development seemed to support the fact that globalizing forces for economic and technological development were overwhelming the distinctive institutional capacities. Countries that wanted an "information privacy" or "data protection" policy had limited solutions. They all selected a version of " fair information principles".
Evolving and pervasive information technologies shape the structure of debates in each country. Communications intensified throughout the 1970s, pressure was placed on international organizations to help out. Marked the need for harmony among data-protection policy.
End of 1980s 3 major divergences in world data protection politics: First was scope, 2nd difference reflected a disagreement about whether laws should apply only to computerized personal data. Third and principal difference concerned the choice of policy instruments to enforce, oversee, and administer the implementation of the legislation.
One assumption behind this analysis was that "information privacy" or "data protection" considered discrete legal or technological issues. Regulating Privacy written at a time when only 17 developed countries had data protection laws.
Forces for Policy Convergence in the 1990s- distinction among policy content, policy instruments, and policy outcomes will prove useful. To what extent has the worldís data-protection policy been converging.
National and Global Information Highways: The Technological Imperative
Technologies are now more interactive and less centralized. The same dynamics are at work as existed in the 1960s with respect to the mainframe "data banks" to which the initial data protection laws were directed. One danger is pervasiveness. The development of global information networks has changed and intensified the character of privacy-protection problem. 60s and 70s the privacy challenge could be nationally conceived. Less threatened than when "Big Brother". The "new surveillance" is decentralized, routine, and more and more global.
Another characteristic of new technologies is their rapid growth. Laws and regulations have been made and enforced by people who lack a thorough understanding of the ways computers work. Privacy and data-protection law is intended to increase trust in technologies, and organizations by est. some procedures. Technology shapes the structure of the battle, but not every outcome. Policymakers and institutions have considerable autonomy to define problems and make better choices.
Policy Harmonization: EU Data Protection Directive The 1970s consensus on data protection had, by the 80s expanded and institutionalized. The publication in Sept 1990 by the ECPCD concerning the protection of individuals in relation to the processing of personal data. European governments have til 1998 to bring their laws up to this new European standard. 1. Distinctions between public and private were almost completely removed. Political convergence or public policy in 3 areas: 2. many battles were fought over the question of the inclusion of "manual: data within the scope. 3. the directive tries to harmonize policy instruments through data-protection principles. The data protection directive also specifies the nature and function of a member stateís "supervisory authority". International agreement, we see an attempt to extend the process of policy convergence to the policy instruments.
Data Protection through Penetration: The External Impact of the EU.s
The Data Protection Directive has had, and will continue to have, an impact on the data-protection directive. The pressure on non-EU countries stems principally from a stipulation in Article 25 that data transfers to a 3rd country may take place only if that country ensures "adaptive level of protection". The EUs 1995 Data Protection Directive constitutes the rules for the increasingly global character of data processing operations.
The Desire to Be in the "Data-Protection Club" The Emulation of the Majority
EU directive not only an instrument of pressure and economic interdependence. It is also conceivably a legal framework for others to emulate
The Divergence of Policy Experience
There are some countervailing forces that suggest a continuing divergence.
End of 1996, 24 developed countries and 6 have failed to pass privacy acts. U.S., Canada, Australia, Japan, Greece, and Turkey. Bills are being discussed. Private sector here is regulated through an incomplete patchwork of federal and state provisions. It is very easy to prevent legislation and I havenít used my veto for me. End of 1990s there could be a data-protection legislation in every advanced industrial country.
Privacy Codes and Standards
Analysts have observed a discernible shift in the mid 80s to legislation that displayed sensitivity to the information processing practices and technological conditions of different sectors. Move toward sectoral regulation is reflected in the "third generation"
Mechanism to implement flexibility is the "code of practice." Comes with Data protection legislation. U.S and Canada, privacy codes have generally have been developed in conformity with OECD Guidelines of 1981. "Voluntary" codes numerous but display variability in terms of scope and extent of enforcement.
CSA Model Code was passed without any dissenting vote on Sept. 20, 1995, and was subsequently approved by the Standards Council of Canada. The Privacy Code is then adopted by diff. sectors, adapted to specific circumstances.
The CSA Model Code is a rather different instrument from those developed by companies or trade associations..
No question that information technologies can be a friend to privacy as well as a threat. Can be designed to delete and forget personal data. Encryption can be incorporated int o software. Data on personal transactions has economic value. The widespread application of privacy-enhancing technologies will also require a massive educational effort. Privacy-enhancing technologies cannot be a substitute for public policy. May be influenced by the slow development of standards. The application of these technologies will then be governed by the complex interplay of market forces, consumer pressure and education, and government sponsorship.
Conclusion: The Limits to Harmonization
The European Unionís Data Protection Directive will tend to the process of policy convergence at the level of policy content. The EU Directive will also push a greater conformity at the level of policy instruments. Evidence for policy convergence is far more difficult to trace. At root, the problem may be conceptual rather than political, technological, or economic
What is needed is a more holistic perspective that sees data protection as a process that involves a wide network of actors and regulators. The successful implementation of data protection requires a shift in organizational culture and citizen behavior. There are clear limits to the evaluation of policy success, and thus to the observation of policy convergence, .Rhetoric to the contrary, we may well have confronted those limits.
Burkert, Herbert, 1997, "Privacy-Enhancing Technologies: Typology, Critique, Vision." Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 125-140
Chapter 4 Privacy-Enhancing Technologies: Typology, Critique, Vision
By Herbert Burkert
PETs refer to technical and organizational concepts that aim at protecting personal identity. PETs set apart from data-security technologies. Data-security measures seek to render data processing safe regardless of legitimacy. Data security is necessary but not a necessary condition for private protection. PETs seek to eliminate the use of personal data altogether or to give direct over therefore closer to social goals
PET concepts start from observation that personal information is accumulated in interactions as actions between subjects relating to objects within systems, we can differentiate 4 types of PET concepts:
Subject-Oriented Concepts- seek to eliminate or substantially reduce the capability to personally identify the acting subject. This aim is achieved by implementing proxies: Individuals may be given identifiers. To implement a credit card system such as a PET system, we could take the name of the cardholder and link to the credit-card number from the system and put them into a "data safe" that makes such information accessible only with the consent of the individual.
Object-Oriented Concepts- Object-oriented concepts build on the observation that transactions often involve exchange or barter and the observation that an object given in exchange carries traces that allow the ID of the exchanging subjects. The aim is to free the exchanged object from all such traces without eliminating the object itself.
Transaction-Oriented Concepts- Concepts which he has not yet seen discussed extensively, not in the context of PETs, seek to address the trace the transaction process is leaving behind. In this context one could make use of technical devices like combinations of data and small programs currently being discussed in the context of the protection of intellectual property rights.
System-Oriented Concepts- Finally, we might imagine concepts that seek to integrate some or all of the elements described, thereby creating zones of interaction where the identity of the subjects is being hidden.
Attempt at a Critique-The Achievements of PET Design
Those advocating PETs, the first question asked in PET design is whether personal information is needed at all. Availability of PETs creates a burden of legitimate on those who want to have personal information in their system. They take pressure off of the consent principle. With a wider availability of PET designs, one could avoid from the beginning situations in which consent would have to be requested. One should ensure that the PET approach does not set in before the consent principle has been examined. PETs as a concept have brought the social "soft" concern of how to reduce personal data down to the level of "hard: system-design considerations. Addressing privacy on the design level is an essential precondition to addressing privacy on the political leavel. With the existence of PETs,, whatever (personal info) are being regarded as politically desirable.
The Limitations of PET Design- there are limitations that are inherent in the designs and are generally understood to be so by PET designers. Internal limitation- the one-directional perspective, the problem is the "systems perspective"
The one-directional perception- some subject oriented PET concepts built on the perception that within an interaction there is usually one party who merits to be guarded against his counterpart. There are situations where the customer is a potent organization and the seller is a small company with a legitimate interest in being protected against the buyerís economic power. 2 implications: We should not forget that PETs and in this they do resemble data-security designs, remain technical. Normative decision still has to be made, canít be replaced by the PET. We have to decide, it cannot be replaced by the PET concept as such. Have to decide.
In political-environments that are less formed by a "freedom of information" culture, we might foresee the us of PETs by administrators that intend to protect the identities of persons who have to operate in hostile environments. Implementing PETs, careful attention has to be paid to the normative question in order to handle the "dual-use" problem of these technologies Identifying information- Some PET concepts rely on the capability to differentiate information that relates to a person from "identifying information" The capability to identify persons behind anonymous info depends on the purpose of our ID, on the "hidden information" connected to this anonymous info., and on the extent of "additional knowledge" that may be provided by other information systems.
The systems view- the attraction of PET concepts,, they take the system designerís view of the world and talk to the designers. Turning a school planning system and a library planning system for a minority into a PET system would lose much money if its privacy-enhancing effect if a subscription file for a paper that is exclusively read by the minority were to remain a non-PET system.
The Technical assumption- Some PET systems are based on a specific technical assessment: the significant imbalance between the efforts to encrypt and the efforts to decrypt. External limitation- resistant economic, social, and political forces have to be reckoned with when trying to implement PET concepts. Among the issues to be considered here are the economics of information, the needs of mobilization, and the concept of privacy itself. The Economics of information- Personal information given away in a transaction process is part of the payment for the desired service or good. If this information is no longer there, the price of the good or service is likely to change. Mobilization- Traditional economic approaches build on a more general social trait. Process of social modernization, disturbances in social bonds are being compensated by strategies of mobilization in both the public and private sector. The force of mobilization as a social need should not be underestimated. The concept of privacy- Concept of privacy on which PET design still largely seems to be built, privacy and anonymity, or in more advanced settings privacy as a conscious choice between anonymity and identification.
Privacy and the Role of PETs- There is a need to accompany such legislation not only with data-security, but also with privacy-ensuring technical designs. Even the basic rule that the best data protection is to have no personal data at all was mentioned only occasionally in the annual reports of data-protection agencies. There are 3 trends that will be relevant to the future development of the concept of privacy and of PET or PET-like design approaches. These elements are information balance, identity, and trust.
Information Balance- The notion of information balance is the traditional concept that builds on the presumption of a continuous social contractual relationship where balances have to be made between the interests of the "data subject" and the interests of other individuals.
First, the design process of PETs has to include individuals directly. He referred to the need of opening up the "traditional understanding of privacy" to a political understanding of this concept. PETs are public utilities. Data-protection agencies need to play an important role in adopting procedures for the design and implementation of information systems and their PET components. Second, PET design should also pay more attention to participatory processes in the realm of "electronic democracy."
Identity- major argument against privacy is that it misses the essence of identity. "Identity isnít a constant, but a process" (Burkert, 97, 138). We donít develop identity by isolating ourselves from others, our identity is the main thing that others know about us.
Trust- Main argument against broad implementation of PETs is "the need or the habit of meeting the insecurity of life with greater confidence and of maintaining or regaining certainty in a world of uncertainty" ( Burkert, 97, 138). Trust in this context might best be understood as reliability. Trust involves a conscious decision to interact although there is risk. This sense of trust, we mean "trusted systems" and we mean that it will have to play a larger role in "the design of social systems" ( Burkert, 97, 138). Trust may be the answer to the surveillance paradox. That is, Surveillance has to be controlled; however, a layer of surveillance calls for more surveillance.
Conclusion- We might look at PETs as a a technical innovationthat helps us solve a set of problems. Their implementation forces us to return to social innovation. Data-protection regulation has already proven important in the social innovation field. PETs can help us to meet further challenges.
Davis, Simon G., 1997, "Re-Engineering the Right to Privacy: How Privacy Has Been Transformed from a Right to a Commodity." Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 143-162.
Chapter 5 Re-Engineering the Right to Privacy
By. Simon G. Davies.
Privacy as a concept has shifted, during a generation, from a civil/political rights issue motivated by strong ideology to a consumer rights issue undercut by data protection principles and by the law of trading standards. Private right and public interest have been re-defined. Privacy has changed from an issue of societyís power relationship to a very tedious and constrained legal right. This chapter will discuss "the changing nature of privacy" (Davies, 97, 143), and the new set of rules that are characteristic of public interest and private right.
Privacy Activism- not uniform, but has been fueled by issues of sovereignty, technophobia, power, and autonomy. There was a fear of Big Brother Society in 1970, group who watched you and ran your life with coercion and fear. Since 1980, there has been a shift in the way the public views privacy and privacy invasion, There have been fundamental changes to traditional privacy issues in many countries. This can be attributed to 5 factors: 1. from privacy protection to data protection-formal rules in the form of data-protection mindset, satisfied some of the people but have failed to stop the growth of surveillance. 2. the creation of partnerships-all stakeholders, whether proponents of surveillance, have been collected together into a "partnership" with common goals and desires. 3. the illusion of voluntariness- there are many surveillance schemes now involve a "voluntary" that only has the effect of neutralizing public attorneys. 4. privacy rights as commodities- many traditional rights have fallen by the wayside, thus converting privacy rights into consumer issues. 5. the concept of "public interest"- this has expanded very much in recent years, to the point where in some countries .
History- 60s and 70s, information technology emerged as a tool for public admin. Britain concern about the government databases was up there with unemployment and education. Since mid 1980s polls list a great concern for computers. The surveillance has only gotten worse. Caller ID has brought back a right for privacy advocates.
The Great Angst: Personal Concerns with Privacy- Concerns about computers, surveillance, and privacy fence causes stress; however, In developed nations, anxiety over unemployment has been devastating. 2nd fundamental anxiety- grown for 20 years, that of "personal security" (Davies, 97, 149).
Privacy Paralysis in Practice: Closed Circuit TV in Britain-New generation of (closed circuit television), the surveillance equipment has come back to Big Brother. CCTV has grown .
Defining the Private Right, the Public Interest, and the Question of Balance
Matters of public interest, privacy tends to be whatever is left after. Security are intimidating. Privacy is cast as the "bete noir" of law enforcement, Privacy invasion and its related technologies are promoted in neutral language. One pioneer said privacy is "part philosophy, some semantics, and pure passion" (Davies, 97, 153).
The Quantum Shift in Privacy Activism- 60s and 70s, fueled by strong defense of democratic rights. The Dutch group was worldís most successful non-government privacy organization. Privacy is in decline right now.
Assessing the 5 Fundamental Transitions in Privacy- there are many causing factors that have been amplified by private and government interest groups, by legal orgs and IGOs, and of course by our Media.
From Privacy Protection to Data Protection- group of principles began to change and eventually formulated into law. Privacy of personal information is protected through principles. 2 serious problems with the core principles. First, they allow many privacy violations to occur through exemptions for law enforcement. Second, and worse proble data-protection law does nothing to prevent or limit the collection of information. There is evidence that the adoption of principles has calmed some of the public down. ĎProtections in law, where they exist, are sometimes ineffective and even counter-productive" ( Davies, 97, 157).
Subjects of Surveillance Are Becoming Partners in Surveillance- Closed circuit TV schemes are commonplace in many countries now. There is a strong togetherness movement in Europe that sponsors partnerships. The "partnership" or "shareholder image" is becoming popular in areas of private invasion. There is also a stakeholder model that brings together parties who have long been rivals.
The Illusion of Voluntariness- many of the surveillances involves a "voluntary" component which will neutralize public concern about surveillance. Some regions of surveillance, governments are less inclined to make privacy invasion mandatory, choosing that participation is free choice. Australians have gotten "pseudo-voluntary" ID card. The British government took the lead with the ID cards. Those who donít volunteer bring it upon themselves.
Privacy Rights are Becoming Commodities- Privacy has had to make the journey from the political realm to the business realm, which means a new set of values comes along too. Placing privacy in the "free market" environment along with such choices as color, durability, and size creates an environment. Process of commodification is inimical to privacy.
The Triumph of "Public Interest- Public interest concept has expanded substantially in recent years, Its likely that public interest can be manufactured limitlessly in any environment in which political choice is a minimum. If you are a privatist, there is cause for concern. The concept of privacy has become fragmented and dispersed among many sectors and interest groups.
Flaherty, David H., 1997, "Controlling Surveillance: Can Privacy Protection be made Effective." Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 167-190
Chapter 6 Controlling Surveillance: Can Privacy Protection Be Made Effective
By David H. Flaherty
There is quite a significant amount of theoretical and empirical support for the importance of privacy in regards to our lives; there are some disagreements among nation states, discussing how to best implement the protection of privacy of individuals in the public and private sectors. Flaherty chose to ignore the freedom of information side of work, and decided to concentrate on how to control surveillance by making data protection effective in the public sector.
The Need for Data-Protection Laws- "the existence of data protection laws gives us some hope that the twenty-first century will not be a world in which personal privacy has been fully eroded by the forces of promoting surveillance" (Flaherty, 89, 169). Flaherty thinks that the pro-privacy side has won the equation for now. He says that we live in an odd Western nation that does not have exclusive laws for collection, use, retention, and disclosure of personal information in both sectors.
1980s- Deregulation movement by Margaret Thatcher and Ronald Reagan. There were also advances in the development of information technology at an extremely broad scope and quick pace. This added urgency to the argument that there was a need for surveillance. During this time there were enormous pressures in the U.S. and Canada to have surveillance cameras. There was also much government pressure for the rationalization of identity checking of applicants for government benefits; Everyoneís information was put into a "common registry". Flaherty had a big problem since he was there to protect personal privacy of individuals in 1989.
Concern for privacy is understood by many to be clear, rational thought. Others view it as a barrier that stands in the way of data-protection laws.
Defining Privacy Interests to Limit Surveillance- Those who are in government do not need to define privacy interests, needed to pay attention about limiting surveillance. Flaherty says he has had no more success at concentrating legislative efforts on articulating central privacy interests. He also pushes for supporting classic liberalism, when he calls for "the need to balance privacy against competing values and legitimate governmental purposes" (Flaherty, 89 172).
The Need for Data-Protection Agencies- British Columbia created the Office of the Information and Privacy Commissioner, imitating Ontario and Quebec. Flaherty argues that the best evidence of the need for protection of privacy bodies is the embarrassing state of data protection in the US, where very few of the states have equivalent agencies to Canada. US suffers from a lack of an institutionalized body of "specialized privacy expertise" (Flaherty, 97, 174), that should be available to all levels of government. The US has a small network of talented US privacy advocates, including many lawyers who are still in private practice; however, this is not enough to compensate for the gap in numbers and quality. Flaherty thought that data protectors always went up against significant resistance from organizations. Flaherty again describes, ". . . data protection commissioners or agencies are an alarm system for the protection of privacy" (Flaherty, 89, 175).
The Effective Conduct of Data Protection- Flaherty adopted and applied a functional, expansive, and empirical approach to his statutory tasks. He said that he is interested in knowing and thus applied an empirical approach, allowing himself to learn and know as much as possible, how a system or modification would or would not work. Avoid public and private confrontations by being pragmatic as you negotiate.
Flaherty urged that one should avoid bureaucracy in a data-protection organization and the risk of having "an overworked and overcommitted staff that provides only the illusion of data protection" (Flaherty, 89, 178). He also agrees that the main function of a data-protection agency is to articulate and advance private interests that must be defended..
Independence and the Exercise of Power- The exercise of independence by a regulatory authority, subject to certain outside controls, is essential to successful data protection. His staff consists of public servants who were hired in conjunction with the Public Service Act, which ensures that they are insiders to government in terms of job morality. None of them are Union members at the request of the legislature. His concern for independence is equally counterbalanced by the desire to build and effective network in "government circles". These circles will then take a mediatory role of our office in settling as many requests as possible with diplomacy. Flaherty was watched by 2 agencies: the Freedom of Information and Privacy Association and the BC Civil Liberties Association.
The Adequacy of Advisory Powers- Flaherty changed his mind about having advisory powers in this particular bureaucracy. He concludes that a privacy commissioner should have total regulatory power at his disposal.
-One weakness that Flaherty predicted in the Freedom of Information Act, is that there are no criminal sanctions in the act itself.
The Primacy of Data-Protection Concerns
Argued in 1989 that privacy protector should stick to their jobs and avoid direct responsibility for the other information/policy issues. The implication was that they should leave alone issues not related to controlling surveillance of the population.
Complaints, Audits, and Access Rights-Complaints are a safety valve for an aggrieved public, and do help to set priorities for audits and inspections. Must adhere to the Freedom of Information and Protection of Privacy Act
He calls for a more mechanistic system for redressing grievances. Flaherty has always been an admirer of conducts and audits or inspections by German federal and state data-protection offices. Flaherty says that his greatest feat was being able to help in 1992, with the shaping of the BC Freedom of Information and Protection of Privacy Act. He urged, successfully the inclusive authority for the commissioner.
Monitoring Surveillance Technology- Had a special role for data-protection agencies in the monitoring developments. "Technological imperatives are increasingly harnessed to governmentís goals of reducing costs . . ." (Flaherty, 97, 187). He feels that the public sector hasnít been able to afford the software and data matching resources when compared with the United States. Theyíre making efforts in British Columbia to reach "specialists in informatics" who can "mobilize technological expertise for protective purposes" (Flaherty, 97, 187). There has been modest success in BC in trying to promote the preparation of privacy impact assessments. He cannot find the resources to prepare similar documentation except under duress in the crisis of management.
Strengthening Data-Protection Legislation- Flahertyís office is preoccupied with its day to day tasks. For this reason, opportunities to look around and remember the big picture are a quality that one will find valuable in any profession. The question is What remains problematic in British Columbia? Despite the European Unionís 1995 Directive on Data Protection, is an effort to give data protection to the rest of the private sector. He is an optimist in that he believes that the situation will change in the next several years; however, he says that his written and oral efforts are not listened to.
Toward the Future- 1989 he asked what data-protection authorities would look like in the year 2000? He didnít anticipate that he was going to be partly responsible for that future vision. He believes that his office has been diligent and resourceful to date; however, he is less inclined to draw on an optimistic conclusion because of the technological revolution and the increasingly digital economy.
Gellman, Robert, 1997, "Does Privacy Law Work?". Technology and Privacy: The New Landscape, eds. Agre, Philip E. and Marc Rotenberg. Cambridge, MA: The MIT Press. Pp. 193-215.
Chapter 7 Does Privacy Law Work?
By Robert Gellman
Logical first step- Must evaluate the law as a mechanism for regulating privacy. This might be impossible. In order to identify privacy laws, first we must define privacy. In the US, privacy is a broad and limitless subject. To give a bit of a more concise definition, privacy is cited to include everything from control over personal information to personal reproductive rights to limits on government intrusion. The constitution guarantees individuals a right to be secure in their homes against unreasonable searches and seizures. These are other constitutional principles that are associated with privacy interests; however, the word "privacy" does not appear in the Constitution.
Lawyers, judges, philosophers, and scholars all have attempted to define the scope and meaning of privacy, perhaps they have not succeeded, but it would be unfair to suggest that they have failed. The main focus here is a little slice of privacy known as "data protection". It refers to rules about the collection, use, and dispersal of personal information.
Statutory Codes of Fair Information Practices- Concerns about the power of the governmentís ability to use and abuse personal information. The first legislative response to this was: The Privacy Act of 1974. This law established rules for " the collection, maintenance, use, and disclosure or personal information held by federal agencies" (Gellman, 97, 194).
The first principle of fair information practices is being open: there shouldnít be any secret record-keeping systems. Between 1975 and 1990, the number of record systems shrank from 7000 to 5000.
The second principle is individual participation. The subject of a record should be able to see a record and correct it. There has been considerable litigation seeking enforcement of correction rights.
The third principle is of fair information practices is that there should be limits on the collection of personal information. The fourth is that personal data should be relevant to the case. The Fifth principle requires limits on the internal use of personal data by the record keeper. This exposes a shortcoming in the Privacy Act.
Other privacy statutes define the responsibilities of record keepers and also the rights of the consumers. Maybe the most powerful privacy law regulating private-sector use of personal information is the Fair Credit Reporting Act.
Constitutional Protections- The Supreme Court has addressed privacy issues in many different ways: Griswold vs. Connecticut- involved the constitutionality of a state statute restricting contraceptive availability. Justice Douglas described privacy as found within the "penumbra" of the Bill of Rights. Olmstead vs. United States-The court upheld that neither the 4th amendment nor the 5th amendment provided protection against "wiretapping". Important case in 1928.
Katz vs. United States : The court overturned Olmstead and ruled that wiretapping constituted a search under the 4th amendment
Whalen vs. Roe- This involved the question of informational privacy. Reflects the theme of privacy and technology. The court struck down a motion to create a state wide medical record bank with prescriptions.
United States vs. Miller- The court decided that an individual had no expectation of privacy in account records held by a bank. The account owner was not entitled to notice or the opportunity to object when the government sought records from the Bank.
Greidinger vs. Davis. The plaintiff challenged the public-disclosure requirement of of the law as an unconstitutional burden on the right to vote. No constitutional right of privacy in the Social Security Act.
The clear conclusion is that privacy law is more likely to be successful when there is independent oversight and enforcement. Litigation has been only partly successful under the Privacy Act.
Common Law and Other forms of Privacy Protection- Limiting the power of government is a popular doctrine of American political philosophy. Government was perceived as the principal threat to personal privacy interests. Controls on the governmentís ability to arrest or investigate individuals, to regulate behavior, and to collect and use information were high priorities.
20th century, tort law developed some common-law remedies for invasions of privacy. These are largely but not exclusively aimed at private and not governmental gains.
Can tort law really provide meaningful remedies for individuals? Litigation begun in Virginia 1995 questions the sale of individual names through the rental of mailing lists. This plaintiff lost, and an appeal is pending A ruling that the sale of a mailing list violates the rights of the individuals on the list would have a significant effect on the marketing and mailing list businesses.
If restrictions on the use of personal information are to come through tort remedies, new approaches will be needed
Conclusion- It is difficult to say whether the law is really an effective device for protecting privacy. Different attempts have produced a mixed bag of results. In the US, a broad based privacy law has been imposed on the federal government have been constantly weakened The underlying issue for creating formal and enforceable laws fair information practices may be one of incentive. Privacy principles have generally not been implemented in ways that offer natural incentives to record keeprs to comply.
Finally, legal mechanisms are available to enforce privacy policies. The code of fair information practices offers a comprehensive outline of those policies, but its application in the United States is spotty at best. If the will for better privacy rules develops, the law can provide a way to accomplish the objectives.