Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Revision of 12 April 1999
© Xamax Consultancy Pty Ltd, 1999
These notes were prepared to accompany a presentation at AT&T Research Labs, Florham Park NJ, on 5 April 1999; and revised after the event
This document is at http://www.anu.edu.au/people/Roger.Clarke/DV/Florham.html
Privacy protection is vital to the establishment of trust and public confidence in business-related uses of information infrastructure, such as consumer electronic commerce and electronic service delivery by governments.
Privacy protection comprises a complex of legal, organisational and technological features, that together implement a complex balance among conflicting interests, and reflect a value system.
Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations. It is not a single interest, but rather has several dimensions, including privacy of the person, privacy of personal behaviour, privacy of personal communications, and privacy of personal data. The focus here is primarily (though not exclusively) on the privacy of communications and of personal data, which are usefully referred to as 'information privacy'. An overview of information privacy matters is in Clarke (1997b).
Privacy Protection is a process of finding appropriate balances between privacy and multiple competing interests. A competing interest of especial importance is accountability. The formulation of detailed, operational rules about privacy protection is a difficult exercise. It is all the more challenging in the context of rapid technological change. Particular technologies that raise concerns include intelligent transportation systems, chip-cards, and biometrics.
Particular points relevant to the argument that follows are that:
That the level of public concern is high, and getting higher, is evidenced by surveys of public attitudes (Clarke 1996a). These concerns are widely seen to be holding back consumer electronic commerce and electronic service delivery (Clarke 1999a, 1999c).
This talk is concerned primarily with information infrastructure in general, and the Internet in particular. Privacy issues arising in the context of the Internet are reviewed in Clarke (1997c), and examined in greater detail in Clarke (1998b).
There are several alternative approaches that may be adopted to the establishment of privacy protections. Further information is provided in Clarke (1998g) relating to the philosophical, technological, and international regulatory contexts. This talk considers non-regulation (which is conventionally referred to using the euphemism 'self-regulation'), conventional governmental regulation, and a recently-emerged model that represents a blend between the two.
This approach commences with the assumption that organisations and industry sectors have incentives to exercise restraint on their own behaviour and on that of their competitors. It is associated with the excusatory literature of Alan Westin, which placed 'administrative convenience' as the primary goal, with privacy protection as a constraint.
There are many circumstances in which organisations do not have sufficient incentives to be good corporate citizens. In addition, there are major constraints on the capacity of industry associations to impose standards on their members; and associations have no power whatsoever in relation to the behaviour of non-members. The argument is summarised at Clarke (1998c), and can be expressed in the aphorism "Wolves self-regulate for the good of themselves and the pack, not the deer".
The proposition that self-regulation could somehow be sufficient has only ever been accorded credibility in the United States. The failure of consumer electronic commerce to grow with the explosive force of other Internet activities (examined in Clarke 1999a), together with a range of cavalier entrepreneurial behaviours, has brought the naivete of the proposition into sharp relief.
At the other extremity would be statutory requirements coupled with a hard-fisted regulatory agency wielding substantial powers. No country has ever legislated such a regulatory regime in relation to privacy. The closest has perhaps been the excessive registration requirements of the U.K. Data Protection Act 1984, and the (necessary, and successful) regulation of credit reporting in Australia.
Moderately hard-nosed regimes have, however, been applied in virtually all European countries, in most cases affecting both the public and the private sectors. Many date from the 1970s, but have been the subject of enhancements over the years. Some additional requirements are coming into effect as a result of the EU Directive of 1995, which came into force in October 1998. This requires the remaining members of the European Union whose legislative provisions do not already apply to the private sector, to extend them.
Business associations and privacy advocates are in agreement that no-one's interests are served by the over-reaction that 'hard' regulation would represent: privacy invasions are not to be compared with murder or even breaking-and-entering. They need to be addressed primarily through educative processes, adjustments to procedures, and complaints-determination-and-compensation arrangements, with damages and criminal offences in reserve for serious and for continued abuses.
The most constructive approach is to create incentive for meaningful self-regulation, by establishing back-end legal mechanisms. This is usefully described as 'co-regulation'. The statute popularly regarded as the first implementation was the New Zealand Data Protection Act 1993. This drew on experience with the application of the Australian Privacy Act 1988, and created a tiered approach comprising legislation, statutory principles, and a process for the establishment and approval of sector-specific codes of practice that articulate the abstract principles within that particular context. Although the term is of recent origin, the approach is already well-established in such areas as environmental protections, and consumer rights.
The co-regulatory approach involves the following elements. The aspects that differentiate the approach from what has been hitherto conventional in Europe and elsewhere are shown in boldface-type:
A somewhat more detailed specification for a co-regulatory framework is at Clarke (1999b).
A co-regulatory privacy protection regime is dependent on an abstract body of principles that is capable of being articulated in particular circumstances. The primary reference point for such an abstract body of principles is the OECD Guidelines (1980). A short form of these Principles is in Clarke (1999b), and a template for evaluating compliance with the OECD Guidelines as a whole is in Clarke (1989). Bodies of principles based on the OECD's formulation have been expressed in legislation and government reports in many countries. They tend to be structured along the following lines:
These requirements are supplemented by Openness (or Public Access) and Accountability Principles.
The OECD Guidelines are showing their advanced age. Enhancements are overdue, to cope with the ravages of technological advance, and the increasing expectations of consumers and citizens. More up-to-date formulations are to be found in the EU Directive of 1995, and instruments such as the Australian Privacy Charter of 1994.
Even more fundamentally, however, the weaknesses of the IPPs reflect their origins in an overriding commitment to administrative efficiency rather than to privacy interests. The so-called 'Fair Information Practices' movement within which these IPPs were derived adopted the approach that the efficiency and administrative convenience of business and government should not be hindered: there should merely be some conditions applied to business processes. Aided by this business-friendly approach, practices have become increasingly information- intensive and the personal data collected has become increasingly fine- grained.
Mere 'fairness' has been demonstrated to be inadequate to protect people's interests. The weaknesses of the IPPs include:
All of these need to be addressed. The most critical of the policy imperatives for the coming century are the following.
This does not be misinterpreted as denying that identification may be appropriate in a few, particular circumstances, such as:
What this Policy Imperative does deny is the increasingly common assumption by the State and corporations that individuals are to be subjected to the same efficiency-engendering technologies as are appropriate to manufactured objects. Policy issues arising in relation to anonymous, identified and pseudonymous transactions are addressed in Clarke (1996d) and Clarke (1999d).
This set of propositions denies the ability of the State and of corporations to consolidate personal data through the use of common identifiers. It further stresses the central importance of pseudonymity as a primary means of achieving the necessary balance between the needs for privacy and for accountability.
There are many technologies that were conceived expressly to be privacy-invasive technologies (reasonably described as being the PITs). Examples include data-trail generation through the denial of anonymity, data-trail intensification (e.g. identified phones, SVCs, and ITS), data warehousing and data mining, stored biometrics, and imposed biometrics. Given public concerns about privacy, the strong trend towards a privacy regulatory regime, and the need to unlock the promise of consumer electronic commerce and electronic services delivery, it is clear that technologies of these kinds are not regarded by the public as value-neutral, but rather are perceived to be a significiant part of a serious problem.
The last decade has seen the emergence of technologies that are expressly designed as privacy-enhancing technologies (PETs). These set out to reverse the trend in technological development, in order to produce tools that directly assist the protection of the privacy interest. These are commonly tools that provide genuine, untraceable anonymity. For examples, see EPIC's list of privacy-enhancing tools, and Lorrie Faith Cranor's page of P3P tools. Tools of this kind were discussed during a panel session at the Computers, Freedom & Privacy Conference in April 1999.
Technology is proving capable of delivering genuine anonymity; and this, in turn, is leading to an improved appreciation of the balance that's needed between privacy and other interests, especially accountability. From this, I suggest, a mature concept of privacy-sympathetic technology (PSTs) should shortly emerge. The remaining sections provide a framework within which privacy-enhancing and privacy-sympathetic technologies can be placed. It considers emergent technologies within the following classifications:
Services are available that provide outright anonymity. By this is meant the ability of a party to conduct transactions without any other party being able to discover the first party's identity. A common form for these arrangements is a succession of intermediary-operated proxy-services. Each intermediary knows the identities of the adjacent proxies, but has too little information to enable it to identify the prior and subsequent proxies, and hence cannot track the communication back to the originator or forward to the ultimate recipient.
It is technically feasible for a person to operate anonymously, and yet provide a persistent presentation (referred to as a 'nym'), and possess a profile (i.e. have data such as demographics) associated with the nym.
Other approaches that facilitate anonymity include:
Full anonymity appears to deny conventional forms of accountability, precludes a straightforward balancing between the conflicting interests of privacy and accountability, and is therefore privacy-enhancing rather than privacy-sympathetic technology. There may, however, be other ways in which accountability can be achieved in a context of anonymous transactions.
A pseudonym is a nym whose relationship with an underlying person is able to be established, but only under particular circumstances. Examples of pseudonymity services include:
A pseudonym is discoverable, it may or may not be persistent, and it may or may not have profile data associated with it.
Because pseudonymity services are so critical to addressing the two C2K policy imperatives discussed above, it is worth documenting some key requirements. Fundamental to pseudonymity services are that:
Aditional features important to a mature service include the following:
Pseudonymity enables the veil of anonymity to be pierced. It therefore provides a context within which conventional forms of accountability can be sustained, and balance between privacy and accountability interests achieved. It is therefore appropriately described as privacy-sympathetic technology.
Privacy-sympathetic technology may also be designed to work in an entirely identified context, by protecting personal data.
The risks of data being intercepted by unauthorised parties can be addressed by the application of symmetric cryptography to channel protection, commonly using SSL; although key management remains a weakness.
Security of personal data in storage on server-machines and client-machines, and during processing by server-software and client-software, is the subject of a great deal of attention. Computer, server and client security products are available which can provide comprehensive services, and high levels of protection. In addition to investment, however, they require levels of competence and concentration that appear to be well beyond the vast majority of organisations.
Asymmetric cryptography in principle enables the authentication of sender and receiver to be performed, and the repudiation risks to be addressed. In practice, it has become clear that a very substantial infrastructure is needed before the risk is comprehensively reduced.
Public key infrastructure can take a variety of forms, but most of them (and all of the obvious ones) are themselves highly privacy-intrusive (Greenleaf & Clarke 1997). Securing each person's private key in such a manner that they (and only they) can apply it, and can do so with relative ease, is a considerable challenge. Some of the schemes proposed involve highly intrusive applications of highly-intrusive biometrics. Schemes proposed to date fail to measure up against the requirements (Clarke 1998f).
When data is exposed to a party, however, it is important that trust exist in the practices that the other party will apply to tha data. One aspect of this is the application of appropriate security precautions on the other party's machines and network. Another aspect is confidence concerning the uses to which the data will and will not be put, and the circumstances under which it may be disclosed to some other party.
A number of schemes have been launched that are based on privacy statements and trademarks. Examples of such pseudo-technologies include TRUSTe, WebTrust and CPA WebTrust, and the Better Business Bureau. Their primary purpose is to provide an image of contribution; but they contribute little.
Infrastructures can be conceived that are privacy-sympathetic, i.e. that embody an appropriate balance between privacy and accountability. In the best of all possible worlds, that would begin at TCP/IP, and even the Layer 1 protocols, and work upwards, in a coherent fashion. Even ISO OSI, however, strongly architected though it was, lacked even a security thread through its Layers, let alone evidence of embedded privacy-awareness. The result is that Internet-based infrastructure must have privacy-sympathy grafted onto it, and gradually integrated into it.
The most developed of the frameworks is W3C's Platform for Privacy Preferences (P3P). This creates the possibility of standards-based protections for personal data that is resident on clients, and/or held by intermediaries; and enables programmatic negotiation between server and client (Cranor 1998, Clarke 1998d, 1998e).
An appreciation of developments in privacy protection regimes provides context to, and motivation for, the conception and realisation of privacy-sympathetic and privacy-enhancing technologies.
This list comprises works by the presenter. Many of these contain references to other literature. In addition, the following is a set of references to the international and Australian literatures:
Clarke (1998) 'A History of Privacy in Australia: References' December 1998, at, http://www.anu.edu.au/people/Roger.Clarke/DV/OzHR.html
Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988), at http://www.anu.edu.au/people/Roger.Clarke/DV/CACM88.html
Clarke R. (1989) 'The OECD Data Protection Guidelines: A Template for Evaluating Information Privacy Law and Proposals for Information Privacy Law', April 1989, at http://www.anu.edu.au/people/Roger.Clarke/DV/PaperOECD.html
Clarke R. (1994a) 'The Digital Persona and its Application to Data Surveillance', The Information Society 10, 2 (June 1994), at http://www.anu.edu.au/people/Roger.Clarke/DV/DigPersona.html
Clarke R. (1994b) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994), at http://www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html
Clarke R. (1996a) , 'Reference List: Surveys of Privacy Attitudes', March 1996, at http://www.anu.edu.au/people/Roger.Clarke/DV/Surveys.html
Clarke R. (1996b) 'Trails in the Sand' (May 1996), at http://www.anu.edu.au/people/Roger.Clarke/DV/Trails.html
Clarke R. (1996c) 'Privacy and Dataveillance, and Organisational Strategy', Proc. Conf. EDPAC'96, Perth, 28 May 1996, at http://www.anu.edu.au/people/Roger.Clarke/DV/PStrat.html
Clarke R. (1996d) 'Identification, Anonymity and Pseudonymity in Consumer Transactions: A Vital Systems Design and Public Policy Issue', Conference on 'Smart Cards: The Issues', Sydney, 18 October 1996, at http://www.anu.edu.au/people/Roger.Clarke/DV/AnonPsPol.html
Clarke R. (1997a) 'Cookies', February 1997, at http://www.anu.edu.au/people/Roger.Clarke/II/Cookies.html
Clarke R. (1997b) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' August 1997, at http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html
Clarke R. (1997c) 'Privacy On the Internet: Threats, Countermeasures and Policy', Invited Address to the IBC 1997 Australian Privacy Forum, Gazebo Hotel, Sydney, 21-22 October 1997, at http://www.anu.edu.au/people/Roger.Clarke/DV/Internet.html
Clarke R. (1998a) 'Direct Marketing and Privacy', Proc. AIC Conf. on the Direct Distribution of Financial Services, Sydney, 24 February 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/DirectMkting.html
Clarke R. (1998b) 'Information Privacy On the Internet: Cyberspace Invades Personal Space' Telecommunication Journal of Australia 48, 2 (May/June 1998), at http://www.anu.edu.au/people/Roger.Clarke/DV/IPrivacy.html
Clarke R. (1998c) 'Submission to the Senate Legal and Constitutional References Committee Inquiry Into Privacy and the Private Sector', 7 July 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/SLCCPte.html, plus a 'Supplementary Submission', 4 August, at http://www.anu.edu.au/people/Roger.Clarke/DV/SLCCPteSupp.html
Clarke R. (1998d) 'Platform for Privacy Preferences: An Overview' (April 1998), Privacy Law & Policy Reporter 5, 2 (July 1998) 35-39, at http://www.anu.edu.au/people/Roger.Clarke/DV/P3POview.html
Clarke R. (1998e) 'Platform for Privacy Preferences: A Critique' (April 1998), Privacy Law & Policy Reporter 5, 3 (August 1998) 46-48, at http://www.anu.edu.au/people/Roger.Clarke/DV/P3PCrit.html
Clarke R. (1998f) 'Public Key Infrastructure: Position Statement', May 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/PKIPosn.html
Clarke R. (1998g) 'A History of Privacy in Australia: Context', October 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/OzHC.html
Clarke R. (1999a) 'Key Issues in Electronic Commerce and Electronic Publishing', Proc. Conf. Information Online and On Disc 99, Sydney, January 1999, at http://www.anu.edu.au/people/Roger.Clarke/EC/Issues98.html
Clarke R. (1999b) 'Internet Privacy Concerns Confirm the Case for Intervention', Communications of the ACM 42, 2 (February 1999) 60-67, at http://www.anu.edu.au/people/Roger.Clarke/DV/CACM99.html
Clarke R. (1999c) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report' Forthcoming, Proc. 12th Int'l Conf. on Electronic Commerce, Bled, Slovenia, 8-9 June 1999, at http://www.anu.edu.au/people/Roger.Clarke/EC/WillPay.html
Clarke R. (1999d) 'Anonymous, Pseudonymous and Identified Transactions: The Spectrum of Choice' Forthcoming, Proc. Conf., User Identification & Privacy Protection, Stockholm, 14-15 June 1999, Extended Abstract at http://www.anu.edu.au/people/Roger.Clarke/DV/UIPP99EA.html, paper forthcoming at http://www.anu.edu.au/people/Roger.Clarke/DV/UIPP99.html
Greenleaf G.W. & Clarke R. (1997) 'Privacy Implications of Digital Signatures', Invited Address, IBC Conference on Digital Signatures, Sydney, 12 March 1997 , at http://www.anu.edu.au/people/Roger.Clarke/DV/DigSig.html
Go to Roger's Home Page.
Go to the contents-page for this segment.
Send an email to Roger
Created: 29 March 1999
Last Amended: 12 April 1999
|These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content).|
| The Australian National University|
Visiting Fellow, Faculty of
Engineering and Information Technology,
Information Sciences Building Room 211
Pty Ltd, ACN: 002 360 456|
78 Sidaway St
Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 1472, 6288 6916