Roger Clarke's Web-Site© Xamax Consultancy Pty Ltd, 1995-2024 |
||||||
HOME | eBusiness |
Information Infrastructure |
Dataveillance & Privacy |
Identity Matters | Other Topics | |
What's New |
Waltzing Matilda | Advanced Site-Search |
Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Version of 13 October 1996
© Xamax Consultancy Pty Ltd, 1996
Available under an AEShareNet licence
Invited Presentation to the Conference on 'Smart Cards: The Issues', Sydney, 18 October 1996
This document is at http://www.rogerclarke.com/DV/AnonPsPol.html
Applications of smart cards tend to convert anonymous transactions into identified ones. This represents a very significant increase in the data trails that people leave behind them, and hence in the privacy-invasiveness of systems. There is a strong likelihood of a public backlash against such schemes, which may take such forms as their outright rejection by consumers, widespread adoption of multiple identities, or rampant obscuration and falsification of personal data.
Smart cards are very flexible instruments, and offer systems designers a great deal of flexibility. That flexibility can and should be used to ensure that the privacy-abusive potential of card-based schemes is not realised. This short paper outlines techniques whereby anonymity and pseudonymity can be achieved. It argues for maximum use of anonymity in schemes, for very careful justification of direct identification, and for maximum use of indirect identification.
The presumption is too often made that transactions must be either identified or anonymous. Payment schemes that involve identification of the parties generate audit trails and hence enable some degree of traceability; but that rich trail of data represents a threat to security and privacy. Anonymous schemes, on the other hand, protect the parties' identities, but work against such interests as law enforcement.
There is an intermediate alternative, which, carefully applied, enables balancing of the various interests. 'Pseudonymous' or 'indirectly identified' transactions involve the recording of a pseudo-identifier. The cross-index between the pseudo- and the real identifiers are protected by appropriate technical, organisational and legal measures.
This paper provides background to pseudonymity, and argues that the technique should be used to address the vital policy issues raised by smart-card, biometric, and networking technologies.
An identified transaction is one in which the data can be readily related to a particular individual. This may be because it carries a direct identifier of the person concerned, or because it contains data which, in combination with other available data, links the data to a particular person.
There are some kinds of transactions that cannot reasonably be performed unless the parties' identities are disclosed and recorded. Examples include:
There are some further kinds of transactions which cannot reasonably be performed unless the parties' identities are disclosed, but where the identity need not be recorded. For example, where a person collects a privacy-sensitive document (such as medical information), or a token intended to serve as evidence of identity (such as a passport), it is appropriate for some evidence of identity to be provided. It may be entirely sufficient, however, to record just the fact of identification having been produced, rather than the specifics of the evidence (such as the person's driving licence number).
Where transaction data is retained in identified form, it can result in a collection of data that reveals a great deal about the individual and their behaviour. Such 'data trails' may be used to trace back over a person's past, or analysed to provide an abstract model of the person, or 'digital persona'. This digital persona is then used by government agencies as a means of social control, and by marketers as a means of identifying prospects. Descriptions of data trails can be found in Clarke (1996), and of the digital persona in Clarke (1994a).
Detailed examination of the techniques of human identification, and discussion of the associated policy issues, are provided in Clarke (1994b).
One particular tendency towards tighter identification requirements is in the areas of electronic commerce (EC) and electronic services delivery (ESD), particularly using the public Internet. It is argued that in environments in which business is conducted without ever seeing the other party, much greater care must be taken in authenticating their identity. The concept of authentication is being addressed through the technique of a high-integrity 'digital signature'. An introduction to these concepts is provided in Clarke (1996b), and an assessment of the current state of play is in Clarke (1996c).
Anonymity refers to the complete absence of identification data in a transaction. The key characteristic of an anonymous transaction is that the specific identity of one or more of the parties to the transaction cannot be extracted from the data itself, nor by combining the transaction with other data.
Some examples of non-identified, anonymous transactions include:
People desire anonymity for a variety of reasons. Some of these are of dubious social value, such as avoiding detection of their whereabouts in order to escape responsibilities such as paying debts and supporting the children of a broken marriage; avoiding retribution for financial fraud; and obscuring the flow of funds arising from illegal activities such as theft, drug-trading and extortion (commonly referred to as 'money-laundering').
Other reasons for seeking anonymity are of arguably significant social value. Examples include:
It is often blithely assumed that the interests of parties to a transaction cannot be protected if the transaction is conducted anonymously. This assumption is sometimes correct, but not always so.
An important example of a technique that can be used to protect people's interests is 'eligibility authentication'. This involves the provision of evidence of some attribute of the individual (such as their age, or their club or association membership, or their eligibility for a concession). This is distinguished from 'user authentication', which necessarily involves the provision of evidence of the identity of the individual.
It is commonly assumed that tension exists between the proponents of all transaction data being identified (typified by the presumption that "the only people who want privacy are the ones with something to hide"), and the adherents to the view that all data is private. In fact, another alternative exists which can be applied to address the desires of both sides.
A pseudonym is an identifier for a party to a transaction, which is not, in the normal course of events, sufficient to associate the transaction with a particular human being. Hence a transaction is pseudonymous in relation to a particular party if the transaction data contains no direct identifier for that party, and can only be related to them in the event that a very specific piece of additional data is associated with it. The data may, however, be indirectly associated with the person, if particular procedures are followed.
There are several ways in which the requirements of pseudonymity can be implemented. One is the storage of partial identifiers by two or more organisations (e.g. two halves of a person's driver's licence number). Both would have to provide their portions of the transaction trail in order that the identity of the party can be constructed.
A more common way is to apply the concept of 'identity escrow'. This involves the following elements:
The trusted third party must maintain a cross-index between the indirect identifier and the person's real identity. It must apply appropriate technical and organisational security measures, and divulge the link only in circumstances specified under legal authority, such as contract, legislation, search warrant or court order.
Such mechanisms already exist in a variety of settings; for example, epidemiological research in the health-care and social-science arenas needs longitudinal data, including demographic data about the individuals concerned, but does not necessarily need to know their identities: a pseudo-identity is sufficient.
Another example is 'anonymous re-mailers', which enable individuals to obscure their identities when they send email messages, by filtering them through a service which undertakes to protect the linkage between real and nominal identity. Such undertakings might provide an iron-clad guarantee of anonymity, provided that the service-operator and its clients forego a transaction trail, and thereby any form of traceability. In many cases, however, a transaction trail is likely to be maintained, and be subject to, for example, court orders, search warrants and sub poenas; and the messages are therefore pseudonymous rather than anonymous.
There are also applications in the area of financial services. For example, buyers and sellers on exchanges which deal in stocks, shares, financial derivatives and foreign currencies do not, and do not need to, know the identity of the other party to the transaction.
In addition, financial institutions in some countries protect the identities of companies and individuals which have deposits with them, or undertake transactions through them. A noteworthy element of recent public discussions about Swiss banking secrecy is that the original intention (the protection of Jews who broke German law of the 1930s by depositing value in Swiss banks) has been subverted, and the technique applied to other purposes, without any adjustment to the checks and balances within the system. The need is for both identity-protection methods, and the means to override the protections when the public interest demands it.
Innovative mechanisms which have been developed to serve the interests of the wealthy are capable of adaptation to the needs of people generally.
This paper argues the following:
Clarke R. (1994a) 'The Digital Persona and its Application to Data Surveillance', The Information Society 10, 2 (June 1994)', at http://www.rogerclarke.com/DV/DigPersona.html
Clarke R. (1994b) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues', Information Technology & People 7,4 (December 1994) 6-37, at http://www.rogerclarke.com/DV/HumanID.html
Clarke R. (1995) 'When Do They Need to Know 'Whodunnit?' The Justification for Transaction Identification; The Scope for Transaction Anonymity and Pseudonymity' Proc. Conf. Computers, Freedom & Privacy, San Francisco, 31 March 1995, at http://www.rogerclarke.com/DV/PaperCFP95
Clarke R. (1996a) 'Trails in the Sand', May 1996, at http://www.rogerclarke.com/DV/Trails.html
Clarke R. (1996b) 'Cryptography in Plain Text', Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 24-27, at http://www.rogerclarke.com/II/CryptoSecy.htmlClarke R. (1996c) 'Crypto-Confusion: Mutual Non-Comprehension Threatens Exploitation of the GII', Privacy Law & Policy Reporter 3, 2 (May 1996), pp. 30-33, at http://www.rogerclarke.com/II/CryptoConf.html
Personalia |
Photographs Presentations Videos |
Access Statistics |
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax. From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024. Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer |
Xamax Consultancy Pty Ltd ACN: 002 360 456 78 Sidaway St, Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 6916 |
Created: 13 October 1996 - Last Amended: 9 March 1998; addition of FfE licence 5 March 2004 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/AnonPsPol.html
Mail to Webmaster - © Xamax Consultancy Pty Ltd, 1995-2024 - Privacy Policy