Principal, Xamax Consultancy Pty Ltd, Canberra
Visiting Fellow, Department of Computer Science, Australian National University
Version of 30 July 2002
© Xamax Consultancy Pty Ltd, 2002
Statement for a Panel Session on 'Understanding e-Business: Can we remain anonymous in the marketplace?' 24th International Conference of Data Protection & Privacy Commissioners, Cardiff UK, 9-11 September 2002
This document is at http://www.anu.edu.au/people/Roger.Clarke/DV/AnonDPPC02.html
The accompanying PowerPoint slide-set is at http://www.anu.edu.au/people/Roger.Clarke/DV/AnonDPPC02.ppt (500KB)
Three panellists, invited and chaired by Mrs Lynn Keig, Isle of Man Data Protection Registrar, were asked to address the following topic:
"Identification and authentication are as important to business as to government. Businesses need to know who they are dealing with, whether those people have authority for particular transactions, what the risks are in dealing with a particular individual and how those risks can be minimised.
"In the e-commerce world, the emphasis is placed on secure transactions, but security is only part of the answer. Limiting the use of identity is equally important. Can we achieve the same anonymity in the world of e-commerce as when we buy goods for cash at a street market? Do we want to?".
Claims that consumer identity is necessary in all cyberspace transactions are based on two myths:
The first statement is a myth because sellers use their market power to require consumers to pay before delivery, and hence in the majority of circumstances consumers carry the transaction risk not sellers. The second statement is a myth because there are many different kinds of assertion that may need to be authenticated, and assertions about identity are seldom as important as assertions about the consumers' attributes and about the value of the consideration being offered.
Anonymity is appropriate in many circumstances, and can be achieved in most circumstances without exposing marketers to transaction risks. (It does of course reduce their capacity to collect data about consumers without express consent). Persistent nyms, i.e. identifiers that cannot be readily related back to individuals, need to be implemented as a means of balancing the interests of consumers and marketers.
Unlike anonyms, pseudonyms can sustain accountability, provided that they enable the veil to be penetrated, but are subject to legal, organisational and technical protections that are together strong enough to withstand abuse by powerful government agencies and corporations. Data Protection and Privacy Commissioners need to deny marketer claims that they need consumers' identities, and to place much more emphasis on research and development in relation to nymous transaction mechanisms.
In every kind of marketplace, many risks arise. For example, the other party might fail to fulfil its part of the bargain, the marketplace operator might not do its job, or delivery services, banking or insurance might break down, the item might not be of the expected quality or might be delivered late, and the consideration offered might not be worth what it was supposed to be.
A common explanation for the slower-than-expected uptake of eCommerce is that marketspaces involve even more risks, and even more serious risks. The confidence that arises from meeting people in the flesh, and from seeing their real-world foot-print, is argued to be missing from cyberspace transactions. Hence, so the popular wisdom goes, we all have to abandon anonymity, yield up our identifiers, and submit to intrusive identity authentication procedures (Clarke 2000b).
There's some truth in the assertion that consumers are at greater risk in eCommerce transactions than they are in the physical world. That's because sellers use their market power to force consumers to pay first. The primary risks of non-performance and malperformance therefore fall on the consumer. Hence sellers are in most kinds of transactions not exposed to risk if the consumer fails to identify themselves or uses an identifier that cannot be readily linked to a real person.
Marketers have been guilty of a confidence trick, and the proposition in the preamble to this Panel Session that "businesses need to know who they are dealing with" needs to be taken with a sackful of salt. Marketers are certainly motivated to acquire consumers' identities, but risk management is not the reason. Marketers want consumers' identities in order to acquire data-trails, correlate data-trails, build profiles, reduce the costs of marketing communications, increase their sales-to-ads ratio, and manipulate consumer behaviour (Clarke 1998a).
Linked with this gross misunderstanding is the serious misapprehension that 'authentication' is inherently about identity. It's not. Authentication is the confirmation of an assertion, and assertions are of many kinds (Clarke 2001g).
One important kind of assertion is a person's claim to have a particular attribute, as arises with trade-sales (is that person really a plumber?), membership-based discounts, and age-restricted goods and services. The identity of the consumer is not the issue: a card may carry a person's identifier as a means of relating the card to the person, but the information that's relevant to the transaction is the attribute that the card reveals, not the identifier.
A particular kind of attribute assertion relates to principal-agent relationships, and the nature and extent of the delegation that the agent carries. This arises in consumer transactions where, for example, a parent transacts on behalf of a child, or a solicitor on behalf of a client.
But the primary form of assertion that's important in marketspaces is, as it always was in marketplaces, that 'the consideration I'm offering has a particular value'. Value authentication used to involve testing the give of a coin against the teeth; then the recipient needed to look for the metal strip in the banknote; and later it became necessary to compare the cheque against the cheque guarantee card. In contemporary shops and eCommerce alike, it involves the clearance of payment-card details against the entries on a remote database.
Marketers gain great benefits from habituating consumers to yielding up their identity in the form of debit-, credit- and 'loyalty'-cards. They have successfully held off anonymous forms of e-payment such as Chaumian Digicash, and have avoided implementing pseudonymous credit-card schemes, leaving e-consumers no payment option other than using identified credit-cards (Clarke 1995).
There are a few specific circumstances in which the protection of a corporation's interests are challenging unless their customer's identity is available to them. These include:
For the large majority of transactions, however, there is no intrinsic justification for consumer identity to have to be disclosed. It is merely a convenience to marketers.
Much more attention needs to be paid to the alternatives to identification Clarke (1994, 1999b). Anonymity precludes the association of data or a transaction with a particular person. Anonymity makes conservative people nervous, and liberal people wary, because it undermines one of the main ways in which accountability is achieved. The argument needs to be moved on, well beyond the trivial "anonymity = unaccountability and is used by crooks, so it's evil" level of understanding. Anonymity will continue to exist, and has important functions as well as offering scope for the criminal. It must and will survive the assault by the national security and law enforcement lobbies that was launched on 12 September 2001.
Pseudonymity, unlike anonymity, does not entirely preclude the linking of data and actions to an individual. Instead, it creates barriers that are significant, but penetrable. For effective pseudonymity to be delivered, it must support persistence (i.e. the nym must be able to be used on an ongoing basis, such that new transactions can be linked to existing data). But, to be credible, pseudonymity must also provide a raft of protections, that are over-ridable for the right reasons, but not just through the exercise of power. That means:
The lack of attention that has been given to pseudonymity to date is a symptom of the utter inadequacy of the 'fair information practices' (FIPs) model, as originated by Westin at the end of the 1960s, expressed in the OECD Guidelines in 1980, and extended by the EU Directive of 1995. FIP(s) was conceived as minimalist regulation, business- and government-friendly, sufficient to quieten consumer and citizen concern. It was codified in order to avoid inconvenience to corporations that operate internationally, and it addressed the IT of the 1970s. For a comprehensive assessment of the inadequacies of FIPs, see Clarke (2000a).
It has been clear for a long time that identity is central to dataveillance techniques (Clarke 1988). The threats have become progressively more serious since then, with the dramatic growth in dataveillance technologies (Clarke 2001a), and particularly in person location and tracking technologies (Clarke 1999c) and biometrics (Clarke 2001d, 2002a, 2002b). It is crucial to privacy protection that organisations be denied ready access to personal data, and the first step is to deny the association of identifiers with data at every opportunity, and to permit it only where the justification outweighs the risks.
Businesses might want to know who they're dealing with, but in most circumstances they don't need to know. They have succeeded in contriving arrangements whereby consumers are forced to provide an identifier, and those contrivances must be exposed, and practices changed.
Consumer anonymity is highly desirable, in order to protect against the exercise of market power by marketers. Where ongoing relationships need to be supported, persistent nyms can be used, which may be either entirely anonymous or pseudonymous. Multiple identifiers or nyms are highly desirable, in order to deny marketers from breaking through to the underlying person, and to prevent marketers from correlating data from multiple sources.
Data Protection and Privacy Commissioners must join with privacy advocates in denying corporations the use of specious arguments for mandatory consumer identification as a pre-condition for the conduct of e-commerce. They must also strongly support the availability of consumer anonymity and effective pseudonymity.
I've drawn attention to the problems of unjustified demands for consumers' identity in a succession of papers commencing with Clarke (1997). See also Clarke (1998b) and Clarke (2001b 2001e, and 2001f).
There is an index to my papers on identification, anonymity and pseudonymity topics. Probably the most usable expression of the analysis is in Clarke (1999b).
A summary of my position on trust is in Clarke (2000b). The most compact form of my analysis of appropriate authentication models is at Clarke (2001g).
Clarke R. (1988) `Information Technology and Dataveillance', Commun. ACM 31,5 (May 1988) 498-512, at http://www.anu.edu.au/people/Roger.Clarke/DV/CACM88.html
Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994), at http://www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html
Clarke R. (1995) 'Net-Based Payment Schemes' February 1995, at http://www.anu.edu.au/people/Roger.Clarke/EC/EPMEPM.html
Clarke R. (1997) `Promises and Threats in Electronic Commerce' August 1997, at http://www.anu.edu.au/people/Roger.Clarke/EC/Quantum.html
Clarke R. (1998a) 'Direct Marketing and Privacy', Proc. AIC Conf. on the Direct Distribution of Financial Services, Sydney, 24 February 1998, at http://www.anu.edu.au/people/Roger.Clarke/DV/DirectMkting.html
Clarke R. (1998b) 'Information Privacy On the Internet: Cyberspace Invades Personal Space' Telecommunication Journal of Australia 48, 2 (May/June 1998), at http://www.anu.edu.au/people/Roger.Clarke/DV/IPrivacy.html
Clarke R. (1999a) 'Privacy-Enhancing and Privacy-Sympathetic Technologies - Resources', April 1999, at http://www.anu.edu.au/people/Roger.Clarke/DV/PEPST.html
Clarke R. (1999b) 'Anonymous, Pseudonymous and Identified Transactions: The Spectrum of Choice', Proc. IFIP User Identification & Privacy Protection Conference, Stockholm, June 1999, at http://www.anu.edu.au/people/Roger.Clarke/DV/UIPP99.html
Clarke R. (1999c) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. Data Protection & Privacy Commissioner's Conference, Hong Kong, September 1999, and Information Technology & People 14, 2 (Summer 2001) 206-231 , at http://www.anu.edu.au/people/Roger.Clarke/DV/PLT.html
Clarke R. (2000a) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century' January 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/PP21C.html
Clarke R. (2000b) 'Trust in the Context of e-Business' July 2000. An edited version of was published in the Internet Law Bulletin 4, 5 (February 2002) 56-59. At http://www.anu.edu.au/people/Roger.Clarke/EC/Trust.html
Clarke R. (2001a) 'While You Were Sleeping ... Surveillance Technologies Arrived Australian Quarterly 73, 1 (January-February 2001) , at http://www.anu.edu.au/people/Roger.Clarke/DV/AQ2001.html
Clarke R. (2001b) 'Of Trustworthiness and Pets: What Lawyers Haven't Done for e-Business' February 2001, Proc. 5th Biennial Pacific Rim Computer Law Conference, February 2001, at http://www.anu.edu.au/people/Roger.Clarke/EC/PacRimCL01.html
Clarke R. (2001c) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, at http://www.anu.edu.au/people/Roger.Clarke/DV/PITsPETs.html
Clarke R. (2001d) 'Biometrics and Privacy' April 2001, at http://www.anu.edu.au/people/Roger.Clarke/DV/Biometrics.html
Clarke R. (2001e) 'Trust in Cyberspace: What eCommerce Doesn't Get', U.N.S.W. Continuing Legal Education Seminar, May 2001, at http://www.anu.edu.au/people/Roger.Clarke/EC/TrustCLE01.html
Clarke R. (2001f) 'Privacy as a Means of Engendering Trust in Cyberspace' UNSW L. J. 24, 1 (June 2001), at http://www.anu.edu.au/people/Roger.Clarke/DV/eTrust.html
Clarke R. (2001g) 'Authentication: A Sufficiently Rich Model to Enable e-Business' December 2001, at http://www.anu.edu.au/people/Roger.Clarke/EC/AuthModel.html
Clarke R. (2002a) 'Biometrics Inadequacies & Threats, & Privacy-Protective Architecture', April 2002, at http://www.anu.edu.au/people/Roger.Clarke/DV/NotesCFP02.html#BiomRC
Clarke R. (2002b) 'Biometrics Inadequacies & Threats, & Privacy-Protective Architecture', May 2002, at http://www.anu.edu.au/people/Roger.Clarke/DV/BiomHKU.ppt
Go to Roger's Home Page.
Go to the contents-page for this segment.
Created: 28 July 2002
Last Amended: 30 July 2002
These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content). |
The Australian National University Visiting Fellow, Faculty of Engineering and Information Technology, Information Sciences Building Room 211 | Xamax Consultancy
Pty Ltd, ACN: 002 360 456 78 Sidaway St Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 1472, 6288 6916 |