Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Biometrics and Privacy'

Biometrics and Privacy

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Fellow, Department of Computer Science, Australian National University

Notes of 15 April 2001

© Xamax Consultancy Pty Ltd, 2001

This document is at http://www.rogerclarke.com/DV/Biometrics.html


Introduction

Biometrics is a general term for measurements of humans designed to be used to identify them or authenticate that they are who they claim to be. Many biometric technologies have been developed in the last few decades, and the technologies have already been applied in a variety of settings.

Biometric technologies have extremely serious implications for human rights in general, and privacy in particular. Their uses to date have been to enable powerful organisations to exercise social control over people, and the designs have been highly insensitive to the interests of the individuals they're imposed upon.

This document provides background to the topic, and pointers to some resources. It's a set of preliminary notes on a complex and challenging subject. Constructively negative feedback is sought, to Roger.Clarke@xamax.com.au.

The material is organised under the following headings:


Identification and Identity Authentication

Identification is a process whereby a real-world entity is recognised, and its 'identity' established. The concept of 'identity' is quite complex, but for the purposes of this document it implies a particular person. An identity is signified by an identifier. This may be a purpose-designed code; or it may be a compound of such data as given and family name, date-of-birth and postcode of residence.

Organisations acquire a person's identifier in order to conduct a search through existing files, and thereby associate the current transaction with previously recorded information. Hence identification is a process of searching for one among many. These concepts are examined in considerable detail in Clarke (1994).

Authentication is the process whereby a degree of confidence is established about the truth of an assertion. The assertion may be of many kinds, such as 'this thing I'm giving you is worth $Aust100', or 'the bearer of this card is a qualified medical practitioner'. Identity authentication is the process whereby a degree of confidence is established about the truth of an assertion by an entity that they have a particular identity, or are properly signified by a particular identifier.

Organisations acquire information about a person in order to check it against previously recorded information. Hence authentication is a process of one-to-one comparison, rather than a mass searching process. Further information is available at Clarke (1999a).

There is a tendency among providers of biometric technologies to use the term 'identity verification' to refer to what is called in this document 'identity authentication'. (See, for example, the IBIA FAQ). This is an exaggeration, used by such corporations in order to convey the impression that the technologies they provide are foolproof, and can deliver verity, or Truth.


Biometrics

Biometrics is a generic term encompassing a wide range of measures of human physiography and behaviour.

Some technologies measure relatively stable aspects of the body such as fingerprints, thumb geometry, aspects of the iris and ear-lobes, and DNA. Other technologies focus on dynamic aspects of human behaviour such as the process (as distinct from the product) of creating a hand-written signature, and the process of keying a password. Further information is provided at Clarke (1994) and in Clarke (1997).

The following are examples of characteristics on which biometric technologies can be based:

Technologies that have achieve some degree of use on humans to date include fingerprints, thumb geometry, iris-scanning, and anklets. Embedded micro-chips have been applied to pets and livestock.


How Biometrics Technologies Work

Biometric applications depend on comparing a new measure against a previously captured measure. In order to establish the necessary reference-point, some aspect of a person is measured; the measure may be processed; and the resulting data is stored. At a subsequent time, the same aspect of a person is measured, and compared against the stored data. If it is being used for authentication, the new data is compared against the data already in storage for that person. If it is being used for identification, the entire database is searched, in order to locate one or more individuals that are a close fit to the new data.

Most biometrics technologies do not seek exact equality between the new and the stored measures. Instead they have a pre-set tolerance range within which the two are deemed to be sufficiently close.

The biometric measure itself may be stored. Alternatively the data may be subjected to processing of some kind, and the results of that processing stored instead. The kinds of processing include:

Data-storage may be in a central location, or may be local to the place where it is used (e.g. in a particular building where the person works), or it may be on a device carried by the person themselves (such as a smartcard, a watch or a ring), or in two of the above, or even all three.

The data may be stored in such a manner that it is only usable for a single purpose by a single organisation, or for multiple purposes by a single organisation, or for multiple purposes by multiple organisations.


Uses of Biometrics

Biometrics may be used for identification. A new measurement is compared against a database containing information about a lot of entities. An example of this is the comparison by police investigators of fingerprints from the scene of a crime against a collection of prints from persons previously convicted of serious criminal offences. In some jurisdictions, government agencies are permitted to collect biometrics without a conviction, or even a charge; and there are increasing attempts to compulsorily or pseudo-voluntarily acquire biometrics from many categories of people only remotely associated with crime (such as visitors to prisons, and people in geographical and/or temporal proximity to the scene of a crime).

Biometrics may be used for authentication. A new measurement that purports to belong to a particular entity is compared against the data stored in relation to that entity. If the measurements match, the assertion that the person is who they say they are is regarded as being authenticated. Some building access schemes work this way, with the system comparing the new measure against the company's employee database.

The primary objectives of sponsors of biometrics schemes are generally of a security nature. They include such purposes as:


Effectiveness, Accuracy and Benefits

Depending on a range of factors, biometric schemes can be devised that have substantial accuracy. On the other hand, such schemes are generally limited to specific contexts, are expensive, and are inconvenient and intrusive for the person forced to submit to them. Moreover, there are ways in which even well-designed technology and well-designed applications can be defeated.

In practice, measurements are seldom identical, and the comparison depends on the tolerance level that has been set for the application. This has the result that sometimes 'false positives' will occur, e.g. the assertion will be authenticated, even though the person who presented was not the one the system thought it was. On the opposite end of the spectrum, sometimes the right person will be rejected, in what are called 'false negatives' (e.g. they may have moved their finger at the wrong time, the part of the body used may be injured, or the behaviour being measured may have changed).

In general, the tighter the tolerances are set (to avoid false positives), the more false negatives will arise; and the looser the tolerances are set (to avoid false negatives), the more false positives occur. The tolerance is therefore set to reflect the interests of the scheme's primary sponsor, with little attention to the concerns of other stakeholders.

Provided that a biometric scheme is well-designed, that people are successfully forced to comply with its requirements, and sufficiently committed and resourceful people and organisations do not overcome the scheme's security design, biometrics schemes are capable of achieving the objectives of their sponsors. The negative impacts on the people subjected to the scheme, on the other hand, are seldom taken into account in the design of these schemes.


Threats

The last four decades of the twentieth century saw a dramatic explosion in the use of information technologies to subject people to data surveillance. This was examined in depth in Clarke (1988). The kind of centrally-controllable society that novels like '1984' foresaw depended on three conditions being fulfilled:

  1. a range of personal data systems need to exist, each processing data for specific purposes;
  2. some, preferably all, of those personal data systems need to be connected via one or more telecommunications networks; and
  3. the people to whom the data relates need to be identified consistently.

The first two of those conditions were satisfied by about 1980. The lack of consistent identification of individuals is the sole factor that has held back what has been referred to as 'the dossier society' and 'the surveillance state'. The threats inherent in human identification are examined in depth in Clarke (1994) and Clarke (1997).

Biometrics technologies are expressly designed as means of identifying individuals more reliably and more consistently. They threaten to break through the last remaining protection against dominance of individuals by governments and corporations. The specific threats embodied in biometrics technologies are examined in the following sub-sections.

(1) Privacy of the Person

Biometric technologies don't just involve collection of information about the person, but rather information of the person, intrinsic to them. That alone makes the very idea of these technologies distasteful to people in many cultures, and of many religious persuasions.

In addition, each person has to submit to examination, in some cases in a manner that many people regard as demeaning. For example, the provision of a quality thumbprint involves one's forearm and hand being grasped by a specialist and rolled firmly and without hesitation across a piece of paper or a platen; and an iris-print or a retinal print require the eye to be presented in a manner compliant with the engineering specifications of the supplier's machine. Some technologies, such as those based on DNA, go so far as to require the person to provide a sample of body-fluids or body-tissue.

(2) Privacy of Personal Data

Many schemes require the provision of personal data to assist in the administration of the scheme. Some are operated in close conjunction with other data-rich systems such as personnel or welfare administration. This consolidation of data enhances the opportunity for the organisation to exercise control over the population for whom it holds biometrics.

(3) Privacy of Personal Behaviour

The monitoring of people's movements and actions through the use of biometrics increases the transparency of individuals' behaviour to organisations. Those organisations are in a better position to anticipate actions that they would prefer to prevent. Moreover, an organisation that performs biometrics-aided monitoring is in a position to share personal data with other organisations, such as contracted suppliers and customers, 'business partners', and corporations and governments agencies with which it 'enjoys a strategic relationship'.

(4) Multi-Purpose and General-Purpose Identification

Biometric schemes are expensive. They also require the individuals that are subjected to them to register with some authority. Some schemes also require the individual to carry a token such as a card. To share costs, organisations are therefore motivated to apply biometric schemes for multiple purposes.

Any multiple usage of identifiers represents a serious threat to privacy, because it provides the organisations with simple means of sharing the data that each of them gathers, and hence with means to exercise control over the individuals involved.

There are no natural barriers to data-sharing, many countries lack laws to preclude it, and a strong tendency exists for organisations to break down such legal impediments as do exist. Hence the multiple purposes to which a biometric scheme is applied can readily extend beyond a single organisation to encompass multiple organisations in both the private and public sectors.

(5) Denial of Anonymity and Pseudonymity

Until very recent times, the vast majority of actions and transactions undertaken by people were anonymous, or were identified only to the extent that an observer saw them and might remember them, but no records of the event were kept.

Corporations and government agencies have been working very hard to deny people the ability to keep their transactions anonymous. As a result of new forms of information technology, the cost of data capture has plummeted, and huge numbers of transactions are now recorded which would have been uneconomic to record in the past. These records carry enough information to identify who the person was who conducted them, and systems are designed so as to readily associate the data with that person.

Biometric technologies create new capabilities for the association of identity with transactions that have never been recorded before, such as passing through a door within a building, across an intersection, or into a public place or an entertainment facility. They provide a powerful weapon to corporations and governments, whereby yet more of the remnant anonymity of human action can be stripped away.

(6) Masquerade

The storage of biometrics makes much easier the fabrication of tools, or the synthesis of signals, that are highly convincing replicas of a particular person's physiometrics. This raises the prospect of people having acts attributed to them that they did not do.

The feasibility of the manoeuvre varies depending on the kind of biometric. The technology to fabricate a convincing iris, based on the data captured and stored by an iris-reading device would seem to be challenging, and may well not exist. On the other hand, if a biometric comprises measurements of some part of a person's body, such as the first knuckle of the right thumb, then technology is probably already available that can produce a synthetic equivalent of that body-part.

Moreover, some biometric techniques select a small sub-set of the captured data, such as the number and orientation of ridges on a fingerprint, or the location and size of features in an iris. The risk is all the greater if the biometric is used in its raw form, or the compression is insufficiently `lossy' and hence the compressed form can be used to generate an adequate masquerade, or the hashing algorithm is not one-way.

A significant risk exists that an imposter could produce means to trick devices into identifying or authenticating a person even if they are not present. Possible uses would be to gain access to buildings, software or data, digitally sign messages and transactions, capture the person's identity, harm the person's reputation, or `frame' the person.

Any id or authentication scheme that involves storage of a biometric is fraught with enormous risks. These will very likely rebound on the person, whether or not it harms the organisation that sponsors the scheme.

(7) Permanent Identity-Theft

An act of masquerading as another person is a single event. If the imposter conducts a succession of masquerades, their behaviour amounts to taking over the person's identity. Cases of identity theft have been reported already, which have had very serious consequences for the victims. Organisations caannot distinguish the acts and transactions of the two individuals using the one identity, and hence they are merged together. A typical outcome is that the person faces demands for payment from organisations they have never purchased anything from, and shortly afterwards can no longer gain access to loans.

Under these circumstances, the identity can become so tainted that the person has to abandon that identity and adopt a new one. That is challenging, because such an act is readily interpreted as an admission of guilt, and an attempt to avoid the consequences of actions that are presumed to be actions of that person, rather than of the imposter.

Biometrics adds a frightening new dimension to identity theft. The purveyors of the technology convey the message that it is foolproof, in order to keep making sales . The organisations that sponsor schemes want to believe that it is foolproof, in order to avoid liabilities for problems. The resulting aura of accuracy and reliability will make it extraordinarily difficult for an individual who has been subjected to identity theft to prosecute their innocence.

Any biometric is an extraordinarily dangerous measure, because it's the equivalent of a PIN that can't be changed. Lose it once, and you're forever subject to masquerade by each person or organisation that gains access to it.

(8) Automated Denial of Identity

Identity theft is not limited to individual criminals. For example, a corporation could apply biometrics to the denial of access to premises by ex-employees, customers previously found guilty of shop-lifting, and in the case of casinos, problem-gamblers.

Proposals of this nature have arisen in the context of football grounds, and it was reported that an application was applied to the thousands of people who streamed into the U.S. Super Bowl in January 2001 (e.g. Green 2001).

The technique could of course be extended to the denial of access by customers suspected of shop-lifting, complainants, or known agitators against the company's practices. Government agencies could find scores of applications, such as preventing targeted people from using transport facilities. This scenario was investigated many years ago in the sci-fi novel 'Shockwave Rider' (Brunner 1975).

(9) Chilling Effect on Freedom, and on Democracy

Biometric technologies, building as they do on a substantial set of other surveillance mechanisms, create an environment in which organisations have enormous power over individuals. Faced with the prospect of being alienated by employers, by providers of consumer goods and services, and by government agencies, individuals are less ready to voice dissent, or even to complain.

That is completely contrary to the patterns that have been associated with the rise of personal freedoms and free, open societies. It represents the kind of closed-minded society that the Soviet bloc created, and which the free world decried. The once-free world is submitting to a 'technological imperative', and permitting surveillance technologies to change society for the worse. Biometrics tools are among the most threatening of all surveillance technologies, and herald the severe curtailment of freedoms, and the repression of 'different-thinkers', public interest advocates and 'troublemakers'.

Clearly, this undermines democracy, because candidates, dependent on parties, sponsors and the media, are less willing to be marginalised; supporters are less prepared to be seen to be so; and voters become fearful of the consequences if their voting patterns become visible.

Less clearly, the suppression of different-thinkers strangles the economy. It does this because the adaptability of supply is dependent on experimentation, choice, and the scope for consumers to change their demand patterns.

(10) Dehumanisation

Beyond the fairly practical considerations of freedom of thought and action, democracy and economic behaviour, there is the question of the ethics of the matter. If we're happy to treat humans in the same manner as manufactured goods, shipping cartons, and pets, then biometrics technologies are unobjectionable. If, on the other hand, humans continue to be accorded special respect, then biometrics technologies are repugnant to contemporary free societies.

Authoritarian governments ride rough-shod over personal freedoms and human rights. They will establish legal authority for and enforcement of the capture of biometrics for every transaction, and at every doorway. Such governments see consent and even awareness by the person as being irrelevant, because they consider that the interests of society or 'the State' (i.e. of the currently powerful cliques) dominate the interests of individuals.

In the free world as well, substantial momentum exists within governments and corporations to apply those same technologies, and in the process destroy civil rights in those countries.


Safeguards

It's possible that at least the more serious implications of biometric technology may be precluded by the processes of democracies. This section considers the extent to which safeguards exist or can be created.

(1) Self-Regulation

The possibility exists that purveyors of biometric technologies might recognise the signficance of their products, and exercise self-restraint. Unfortunately, this is a highly unlikely result in a world that demands that corporations act to maximise profit, market-share, and shareholder value. Biometrics providers are eager suppliers to repressive governments in the third world; and they use the lessons from those pilot schemes to supply technologies to governments in the hitherto free world.

A Code of sorts has been expressed by at least one of the burgeoning industry associations designed to serve the common interests of biometrics technology providers. The International Biometrics Industry Association's Code, promulgated in 2000, is a trivial document whose sole function is to convey concern, and which contributes nothing whatsoever to the protection of the people forced to submit to biometric measurement.

Alternatively, the organisations that apply biometrics technology might resist the blandishments of technology providers, design schemes that balance the various interests involved, refuse to acquire technologies that abuse human rights, and demand technologies that are sensitive to human needs. For example, the following principle was developed in relation to a proposed smart card application by an Australian social welfare agency:

Smartcard applications should only utilise biometric technology when there is a clear acceptance by the public of this type of authentication, the security of biometric systems is conclusive, and there is no prospect of a central register of biometric information.

Unfortunately, the agency did not adopt the principle, and there are few signs of any agency or corporation applying biometric technologies even having nominal commitments of this nature, let alone applying such principles to their designs and their dealings with technology providers.

It's necessary to look elsewhere for protections.

(2) Compulsory Social Impact Assessments

The principle has gradually become established that schemes that have significant potential impacts on the physical environment need to be subject to comprehensive assessment prior to any decision being made to proceed, and as a guide to the detailed design of the scheme.

If the nature of our society is important to us, then it's essential that social impact assessments be required whenever consideration is being given to applying technologies that have the potential for significant impact on people. Data surveillance technologies in general, and biometrics in particular, should therefore be subject to requirements of this kind. Guidelines on privacy impact assessment are available (e.g. Clarke 1998b). The process requires:

Unfortunately, very few countries require social impact assessments, and there does not appear to be much of a tendency for any country to do so. Arguably the trend is in the opposite direction, given the disbandment a few years ago of the Office of Technology Assessment of the U.S. Congress.

(3) Generic Privacy Laws

Perhaps biometric technologies might be precluded or regulated by existing privacy protection laws.

For the last 30 years, free world governments have been operating within what's commonly called the 'Fair Information Practices' approach to information privacy protections. This has resulted in modest regulation of personal data-handling in Western Europe, and a scatter of provisions in North America. This approach was inadequate when it was conceived around 1970, and is utterly incapable of dealing with the surveillance technologies that have been developed and deployed since then (Clarke 2000).

In Australia, weak laws have been imposed on the federal public sector and on those of N.S.W. and Victoria. In December 2000, an appalling law was passed that purports to impose regulation on the private sector, but actually legitimises a range of privacy-invasive practices by corporations (Clarke 2000b, Clarke 2000c).

Such generic laws as exist are far too naive and weak to represent any kind of curb on the explosion of biometric technologies.

(4) Specific Regulation

It may be that there is already sufficient understanding of the nature, impacts and implications of biometric technologies that they can be subjected to specific legislative regulation.

Some principles that can be extracted from the brief review in this paper are as follows:

There appear to be no steps in train to impose any such forms of regulation.

(5) A Moratorium on the Application of Biometrics

Given the extraordinarily serious implications of biometric technologies, and the absence of any effective protections, a ban is needed on all applications. The ban needs to remain in force until after a comprehensive set of design requirements and protections has been devised, implemented, and is actually in force.

Only in this manner can biometric technology providers and scheme sponsors be forced to balance the interests of all parties, rather than serving the interests of only the powerful and repressing the individuals whose biometrics are to be captured.


Conclusions

Biometrics is one of the most serious among the many technologies of surveillance that are threatening the freedom of individuals and of societies Clarke (2001).

In one possible future, biometrics will fall into ill-repute in relatively free countries. But in authoritarian countries, biometrics will be successfully imposed on the population, resulting in freedoms being reduced even further. Biometrics providers will flourish by selling their technology to repressive governments, and achieve footholds in relatively free countries by looking for soft targets, starting in some cases with animals, and in others with captive populations like the frail aged, prisoners, employees, insurance consumers, and welfare recipients. All relatively free countries will become more repressive. Public confidence in corporations and government agencies will spiral much lower. This scenario leads away from freedoms, and towards subjugation of the individual to powerful organisations.

The other alternative is that societies appreciate the seriousness of the threats, and impose substantial constraints on technologies and their use. This demands commitment by the public, and courage by elected representatives, who must withstand pressure from large corporations, and from the national security and law enforcement apparatus that invokes such bogeymen as terrorism, illegal immigration, and domestic law and order as justifications for the implementation of repressive technologies. This scenario embodies scope for achieving balance among the needs of individuals and society as a whole.

Right now, the choice is ours.

But it won't be for much longer, because corporations and governments are moving to implement such schemes, and once they're implemented the scope to participate in opposition to centrally-approved schemes will be drastically reduced.


References

Brunner J. (1975) 'The Shockwave Rider' Ballantine, 1975

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991 , at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1993) 'Introduction to Chip-Cards and Smart Cards', 1993, at http://www.rogerclarke.com/EC/ChipIntro.html

Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Information Technology & People 7, 4 (December 1994), at http://www.rogerclarke.com/DV/HumanID.html

Clarke R. (1997) 'Chip-Based ID: Promise and Peril', Int'l Conf. on Privacy, Montreal (September 1997), at http://www.rogerclarke.com/DV/IDCards97.html

Clarke R. (1998a) 'Smart Cards in Identification and Authentication' Chapter 7 of Smart Card Technical Issues Starter Kit, Centrelink, August 1998, at http://www.rogerclarke.com/DV/SCTISK7.html

Clarke R, (1998b) 'Privacy Impact Analysis' November 1988, at http://www.rogerclarke.com/DV/PIA.html

Clarke R. (1999a) 'Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice' Proc. User Identification & Privacy Protection Conf. Stockholm, 14-15 June 1999, at http://www.rogerclarke.com/DV/UIPP99.html

Clarke R. (1999b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999, at http://www.rogerclarke.com/DV/PLT.html

Clarke R. (2000a) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century' January 2000, at http://www.rogerclarke.com/DV/PP21C.html

Clarke R. (2000b) 'Submission to the Commonwealth Attorney-General Re: 'A privacy scheme for the private sector: Release of Key Provisions' of 14 December 1999' January 2000, at http://www.rogerclarke.com/DV/PAPSSub0001.html

Clarke R. (2000c) 'Submission to the Inquiry into the Privacy Amendment (Private Sector) Bill 2000 by the Senate Legal and Constitutional Legislation Committee' September 2000, at http://www.rogerclarke.com/DV/SenatePBSub2000.html

Daugman J. (1998) 'History and Development of Iris Recognition', at http://www.cl.cam.ac.uk/users/jgd1000/history.html

Daugman J. (1999) 'Iris Recognition for Personal Identification', at http://www.cl.cam.ac.uk/users/jgd1000/iris_recognition.html

Daugman J. (1999) 'Biometric decision landscapes" Technical Report No. TR482, University of Cambridge Computer Laboratory, at http://www.cl.cam.ac.uk/users/jgd1000/biomdecis.pdf

Daugman J. (2000) 'Combining Multiple Biometrics', at http://www.cl.cam.ac.uk/users/jgd1000/combine/combine.html

Davies S. G. (1994) 'Touching Big Brother: How biometric technology will fuse flesh and machine' Information Technology & People 7, 4 1994

Davis A. (1997) 'The Body as Password' Wired 5.07 (July 1997), at http://www.wired.com/wired/archive/5.07/biometrics_pr.html

Ebringer T. (2001a) 'Feigning fingerprints' Fairfax IT News, March 19, 2001, at http://it.mycareer.com.au/industry/20010319/A30358-2001Mar19.html

Ebringer T. (2001b) 'A cautionary tale about authentication integrity' Fairfax IT News, March 19, 2001, at http://it.mycareer.com.au/industry/20010319/A30359-2001Mar19.html

Economist (2000) 'The measure of man' The Economist, 9 September 2000, at http://www.economist.com/PrinterFriendly.cfm?Story_ID=360238&CFID=361701&CFTOKEN=64612627

Greene T.C. (2001) 'Feds use biometrics against Super Bowl fans' The Register, U.K. Edition, 1 February 2001, at http://www.theregister.co.uk/content/archive/16561.html

IBIA (2000) 'IBIA Privacy Principles', International Biometrics Industry Association, October 2000, at http://www.ibia.org/privacy.htm

IPCO (1999a) 'Biometrics and Policing: Comments from a Privacy Perspective' Information and Privacy Commissioner, Ontario, August 1999

IPCO (1999b) 'Privacy and Biometrics' Information and Privacy Commissioner, Ontario, September 1999, at http://www.ipc.on.ca/english/pubpres/sum_pap/papers/pri-biom.htm (188K)

IPCO (1999c) 'Consumer Biometric Applications: A Discussion Paper' Information and Privacy Commissioner, Ontario, September 1999, at http://www.ipc.on.ca/english/pubpres/sum_pap/papers/cons-bio.htm (376K)

Privacy International (1995-) 'National ID Cards', at http://www.privacy.org/pi/activities/idcard/

Privacy International (1995-) 'ID Cards FAQ', at http://www.privacy.org/pi/activities/idcard/idcard_faq.html

Privacy International (1995-) 'Campaigns of Opposition to ID Card Schemes', at http://www.privacy.org/pi/activities/idcard/campaigns.html

Silberman S. (2001) 'The New ID' Wired 9:01, at http://www.wired.com/wired/archive/9.01/ideo_pr.html

Tomko G. (1998) 'Biometrics as a Privacy-Enhancing Technology: Friend or Foe of Privacy?' Proc. 9th Privacy Commissioners'/Data Protection Workshop, 1998, at http://www.dss.state.ct.us/digital/tomko.htm

Warwick K. (2000) 'Cyborg 1.0' Wired 8.02 (February 2000), at http://www.wired.com/wired/archive/8.02/warwick_pr.html

Woodward J.D. (1997) 'Biometric Scanning, Law and Policy: Identifying the Concerns--Drafting the Biometric Blueprint' University of Pittsburgh Law Review 59.97 (1997), at http://www.pitt.edu/~lawrev/59-1/woodward.htm


Other Biometrics Resources


Biometrics Industry Associations

Biometric Consortium (U.S.), at http://www.biometrics.org/, including a list of sites, at http://www.biometrics.org/html/sites.html

Association for Biometrics (U.K.), http://www.afb.org.uk/, including a glossary, at http://www.afb.org.uk/public/glossuk1.html

International Biometric Industry Association (IBIA), at http://www.ibia.org/, including an FAQ, at http://www.ibia.org/faqs.htm


Biometrics Industry Materials

Biometix (2000-) 'Analysis Papers' The Biometric Resource Centre, at http://www.biomet.org/analysis_gen.html

Biometix (2000-) 'Industry Product Catalogue' The Biometric Resource Centre, at http://www.biomet.org/products.html

Bowman E. (2000) 'Everything You Need to Know About Biometrics', Identix Corporation, January 2000 (in PDF, 27K), at http://www.ibia.org/EverythingAboutBiometrics.PDF

Connecticut DSS Biometric Project, at http://www.dss.state.ct.us/digital.htm

PC Magazine (1999) 'Are yu ready for Biometrics" PC Magazine, 23 February 1999 , at http://www.zdnet.com/pcmag/features/biometrics/

Rand (2000) 'Biometrics: Will Digital Fingerprints, Iris Scans and Speaker Recognition Soon Replace Passwords and Personal Identification Numbers?', at http://www.rand.org/natsec/products/bionav.html

A site hosted at Michigan State University, including a list of suppliers



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 12 April 2001 - Last Amended: 15 April 2001 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/Biometrics.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy