Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2017
|Identity Matters||Other Topics||Waltzing Matilda||What's New|
Roger Clarke **
Version of 29 March 2004
Prepared for IPAA/NOIE, and included in a NOIE publication in September 2004
© Xamax Consultancy Pty Ltd, 2004
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://www.rogerclarke.com/DV/Wobble.html
A balance-beam is a device that enables a confident athlete to demonstrate their stability.
A wobble-board is a device that causes a nervous patient to flap their arms in a desperate attempt to achieve balance, which is largely symbolic, and usually fruitless.
Privacy is a legal right grounded in international human rights law. Lack of respect for it continues to hold back eBusiness.
Privacy has many dimensions, including privacy of the physical person, of personal behaviour, of personal communications, and of personal data. Privacy protection is vital, but because the privacy interest is one among many, privacy protection is an exercise in balance against multiple interests of the individual, of other individuals, of organisations, and of society as a whole.
In the context of eBusiness, the process of seeking appropriate balances needs to reflect the notion of 'trust'. Consumer Internet commerce continues to grow far more slowly than other Internet metrics, and eGovernment initiatives generate suspicion as well as interest. This is because neither privacy protection nor its importance are well enough understood, and because site-designs and merchant practices generate more distrust than trust; and hence the balances that are currently being imposed are not appropriate.
To achieve progress, a three-pronged approach is needed. One element is the reining in of privacy-invasive technologies ('the PITs'), and implementation of privacy-enhancing technologies ('PETs'). The second element is the replacement of the limited 'fair information practices' notions that underlie existing privacy legislation. This is necessary in order to deliver a regime that is much less facilitative of business, and much more protective of people. The third element is the implementation of effective consultation with stakeholders, and their involvement in the design and operation of eBusiness systems.
The purpose of this paper is to provide background to the concept of privacy as it relates to the unlocking of eBusiness, including both so-called 'B2C' eCommerce and eGovernment. The central theme is that privacy protection is a permanent balancing act, but one which has been performed very badly, which is still being performed very badly, and which is undermining trust by citizens and consumers in the institutions that they deal with.
The sections of the paper deal in succession with the following topics:
The origins of the privacy concept need to be traced along a number of lines. It emerged as a significant human value during the period of the rinascimento and the industrial revolution. By the end of the nineteenth century, learned justices in the United States were arguing that "the right to be let alone ... secures the exercise of extensive civil privileges".
Between 1920 and 1950, the genre of 'anti-utopian' novels described futures repugnant to humanity. The classic image of an information-rich government dominating citizens' thoughts and actions is associated with Zamyatin's 'We' of 1922 and Orwell's '1984' of 1948, although the technological basis of the surveillance culture had been established as early as the late eighteenth century by Jeremy Bentham's designs in 1791 for a model prison, incorporating the ubiquitous, all-seeing 'panopticon'. These initial warnings were stimulated by the spectre of authoritarian governments (variously fascist and communist) harnessing technology to their anti-democratic aims.
From about 1950 onwards, a gradual shift is discernible towards the perception that technology, or at least the application of technology, is a primary determinant of the directions of change. Expressions of concern in non-fiction literatures exploded between the mid-1950s and the mid-1970s. Many of the later works focussed on computer technology as the culprit. In 1975, Foucault used the symbol of the panopticon to argue that the prison metaphor was the leitmotiv of authoritarian society. By the turn of the 21st century, the collective term 'privacy-intrusive technologies' (the PITs) was being used.
Many analyses of privacy-intrusive behaviours and technologies are available. Key concepts include the 'information-intensity' of administration during the twentieth century, resulting in the collection, maintenance and dissemination of ever more data, ever more 'finely grained'.
The information-intensity phenomenon has arisen from the increasing scale of human organisations. This has created 'social distance' between them and their clients, and made them more dependent on abstract, stored data rather than on their employees' personal knowledge about their customers. Other factors have been an increasing level of education among organisations' employees, the concomitant trend toward 'scientific management' and 'rational decision-models', and, particularly since the middle of the twentieth century, the brisk development in information technologies to serve the needs of data management. Decision-making by large organisations about individuals has largely ceased to be based on personal knowledge, judgement and trust, and is based almost entirely on data and business rules.
The primary approach to privacy is to perceive it as a fundamental human right. It is expressly recognised in the key international instruments, most fundamentally Article 12 of the Universal Declaration of Human Rights (UDHR) and Article 17 of the International Covenant on Civil and Political Rights (ICCPR). It is also embedded in the Constitutions of various countries.
A secondary approach that surfaces from time to time is to conceive of a property right in personal data. Proposals are continually re-cycled, particularly in North America, for the creation of such a right, which would vest in the individual concerned, and which would be tradeable. The notion is an invention of economic rationalists, and would benefit corporations by enabling them to arrange for individuals to contract out of their legal rights.
In Australia, constitutional protection for human rights is extremely limited. The 1973 Morison Report suggested that it is more convenient for the purposes of analysis to approach privacy as an important interest that humans have. Hence, according to Morison:
Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations.
Whether approached as a right or as an important interest, it has several dimensions:
Much of the discussion and analysis that purports to address 'privacy' is actually limited to the fourth important sub-set of issues:
Information Privacy or Data Privacy is the interest an individual has in controlling, or at least significantly influencing, the handling of data about themselves.
Another weakness in much of the discussion and analysis of privacy issues is the implicit assumption made by many people that the justification for privacy protections is based entirely on the needs of individuals. It is clear that individuals have strong psychological needs for privacy. But there are also strong social needs for individuals within society to enjoy privacy protections, because without them social, artistic, scientific and economic innovation are severely constrained. Moreover, there is an urgent political need for privacy, to ensure that democratic freedoms are not infringed by covert and overt pressure on those members of the population who are politically active.
Privacy has to be balanced against many other, often competing, interests. Hence:
Privacy Protection is a process of finding appropriate balances between privacy and multiple competing interests.
There are a number of significantly different categories of conflicts among rights. Important among them are:
The recent dominance of 'economic rationalist' thought has been elevating the interests of organisations to a plane that competes with those of individuals and society. Based on purely economic arguments, the interests of creditors, insurers and even marketers, are being permitted to dominate those of people.
A later section considers the processes whereby balance can be achieved. Firstly, however, it is necessary to review the origins and nature of existing privacy laws, and the extent to which trust in eBusiness has been achieved.
It does not appear that any country has enacted laws to protect privacy as a whole. In Australia, as elsewhere, privacy of the person, of personal behaviour and of personal communications are subject to an array of accidental and incidental protections which lack any underlying philosophy, and which leave enormous, unregulated gaps. Privacy of personal data, on the other hand, has been the subject of a wave of legislation worldwide, commencing in the 1970s. This section traces the origins and trajectory of that movement.
The 'Fair Information Practices' model of information privacy protection emerged between 1965 and 1975. It focuses on 'data protection', i.e. the protection of data about people, rather than of people themselves. Important early activity in the United States included publications by Westin in 1967 and 1974, and an Advisory Committee to the then Department of Health Education and Welfare (HEW) in 1973.
In those early years of personal data systems, the dominant school of thought, legitimised by Westin's publications, was that the invisible economic hand guiding business and government activity would ensure that information technology did not result in excessive privacy invasion. Hence privacy regulation was unnecessary, or, to the extent that it was imposed, it was essential that the detrimental effects on business and government be minimised.
Following the Westin line, the U.S. Congress passed a Privacy Act in 1974, which regulated federal government agencies. A report on early experiences is to be found in the 1977 Report of the Privacy Protection Study Commission. The efforts of the U.S. Administration succeeded in emasculating the legislation and the report. No generic legislation has been passed to regulate the private sector. The U.S. Congress and the legislatures of the 50 American States have since passed several hundred laws, in a scattergun approach reactive to current issues and devoid of an underlying philosophy.
Activities in Europe proceeded in parallel with those on the other side of the Atlantic. Burkert reported in 1999 that a 1970 edition of Westin's 1967 publication was available in German translation in the same year, courtesy of IBM, and significantly influenced developments in Germany. This reflects the close association that has always existed between Westin's work and the needs of business and government. The German Land of Hesse in 1970 was the first to enact legislation. Sweden passed the world's first national legislation in 1973. Most other western European countries and states followed during the next decade.
Concern quickly arose that inconsistencies among the regulatory regimes in European countries might become a restraint on trade. To address this risk, the OECD codified the FIPs-based regime in a set of Guidelines. Although they are non-binding, they have greatly influenced all subsequent national and sub-national legislation within and beyond the OECD's membership. They remain the primary reference-point for discussions.
The OECD work was expressly not an attempt to flesh out more general documents concerning human rights. The prime concern was to "advance the free flow of information between Member countries and to avoid the creation of unjustified obstacles to the development of economic and social relations among Member countries" (at p.7). The concern to ensure that member-countries had a clear statement of international expectations regarding privacy protection was quite secondary to the facilitation of international business. The dominance of economic over social interests is embedded in FIPs regimes.
From a privacy protection viewpoint, the OECD Guidelines have many further deficiencies. Their formulation was constrained by the need to leave existing legislation unaffected, and their wording reflected the need for cross-cultural comprehensibility. And they are a late 1970s codification of early- and mid-1970s statutory language which reflected inadequate understanding of technologies of the 1960s.
The European Union, in seeking a greater degree of harmonisation among its member-nations, has moved well beyond the OECD requirements. It has published a succession of Directives, the most important being that of 1995, which came into effect in October 1998.
The EU Directives created a real prospect of disadvantage for U.S. corporations' activities in Europe. The U.S. Administration has fought hard to sustain a 'self-regulatory' regime, code-named 'safe harbor', whose purpose is to enable American corporations to continue to be substantially unfettered in their use of data about people worldwide. Free Trade Agreements, including that being arranged with Australia, are another vehicle whereby the U.S. Administration is imposing its corporations' interests on other countries. The tension continues, especially in the context of the Internet.
In Australia, a call to arms was issued in the Boyer Lecture Series of 1969, by a subsequent Governor-General, Zelman Cowen. The Morison Report was provided to the Commonwealth and State Attorneys-General in 1973, but only one of the nation's nine jurisdictions acted upon it. A study by the Law Reform Commission study dragged on for more than 6 years from 1976 to 1983, and was published so late that the momentum had been lost.
Australia acceded to the OECD Guidelines in 1984. Legislation affecting the public sector was passed only in 1988, and even then only as a byproduct of a failed attempt to instigate a national identification scheme. Legislation affecting the private sector was passed only in 2000. It is the most extreme instance of a law designed to facilitate business rather than protect privacy. It provides extremely weak protections, and legitimises privacy-invasive activities by corporations. Despite its nominal responsibilities in relation to the private sector, the Office of the Privacy Commissioner was provided with no net resources beyond that which it had prior to the law being enacted. Enforcement activities are, to say the least, muted.
Meanwhile, since 2002, the U.S. and Australian Governments have been utilising the Asia-Pacific Economic Cooperation forum as a means whereby Australian privacy protections can be lowered even further. Draft 9 of the APEC Privacy Principles falls far, far below the already inadequate benchmark set by the OECD Guidelines. This manoeuvre has been countered by an emergent Asia-Pacific Privacy Charter.
Exploitation of eBusiness opportunities depends on all parties having sufficient confidence to deal with one another using the inevitably somewhat mysterious medium of electronic communications. The success of 'B2C' eCommerce is dependent on trust by consumers in distant suppliers. In the eGovernment arena, agencies are seeking increasing amounts of information from their clients. Some people give that information up willingly, some feel threatened, and some are active opponents, for reasons that range from the noble to the criminal. Trust therefore plays a critical role in government-with-citizen eBusiness as well as eCommerce.
Trust involves a complex of factors including consumer rights, freedom of expression and social equity; but importantly also privacy. This section examines the concept of 'trust', and the extent to which citizen/consumer trust in eBusiness is being encouraged or undermined.
The behaviour of people and organisations takes place in conditions of uncertainty. It is never clear what the future will bring; and there are often factors that are in principle knowable, but in practice are unknown because the cost or time involved in gathering and assimilating the information is too great.
When one party is dependent on the behaviour of another party, the uncertainties give rise to risks. These risks may be managed in various ways, such as through warranties and insurance; but it is unusual for the risks to be entirely covered. Each party has little option but to tolerate the residual risk inherent in the transaction. The notion of trust involves having confidence in the other parties, and hence having an expectation that the risks are unlikely to result in loss:
Trust is confident reliance by one party about the behaviour of other parties
The basis of trust can be sought in the branch of philosophy called ethics, and the topic is frequently discussed in the context of social and democratic processes. The origins of the concept are in family and social settings: in its fundamental form, trust is associated with cultural affinities and inter-dependencies.
In organisational settings, on the other hand, cultural affinity is limited. In governmental contexts, the individual is generally forced into a relationship with an agency. In commercial contexts, the individual is dealing with a business enterprise that has advantages over them, in the forms of scale, resources, information and expertise. Even small, unincorporated business enterprises have an evident economic incentive to maximise their profit at the expense of the other party. In the case of corporations, the incentive has been institutionalised, through the legal requirement that Directors and employees make decisions in the best interests of the organisation, not of the parties it deals with. As a result, trust in the context of business is not grounded in culture, but is merely what a party has to depend on when no other form of risk amelioration strategy is available.
Situations arise in which a party is effectively compelled to enter into a transaction, and rely on the other party. In such circumstances, trust is of no great import. In other situations, either or both parties may have a real choice as to whether or not to conduct the transaction. It is in these circumstances that trust is a vital factor in determining whether eBusiness will take place.
There is a variety of bases on which trust can be founded. The most straightforward and effective is a direct relationship, such as kinship, membership of the same cultural group, and mateship. In the context of economic rather than social motivations, formalised relationships have been devised to mimic kinship, and hence to engender trust by one party in another party. A key example is a principal-agent relationship, which may be entered into with a broker, or an accountant or solicitor. The agent is bound in law to act on the principal's behalf and in their best interests. Another such relationship is a contract, whereby the terms of the relationship are defined, and contingencies addressed. The parties consciously provide undertakings to one another to perform specific acts, and thereby gain legally recognised rights and responsibilities.
A less formal approach, but still potentially effective means of generating trust is a relationship that is based on multiple prior transactions that have been conducted in an appropriate manner, resulting in a mutual perception of loyalty between the parties. An intermediate step along the way to having a relationship is the state of having conducted a prior transaction, or having prior exposure in the form of vicarious experience (e.g. having watched someone else conduct transactions). A proxy for a prior transaction is a trial transaction or a demonstration.
A less effective form is 'referred trust', which arises as a result of a reference being provided by someone else. This involves the notion of 'transitivity' of trust, i.e. if I trust a party, do I therefore trust a further party that they recommend, endorse, or introduce me to? The basic form of referred trust is 'word-of-mouth'. The idea of reputation is a generalised form of 'word-of-mouth referred trust'. The individuals whose combined evaluation is depended upon extend beyond a party's own, direct contacts, because reputation arises within a social network. Innovation diffusion theory explains adoption processes in large measure on the basis of communications through particular channels, over time, among the members of a social network.
An institutionalised form of reputation is accreditation, whereby a party relies on some trusted organisation (such as a licensing or registration board) to permit registration by, or only grant licences to, parties that satisfy a set of conditions, and that are affirmed by audit to continue to satisfy them.
The weakest form of reputation is that asserted through images of trustworthiness. A brand is a logo or word that an organisation projects as a signifier for reputation. Although brands are nominally based on quality, relevance and consistency of product, in practice branding activities focus very heavily on image-manufacture, manipulation and maintenance. Where no relationship is developed, and slippage occurs in the quality, relevance or consistency, trust based on branding alone is brittle.
During recent decades, branding has been applied to accreditation as well. The result has been meta-brands. Examples include ISO9001 seals of approval, and in the eCommerce arena, TrustE and WebTrust. These provide a veneer of registration and audit, but are in practice focussed almost entirely on the building and maintenance of image.
Given the complexities involved, it is no surprise that the quality of an act of trusting may be reasonable, or not. In particular, trust may be:
Trust by consumers of marketers is not easily gained, and very easily lost. Unfortunately, during the Internet's first decade, a great many of the activities of governments, and especially of business enterprises, have actively worked against trust of eBusiness by consumer/citizens.
The contemporary view is that a corporation has a simple objective function: maximise shareholder value. That objective function is subject to constraints, such as the conflicting interests of the corporation's agents, particularly its employees, environmental impacts, and interference from regulators, consumers, and the public generally. Large corporations, which have achieved market power, are required by law to exercise that market power over the organisations and people they deal with. An important corollary of the pursuit of shareholder value and the exercise of market power is that consumers are quarry. The language of marketing and selling attests to that, in such words as campaigns, targets, suspects and customer acquisition. The word 'quarry' is usefully ambiguous: consumer data is 'fair game', wherever it may be acquired from; and consumer data is there to be mined, or 'quarried'. The data's purpose is to achieve efficiency in marketing communications - efficiency from the corporation's viewpoint.
For most of the twentieth century, consumer marketing used broadcast media such as newspapers, radio, television and billboards. This resulted in a culture of one-way mass marketing. The alternative of 'direct' marketing involved acquisition by the marketer of data about the consumer, frequently on a non-consensual basis, and use of that data to selectively communicate to the consumer, and thereby manipulate the consumer's behaviour.
In the case of B2C eCommerce, marketers have continued this 'consumer as quarry' attitude. The very expression 'B2C' is part of the problem, because it expressly conveys one-sided 'push'. Internet marketing is suffering the legacy of a succession of inappropriate and largely unsuccessful manoeuvres attempted during the second half of the 1990s.
Against the ravages of technology-driven privacy invasion, natural defences such as inertia have proven inadequate. Data is increasingly collected and personalised. Storage technology ensures that it remains available. Database technologies make it discoverable. And telecommunications enables its rapid reticulation. Organisations are only faintly restrained by professional and industry association codes.
Cost-constraints continue to diminish rapidly. In any case, economic limitations have simply not acted as a constraint on the explosion in the use of personal data. Westin asserted in the 1960s that no regulation was needed because privacy-invasive behaviour would be automatically self-correcting. That may have been convenient for business and government; but it was clearly wrong. Government programs are continued even after they have been clearly demonstrated to be financially unjustifiable; and in the private sector many highly privacy-invasive practices are routinely undertaken by corporations precisely because they are economic, from the organisation's own, limited perspective. Companies' interest in efficiency dominates consumer interests, and hence corporate power is exercised over people.
Exercise of countervailing power by individuals has had limited impact. The most effective 'campaign' to date has been the public's apathy and inaction: one of the major reasons for the slow adoption of eBusiness is the serious lack of trust by consumers and small business in corporations and governments. There is a need for change.
Foundations are required, on which trust can be built, and the poor start overcome.
One approach is consumer-oriented. eBusiness typically involves significant risk for the party that delivers or performs first and is hence exposed to the possibility of default by the other party. An important focus therefore needs to be on safeguards that address various contingencies, combined with clear statements of the residual risks borne by the consumer.
Enforcement is entirely lacking, but useful guidance is available in relation to consumer-friendly principles and technologies, and to privacy aspects of trust in the context of consumer Internet commerce.
The conditions precedent for trust have to do with laws, detailed codes of conduct, sanctions, and enforcement mechanisms, and they are needed in relation to privacy just as much as consumer rights. Trust has to be based on infrastructure that the consumer has, is familiar with, and is confident in. Assurances of security have to be built into that infrastructure, and not just technical security, but also commercial security and data security. And there must be relatively few 'bad news' stories, and rapid action to address not just the instances, but also the root causes.
The preceding sections have examined privacy, privacy protection, the nature of existing privacy laws, and trust. The final sections offer suggestions as to what is needed in order for progress to be achieved.
Various proposals for change have been mooted. Questions are raised from time to time as to whether privacy is dead, or should be dead. Proponents of this notion perceive that the public is untrustworthy, and that the benefits of modern society must only be granted in return for substantially enhanced organisational access to personal data. Rationalist economic precepts may support this idea, and the business efficiency criterion may be axiomatic to corporate executives, and may even be convincing to government officials steeped in the same ideology. But the proposition may not be attractive to their citizen-consumer adversaries.
Suggestions have also been made that the technological imperative is irresistible; and that privacy protections are futile. The primary proponent of this argument, 'sci-fi' author David Brin, believes that privacy can only be sustained by focussing instead on freedom of information for everyone: if privacy through secrecy is no longer tenable, achieve privacy through openness. But Brin's argument is based on the premise that the watchers will not exercise political power in order to preclude others from watching them. The political history of societies suggests otherwise.
Further suggestions have been made to the effect that corporate innovation can solve the problem. According to this line of thought, technological measures can fill the void left by the reluctance of parliaments and governments to provide formal, legal protections; self-regulation can somehow ensure balanced behaviour by powerful corporations; and codes of conduct and trademarks established by industry associations can make a difference to corporate behaviour. But these claims do not correspond with the citizen/consumers' experiences.
Alternatives to privacy protection cannot deliver what is needed right now, which is a solution to the crisis in public confidence in the use of information technology by governments and corporations that is being brought to a head by the rapid growth and far-reaching impact of the Internet.
Appropriate balances need to be found. And because of rapid change in business processes, technologies and public expectations, social processes need to be in place to continually re-adjust those balances.
To date, when decisions have been made about privacy, there has been a strong tendency for privacy to be compromised, but for the interests of governments and corporations to be left untouched. This is not balance at all, but simply represents the subjugation of social needs to economic efficiencies as perceived by powerful organisations.
If trust is to be achieved, and progress is to be made in eBusiness, then processes are needed that actively seek a balance among interests, and that value privacy highly. Inevitably, the balancing process is political in nature, involving the exercise of power deriving from authority, markets, or any other available source. Consumer/citizens may be politically weaker than the other players, but a balanced outcome cannot be achieved in a context in which they are excluded from discussions and decision-making about system designs.
The degree of trust by consumer/citizens that is necessary to support eBusiness has not been achieved. If there is to be real progress in eCommerce and eGovernment, then a positive approach needs to be taken to the structuring of privacy protections. A three-pronged attack is necessary, addressing technological, legal and organisational aspects.
Privacy-invasive technologies (the PITs) need to be exposed, documented and analysed. Designs using them must be negotiated among affected parties rather than just imposed by scheme sponsors. This requires:
In addition, privacy-enhancing technologies need to be developed, promoted and applied. Some PETs are countermeasures against PITs, others support strong anonymity, and yet others offer better balance between privacy and accountability, by means of protected pseudonymity.
For a half-century, organisations have been collecting, storing, using and disseminating ever more data, ever more 'finely grained'. FIPs legislation has facilitated more 'information-intensive' relationships between organisations and individuals, in return for quite limited constraints on organisational behaviour in relation to personal data.
Technologies have increased enormously in sophistication, in capacity, and in intrusiveness. Convergence has taken place between computing, telecommunications and robotics, to deliver what is currently referred to as 'information technology'. Identifiers, automated data collection devices, transponders, locator mechanisms, biometrics and RFID tags have been developed and are being deployed.
During the last few years, corporations and industry associations have had considerable success in lobbying for adjustment of Australian laws away from the European 'data protection law' interpretation of FIPs towards the much more corporation-friendly U.S. 'self-regulatory' interpretation. There are encroachments of the U.S. anti-privacy 'opt-out' philosophy, which is diametrically opposed to the 'opt-in'/consent-based approach that is fundamental to the OECD Guidelines and to the data protection laws of almost all countries that have implemented data protection laws. Another example is the joint U.S.-Australian attempt to use APEC as a means of watering down the interpretation of the OECD Guidelines to be used in the region.
These manoeuvres are seriously out of touch with the reality of consumer/citizen grievances. The OECD Guidelines were conceived in a period when information technology was far less mature, powerful, integrated and threatening than it is now, and they were in any case inadequately protective when they were formulated. In order to attract greater adoption of Citizen-with-Government and Consumer-with-Business net-based services, existing FIPs-style 'data protection' laws need to be re-visited, and greatly strengthened.
The public senses the inadequacies of existing protections and the great change in the balance of power in favour of governments and corporations. Public interest advocates clamour for more substantial protections. The precepts on which privacy-protective infrastructure for the twenty-first century needs to be built must transcend the limited principles of 'fair information practice'. In particular, additional principles are emergent in such areas as anonymity and pseudonymity, limitations on multiple use of identifiers, controls over the application of biometrics, mandatory privacy impact assessments, public justification for privacy-invasive systems and practices, and effective implementation of consents and denials.
A credible co-regulatory environment would have a number of characteristics. It would involve general principles that were applied somewhat differently depending on the circumstances. Dispute resolution procedures would operate at the levels of individual organisations and industry associations, in order to focus on problem-solving first and arbitration second. Effective sanctions would be specified, and access provided to quasi-judicial (tribunal) and court enforcement procedures, in order to bring misbehaviour under control. The regime would be tied together with formal legislative obligations and sanctions, and a regulator vested with workable statutory powers and provided with the necessary resources.
Corporations and government agencies have taken far too long to recognise that privacy is a strategic variable, and that it needs to be approached in that light. It is a factor that can be ignored or handled badly and become an impediment to eBusiness, or it can be appreciated and addressed and become a positive feature of business relationships and processes.
It is remarkable that, in an era in which public policy discussions talk of 'stakeholders', individuals and their representatives and advocates are largely excluded from participation in the development of large-scale schemes involving inherently privacy-invasive technologies.
Meaningful consultative processes are critical. The involvement needs to be longitudinal, commencing when the project is being framed, continuing through the analysis of requirements, and the conceptual and detailed design, and on through the implementation and operation phases.
Old-style Environmental Impact Statements (EIS) have matured into Environmental Impact Assessment (EIA) processes, which require consultation and interaction rather than the mere publication of a document sculpted so as to justify approval of the project. These ideas have been broadened in the direction of Social Impact Assessment, encompassing additional aspects such as service quality assurance, accessibility, equity and privacy.
Privacy Impact Assessment (PIA) is now a legal requirement in many countries in Australia's reference group. PIAs are most effective when they build confidence through ongoing linkages among the stakeholder groups. This speaks for standing arrangements that involve relevant public interest representative and advocacy groups.
Adoption of eCommerce and public acceptance of eGovernment depend on more than 24/7 service availability and a 'value proposition'. Privacy will remain a major impediment until and unless agencies commit to it as a strategic variable that significantly influences the conception, requirements analysis, design and operation of their business processes and supporting information systems.
Privacy has always been about balance and trade-off. But the approach taken to privacy protection to date in Australia has not produced the positive image of a confident athlete on a balance-beam. Instead, it has all the uncertainty and instability of a patient on a wobble-board.
Information law demands the formalisation of balancing processes between, on the one hand, freedoms to know, to publish and to express, and, on the other, freedoms to be, to hide, and to deny. The information economy is dependent on trust, trust has to be earned, and intrusion-permissive and intrusion-enabling arrangements undermine trust.
Privacy is both sustainable and a necessary focal point of the information society, fundamentally as a means of resisting the commoditisation of human beings, but also as a means of enabling eBusiness. Industry self-regulation and the development and application of privacy-enhancing technologies are necessary, but they are not sufficient.
PITs need to be controlled, and PETs need to be stimulated. Strong privacy-protective laws are essential. In order to cope with the last quarter-century's dramatic enhancements to the capabilities and capacity of information technology, the principles underlying those laws must extend well beyond the outdated set codified in the 1980 OECD Guidelines. The addition of stakeholder consultation and participation to the technical and legal protections can ensure that consumer/citizens perceive their interests to have been reflected in the ongoing balancing act that is privacy protection. Privacy protections can thereby underpin the trust that is essential to effective eBusiness.
Rather than building references into the text, the approach taken in this paper has been to provide a comprehensive set of resources underlying the various sections. Some of the resources are pointers to original sources. Many are papers by this author on specific topics, many of which provide references to the broader literature, in some cases in significant quantity.
Clarke R. (1988) 'Information Technology and Dataveillance' Comm. ACM 31,5 (May 1988) Re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, at http://www.rogerclarke.com/DV/CACM88.html
Clarke R. (1993) 'A 'Future Trace' on Dataveillance: The Anti-Utopian and Cyberpunk Literary Genre' Xamax Consultancy Pty Ltd, March 1993, at http://www.rogerclarke.com/DV/NotesAntiUtopia.html
Clarke R. (1994) 'Information Technology: Weapon of Authoritarianism or Tool of Democracy?' Proc. World Congress, Int'l Fed. of Info. Processing, Hamburg, September 1994, at http://www.rogerclarke.com/DV/PaperAuthism.html
Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Info. Technology & People 7,4 (December 1994), at http://www.rogerclarke.com/DV/HumanID.html
Clarke R. (1997) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' Xamax Consultancy Pty Ltd, August 1997, at http://www.rogerclarke.com/DV/Intro.html
Clarke R. (1998) 'A History of Privacy in Australia - Context' Xamax Consultancy Pty Ltd, December 1998, at http://www.rogerclarke.com/DV/OzHC.html
Clarke R. (1998) 'Information Privacy On the Internet: Cyberspace Invades Personal Space' Telecommunication Journal of Australia 48, 2 (May/June1998), at http://www.rogerclarke.com/DV/IPrivacy.html
Clarke R. (1999) 'Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice' Proc. User Identification & Privacy Protection Conf., Stockholm, 14-15 June 1999, at http://www.rogerclarke.com/DV/UIPP99.html
Clarke R. (2000) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century' Xamax Consultancy Pty Ltd, January 2000, at http://www.rogerclarke.com/DV/PP21C.html
Clarke R. (2001) 'While You Were Sleeping ... Surveillance Technologies Arrived', Australian Quarterly 73, 1 (January-February 2001), at http://www.rogerclarke.com/DV/AQ2001.html
Clarke R. (2001) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, at http://www.rogerclarke.com/DV/PITsPETs.html
UDHR (1948) The Universal Declaration of Human Rights' United Nations, 1948, at http://www.un.org/Overview/rights.html
ICCPR (1996) 'The International Covenant on Civil and Political Rights' United Nations, at http://www.unhchr.ch/html/menu3/b/a_ccpr.htm
APEC (2004) 'APEC Privacy Principles; Version 9 (consultation draft)' 27 February 2004, at http://www.bakercyberlawcentre.org/appcc/apec_draft_v9.htm
APF (1998-) 'Commonwealth Privacy Law - Commonwealth' Australian Privacy Foundation Resource-Pages, at http://www.privacy.org.au/Resources/PLawsClth.html
APF (1998-) 'Commonwealth Privacy Law - States and Territories' Australian Privacy Foundation Resource-Pages, at http://www.privacy.org.au/Resources/PLawsST.html
APF (1998-) 'Commonwealth Privacy Law - World' Australian Privacy Foundation Resource-Pages, at http://www.privacy.org.au/Resources/PLawsWorld.html
APF (1998-) 'Commonwealth Privacy Law - International' Australian Privacy Foundation Resource-Pages, at http://www.privacy.org.au/Resources/PLawsIntl.html
Clarke (1989) 'The OECD Data Protection Guidelines: A Template for Evaluating Information Privacy Law and Proposals for Information Privacy Law' Xamax Consultancy Pty Ltd, April 1989, at http://www.rogerclarke.com/DV/PaperOECD.html
Greenleaf G.W. (2004) 'Criticisms of the APEC Privacy Principles (Version 9), and recommendations for improvements', Working Paper, March 2004, at http://www2.austlii.edu.au/%7Egraham/publications/2004/APEC_V9_critique/APEC_V9_critique.html
OECD (1980) 'Guidelines on the Protection of Privacy and Transborder Flows of Personal Data', Organisation for Economic Cooperation and Development, 1981, at http://www.oecd.int/document/20/0,2340,en_2649_201185_15589524_1_1_1_1,00.html
Clarke R. (2001) 'Privacy as a Means of Engendering Trust in Cyberspace' UNSW L. J. 24, 1 (July 2001) 290-297, at http://www.rogerclarke.com/DV/eTrust.html
Clarke R. (2001) 'Trust in Cyberspace: What eCommerce Doesn't Get' Proc. U.N.S.W. Continuing Legal Education Seminar on 'Cyberspace Regulation: eCommerce and Content', 25 May 2001, Sydney, at http://www.rogerclarke.com/EC/TrustCLE01.html
Clarke R. (2001) 'Of Trustworthiness and Pets: What Lawyers Haven't Done for e-Business' Proc. Pacific Rim Computer Law Conf., Sydney, February 2001, at http://www.rogerclarke.com/EC/PacRimCL01.html
Clarke R. (2001) 'Meta-Brands' Privacy Law & Policy Reporter 7, 11 (May 2001), at http://www.rogerclarke.com/DV/MetaBrands.html
Clarke R. (2002) 'Trust in the Context of e-Business' Internet Law Bulletin 4, 5 (February 2002) 56-59, at http://www.rogerclarke.com/EC/Trust.html
Clarke R. (2002) 'e-Consent: A Critical Element of Trust in e-Business' Proc. 15th Bled Electronic Commerce Conf., Bled, Slovenia, 17-19 June 2002, at http://www.rogerclarke.com/EC/eConsent.html
Treasury (2000) 'Building Consumer Sovereignty in Electronic Commerce: A best practice model for business', Department of the Treasury, May 2000, at http://www.ecommerce.treasury.gov.au/publications/BuildingConsumerSovereigntyInElectronicCommerce-ABestPracticeModelForBusiness/index.htm
Clarke R. (1992) 'Extra-Organisational Systems: A Challenge to the Software Engineering Paradigm' Proc. IFIP World Congress, Madrid, September 1992, at http://www.rogerclarke.com/SOS/PaperExtraOrgSys.html
Clarke R. (1996) 'Privacy and Dataveillance, and Organisational Strategy' Proc. Conf. I.S. Audit & Control Association (EDPAC'96), Perth, 28 May 1996, at http://www.rogerclarke.com/DV/PStrat.html
Clarke R. (1997) 'Exemptions from General Principles Versus Balanced Implementation of Universal Principles' Xamax Consultancy Pty Ltd, February 1997, at http://www.rogerclarke.com/DV/Except.html
Clarke R. (1998) 'Privacy Impact Assessments' Xamax Consultancy Pty Ltd, February 1998, at http://www.rogerclarke.com/DV/PIA.html
Clarke R. (1999) 'Internet Privacy Concerns Confirm the Case for Intervention', Communications of the ACM 42, 2 (February 1999), at http://www.rogerclarke.com/DV/CACM99.html
Clarke R. (2000) 'Beyond the OECD Guidelines: Privacy Protection for the 21st Century' Xamax Consultancy Pty Ltd, January 2000, at http://www.rogerclarke.com/DV/PP21C.html
Clarke R. (2003) 'Emergent Privacy Protection Principles' Xamax Consultancy Pty Ltd, April 2003, at http://www.rogerclarke.com/DV/EPPP.html
Clarke R. (2004) 'A History of Privacy Impact Assessments' Xamax Consultancy Pty Ltd, February 2004, at http://www.rogerclarke.com/DV/PIAHist.html
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., a Visiting Professor in the E-Commerce Programme at the University of Hong Kong, and a Visiting Professor in the Department of Computer Science at the Australian National University.
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.
Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916
Created: 6 January 2004 - Last Amended: 29 March 2004 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/Wobble.html