Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2019
|Identity Matters||Other Topics||Waltzing Matilda||What's New|
For presentation to a Session on 'Privacy from the Informatics Point of
to the Interdisciplinary Conference on 'Privacy and Freedom'
at Bielefeld University - 4-5 May 2015
Notes of 13 April 2015
Roger Clarke **
© Xamax Consultancy Pty Ltd, 2015
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://www.rogerclarke.com/DV/Biel15.html
The accompanying slide-set is at http://www.rogerclarke.com/DV/Biel15.pdf
An international framework for freedoms exists in the form of an International Covenant (ICCPR). Although this includes no fundamental right of individual self-determination, a set of rights and freedoms associated with privacy provides a substantial basis for individual freedom.
Privacy is, however, a complex construct. One of its dimensions, data privacy, has attracted a great deal of attention since the 1960s; but data protection laws, even in Europe, provide incomplete protections of the real human need. Three further dimensions - privacy of the person, of personal communications, and of personal behaviour - are subject to a scatter of protections that lack comprehensiveness and cohesion. A fifth dimension - privacy of personal experience - has come to prominence only during the last decade.
Information technology has been utilised by organisations to intrude mightily into all dimensions of privacy. Underpinning the many forms of surveillance are the capabilities to identify individuals, to locate them, and to track them.
Achieving societal resilience against this onslaught requires precautionary and preventative actions during the formative phases of new privacy-intrusive applications of technology. Where schemes proceed, further measures are needed, to ensure resistance, robustness and the resilience to recover freedoms after the repressive measures have been overcome. In addition to being a weapon of privacy-abuse, information technology has the capacity to support societal resilience, and to sustain freedom and privacy.
This event has as its focus the nature and importance of privacy as an element of civic freedoms. The event is intended to contribute to a research program which is initially examing the interactions among privacy, freedoms and information technology, and which is to ultimately assess their role in ensuring the survival of political freedom.
This paper commences by reviewing the notion of freedom, including the basis for protections of freedom provided by international law. It then addresses privacy, identifying the motivations for it, and its dimensions, and the nature of privacy protection. It notes the negative impacts of information technologies on freedoms and privacy. It then identifies important aspects of the design of pro-privacy and privacy-sensitive applications of technology.
Freedom in the context of this paper refers to the scope for people to enjoy a substantial degree of liberty, subject only to those constraints that are necessary in order to sustain the freedoms of others, social and political cohesion and economic wellbeing and sustainability.
Freedom encompasses both positive human rights - to express views and to perform acts - and negative human rights - in the form of freedom from unreasonable external constraints on thoughts, words and deeds. The term 'rights' refers to legal, social, or moral freedoms to act or refrain from acting, or entitlements to be acted upon or not acted upon. In practical terms, the protection of a person's freedom depends on the rule of law, laws that give rise to rights, and enforcement of those laws. Further discussion is in Clarke (2011d).
It is conventional to distinguish rights to liberty and participation in political life - addressed in the Universal Declaration of Human Rights (UDHR 1948, at Articles 3-21), from rights to equality (UDHR Articles 22-27), and from further rights that have received less broad international acceptance.
The International Covenant for Civil and Political Rights (ICCPR 1966) provides a framework for the first category of rights that is recognised in international law. The rights to equality, on the other hand, are expressed in the less widely adopted International Covenant on Economic, Social and Cultural Rights (ICESCR 1966). A summary of the rights recognised within the ICCPR and ICESCR is provided in AHRC (2015), and reproduced in Appendix 1. ICCPR rights that are of particular relevance to the analysis conducted here are highlighted in the following section.
Privacy is a pivotal value, which is reflected in a dozen ICCPR Articles, and which underpins many of the rights that are constituents of freedom. Further discussion is in Clarke (2014h). The following sections provide a working definition of the term, and then probe deeper into the motivations underlying it, the dimensions of the privacy notion, and the nature of privacy protections.
The philosophical notion of self-determination is associated with the idea of 'free will', and refers to the capacity of an individual to make their own decisions about their own behaviour. A legal framework that strongly valued individualism would have individual self-determination at its core. However, no such right has been formalised in human rights documents. The concept of 'self-determination of peoples' is expressed in ICCPR Article 1, but that has to do with the capacity of a (voting) populace within a nation-state to determine that country's political, economic, social and cultural activities.
Although individual self-determination is not currently a legal right, a more qualified form of freedom exists, in the form of privacy. The meaning and scope of the term 'privacy' is subject to endless, and frequently fruitless, debate. Jurisdictions around the world have generally avoided even attempting a legal definition of it. A practical approach is as follows (Morison 1973, Clarke 1997-):
Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations
This recognises the 'interest', but leaves open the ways in which support for that interest is provided by specific freedoms and legal rights. The following sections investigate relevant aspects of the notion and the associated rights.
In order to reach more deeply into the practical meaning of privacy, it is important to appreciate the reasons why people want it. The motivations for privacy are apparent at multiple levels. Analyses can be pursued from such perspectives as religion, philosophy and artistic expression. However, the perspectives in Table 1 are those most commonly encountered (Clarke 1997-, 2006b).
This presentation is concerned with the full spread of motivations for privacy. In addition, it is intended to lay the foundation for further work on the specific topic of political freedom.
Privacy is not a simple concept, but rather a construct of considerable complexity. The diversity of motivations outlined above is associated with a similar degree of variety in what I refer to as 'dimensions' and some other authors prefer to call 'aspects' or 'types' (Finn et al. 2013). It is important to distinguish the following dimensions in order to ensure that discussions about privacy invasiveness and privacy protections are not limited to one or a few elements of human needs, but are comprehensive. For discussions of these dimensions, see Clarke (1997-, 2006b, 2014h).
The deepest-seated need is for privacy of the physical person. A person who is in danger, wet and cold, or seriously hungry, does not have the luxury of worrying about needs higher up the Maslowian hierarchy. On the other hand, people in many societies enjoy pleasant living conditions, and place considerable value on four further dimensions of their privacy, for psychological, social, economic and political reasons.
This dimension of privacy is subject to a number of formally-recognised freedoms, including the right to life (ICCPR Article 6), right to freedom from torture and non-consensual experimentation (7), freedom from slavery (8), liberty and security and freedom from arbitrary arrest and detention (9), the right to humane treatment (10), and equality before the law (14).
Individuals have an interest in being able to communicate with people who they choose to interact with, without others being privy to the conversation. Speech has always been at risk of being overheard, and since at least the early days of the telegraph in the 1840s, messages have been subjected to electronic interception. Moreover, unlike earlier forms of surveillance, electronic interception is usually conducted covertly. The Snowden revelations have caused far more people to appreciate that invasions of the privacy of human communications have reached epidemic proportions and that law enforcement agencies see interception capacity as an entitlement.
This dimension of privacy is subject to a number of formally-recognised freedoms, including freedom from arbitrary or unlawful interference with correspondence (ICCPR 17), freedom of conscience and religion (18), freedom of expression and the right to seek, receive and impart information and ideas (19), and the rights to peaceful assembly (21) and freedom of association (22).
There was limited need to assert rights to data privacy while personal data was scattered across discrete physical records and read only by the few people who could gain access to each of them. Since the application of computing technologies to administrative data commencing about 1955, however, the privacy of personal data has also been subject to a rapidly-rising crescendo of intrusive behaviours and applications of technologies (Clarke 1988).
During the 1970s, business and government moved to defuse public concerns by creating a chimera rather than a shield. The express purpose of the OECD Guidelines was "to advance the free flow of information between Member countries and to avoid the creation of unjustified obstacles to the development of economic and social relations among Member countries" (OECD 1980). As a result, the real function of 'data protection' laws is not to protect personal data, let alone the person, but to authorise privacy-invasive behaviours by organisations subject to a limited regulatory framework.
The ICCPR was based on the 1948 UDHR document, which predates administrative applications of computing, and as a result the concept of personal data does not appear in the document. The primary civil right that supports data privacy is generally regarded as being freedom from arbitrary or unlawful interference with privacy (Article 17).
People have an interest in having considerable freedom to behave as they wish, subject only to the constraints of avoiding harm to the interests of others, of communities and societies. An individual's behaviour may be constrained by means of acts directly against their person, or by indirect influences, in particular through surveillance.
Surveillance constrains how people act, whether it is conducted in a physical manner (using the eyes and ears of humans), aided by technologies (such as directional microphones and recording apparatus), or entirely automatically. For discussions, see Clarke (2000, 2001a, 2007, 2009d, 2010). Covert surveillance causes many people to have a generalised fear of the 'pan-optic', and this can have an even more substantial impact on their freedom of behaviour. The pan-optic 'chilling effect' ranges from being highly desirable (where it creates a disincentive for criminal, sociopathic or psychopathic behaviour) to highly undesirable (where it reduces artistic creativity, scientific and engineering inventiveness, economic innovation or political speech).
Relevant freedoms include the rights to liberty and security of person, and against arbitrary arrest or detention (ICCPR 9), freedom of movement (12), freedom from arbitrary or unlawful interference with privacy, family and home (17), freedom of conscience and religion (18), the right to hold opinions, freedom of expression, and freedom to seek, receive and impart information and ideas (19), the rights to peaceful assembly (21) and freedom of association (22), and the right to participation in public affairs, to vote and to be elected (25).
A further dimension of human concerns has been threatened by technological changes of the first two decades of the new century. What an individual reads and views, and the ideas that they gain access to through meetings and other events, have been converted from unrecorded ephemera to stored data. This is achieved through the logging of content sought (search-terms), text read (e-books and web-pages), audio listened to, and image and video viewed. That data is under the control of, and exploitable by, for-profit corporations, and is available to government agencies. The privacy of personal thought may not yet be directly under assault, but the privacy of personal experience is a dangerously close proxy for it.
The ICCPR contains a range of prescriptions that are relevant to this dimension of privacy, including freedom of movement (12), freedom from arbitrary or unlawful interference with privacy and correspondence (17), freedom of thought, conscience and religion (18), the freedom to seek and receive information and ideas (19), the right to peaceful assembly (21), freedom of association (22), and the right to participation in public affairs (25).
The primary focus of the Research Program of which this event forms a part is on the often-underplayed political motivation for privacy, and hence to a considerable extent on the behavioural and experiential dimensions of privacy.
As with all other individual freedoms, the privacy of one individual comes into conflict with other needs of the individual, with the interests of other individuals, and with the interests of communities, societies and economies. Privacy rights are accordingly qualified rather than absolute; and privacy protection is always an exercise in balance among competing interests.
Since the 1960s, the vast majority of privacy discussion has been about the protection of data, and hence almost solely about the privacy of personal data. In addition, a moderate amount of legislation exists relating to data communications, although the function of much of that law is to authorise breaches of communications privacy. Recently, the Snowden revelations have lifted public awareness of the extent to which public expectations are breached, and even the law. Privacy of the person and of personal behaviour are subject to scatters of largely incoherent law. Privacy of personal experience is a new dimension, and the law has yet to grapple with it.
On the basis of the above analyses of freedom and privacy, the following sections consider the impacts of information technologies.
The negative impacts of information technology on data privacy have been much-discussed, e.g. Clarke (1988). Similarly, impacts of technology on communications privacy have attracted considerable attention. This attention has intensified recently. More widespread understanding now exists that freedoms have been compromised by the excessive provisions of post-2001 statutes passed in a state of legislative hysteria, that controls over the use of those laws are desperately inadequate, and that even those limited laws are continually subverted, circumvented and breached by national security agencies throughout the world.
Impacts of technology on the privacy of personal behaviour and of personal experience, on the other hand, are much less well-studied. Building in particular on Clarke 2009d and 2010), Table 2 provides a brief summary of key categories of technology-aided activity that have negative impacts on freedom and privacy.
All of Which are underpinned by Identification
And by Location and Tracking
And which enable Interference with Behaviours
Recent decades have seen a great deal of development in surveillance technologies. Less progress has been made in relation to direct interdiction of people's movements and interference with people's behaviour. On the other hand, 'no-fly lists' and 'open prison' techniques using anklets with embedded chips may be harbingers of broader categories of interventional technologies.
Table 2 notes two key enabling technologies. Identity technologies are examined in Clarke (1987, 1994b, 1999b, 2001c, 2006a, 2008, 2009c, Clarke & Wigan 2011, Clarke 2014e). For material on location and tracking technologies, see Clarke (2001c, Wigan & Clarke 2006, Wigan & Clarke 2009, Clarke 2009a, Clarke & Wigan 2011, Michael & Clarke 2013), Clarke 2014f, 2014g).
In addition to specific technologies, it is necessary to consider a broader view of the nature of computing and communications technologies. An earlier model of developments during the 65 years since the emergence of computing divided the period into three eras (Clarke 1994a, 2004), but it is now necessary to postulate a fourth:
This tendency back towards control is also evident in the successive attempts by repressive governments to gain control of Internet management and architecture, initially through the Government Advisory Committee (GAC) of the Internet Corporation for Assigned Names and Numbers (ICANN), and currently through the International Telecommunication Union (ITU) and its World Conference on International Telecommunications (WCIT).
The analysis in this section shows that privacy-invasive technologies are already well-established and are gaining in their intensity and intrusiveness. The remaining sections of this paper focus on the ways in which freedom and privacy can survive technologically-assisted intrusions. It does this firstly by examining principles underlying the resilience of society, and then by identifying some specific examples of ICT designs to support freedom and privacy.
A framework is needed within which IT's usefulness in sustaining free societies can be assessed. The analysis in this section derives from work undertaken while a member of the Advisory Group of the EU-funded project Increasing Resilience in Surveillance Societies (IRISS, 2012-2015). See Clarke (2013a). It suggests that the notion of 'Societal Resilience' against privacy-intrusive institutions comprises four segments of activity, discussed below in chronological order.
Precautionary or Preventative Measures are actions that are taken at an early stage, and that are intended to ensure that proposals that embody threats are subject to an appropriate process. As summarised in Table 3, the fundamental requirements are that proposals for threatening schemes be subjected to evaluation, and that schemes do not proceed unless they satisfy seven further 'meta-principles' (Clarke 2007, APF 2009, 2013, Clarke 2010, 2014i).
A variety of methods exist for the assessment of technologies and their application to particular purposes and in particular contexts (Clarke 2014a). These methods include Techonology Assessment (TA), Social Impact Assessment, Privacy Impact Assessment (PIA - Clarke 2009b), and Surveillance Impact Assessment (Wright & Raab 2012, Wright et al. 2015).
Such actions might be taken by the sponsors of the scheme, but - particularly in contexts in which institutional or market power is available to sponsors - that is unlikely and may in any case result in a deficient or biassed evaluation. Hence it is usually necessary for actions to be taken by other parties.
In practice, in most countries, proposals that embody privacy threats are not subject to any institutional framework that ensures an appropriate evaluation process is undertaken. For example, such Offices of Technology Assessment (OTAs) as exist, mainly in Europe, lack the power to ensure evaluation is undertaken, and that outcomes of evaluations are reflected in designs (Clarke 2014a); and PIAs are optional and/or superficial and/or fail the criteria of effective process (Clarke 2011a).
The dictum 'the price of liberty is eternal vigilance' is of little use unless the discovery of a threat is followed by action to deal with that threat. Hence a fundamental exercise of political freedoms is agitation to ensure that appropriate evaluation processes are applied. When one or more organisations is seeking to apply their institutional and/or market power in order to impose a scheme that is privacy-threatening, measures are needed to activate or generate countervailing power, in order to prevent the scheme proceeding, or to force changes in the process whereby the scheme is developed and/or changes in the design of the scheme.
During the period when a scheme is being approved, developed and implemented, and even after it has come into being, Resistance comprises actions that are intended to have a detrimental effect on its effectiveness. Resistance may take many forms, some of which are passive, such as avoidance and civil disobedience; and others of which are active, ranging from political speech and media coverage, via demonstrations and court actions, to sabotage. A handbook presenting 100 Campaign Principles is provided by Davies (2014).
Robustness is the quality whereby important values and processes are sustained, despite the constraints placed on freedom and privacy. The focus here is primarily on schemes that are unjustified or excessive and/or embody inadequate measures to mitigate unnecessary negative impacts.
Robustness exists if a large majority of institutions and individuals continue to exhibit relevant social, economic and political behaviours despite the imposition of the constraints. This may be because the constraints are ignored, or even an object of derision; or the institutions and individuals have become inured to it and take the risk; or because they intentionally transgress norms that authoritarian agencies seek to impose, in effect daring those organisations to take action.
The term 'resilience' was used in the IRISS Project as a collective word encompassing all phases of the process whereby political freedoms are sustained in the face of surveillance schemes. The term also has a narrow usage, which refers to the quality whereby freedom and privacy are recovered, following the imposition of significant constraints. Resilience is evidenced by a majority of institutions and individuals resuming social, economic and political behaviours after a period during which privacy has been subjected to constraints that were not justifiable, or are no longer justifiable.
Within this framework, the following section considers the contributions that technology can make to the protection of privacy.
This section identifies several ways in which the implementation of societal resilience principles is supported by information technology.
The term 'privacy-enhancing technology' (PET) was coined at least as long ago as 1991, and has been mainstream at least since the joint work by Cavoukian & Hustinx in 1995. Since the late 1990s, I have juxtaposed against it the term 'Privacy-Invasive Technologies' (the PITs), as a convenient way of referring to the vast array of technologies and applications of technologies that threaten privacy (Clarke 1999a). Some technologies can of course be applied in a positive or a negative manner, and hence a more appropriate term than 'the PITs' might be 'Privacy-Invasive Technologies-in-Use' (PITUs).
Table 4 distinguishes a number of categories of PETs (Clarke 2001b).
An alternative approach to the categorisation of PETs might focus on five key areas in which protections are needed. See Table 5. In principle, it is desirable that normal social, cultural and political activities not need such protections; but the practicalities, even in free nations, are that these capabilities need to be created, maintained and exercised, and in some circumstances used on a routine basis.
A great many PETs exist, and are catalogued. See for example PRISM-Break, the EPIC Online Guide to Practical Privacy Tools, and Kissell (2014). The adoption-rate of PETs has been considerably slower than their developers had anticipated (Clarke 2014b). The desperately low levels of security associated with consumer devices, and the untrustworthiness of government agencies, are together encouraging more rapid adoption. Nonetheless, the tools need to be simplified, routinised and embedded in products and practices (Clarke 2014c).
A specific example of a service that is badly needed, but not yet delivered, is Consumer-Oriented Social Media (COSM). By this is meant means for people to interact with other individuals, broadcast messages, and collaborate with others, without being subject to the highly exploitative business model of contemporary social networking services, and without exposing their profiles, identities and locations. Categories of 'persons-at-risk' who have particular need for such a service include victims of domestic violence, protected witnesses and undercover operatives (Clarke 2001d, GFW 2011). Table 6 identifies key features of COSM as proposed in Clarke (2014d):
This paper has reviewed the nature of freedom and privacy. It has identified ways in which informatics has greatly increased privacy intrusions. In addition, however, it has shown that information technologies provide tools for the maintenance of freedom and privacy, through protections for data, messages, identity and social networks.
An optimist perceives that the ongoing technological tendencies towards inexpensive and highly distributed technologies will afford support for freedom and privacy, which will enable repressive forces to be held at bay. A pessimist sees the wave of open and distributed technologies as waning, and being replaced by closed and centrally-controlled technologies, with the forces of repression actively subverting designs in order to align them with their own interests, and to establish controls over people and their behaviour. An alternative is to see continual tension and flux between the two extremes, with brief periods of dominance by each, but mostly varying degrees of balance between them.
Unfortunately, the winners are generally the powerful, and the powerful are almost never the public and almost always large corporations, large government agencies, and organised crime. In the information era, the maintenance of freedom and privacy is utterly dependent on individuals understanding, implementing and applying technology in order to protect free society against powerful institutions.
Extract from AHRC (2015), emphases added
AHRC (2015) 'Rights and freedoms: right by right' Australian Human Rights Commission, Aprtil 2015, at http://www.humanrights.gov.au/rights-and-freedoms-right-right-0
APF (2009) 'Visual Surveillance, incl. CCTV' Australian Privacy Foundation, October 2009, at https://www.privacy.org.au/Papers/PS-CCTV.html
APF (2013) 'Meta-Principles for Privacy Protection' Australian Privacy Foundation, March 2013, at https://www.privacy.org.au/Papers/PS-MetaP.html
Clarke R. (1987) ''Just Another Piece of Plastic for Your Wallet': The Australia Card' Prometheus 5,1 June 1987 Republished in Computers & Society 18,1 (January 1988), with an Addendum in Computers & Society 18,3 (July 1988), at http://www.rogerclarke.com/DV/OzCard.html
Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) Re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, PrePrint at http://www.rogerclarke.com/DV/CACM88.html
Clarke R. (1994a) 'The Eras of Dataveillance' Xamax Consultancy Pty Ltd, March 1994, at http://www.rogerclarke.com/DV/NotesDVEras.html
Clarke R. (1994b) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994), at http://www.rogerclarke.com/DV/DigPersona.html
Clarke R. (1994c) 'Information Technology: Weapon of Authoritarianism or Tool of Democracy?' Proc. World Congress, Int'l Fed. of Info. Processing, Hamburg, September 1994, PrePrint at http://www.rogerclarke.com/DV/PaperAuthism.html
Clarke R. (1997-) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' Xamax Consultancy Pty Ltd, August 1997, revisions to March 2015, at http://www.rogerclarke.com/DV/Intro.html
Clarke R. (1999a) 'Privacy-Enhancing and Privacy-Sympathetic Technologies: Resources' Xamax Consultancy Pty Ltd, April 1999, at http://www.rogerclarke.com/DV/PEPST.html
Clarke R. (1999b) 'Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice ' Proc. User Identification & Privacy Protection Conf., Stockholm, 14-15 June 1999, PrePrint at http://www.rogerclarke.com/DV/UIPP99.html
Clarke R. (2000) 'Technologies of Mass Observation' Notes for the 'Mass Observation Movement' Forum, Melbourne, 26 October 2000, at http://www.anu.edu.au/people/Roger.Clarke/DV/MassObsT.html
Clarke R. (2001a) 'While You Were Sleeping ... Surveillance Technologies Arrived' Australian Quarterly 73, 1 (January-February 2001), PrePrint at http://www.rogerclarke.com/DV/AQ2001.html
Clarke R. (2001b) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, PrePrint at http://www.rogerclarke.com/DV/PITsPETs.html
Clarke R. (2001c) 'Person-Location and Person-Tracking: Technologies, Risksand Policy Implications' Information Technology & People 14, 2 (Summer 2001) 206-231, PrePrint at http://www.rogerclarke.com/DV/PLT.html
Clarke R. (2001d) 'Research Challenges in Emergent e-Health Technologies' Xamax Consultancy Pty Ltd, July 2001, at http://www.rogerclarke.com/EC/eHlthRes.html#PAR
Clarke R. (2004) 'Origins and Nature of the Internet in Australia' Xamax Consultancy Pty Ltd, January 2004, various published versions, master-copy at http://www.rogerclarke.com/II/OzI04.html#IImpl
Clarke R. (2006a) 'National Identity Schemes - The Elements' Xamax Consultancy Pty Ltd, February 2006, at http://www.rogerclarke.com/DV/NatIDSchemeElms.html
Clarke R. (2006b) 'What's 'Privacy'?' Xamax Consultancy Pty Ltd, August 2006, at http://www.rogerclarke.com/DV/Privacy.html
Clarke R. (2007) 'The Regulation of Surveillance' Xamax Consultancy Pty Ltd, August 2007, at http://www.rogerclarke.com/DV/SReg.html
Clarke R. (2008) 'Dissidentity: The Political Dimension of Identity and Privacy' the Information Society 1, 1 (December, 2008) 221-228, PrePrint at http://www.rogerclarke.com/DV/Dissidentity.html
Clarke R. (2009a) The Covert Implementation of Mass Vehicle Surveillance in Australia' Proc. 4th Workshop on the Social Implications of National Security: Covert Policing, April 2009, ANU, Canberra, at http://www.rogerclarke.com/DV/ANPR-Surv.html
Clarke R. (2009b) 'Privacy Impact Assessment: Its Origins and Development' Computer Law & Security Review 25, 2 (April 2009) 123-135, at http://www.rogerclarke.com/DV/PIAHist-08.html
Clarke R. (2009c) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, at http://www.rogerclarke.com/ID/IdModel-090605.html
Clarke R. (2009d) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html
Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, at http://www.rogerclarke.com/DV/RNSA07.html
Clarke R. (2011a) 'An Evaluation of Privacy Impact Assessment Guidance Documents' International Data Privacy Law 1, 2 (March 2011), at http://www.rogerclarke.com/DV/PIAG-Eval.html
Clarke R. (2011b) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled
eConference, June 2011,
Clarke R. (2011d) 'Cyborg Rights' IEEE Technology and Society 30, 3 (Fall 2011) 49-57, PrePrint at http://www.rogerclarke.com/SOS/CyRts-1102.html
Clarke R. (2013a) 'The Resilience of Society in the Face of Surveillance' Xamax Consultancy Pty Ltd, February 2013, at http://www.rogerclarke.com/DV/IRISSR.html#Res
Clarke R. (2013b) 'Consumer-Oriented Social Media: The Identification of Key Characteristics' Xamax Consultancy Pty Ltd, February 2013, at http://www.rogerclarke.com/II/COSM-1301.html
Clarke R. (2014a) 'Approaches to Impact Assessment' Panellist's Notes, CPDP'14, Brussels, 22 January 2014, Xamax Consultancy Pty Ltd, January 2014, at http://www.rogerclarke.com/SOS/IA-1401.html
Clarke R. (2014b) 'Key Factors in the Limited Adoption of End-User PETs' Proc. Politics of Surveillance Workshop, University of Ottawa, 8-10 May 2014, at http://www.rogerclarke.com/DV/UPETs-1405.html
Clarke R. (2014c) 'How to Promote PET Usage' Panellist's Notes, Politics of Surveillance Workshop, University of Ottawa, Xamax Consultancy Pty Ltd, May 2014, at http://www.rogerclarke.com/DV/PETPromo-1405.html
Clarke R. (2014d) 'The Prospects for Consumer-Oriented Social Media' Proc. Bled eConference, June 2014, PrePrint at http://www.rogerclarke.com/II/COSM-1402.html
Clarke R. (2014e) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, at http://www.rogerclarke.com/ID/DP12.html
Clarke R. (2014f) 'The Regulation of Point of View Surveillance: A Review of Australian Law' IEEE Technology & Society 33, 2 (Jun 2014) 40 - 46, at http://www.rogerclarke.com/DV/POVSRA.html
Clarke R. (2014g) 'The Regulation of Civilian Drones' Impacts on Behavioural Privacy' Computer Law & Security Review 30, 3 (June 2014) 286-305, PrePrint at http://www.rogerclarke.com/SOS/Drones-BP.html
Clarke R. (2014h) 'Privacy and Free Speech' Invited Presentation at the Australian Human Rights Commission Symposium on Free Speech, Sydney, 7 August 2014, Xamax Consultancy Pty Ltd, August 2014, at http://www.rogerclarke.com/DV/PFS-1408.html
Clarke R. (2014i) 'Surveillance, Resilience and Democracy' Xamax Consultancy Pty Ltd, October 2014, at http://www.rogerclarke.com/DV/SRD14.html
Clarke R. & Wigan M.R. (2011) 'You Are Where You've Been The Privacy Implications of Location and Tracking Technologies' Journal of Location Based Services 5, 3-4 (December 2011) 138-155, at http://www.rogerclarke.com/DV/YAWYB-CWP.html
Davies S. (2014) 'Ideas for Change: Campaign principles that shift the world' The Privacy Surgeon, December 2014, at http://www.privacysurgeon.org/resources/ideas-for-change/
Finn R.L., Wright D. & Friedewald M. (2013) 'Seven Types of Privacy' pp. 3032 of Gutwirth S., Leenes R., de Hert P. & Poullet Y. (eds.) 'European Data Protection: Coming of Age' Springer, 2013
GFW (2011) 'Who is harmed by a "Real Names" policy?' Geek Feminism Wiki, undated, apparently of 2011, at http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy%3F
ICESCR (1966) 'International Covenant on Economic, Social and Cultural Rights' United Nations, 16 December 1966, at http://www2.ohchr.org/english/law/cescr.htm
ICCPR (1996) 'International Covenant on Civil and Political Rights' United Nations, 1966, at http://www.ohchr.org/en/professionalinterest/pages/ccpr.aspx
Kissell J. (2014) 'Take Control of Your Online Privacy' TidBITS Publishing Inc., March 2014, at http://email.agilebits.com/t/r-l-ckltdlt-kjiuxtlh-t/
Michael K. & Clarke R. (2013) 'Location and Tracking of Mobile Devices: Überveillance Stalks the Streets' Computer Law & Security Review 29, 3 (June 2013) 216-228, at http://www.rogerclarke.com/DV/LTMD.html
Morison W.L. (1973) 'Report on the Law of Privacy' Government Printer, Sydney, 1973
OECD (1980) 'OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data' OECD, 1980, at http://www.oecd.org/internet/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm
UDHR (1948) 'Universal Declaration of Human Rights' United Nations, 10 December 1948, at http://www.un.org/en/documents/udhr/index.shtml
Wigan M.R. & Clarke R. (2006) 'Social Impacts of Transport Surveillance' Prometheus 24, 4 (December 2006) 389-403, at http://www.rogerclarke.com/DV/SITS-0604.html
Wigan M.R. & Clarke R. (2009) 'Transport and Surveillance Aspects of Location-Based Services' Transportation Research Record 2105 (September 2009) 92-99
Wright D. & Raab C.D. (2012) 'Constructing a surveillance impact assessment' Computer Law & Security Review 28, 6 (December 2012) 613-626
Wright D., Friedewald M. & Gellert R. (2015) 'Developing and testing a surveillance impact assessment methodology' International Data Privacy Law 5, 1 (2015) 40-53
This paper was developed in response to an invitation from Prof. Dr. Rüdiger Grimm to present, as part of an session on 'Privacy from the Informatics Point of View', to an Interdisciplinary Conference on 'Privacy and Freedom' held at Bielefeld University on 4-5 May 2015. The specific topic Prof. Grimm challenged me with was 'Political Freedom: Positive and Negative Effects of Mobile and Internet Applications', but with an initial emphasis on the psychological and social aspects of privacy that underpin political freedoms.
The paper draws heavily on a long series of papers previously published or presented in a wide range of venues, several with co-authors Dr Marcus Wigan and Prof. Katina Michael. The section on Resilience Principles derives from a contribution made as a member of the Advisory Group for the IRISS Project. It has benefited from comments by Rüdiger Grimm, Alana Maurushat, Dan Svantesson and Nigel Waters.
At this stage of its development, the majority of the citations in the paper are to prior works of the author. However, these comprise 25 refereed papers and a further 16 unrefereed papers, which together contain some hundreds of references to the literature, and which have a combined Google citation-count of about 1,500.
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Computer Science at the Australian National University.
He has spent 28 years as a Board member of the world's longest-established privacy advocacy organisation, the Australian Privacy Foundation, including 8 years as its Chair, and is Secretary of the Internet Society of Australia.
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 60 million in early 2019.
Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916
Created: 12 March 2015 - Last Amended: 13 April 2015 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/Biel15.html