Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Technology and Privacy'

A Framework for Analysing
Technology's Negative and Positive Impacts
on Freedom and Privacy

Published in Datenschutz und Datensicherheit 40, 1 (January 2016) 79-83

Version of 16 August 2015

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2015

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/DV/Biel15-DuD.html


Abstract

The term 'privacy' refers to a cluster of values within the broad framework of human rights. Developments in information technologies have substantial negative impacts on all of the dimensions of privacy. Beyond the well-documented harm to personal data and personal communications, we are imposing on one another increasingly serious intrusions into the privacy of the physical person, of personal behaviour, and of personal experience. The survival of human society and polity are dependant on technology being brought under human control, through actions to prevent harmful applications, resist their implementation, and ensure robustness and resilience against their effects. Some technologies offer themselves as weapons in that battle.


Contents


1. Introduction

Concerns about humans' creations gaining dominance over humans themselves have been mainstream in the arts for a century, whereas appreciation by scientists and engineers of the gravity and immediacy of the challenge has been very slow to develop. Rather than robotics and AI technologies that have recently attracted the attention of Stephen Hawking, this paper is concerned with the impact on privacy and freedom of Information and Communications Technologies (ICT).

The paper commences by identifying within the broad area of human rights the central cluster of interests that make up privacy. An outline is the provided of the negative impacts of ICT on freedoms and privacy. A set of principles is declared that need to be applied in order to combat technology's negative effects. This provides a basis for identifying positive contributions that technology can make to the protection of privacy.


2. Freedom and Privacy

The term 'freedom' refers to the scope for people to enjoy a substantial degree of liberty, subject only to those constraints that are necessary in order to sustain the freedoms of others, social and political cohesion and economic wellbeing and sustainability. Freedom encompasses both positive human rights - to express views and to perform acts - and negative human rights - in the form of freedom from unreasonable external constraints on thoughts, words and deeds.

A foundation for international recognition of these freedoms was provided by the Universal Declaration of Human Rights (UDHR 1948, at Articles 3-21). An international legal framework was later established in the form of the International Covenant for Civil and Political Rights (ICCPR 1966). A summary of these rights is provided in Appendix 1 to this paper.

The philosophical notion of self-determination is associated with the idea of 'free will', and refers to the capacity of an individual to make their own decisions about their own behaviour. Individual self-determination is not currently recognised in international instruments as a legal right. A qualified form of self-determination does exist, however. The meaning and scope of the term 'privacy' is subject to endless, and generally fruitless, debate. Jurisdictions around the world have generally avoided even attempting a legal definition of it. However, a practical approach is to define privacy as a broad-ranging interest (after Morison 1973):

Privacy is the interest that individuals have in sustaining 'personal space', free from interference by other people and organisations

The motivations for privacy are apparent at multiple levels. Analyses can be pursued from such perspectives as religion, philosophy and artistic expression. More usefully, motivations for privacy can be described at physiological, psychological, social, economic and political levels. A brief summary of each is in Appendix 2, with further detail in Clarke (1997, 2006).

Privacy is a complex construct. One of its dimensions, data privacy, has attracted a great deal of attention since the 1960s; but data protection laws, even in Europe, provide incomplete protections of the real human need. Three further dimensions - privacy of personal communications, of the person, and of personal behaviour - are subject to a scatter of protections that lack comprehensiveness and cohesion. A fifth dimension - privacy of personal experience - has come to prominence only during the last decade. Outlines of these dimensions are provided in Appendix 3. The ways in which ICCPR supports the various dimensions of privacy are described in Appendix 4. The primary focus of the debate to which this paper is a contribution is on the often-underplayed political motivation for privacy, and hence to a considerable extent on the behavioural and experiential dimensions of privacy.


3. Technology's Negative Impacts

This section identifies ways in which informatics negatively impacts each of the dimensions of privacy. To provide context for the analysis, it is useful to commence with a broad view of the nature of ICT. Table 1 presents a model of developments during the 65 years since the emergence of computing. An earlier analysis divided the period into three eras (Clarke 1994, 2004), but a fourth is now clearly evident.

Table 1: Eras of Information and Communications Technologies

  1. The Era of Centralisation (1940-1980)
    This was justified by the then economics of computing, when scale was necessary to achieve efficiency
  2. An Intervening Era of Tension (1980-2000)
    This was characterised by centralisation that was appealing to organisations, but that was no longer justified by economics, and that was undermined by the appeal to many users of distributing hardware and processing to the peripheries
  3. The Distributed Era (since 2000, but already in decline)
    This was justified by the changed economics of computing and communications, because an architecture based on many-and-small devices under user control was demonstrably more adaptable than one that assumes control by few-and-large devices
  4. The Era of Capture (since 2010, and already in ascendancy)
    This features a sharp movement of power back to the centre. Elements include the following:

In the late 1990s, I coined the term 'Privacy-Invasive Technologies' (the PITs), as a convenient way of referring to the array of ICT and its applications that threaten privacy (Clarke 1999a). Negative impacts on the privacy of personal data have been much-discussed, e.g. Clarke (1988). Similarly, impacts on communications privacy have attracted considerable attention, which has intensified following the release of the Snowdon cache of documents. More widespread understanding now exists that freedoms have been compromised by the excessive provisions of post-2001 statutes passed in a state of legislative hysteria, that controls over the use of those laws are desperately inadequate, and that even those limited laws are continually subverted, circumvented and breached by national security agencies throughout the world. Meanwhile, Google, Facebook, Apple, Amazon and with Windows 10 a resurgent, more intrusive Microsoft, expropriate the content of our communications and team with myriad partners to exploit it.

Impacts of technology on the privacy of personal behaviour and of personal experience, on the other hand, are much less well-studied. Building in particular on Clarke (2009c and 2010), Appendix 5 identifies key categories of technology-aided activity that have negative impacts on freedom and privacy. Surveillance technologies are distinguished from enabling technologies.

Privacy-invasive technologies are already very well-established and are gaining in their intensity and intrusiveness. The remaining sections of this paper focus on the ways in which freedom and privacy can survive technologically-assisted intrusions. The next section examines principles underlying the resilience of society, and a further section identifies examples of ICT that support freedom and privacy.


4. Societal Resistance - The Principles

A framework is needed within which IT's usefulness in sustaining free societies can be assessed. The analysis in this section derives from work undertaken while a member of the Advisory Group of the EU-funded project Increasing Resilience in Surveillance Societies (IRISS, 2012-2015). See Clarke (2013). It suggests that the notion of 'Societal Resilience' against privacy-intrusive institutions comprises four segments of activity, discussed below in chronological order. Most fundamentally, precautionary and preventative actions are necessary during the formative phases of new privacy-intrusive applications of technology.

(1) Precaution or Prevention

Measures are actions that are taken at an early stage, and that are intended to ensure that proposals that embody threats are subject to an appropriate assessment process. As summarised in Table 2, the fundamental requirements are that proposals for threatening schemes be subjected to evaluation, and that schemes do not proceed unless they satisfy seven further 'meta-principles' (Clarke 2007, APF 2013, Clarke 2014d).

Table 2: Meta-Principles for Privacy Protection

After APF (2013)

A variety of methods exist for the assessment of technologies and their application to particular purposes and in particular contexts (Clarke 2014a). These methods include Techonology Assessment (TA), Social Impact Assessment, Privacy Impact Assessment (PIA - Clarke 2009a, Wright & De Hert 2012), and Surveillance Impact Assessment (Wright & Raab 2012, Wright et al. 2015). Such actions might be taken by the sponsors of the scheme, but - particularly in contexts in which institutional or market power is available to sponsors - that is unlikely and may in any case result in a deficient or biassed evaluation. Hence it is usually necessary for actions to be taken by other parties.

In practice, in most countries, proposals that embody privacy threats are not subject to any institutional framework that ensures an appropriate evaluation process is undertaken. For example, such Offices of Technology Assessment (OTAs) as exist, mainly in Europe, lack the power to ensure that evaluation is undertaken, and that outcomes of evaluations are reflected in designs (Clarke 2014a); and PIAs are optional and/or superficial and/or fail the criteria of effective process (Clarke 2011a). A recent study of the application of PIAs to counter-terrorism measures in Australia is reported in Clarke (2015).

(2) Resistance

The dictum 'the price of liberty is eternal vigilance' is of little use unless the discovery of a threat is followed by action to deal with that threat. Hence a fundamental exercise of political freedoms is agitation to ensure that appropriate evaluation processes are applied. When one or more organisations is seeking to apply their institutional and/or market power in order to impose a scheme that is privacy-threatening, measures are needed to activate or generate countervailing power, in order to prevent the scheme proceeding, or to force changes in the process whereby the scheme is developed and/or changes in the design of the scheme.

During the period when a scheme is being approved, developed and implemented, and even after it has come into being, Resistance comprises actions that are intended to have a detrimental effect on its effectiveness. Resistance may take many forms, some of which are passive, such as avoidance and civil disobedience; and others of which are active, ranging from political speech and media coverage, via demonstrations and court actions, to sabotage. A handbook presenting 100 Campaign Principles is provided by Davies (2014).

(3) Robustness

Robustness is the quality whereby important values and processes are sustained, despite the constraints placed on freedom and privacy. The focus here is primarily on schemes that are unjustified or excessive and/or embody inadequate measures to mitigate unnecessary negative impacts. Robustness exists if a large majority of institutions and individuals continue to exhibit relevant social, economic and political behaviours despite the imposition of the constraints. This may be because the constraints are ignored, or are even an object of derision; or the institutions and individuals have become inured to it and take the risk; or because they intentionally transgress norms that authoritarian agencies seek to impose, in effect daring those organisations to take action.

(4) Resilience (in the narrow)

The term 'resilience' was used in the IRISS Project as a collective word encompassing all phases of the process whereby political freedoms are sustained in the face of surveillance schemes. The term also has a narrow usage, which refers to the quality whereby freedom and privacy are recovered, after a period during which significant constraints existed. Resilience is evidenced by a majority of institutions and individuals resuming social, economic and political behaviours after a period during which privacy has been subjected to constraints that were not justifiable, or are no longer justifiable.

The following section applies this framework in order to identify contributions that technology can make to the protection of privacy.


5. Technology's Positive Impacts

In addition to being a weapon of privacy-abuse, ICT has the capacity to support societal resilience and to sustain freedom and privacy. The term 'privacy-enhancing technology' (PET) was coined at least as long ago as 1991, and has been mainstream since the joint work by Cavoukian & Hustinx (1995). Table 3 distinguishes a number of categories of PETs (Clarke 2001).

Table 3: Categories of PETs

An alternative approach to the categorisation of PETs places the focus on five key areas in which obfuscation and falsification are necessary features of privacy-friendly ICT. See Appendix 6. In principle, it is desirable that normal social, cultural and political activities not need such protections; but the practicalities, even in free nations, are that these capabilities need to be created, maintained and exercised, and in some circumstances used on a routine basis.

A great many PETs exist. See for example the catalogues at PRISM-Break, the EPIC Online Guide to Practical Privacy Tools, and Kissell (2014). However, the adoption-rate of PETs has been considerably slower than their developers had anticipated (Clarke 2014b). The desperately low levels of security associated with consumer devices, and the untrustworthiness of government agencies, are together encouraging more rapid adoption. Nonetheless, an essential pre-condition for widespread adoption is that the tools be simplified, routinised and embedded in products and practices (Clarke 2014c). A specific example of a service that is badly needed, but not yet delivered, is Consumer-Oriented Social Media (COSM), outlined at Appendix 7.


6. Conclusions

This paper has reviewed the nature of freedom and privacy. It has identified ways in which informatics has greatly increased privacy intrusions. In addition, however, it has shown that information technologies provide tools for the maintenance of freedom and privacy, through protections for data, messages, identity and social networks.

An optimist perceives that the ongoing technological tendencies towards inexpensive and highly distributed technologies will afford support for freedom and privacy, which will enable repressive forces to be held at bay. A pessimist sees the wave of open and distributed technologies as waning, and being replaced by closed and centrally-controlled technologies, with the forces of repression actively subverting designs in order to align them with their own interests and to establish controls over people and their behaviour. An alternative is to see continual tension and flux between the two extremes, with brief periods of dominance by each, but mostly varying degrees of balance between them.

Unfortunately, the winners are generally the powerful, and the powerful are almost never the public and almost always large corporations, large government agencies, and organised crime overlaid across both. In the current 4th era of ICT, the maintenance of freedom and privacy is utterly dependent on individuals understanding, implementing and applying technology in order to protect free society against powerful institutions.


References

APF (2013) 'Meta-Principles for Privacy Protection' Australian Privacy Foundation, March 2013, at https://www.privacy.org.au/Papers/PS-MetaP.html

Cavoukian A. & Hustinx G. (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner of Ontario and Registratiekamer (Rijswijk) of The Netherlands, 1995

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) Re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991, PrePrint at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1994) 'The Eras of Dataveillance' Xamax Consultancy Pty Ltd, March 1994, at http://www.rogerclarke.com/DV/NotesDVEras.html

Clarke R. (1997-) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' Xamax Consultancy Pty Ltd, August 1997, revisions to March 2015, at http://www.rogerclarke.com/DV/Intro.html

Clarke R. (1999a) 'Privacy-Enhancing and Privacy-Sympathetic Technologies: Resources' Xamax Consultancy Pty Ltd, April 1999, at http://www.rogerclarke.com/DV/PEPST.html

Clarke R. (2001) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, PrePrint at http://www.rogerclarke.com/DV/PITsPETs.html

Clarke R. (2004) 'Origins and Nature of the Internet in Australia' Xamax Consultancy Pty Ltd, January 2004, various published versions, master-copy at http://www.rogerclarke.com/II/OzI04.html#IImpl

Clarke R. (2006) 'What's 'Privacy'?' Xamax Consultancy Pty Ltd, August 2006, at http://www.rogerclarke.com/DV/Privacy.html

Clarke R. (2007) 'The Regulation of Surveillance' Xamax Consultancy Pty Ltd, August 2007, at http://www.rogerclarke.com/DV/SReg.html

Clarke R. (2009a) 'Privacy Impact Assessment: Its Origins and Development' Computer Law & Security Review 25, 2 (April 2009) 123-135, at http://www.rogerclarke.com/DV/PIAHist-08.html

Clarke R. (2009b) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, at http://www.rogerclarke.com/ID/IdModel-090605.html

Clarke R. (2009c) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html

Clarke R. (2011a) 'An Evaluation of Privacy Impact Assessment Guidance Documents' International Data Privacy Law 1, 2 (March 2011), at http://www.rogerclarke.com/DV/PIAG-Eval.html

Clarke R. (2011b) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011,
at http://www.rogerclarke.com/EC/CCC.html

Clarke R. (2013) 'The Resilience of Society in the Face of Surveillance' Xamax Consultancy Pty Ltd, February 2013, at http://www.rogerclarke.com/DV/IRISSR.html#Res

Clarke R. (2014a) 'Approaches to Impact Assessment' Panellist's Notes, CPDP'14, Brussels, 22 January 2014, Xamax Consultancy Pty Ltd, January 2014, at http://www.rogerclarke.com/SOS/IA-1401.html

Clarke R. (2014b) 'Key Factors in the Limited Adoption of End-User PETs' Proc. Politics of Surveillance Workshop, University of Ottawa, 8-10 May 2014, at http://www.rogerclarke.com/DV/UPETs-1405.html

Clarke R. (2014c) 'How to Promote PET Usage' Panellist's Notes, Politics of Surveillance Workshop, University of Ottawa, Xamax Consultancy Pty Ltd, May 2014, at http://www.rogerclarke.com/DV/PETPromo-1405.html

Clarke R. (2014d) 'Surveillance, Resilience and Democracy' Xamax Consultancy Pty Ltd, October 2014, at http://www.rogerclarke.com/DV/SRD14.html

Clarke R. (2015) 'Privacy Impact Assessments as a Control Mechanism for Australian National Security Initiatives' Working Paper, Xamax Consultancy Pty Ltd, August 2015, at http://www.rogerclarke.com/DV/IANS.html

Davies S. (2014) 'Ideas for Change: Campaign principles that shift the world' The Privacy Surgeon, December 2014, at http://www.privacysurgeon.org/resources/ideas-for-change/

ICCPR (1996) 'International Covenant on Civil and Political Rights' United Nations, 1966, at http://www.ohchr.org/en/professionalinterest/pages/ccpr.aspx

Kissell J. (2014) 'Take Control of Your Online Privacy' TidBITS Publishing Inc., March 2014, at http://email.agilebits.com/t/r-l-ckltdlt-kjiuxtlh-t/

Morison W.L. (1973) 'Report on the Law of Privacy' Government Printer, Sydney, 1973

UDHR (1948) 'Universal Declaration of Human Rights' United Nations, 10 December 1948, at http://www.un.org/en/documents/udhr/index.shtml

Wright D. & De Hert P. (eds) (2012) 'Privacy Impact Assessments' Springer, 2012

Wright D. & Raab C.D. (2012) 'Constructing a surveillance impact assessment' Computer Law & Security Review 28, 6 (December 2012) 613-626

Wright D., Friedewald M. & Gellert R. (2015) 'Developing and testing a surveillance impact assessment methodology' International Data Privacy Law 5, 1 (2015) 40-53


Acknowledgements

This is a revised version of an invited paper presented at the Interdisciplinary Conference on 'Privacy and Freedom' at Bielefeld University on 4-5 May 2015, responding to a specific challenge posed to me by Prof. Dr. Rüdiger Grimm.

The paper draws heavily on a long series of papers previously published or presented in a wide range of venues, several with co-authors Dr Marcus Wigan and Prof. Katina Michael. The section on Resilience Principles derives from a contribution made as a member of the Advisory Group for the IRISS Project. It has benefited from comments by Rüdiger Grimm, Alana Maurushat, Dan Svantesson and Nigel Waters.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 12 March 2015 - Last Amended: 16 August 2015 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/Biel15-DuD.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2024   -    Privacy Policy