Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Dataveillance 2020'

Notes for an Interview on Dataveillance

By Jasmine Guffond for disclaimer.org.au

Audio-Interview (887MB)

Version of 11 December 2020

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2020

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://rogerclarke.com/DV/DV20.html


JG: Could you introduce yourself?

RC: My name is Roger Clarke. For the last 25 years, I've been a consultant in the strategic and policy implications of advanced information technologies. Prior to that I was a professional in business information systems followed by a decade as a senior academic.

I'm probably better known from my many articles on a wide variety of topics in the information technology arena. My articles investigate how to apply IT in order to achieve benefits, limit the harm, and manage the risks. But a great many organisations use IT in ways that are seriously harmful to people, so a lot of my articles look at the downsides, and how to deal with them.

JG: I first came across your work, via a seminal article you wrote 1988. It is often cited in surveillance studies literature due to your coining of the term 'dataveillance'. You define dataveillance as "the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons".

At the time of writing in 1988, you describe a situation where the computer as a means of surveillance was somewhat new, or an emerging trend, and the aim of the article was to define and clarify the implications of dataveillance for academics, IT professionals, and the public more broadly so that these emerging surveillance techniques could be formally regulated.

How do you feel the situation regarding dataveillence in Australia in 1988 compares to today especially in light of Covid-19 contact tracing technologies?

RC: A lot's happened since the mid-1980s, so I need to answer that step-by-step.

Dataveillance research initially had as its focus personal data that is created manually, and that is stored and accessed internally, within the creating organisation.

But data transfers became more common, and inter-organisational systems enabled direct access to internal data by external parties. So dataveillance research had to adapt to that new reality.

Progressively, individuals were drawn into saving organisations the effort of capturing data, and substantial streams of data became available as by-products of individuals' actions, for example, at automated teller machines/Geldautomaten, EFTPOS terminals, and on public transport.

IT has developed enormously over the last 40 years, in its capabilities, its power and its reach. Multiple kinds of technology have been integrated: computing combined with communications, combined with robotics, combined with digitial representation of all forms of reality, combined with many different ways of producing software.

Designers and coders have increasingly lost touch with their creations. The world's first programmer, Ada Lovelace, had the insight from the beginning that computers don't do what you want them to do, but what you tell them to do. Increasingly, however, we're losing control of their behaviour, by accidentally and sometimes intentionally delegating decision-making to them. So now Ada would say:

Computers don't do what you want them to do. They do what they do.

In the mobile era, dataveillance has leapt to new levels, with vast volumes of 'breadcrumbs' / 'data trails' / 'data exhaust' generated. There has also been a major shift, in that mobile devices are 'appliances' controlled by organisations and their strategic partners, and hence facilities and features are embedded in them that are designed very much to serve the interests of organisations rather than device-users.We all enjoy a great deal of convenience as a result of advances in IT. Unfortunately, however, a lot of organisations have applied IT to the control of people's behaviour.

Corporations gather data about individuals and use it to infer their interests and attitudes, and structure and schedule communications to them in order to manipulate their behaviour. The current business models make up what I call the 'digital surveillance economy'. Shoshanna Zuboff thinks business is being habituated to this way of doing things, and that a new polity is emerging, which she calls 'surveillance capitalism'.

Governments, meanwhile, are increasing their use of IT to gather more data about each individual, to consolidate data from multiple sources, and to impose a single 'digital persona' on each of us. The foreground purpose is to achieve social control, reduce waste and catch small-time cheats and occasionally criminals. But the function creep into political control is well under way, as political and social deviants, whistleblowers, and employees who dare to voice an opinion different from their employer's, are detected, tracked and dealt with.

COVID-19 has been a great many things. An aspect that's relevant to dataveillance is the way in which governments worldwide have grasped at straws. The vogue term for this is 'technological solutionism', but it can be parodied as 'There's an app for that'.

The most well-known family of COVID apps pretends that Bluetooth signal strength is a reliable indicator of proximity between handheld devices. Governments have cajoled people into installing such apps, enabling vast quantities of data to tumble into, in most cases, centralised data-stores. The effectiveness of these apps in contact-tracing is dismally poor. Countries that have reined in the impact of the virus have done so because of quality work by trained contact-tracing staff, not because of the apps. Another example of a tracking tool is the suddenly-mainstream use of QR-codes to register your presence at a venue. In many jurisdictions, the data is centralised, under government control, and protections against abuse of the data are minimal.

In short, COVID-19 has been an opportunity for organisations interested in social control to routinise and embed surveillance. The public has swallowed the idea that enhancing the tracking capabilities of their own devices is a good thing. Auto-surveillance, that is to say surveillance of your self, is being enabled. Organisations now track us because we willingly generate a trail of locations as a by-product of our actions.

JG: 1988 is just before the World Wide Web was invented by Tim Berners Lee in 1989, how do you feel the World Wide Web has contributed to cultures of dataveillance?

RC: The original Web, as conceived by TBL, was a straightforward information access mechanism. If you knew where to ask, you asked, and you received. That was it: a simple request-response pair, end of transaction. With some extensions, particularly indexing of content, and search facilities, this was a powerful tool, for people.

Corporations fought for years to subvert the Web. It was like sunlight - if you can't get control of it, it's hard to make a profit out of it. It took them 15-20 years, but subvert it they did. Web 2.0 has re-engineered the technology. The power of original Web's users to exercise its 'request-response' nature has been replaced by control exercised by the business end of the link. Marketers fight for your 'eyes' and 'attention'. The tools at their disposal enable them to use their insights into human behaviour to extract large quantities of data, and use it for their own benefit and against the interests of consumers.

JG: On your website you have put together an extensive archive of research surveys into public opinion regarding privacy. Do you think Australians' relationships to privacy has changed over time and if so in what ways?

RC: The privacy notion has an enormous number of facets. Philosophical treatises about it are a waste of time, because they're so abstract that they provide little guidance to us in addressing the issues. To have sensible conversations about privacy, you have to focus on at least one dimension at a time, and often you then need to narrow still further to a particular context.

People have different backgrounds, characteristics and circumstances, and hence their concerns and priorities differ a great deal: 'Everyone has something to hide', but what it is varies a great deal between people, between situations, and over time.

Many people have little bits of their personal history that they're sensitive about. Their ethnic background, the country they came from, the unsavoury ancestor, their birthplace, their birthyear, the treatment they had a few years ago, the club they belonged to when they were young, the long-ago minor crime they committed. Those concerns are very real. But I spend most of my time thinking about the larger-scale issues at the level of society, the economy, the polity and culture. If people's behaviour is chilled, they become less innovative and interesting, and less prepared to ask questions of authority, challenge conventions, and confront the powerful.

My usual approach to discussions like this is to pick out a couple of categories of persons-at-risk, and distinguish their concerns. Victims of domestic violence need to obscure their living location and their movements - yet somehow keep contact with the friends who are providing them with psychological support. Similar concerns dominate the lives of protected witnesses, and of people trying to withdraw from membership of a criminal gang. Human rights activists living in hostile countries need to obscure what they read, and who they are in contact with. Some people inherit a valuable painting from their rich aunt, or don't trust banks and keep some cash under the mattress, but doesn't have the money or the inclination to install security in their home.

The absolute minimum we all have to concern ourselves about are the passwords and PINs which cause us serious grief if we lose them, and much worse if someone else acquires them and empties our accounts. Organisations no longer know the people they deal with and instead depend on the data that they store about us. As a result, we face the risk of someone masquerading as us, taking out loans, incurring fines, making life awkward, and even making our normal identifiers unusable. So we need to protect all manner of identifiers, and stop organisations making and storing copies of documents like drivers' licences, health cards and passports.

So there have been changes in what people are nervous about, and in the intensity of many of the threats; but a lot of the concerns are common.

There's one further point worth making here. People commonly assume that each new generation of people (no longer every 25 years, but about every 10-15 years) is less concerned about privacy. And that common assumption is wildly wrong.

What actually happens is that each cohort, no matter which decade the people were born in, starts out young, carefree and without much in the way of psychic assets or liabilities. As the cohort moves through time, they accumulate experiences, many pleasant, some not. They also accumulate a history, much of which is known only to a few, and even to themselves only. Assets and liabilities, both psychic and financial, loom larger. So, as they move through their lives, most people, in all cohorts, become much less carefree, including about particular aspects of privacy, and in particular situations.

Rather than society becoming less concerned about privacy, as corporate spruikers like to argue, I actually contend that the opposite is happening. Each new generation is seeing more damage being done by surveillance of all kinds, and is coming to better understand the existence and impacts of the hidden form of dataveillance. So, as earlier cohorts fade away and later ones come to dominate, privacy concerns are increasing.

JG: You are one of the founders of the Australian Privacy Foundation, which was formed in 1987. Could you explain its origins and on-going work regarding Federal Attorney-General Christian Porter's call for submission to the review of the federal Privacy Act 1988, or the Ass Access Bill?

RC: We formed the APF to fight a particularly nasty proposal to impose a comprehensive national identification scheme, which the government of the day called the 'Australia Card' and dressed up in our real national colours of green and gold, hoping to sell the idea by linking it to patriotism. We ran information campaigns, and once the public realised what the proposal actually was, they got out on the streets as almost never before or since (except the Moratorium against the Vietnam War, and perhaps the demonstrations against South African apartheid).

From the wreckage of that proposal, we rescued a so-called Privacy Act. It was pretty minimalist in 1988, and it's been whittled away by government and business to the point at which both it and the oversight agency are almost valueless.

The current review of that Act is being conducted by a government agency that is dominated by national security interests. The process is intended to adjust the law so that it impedes the operations of government agencies and corporations even less than it does now. It is emphatically not intended to improve privacy protections.

But Australia is an outlier in what we once called 'the free world'. Almost alone among nations, it has no constitutional protection whatsoever for human rights of any kind, least of all privacy. It acceded to the ICCPR, but has never implemented its provisions. Unlike Europe and many other countries, it is not balancing privacy against other interests. It is swimming against the tide by compromising privacy in favour of economic interests, social control by government agencies, and political control by national security agencies.

This interview is part of a series of interviews with artists, academics and activists about strategies for addressing pervasive contemporary surveillance. How effective do you think government policy, regulation and legislation is in addressing digital surveillance?

In Australia, it's been an abject failure, and it's getting worse. Fortunately, Australia is unusual. I don't want to make the mistake of wearing rose-coloured glasses. But the GDPR since 2018 has been multiple steps beyond the OECD Guidelines of 1980, and Europe is drawing multiple other countries along with it. European Data Protection Commissioners actually apply the law, slap significant fines on large corporations and achieve change. Committed advocates like Max Schrems and his colleagues have court processes available to them whereby breaches by corporations and governments can be pursued.

There are ways to achieve regulation of the invasiveness of dataveillance techniques, and to achieve balance in the application of each new technology. We need ongoing pressure from the public, and ongoing commitment among advocates and policy-makers, to wind back excesses, install safeguards, monitor organisations' behaviour, and impose sanctions.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor associated with the Allens Hub for Technology, Law and Innovation in UNSW Law., and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 11 December 2020 - Last Amended: 11 December 2020 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/DV20.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2024   -    Privacy Policy