Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Digitalisation and the Individual'

Risks Inherent in the Digital Surveillance Economy:
A Research Agenda

Review Draft of 10 April 2017

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2016-17

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/DV/InDigR.html


Abstract

The digitisation of a considerable amount of information about the world relevant to business has given rise to a new phase of 'digitalisation'. This involves a substantial shift in business models and industrial organisation, such that the interpretation and management of the world through human perception and cognition has been to a considerable extent replaced by processes that are almost entirely dependent on digital data. Some applications of digitalisation, in addition to creating opportunities, give rise to threats to individuals, and risks to people, society and polity.

A review of the notions of information society, surveillance society, the surveillance state, and surveillance capitalism provides a basis for appreciating the nature of what is referred to here as 'the digital surveillance economy' - a new form of business model that was initiated by Google at the beginning of the 21st century. This model is predicated on the acquisition, expropriation and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. In the digital surveillance economy, not only is the consumer converted from the customer to the product, but consumers' interests have almost no impact on the process, and are ignored. In the words of the model's architects, users are 'bribed' and 'induced' to make their data available at minimal cost to marketers.

The processes of digitisation of the individual, followed by the digitalisation of business processes, has given rise to a digital surveillance economy. This harbours great threats to the interests of individuals, and to the relationship between corporations, on the one hand, and society and polity on the other. The new economic wave may prove to be a tsunami that swamps the social dimension and washes away the last five centuries' individualism and humanism. Alternatively, institutions may achieve regulatory adaptation in order to overcome the worst of the negative impacts; or a breaking-point could be reached and consumers might rebel against corporate domination. A research framework is suggested, within which the alternative scenarios can be investigated.


Contents


1. Introduction

During the last few decades, a succession of terms has been used to refer to the most impactful forms of information technology (IT). These have included strategic information systems, IT alignment, eCommerce, eBusiness, disruptive IT and digital transformation. Following on from these predecessor notions, the term 'digitalisation' has recently emerged as a focal point for research into IT-enabled and IT-driven change. The work reported in this paper is motivated by the author's concern that the digitalisation movement not only offers promise, but also harbours threats that are very substantial but that are far from being understood and managed, and that are for the most part being overlooked.

The terms 'digitisation' and 'digitalisation' are frequently used very loosely, even in many sources relevant to the present analysis. In this paper, following the OED and Brennen & Kreiss (2016), the following distinction is drawn:

In a revolution that was largely completed by the end of the 20th century, almost all aspects of the handling of almost all forms of data have come to be digitised. This includes not only the conversion and creation of data, but also its (near-costless) replication, transmission, access, analysis and manipulation (Negroponte 1995). The digitisation of data, together with the extraordinarily low cost of handling it, had a great deal of impact on the accessibility of information, the economics of activities that handled information, and the structure of industry sectors. The revenue streams of established players were undermined by digitisation, as more nimble competitors took advantage of greatly lowered cost-profiles to erode the incumbents' customer-base.

However, it took some time for a second round of more fundamental change to come about, and to be recognised. One term that has been applied to this is 'datafication', or, less commonly, 'datification'. Some authors merely use that term as a synonym for 'digitisation', and others for "the ability to render into data many aspects of the world that have never been quantified before" (Cukier & Mayer-Schoenberger 2013, p.28), or for "the reliance of enterprises on data" (Lycett 2014). Citing Normann (2001), Lycett uses 'dematerialisation' to refer to the expression of digital data, 'liquifaction' to refer to the ease of manipulation and transmission of digital data, and 'density' as the "(re)combination of resources, mobilised for a particular context, at a given time and place".

The term 'digitalisation' is used here to describe a shift that has occurred in the business models of enterprises within those industry sectors in which information performs a central role. The term is used in the titles of active programmes within the EU and the OECD. In some respects, it is merely the latest in a long line of neologisms; but it will be argued in this paper that convergence among multiple threads of development is indeed leading to substantial change.

Digitalisation is occurring in business and government, but also in economies and societies more broadly. Digitalisation depends on attributes and behaviour of real-world entities having been reduced to digital form, as data-items and data-streams associated with digital identifiers (Clarke 2009a). All categories of real-world entities can be represented in this way, including inanimate natural objects and artefacts; geo-physical, geological and biological processes that occur independently of people; economic and social processes that are created and managed by humankind through groups, communities, associations and societies, and by government agencies, corporations and not-for-profit organisations; and all kinds of living beings, vegetable and animal.

The focus of this paper is on the impact of the digitalisation shift on people. This topic is currently attracting a great deal of attention. Most of the discussion is upbeat, and focussed on opportunities that are perceived to arise from streams of digital data about individuals, e.g.: "the digitization of private lives enables suppliers of digital technologies to form closer and stronger connections with their customers and to build services and devices that better match their expectations and improve their everyday lives" (EM Call for Papers, 2016). This paper considers, on the other hand, the "implications for individuals' lives", referred to at the beginning of the Special Issue's Theme description, but adopts the perspectives of the individuals themselves, rather than being concerned with corporate interests as the objective and individuals' interests as constraints.

The project has included a scan of literature considering the risks to individuals arising from digitalisation. The field has changed rapidly under the onslaught of technological innovation, however. The research reported here is accordingly preliminary, and designed to lay the foundations for a research programme in the area. The intention is to delineate the dimensions of a suitable framework within which more detailed work can be undertaken, including the articulation of relevant research questions.

The paper commences with a brief description of the challenges facing research of this nature, and an outline of the approach adopted. The nature and origins of the digitalisation concept are then discussed, including the technologies that are driving it. Diverse literatures are drawn on, in order to surface the threats to the individual that digitalisation embodies. These threats affect not only the individuals themselves, but also societies, economies and polities. The most relevant aspects that it is proposed need attention are summarised, and are characterised by the expression 'the digital surveillance economy'.

A range of alternative future paths exists. Because the phenomena remain fluid at this early stage in the new era, it is necessary to articulate broad research questions, identify a range of circumstances, typify them using vignettes, and work towards the construction of more detailed scenarios. A broad research agenda is outlined.


2. The Research Approach Adopted

In the research reported in this paper, the object of study is a contemporary shift in the patterns of behaviour of organisations. The purpose of the research is not merely to describe, explain or even predict, but rather to provide insight, and thereby guide decisions and actions by business enterprises and by policy-makers. These features of the research give rise to a number of challenges.

The conduct of empirical research is predicated on the assumption that the phenomena being observed exhibit a degree of stability. The rigour demanded of empirical research is undermined to the extent that the results cannot be reliably reproduced because the population from which samples are extracted exhibits continual changes of behaviour. This is inevitably the case with emergent phenomena, such as those that arise from the application of multiple new digital technologies. Remarkably, instability of the object of study appears to be little-discussed in the IS literature. However, for a discussion of the difficulties that the dynamic nature of the eCommerce and eBusiness eras presented to researchers, see Clarke (2001).

A further challenge arising with future-oriented, instrumentalist research is that it must deal with contexts in which stakeholders will be differentially impacted by change, and in which it is commonly the case that pareto optimality cannot be achieved and hence the interests of at least some stakeholders suffer harm. This necessarily draws the researcher beyond the merely descriptive, explanatory and predictive, and into realms characterised by value-laden contexts, normative modes and prescriptive outcomes. If research is to deliver outputs relevant to business strategy and government policy, it is necessary to adopt research techniques that can cope with both unstable phenomena and contested values, and seek such levels of rigour as are feasible in the circumstances.

The audience to whom the reports of this research is addressed must also be considered. To some extent, the outcomes are intended to influence decision-makers in large corporations, by highlighting the broader implications of current changes in business models. However, the work also contributes to public policy research, whose purpose is to support the development of policies, and the articulation of programmes, that are intended to achieve normative objectives expressed in all of social, economic and environmental terms, and that depend on effective cooperation among disparate organisations, across the public and private sectors and often also the voluntary sector. The diversity of the audience adds to the challenges of research conception, design, conduct and reporting.

Appropriate research techniques need to be identified. Conventional empirical research techniques are of limited applicability to research questions of the kind being addressed in this paper, because the objects of study are unstable, emergent and to some extent not yet observable in the field. Some guidance is provided by Gray & Hovav (2008) and Majchrzak & Markus (2014). See also Niederman et al. (2016) and works on futures research such as Martino (1993), Dreborg (1996) and Bell (2004). In addition to existing techniques such as environmental scanning, technology assessment, Delphi rounds, cost/benefit/risk analysis and privacy impact assessment, some innovation may be needed, as with quasi-empirical scenario analysis (Clarke 2015b), and even some outright invention, as with instrumentalist futurism (Clarke 1997).

The approach adopted to the research reported in this paper was to conduct and present a review of relevant threads of development, identify threats embedded in those threads, and outline a framework whereby the emergent phenomena can be researched, understood and managed. The research questions, rather than being declared at the outset, are an outcome of the analysis, and are presented at the end of the paper. The analysis draws on many sources, including multiple strands of research previously conducted by the author. The self-citations are mostly on topics that attract few researchers and hence few publications, but they have in excess of 1,000 Google citations, and provide access to a substantial set of additional references, above and beyond those in the Reference List to the present paper.


3. Foundations

Digitalisation has been emergent for some time. This section briefly reviews the digitisation phase that laid the foundations for it and shaped it. It then traces the emergence of digitalisation since the end of the nineteenth century. The notions of Information Society and Surveillance Society are revisited, culminating in consideration of Zuboff's recent writings in relation to Digital Capitalism. This lays the foundations for conceptualisation of the research domain being addressed in the current work as the Digital Surveillance Economy.

3.1 Digitisation

The first step in a digitisation process is the selection of attributes whose states or behaviour are to be represented in data. The expression of data is not a value-neutral event but a purposive activity, and choices are made that reflect the intentions of the collector. The commonly-used term 'data capture' embodies an implicit assumption that the data already exists. The term 'data creation' is more appropriate, because it conveys that the step is active rather than passive - even where the activity is limited, as occurs with sensing and analogue-to-digital 'conversion', where the characteristics of the conversion device and process, at the time the data is created, influence whether, when and what data comes into existence. A further consideration is that the collector's perception of purpose results in some attributes of some entities and their actions being represented in the data collection, and the complete omission from the collection of all other attributes of those entities and actions, and of all attributes of all other entities and actions.

The creation of data may involve some form of sampling of the physical state of attributes, such as the use of a sensor to measure the height of water in a container, ambient temperature, or particulate content in air. Alternatively, the creation of data may be a by-product of some action using an artefact, such as the scanning of product codes and a payment card at a cash register. A further form taken by data creation is purposeful human actions that result in machine-readable data, including data capture via keyboard, but also by voice (where text-recognition software is used) and as images (where some form of data-extraction capability exists, such as optical character recognition for printed text, and so-called facial recognition technologies).

Data may be subject to various transformation operations, display to humans, input to software that draws inferences, copying, transmission, disclosure to other parties, and deletion. There are many quality aspects of data, and data-items, records and record-collections evidence very high variability of quality. For reviews in the context of 'big data', see Hazen et al. (2014) and Clarke (2016b). Evidence in support of quality assertions may or may not exist. Such evidence is sometimes referred to as 'meta-data', indicating that its purpose is to store data about other data. Meta-data has the same characteristics as any other kind of data and is subject to the same quality constraints; but it has further attributes and quality constraints as well.

3.2 The Digitisation of Individuals

Digitisation began to affect individuals as long ago as 1890, when the Hollerith card was applied to the US Census, and the era of machine-readable data began (Kistermann 1991). Following the emergence of electronic computing on both sides of the North Atlantic c. 1940, electronic data processing commenced in 1951 at the Lyons Tea Company in the U.K. (Land 2000). Within a decade, the opportunity was perceived for 'national data systems' to consolidate data about people, variously as a basis for sociological research (e.g. the Metropolit projects in Sweden and Denmark - Osler et al. 2006) and for social control (e.g. the National Data Center proposals in the USA - Dunn 1967).

By the 1980s, government agencies were conducting data matching programs. This involved the expropriation of personal data from two or more sources. The term 'expropriation' might be interpreted as being value-loaded. On the other hand, dictionary definitions refer to loss of control and deprivation of benefit, and that is what arises from the divorce of the data from its original purposes and contexts, its merger, and the use of the combined records to identify individuals whose behaviour was deemed deserving of suspicion on the basis of the matched data (Clarke 1994b). Similar approaches were applied by corporations to consumers (Larsen 1992), and to discrimination among consumer segments on the basis of location, class and race (Gandy 1993). Another early form of digitisation of individuals was profiling, which refers to the extraction of a set of characteristics of a particular category of person (Clarke 1993).

Over the course of a few decades c. 1980-2000, the basis for decisions about such matters as lending changed from the evaluation of the applicants face-to-face, complemented by consideration of supporting documents, to a process almost entirely based on the data assembly that was thought to relate to the applicant, i.e. to the digital persona rather than to the person (Clarke 1994a, 2014).

The Internet brought with it scope for pseudonymity and even anonymity (Froomkin 1995). The original World Wide Web was structured in a manner inconvenient to business and government, because it contained limited means for organisations to extract data about the individuals communicating with the organisation's web-site. A review of early attempts to subvert the Web's original user-driven orientation is in Clarke (1999).

However, as the dot.com investment era boomed, and then c. 2000-01 imploded, 'new rules' were emerging (Kelly 1998). Web 2.0 was conceived as the means whereby the Web's architecture could be inverted into a marketer's data collection and consumer manipulation framework (O'Reilly 2005). The defining feature of Web 2.0 was originally claimed to be that businesses leverage user-generated content. The common aspect was identified a little differently in Clarke (2008b), as 'syndication':

Since then, the third of these - syndication of storage - has ebbed, as consumers have been drawn into the cloud computing model, and have abandoned control of their data to service-providers (Clarke 2011). The other three aspects of Web 2.0 syndication, on the other hand, are foundational elements of consumer marketing as it is practised in the second half of the century's second decade. In a mere 15 years, c. 1995-2010, corporations had greatly reduced the scope for user anonymity, and had many tools at their disposal for unmasking individuals. As noted by social scientists early in this phase, "Digitization facilitates a step change in the power, intensity and scope of surveillance" (Graham & Wood 2003).

Until c. 2010, there remained some optimism that consumer marketing would become less intrusive and manipulative, rather than increasing its arrogance and intensity. 'The Cluetrain Manifesto' "nailed 95 theses to the Web", arguing that the new context resulted in consumers gaining market power, and that 'markets are conversations' (Levine et al. 2000). Clarke (2006a, 2006d) discussed strategic approaches to consumer privacy, and Clarke (2006c, 2008) investigated the scope for positive marketer-consumer communications. As will be presented shortly, however, the digital surveillance economy for which Web 2.0 laid the foundations has proven to be anything but a two-sided conversation.

3.3 The Information Society

It had been clear for many decades that secondary industry was being overhauled by service industries as the engine-room of the economy and the major shaper of society. The term 'knowledge industries' was coined by Machlup (1962). Drucker (1968) took the idea further, writing that "knowledge, during the last few decades, has become the central capital, the cost centre and the crucial resource of the economy" (1968, p.9). However, his focus at that stage was on knowledge within large corporations, and he said little about society as a whole. Meanwhile, some influential authors remained as yet uncertain as to whether or not information was central to future developments. Toffler's 'Future Shock' (1970) considered information and knowledge only in relation to their capacity to overload and overwhelm the individual, and for a decade the common term for the emergent era remained 'the post-industrial society' (Touraine 1971, Bell 1976).

In 1981, however, Masuda depicted 'The Information Society as Post-Industrial Society', and the specialist journal The Information Society journal commenced publication. Toffler, even in 'The Third Wave' (1980), still contained little on the notion of information society (pp.187-189). He did, however, propose that a new form of 'social memory' would arise, which would be "extensive and active" and therefore "propulsive". The 'Megatrends' of Naisbitt (1982), on the other hand, were based on the proposition that "we have ... changed to an economy based on the creation and distribution of information" (p.1). Beniger (1986) and Cawkwell (1987) further popularised the information society notion. The tone of the large majority of the relevant literature in the 1980s was very upbeat, strongly focussed on the opportunities, and with very little consideration of what distributive effects it might have, what threats it might embody, or whether negative impacts might fall unevenly.

3.4 Surveillance Society

A striking aspect of the primary references in information society literature is that the notions of personal data, data surveillance and even privacy are barely mentioned, and in many of the books do not even appear in the index. On the other hand, a strong thread of privacy-related literature had been developing since the mid-1960s, and much of this was motivated by the increasing capabilities of organisations to gather, store, process and apply data about individuals (Miller 1969, Rosenberg 1969, Warner & Stone 1970, Wessell 1974, Flaherty 1984, Flaherty 1989).

Roszak (1986) observed that the purpose of 'computerised surveillance' was "to reduce people to statistical skeletons for rapid assessment" (p.186). He declared himself to be aghast at Toffler's misplaced enthusiasm for, and lack of scepticism about, the new form of 'social memory': "What we confront in the burgeoning surveillance machinery of our society is not a value-neutral technological process ... It is, rather, the social vision of the Utilitarian philosophers at last fully realized in the computer. It yields a world without shadows, secrets or mysteries, where everything has become a naked quantity" (pp.186-7). He provided a caricature of the quality problems of what was then referred to as 'data mining' and is now re-badged as 'big data analytics' and 'data science': "garbage in - gospel out" (p.120).

The earliest uses of the term 'surveillance society' in formal literatures appear to have been by Marx (1985), Flaherty (1988), Weingarten (1988) and Gandy (1989). Broadly, it refers to a society in which pervasive monitoring of individuals' behaviour is routine. The majority of the surveillance literature has been contributed by social scientists, and careful definitions of the term are in short supply. However, the sentiments are well summed-up by this early description: "Technology could be offering the tools for building a surveillance society in which everything people do and are as human beings is an open book to those who assert some 'right' to monitor them" (Weingarten 1988, p.747).

The literature has also suffered from the dominance of a single metaphor. The term 'surveillance' was adopted into English c. 1800, derived from the French word for watching, in particular of prisoners. Bentham's late 18th century notion of a (physical) 'panopticon' was co-opted by Foucault (1977) as a virtual notion. Foucault's popularity has been such that discussions of monitoring have been for several decades heavily impregnated with the implicit assumption that surveillance is first and foremost visual in nature, and only secondarily takes other forms. Highly-cited works, such as Lyon (1994) and Norris & Armstrong (1999), embodied that assumption. Despite criticism that "social construction of the surveillance notion has been seriously limited by being rooted in the visual" (Clarke 2000), the metaphor is still in vogue (e.g. Gilliom & Monahan 2012).

Surveillance is usefully defined as the systematic investigation or monitoring of the actions or communications of one or more persons. The purpose of surveillance is commonly to support management of the behaviour of individuals or populations. Surveillance may well deter particular behaviours; but it is largely passive. It is an inappropriate stretch of meaning to use the term to also encompass active interventions intended to repress or constrain, or to enforce, particular behaviours. The more appropriate term for active measures for which surveillance is a key enabler is 'social control'.

The breadth of surveillance is far greater than physical observation, as was attested to by such contributions as Rule (1974). In addition to visual surveillance, other forms comprise physical surveillance, physical surveillance at distance, auto-physical surveillance (by devices carried with, on or in the person), dataveillance Clarke (1988), communications surveillance (of messages), and electronic surveillance (of space, traffic and movement). More detailed analysis of surveillance categories, techniques and technologies is in Clarke (2009b, 2010).

3.5 The Surveillance State

Related to the notion 'surveillance society' is that of the 'surveillance state' (Balkin & Levinson 2006, Harris 2010). Justifications advanced by the proponents of surveillance initiatives by governments are customised in order to invoke the 'ideas in good standing' at the particular time, in the particular place (Slobogin 2005). Excuses have included combatting organised crime and money-laundering, conducting the 'war on drugs', fighting terrorism, protecting the public against violent crime, preventing social unrest, identifying publishers and users of child pornography, and defending copyright-dependent corporations. The notion of protecting national security is (perhaps intentionally) highly vague and has proven to be extremely flexible, encompassing not only the many dimensions of national sovereignty interests, but also the protection of critical infrastructure (itself a highly extensible notion), the physical security of individuals deemed by the State to be very important, and the physical security of the general public. A common feature of surveillance initiatives claimed to protect such values is that they are not subjected to evaluation or transparency, nor to adequate controls, nor to audit, and that substantial doubts exist about their justification and proportionality, and the inclusion of appropriate safeguards and mitigation measures (APF 2013, Clarke 2016a).

Familiar examples of the surveillance state include the USSR under Stalin, China during the 'Cultural Revolution', pre-1989 East Germany (Funder 2003), and developing and under-developed countries ruled by despotic regimes; but the phenomenon is not restricted to those archetypes. Law enforcement and national security agencies, even in free-world countries, not only apply similar techniques but also behave at the margins of legality and even breach the law as they see fit. The USA and its collaborating allies, including the other members of the 'Five Eyes' group, have long had very substantial electronic surveillance networks in place, in defiance of national laws and the expectations of international diplomacy (Priest & Arkin 2010). The Snowden relevations confirmed what was already understood by surveillance-watchers, but that came as a surprise to segments of the media and the public that had wilfully closed their eyes to the evidence (Greenwald 2014).

A further threat to a healthy polity is the application of "six intertwined dynamics" to what Tufekci (2014) calls 'computational politics': "the rise of big data, the shift away from demographics to individualized targeting, the opacity and power of computational modeling, the use of persuasive behavioral science, digital media enabling dynamic real-time experimentation, and the growth of new power brokers who own the data or social media environments". These 'dynamics' will be re-visited in the following section.

The trust that governments ought to earn from their citizens has been, and continues to be, undermined by agency misbehaviour. The distrust in public sector institutions leads inevitably to distrust in corporations as well. This is reinforced by the parallel and continual narrative of corporate malperformance (such as grossly inadequate data security, resulting in continual, avoidable data breaches) and of corporate misbehaviour, in surveillance as in other areas (Clarke 2017). Another factor that conjoins distrust across the two sectors is the widespread outsourcing of public sector surveillance to private sector providers, and the 'public-private partnerships' that blur the edges of the (often mandated) personal data collection practices of government agencies and the (mostly at least nominally consensual) personal data collection processes of corporations. For a vision of the future, high-tech, corporatised-government State, see Schmidt & Cohen (2014).

3.6 Surveillance Capitalism

Further evidence of the nervousness in some parts of contemporary society is the recently-formulated notion of 'surveillance capitalism'. This is defined, accessibly in Zuboff (2016) and more formally in Zuboff (2015) as being a "new form of information capitalism [which] aims to predict and modify human behavior as a means to produce revenue and market control" (2015, p.75), and to do so at scale.

Zuboff identifies the substantial change as having occurred c. 2000, as Google (at that stage a company that had yet to turn a profit) sought a way to 'monetise' its dominance of the search-engine marketplace (Zuboff 2015, p.79). More succinctly, Zuboff (2016) argues that "Google's profits derive from the unilateral surveillance and modification of human behavior", and, most brutally: "Capitalism has been hijacked by surveillance". She also contends that the model has become widespread since then.

Zuboff criticises the surreptitiousness of the techniques, and the absence of a relationship between individuals and the organisations monitoring them, not least because the lack of transparency precludes recourse. Moreover, the new model pays no respect to governmental regulation intended to achieve balance between corporate innovation and social needs (Zuboff 2015, p.80). For example, in a response to criticism in a leading German newspaper (Maier 2014), Google's Chair Eric Schmidt expressed his frustration with the prospect of being subject to public oversight, characterizing it as 'heavy-handed regulation' and threatening that it would create 'serious economic dangers' for Europe (Schmidt, 2014).

The notions of the information society, surveillance society, the surveillance state and surveillance capitalism provide the framework within which the remainder of this paper examines the nature and impacts of digitalisation.


4. The Digital Surveillance Economy

The purpose of the study reported here is primarily concerned with economic and social than with political matters, and its analysis is intended to be relevant to business executives, policy-makers and consumers, rather than to be as broad-ranging and polemical as Zuboff's work. This paper accordingly treats 'surveillance capitalism' as an adjacent topic rather than adopting it as the frame of reference. Contemporary business models are adapting to the new, highly-information-rich context, and hence the appropriate conceptualisation of the research domain for the present study utilises the word 'economy'.

4.1 The Concept

The 'direct marketing' mechanisms of two to five decades ago have developed into heavy reliance by corporations on the acquisition and exploitation of detailed machine-readable dossiers of information on large numbers of individuals (Davies 1997, Hirst 2013). The term 'exploitation' is subject to interpretations both positive - as in the extraction of advantage from one's own assets - and negative - as in unfair utilisation of something belonging to someone else. Both senses of the word are applicable here, because some of the data arises from the organisation's interactions with the individuals concerned; whereas a great deal more arose in other contexts, and has been acquired from other sources.

Surveillance that produces machine-readable data, in quantity, about more-or-less everybody, has changed the game. A reasonable descriptor for that new game is 'the digital surveillance economy'. But what does that glib phase mean, in terms amenable to sober analysis? As at December 2016, Google Scholar found a single, prior usage of the term 'digital surveillance economy'. That usage, published within the Media Studies discipline, is consistent with the interpretation adopted here: "I suggest that there is an emerging understanding on the part of users that the asymmetry and opacity of a 'big data divide' augurs an era of powerful but undetectable and un-anticipatable forms of data mining, contributing to their concern about potential downsides of the digital surveillance economy" (Andrejevic 2014, p.1678). This section builds on the predecessor notions, identifies the key features of the digital surveillance economy, and lays a foundation for analysis of its implications and of the reactions against it that may arise.

A longstanding aphorism in media theory is 'If you're not paying, you're the product'. For an early usage in the Web context, see Lewis (2010). This notion stands at the heart of the new approach. The foundations for the new economy lie neither in classical nor neo-classical economics (which are relevant to scarce resources), but rather in information economics (which applies to abundant resources). The business school text-book that heralded the revolution was Shapiro & Varian (1999). Its significance is underlined by the fact that Hal Varian was recruited from Berkeley in 2002 to be Chief Economist at Google.

The Varian thesis is that customers' attention is the marketer's stock-in-trade. It is to be traded for profit, and cross-leveraged with business partners. Shapiro & Varian are completely driven by supply-side thinking, and assume that demand is not only disorganised and powerless, but also that it will remain so. In their theory - and right now in the digital surveillance economy - the interests of consumers have almost no impact on the process, so they can be safely ignored. Relationship marketing, permission-based marketing, interactivity with and participation of consumers, the prosumer concept, customer needs, need satisfaction and consumer alliances are all overlooked. Consumer rights, privacy rights, and laws to protect them might as well not exist. Regulation is merely a barrier to be overcome, and to be replaced by the chimera of 'self-regulation'. The book lacks any vestige of the notions of community, society, equity, discrimination or individual wellbeing. Indeed, the key terms used in this paragraph do not even appear in Shapiro & Varian's index.

The techniques for acquiring personal data have become much more sophisticated during the nearly two decades since Shapiro & Varian wrote; but the principle was laid out very clearly: "[marketers'] strategy should be to bribe users to give them the appropriate demographics, which in turn can be passed onto advertisers ... [The game is about] inducing consumers to give [marketers] the information they want. ... we expect that many consumers will be happy to sell information about themselves for a nominal amount ..." (pp.35-36). On the surface, the authors appear to have been oblivious to the fact that, in most civilised countries, both 'bribing' and 'inducing' are criminal acts. Alternatively they regarded the criminal law as being merely an inconvenient regulatory hurdle that business enterprises needed to overcome.

4.2 Processes, Sources, Structure

Shapiro & Varian's work has been well-applied, and is evidenced in the key features of the new forms of consumer marketing. A partial model was put forward in Degli Esposti (2014), which considered: "the ability of reorienting, or nudging, individuals' future behavior by means of four classes of actions: 'recorded observation'; 'identification and tracking'; 'analytical intervention'; and 'behavioral manipulation'" (p.210). That list of actions is further articulated in Table 1, with greater detail on the elements of digital surveillance economics provided in Appendix 1.

Table 1: Key Processes of the Digital Surveillance Economy

  1. Acquisition of Personal Data from consumers, some through overt interactions with them, and much more through the use of various forms of spyware to gather data about their marketspace behaviour not only when they are in contact with the organisation but also when they are interacting with other sites and experiencing text, images, sound and video through other channels such as e-readers (Palmas 2011, Degli Esposti 2014, p.214, Christl & Spiekermann 2016, pp.45-72)
  2. Expropriation of Personal Data from sources other than the individual, some through the treatment of social media behaviour as though it were 'public domain' and free for use for any purpose; and some through various forms of trading in data, which may be camouflaged by such terms as 'strategic partnerships' (Christl & Spiekermann 2016, pp.76-117)
  3. Consolidation of Personal Data into Individual Consumer Profiles, variously as physical data collections and as 'on the fly' virtual records, whether constructed internally or by other corporations in the large and complex network that makes up the personal-data-exploitation industry (Christl & Spiekermann 2016, pp.87-90)
  4. Analysis of Data, in order to draw inferences about populations, sub-populations and individuals (Degli Esposti 2014, pp.215-219, Christl & Spiekermann 2016, pp.24-38, 84-86). This may include allocation of each individual into one or more categories or profiles. There are many such categorisations. For a sample set of c.60 profiles, see EPIC (2017)
  5. Narrowcasting of Targeted Advertisements, applying consumers' individual profiles so that the specific advertisement that appears on the individual's screen is attractive to them, fitting to their preferences, and to their interests at that particular time (Dwyer 2009)
  6. Behaviour Manipulation, in that the marketer is sufficiently aware of the individual consumer's socio-demographic attributes, prior behaviour and current interests that the message in the advertisement can be designed to be highly persuasive (Degli Esposti 2014, pp.220-221)
  7. Micro-Pricing, in that the marketer is able to gauge the point at which buyer-resistance is likely to arise, and pitch the offer just below it in order to extract the maximum revenue from each customer (Shapiro & Varian 1999 pp.7-50, Acquisti 2008, Christl & Spiekermann 2016, pp.41-44)

__________________

The last of the seven elements in Table 1 is deserving of further attention, because Shapiro & Varian's segment on differential pricing is looming as a particularly important aspect of the whole. The vast array of personal data that has become available to consumer marketers enables not merely the targeting of advertisements and the manipulation of behaviour through timing and shaping of offers. It enables marketers to move beyond pre-set pricing of their offerings - typically at a price the market as a whole will bear - and instead set a price that each individual will bear. Although some consumers may get a lower price, many will get a higher one, and hence the technique allows marketers to extract far greater revenue from the market as a whole - just as Shapiro & Varian (1999) recommended: "How do you extract the most value ...? ... First, personalize or customize your product ... Second, establish pricing arrangements that capture as much of that value as possible" (p.32).

In Table 2, an indication is provided of the enormously diverse range of sources that are exploited in the digital surveillance economy.

Table 2: Key Data Sources for the Digital Surveillance Economy

  1. eCommerce Transactions
    Some data arises from economic transactions knowingly conducted with an organisation, and the individual would understand that both they and the merchant have interests and rights in relation to that data. But that data is then expropriated for other purposes, and disclosed to other organisations
  2. Advertising 'Transactions'
    Data also arises from clicks on the ads pushed to browsers, and even from the appearance of ads in a browser-window. This is then available to not just the advertiser, but also to the long chains and large networks of intermediaries that feed off that data. Moreover, organisations that have nothing to do with the transaction contrive to get copies of some of the data, or to services drawing on that data
  3. Payment Transactions
    Beyond the small numbers of high-value transactions that consumers conduct, increasing proportions of low-value payments have been converted from ephemeral and largely anonymous form into recorded and identified records. This has resulted in identified trails of individuals' activities and locations - which are all the more intensive since unauthenticated NFC-chip payments exploded post-2010. This provides at least each individual's financial institutions, but also the card-processing companies (primarily Visa and MasterCard), with a treasure-trove of monetisable personal data
  4. Search-Terms
    The searches for content that emanate from users' devices are an especially rich source, because they are mostly conscious and volitional, and hence closely associable with users' interests and psychological states. In 1998-2000, Digital Equipment and then Compaq failed to appreciate the goldmine represented by the search-terms gathered by its AltaVista search-engine. Google, from its humble beginnings as a late-mover that got lucky, has been able to leverage search-term data so well as to become the corporation with the world's second-largest market capitalisation. (In early 2017, numbers 1-4 were Apple, Alphabet i.e. Google, Microsoft and Amazon.com; number 6 was Facebook; and Alibaba was in hot pursuit)
  5. Media Experiences
    Views of web-pages generate flows of data associated with loginids, stable IP-addresses, browser-fingerprints and GPS coordinates. Video-downloads (be they delivered via YouTube, Netflix or any number of other channels) give rise to data that is associated with households and/or with individuals. So also does e-reading, variously by downloading, opening, page-turning, or dwelling on a page (Cohen 1996, Lynch 2017). Music streaming and purchasing do the same (Rosenblatt 2015)
  6. Social Media
    This diverse and busy cluster of services represents a bountiful source of data not just to the people with whom those individuals perceive themselves to be communicating, and not just to the service-provider, but also to the vast numbers of organisations in what industry writers like to refer to as an 'ecology'. Indicators (such as re-posts, re-tweets and 'likes'), postings (of text, of images, and of videos), and meta-data (such as tags, date-and-time-stamps, identifiers and geo-location data) can be directly exploited. Further information can be inferred and interpolated, particularly from the implicit social networks to which the individual belongs.
  7. The Quantified Self
    A further source is measures of aspects of the human body, emanating from 'wellness devices' (such as Fitbit), but also other forms of 'wearables', and from 'self-tracking' devices such as GPS-based pedometers and vehicle 'satnavs' (Lupton 2016)
  8. The Internet of Things
    As IoT passes through its 'hype-cycle', some applications will survive and flourish. Some of those will be quite expressly designed to generate yet more identified data-streams, and others will generate them as a potentially lucrative by-product

__________________

These streams can be combined, producing what Palmas (2011) refers to as 'panspectric surveillance'. Further, they can be integrated with the plethora of data-sets from pre-Internet expropriation rounds. These include the holdings of credit bureaux, mailing-list and database marketing operators, loyalty schemes, phonebook database operators, and the many organisations that have gained access to electoral roll data by fair means or foul. Many of these hold reasonably reliable socio-demographic data for vast numbers of consumers. Acxiom, Experian and Lexis Nexis are familiar names, originating in the USA, but operating in many other countries as well. Further details and case studies are provided in Christl & Spiekermann (2016, pp. 76-117).

The technical means that have been developed since 2001 to support this feeding-frenzy are enormously diverse. In some cases, protocols and data-formats are semi-standardised, e.g. the highly merchant-friendly and consumer-hostile HTML 5.0 specification (Lyne 2011). In other cases, they are proprietary subterfuges and spyware designs that would quite possibly be found to be technically illegal under the cybercrime provisions enacted in many countries following the Convention on Cybercrime (CoE 2001) - if cases could ever be successfully brought before courts. For an overview of data-streams in various contexts, see Christl & Spiekermann (2016, pp.45-75). For recent reviews of the array of techniques for achieving associations of Web data-streams with individuals, see Mayer & Mitchell (2012) and Bujlow et al. (2015). For data on the extent and effectiveness of tracking using these techniques, see Englehardt & Narayanan (2017). For a recent review of the extent to which Android and iOS apps scatter data to large numbers of organisations, see Zang et al. (2015).

The most successful exponent, and (not only in Zuboff's eyes) the primary driver of the emergence of the digital surveillance economy, has been Google (Clarke 2006b, Vaidhyanathan 2011, Newman 2011, Angwin 2016). However, multiple other large corporations have followed their own paths, which have begun in areas different from Google's origins in search-terms - Facebook in social media, Apple in music consumption, and Amazon in eCommerce transactions. Microsoft may eventually make up for missed opportunities and lost ground, e.g. through its cloud-based office suite.

The industry structure through which personal data expropriation, exploitation and cross-leveraging is conducted is in continual flux, and its processes are largely obscured from public view. At a simplistic level, it comprises sources, consolidators, analysts, service-providers and user organisations. But many sub-classifications exist, many overlaps occur, and organisations vie to take others over, and to be taken over. Profiling agencies such as VisualDNA, EFL and Cambridge Analytica specialise in processing data-hoards to address specific interests. See Mayer & Mitchell (2012) and the succession of industry models offered by Brinker (2016). Consumer tools such as Ghostery and AdBlock Plus, and their associated lists, identify many hundreds of organisations that are privy to individuals' web-traffic through such means as cookies, web-bugs, web-beacons and ad-/spy-ware.

User organisations that pay, in some cases for copies of data, and in other cases for services based on data, are found across the length and breadth of the consumer marketing, consumer credit, insurance and healthcare industries, and there are also multiple uses in the context of employment. Many government agencies are also sources of revenue - although in some cases they may pay 'in kind', by making 'their' data available for exploitation. Political parties and candidates have recently become an additional, and in some countries, highly lucrative user sector.

As discussed earlier, the digitalisation notion has broad applicability. Whatever the value of digitalisation in other contexts, however, its applications in the field of individual surveillance are being imposed on society, not offered as a choice. Digitalisation features customisation of services to individuals' profiles, enabled by comprehensive digital surveillance of individuals' behaviour. Organisations justify their applications of digital technology on the basis of 'marketing efficiency', the convenience that their products and services offer to people, and the empirical evidence of people being very easily attracted to adopt and use them. Adoption is achieved through bandwagon effects ('viral marketing'), network effects, and features that appeal to individuals' hedonism and that encourage the suspension of disbelief and the suppression of rational decision-making in individuals' own interests.


5. The Digital Surveillance Economy as Threat

Building on the ideas in the preceding sections, a range of issues can be identified. The focus of the first section is on aspects of digitalisation's impacts that are relevant to individuals, and of the second on broader concerns.

5.1 Threats to the Individual

Individuals gain from the digital surveillance economy in a hedonistic sense. This includes the convenience of not needing to think about purchases because attractive opportunities continually present themselves, the entertainment value built into the 'customer experience', and perhaps a warm glow arising from the feeling that profit-making corporations understand them.

One negative factor is unnecessary purchases through compulsive buying. These arise in response to targetted ads that succeed in their aim of 'pushing that particular consumer's buttons'. A few individuals may pay lower prices than they would under the old set-price approach; but most pay more, because the attractiveness of factors other than price (in particular the way in which the offer is presented, and the time it appears) enable the seller to extract premiums from most buyers. A very substantial degree of consumer manipulation is inherent in the digital surveillance economy. These concerns have been mainstream since at least the mid-20th century (Packard 1960), but are greatly intensified during the digital era.

Further issues arise in relation to the way in which organisations make decisions about individuals, particularly in such contexts as financial services, insurance and health care services. Discriminatory techniques are used in ways that harm individuals' access, such as blacklisting and 'redlining'. Discrimination was originally based on residential address, but is now using a far wider range of socio-demographic data to avoid conducting transactions with individuals deemed on the basis of their digital persona to be unattractive customers, or to inflate the prices quoted to them (Lyon 2003, Newell & Marabelli 2015, Ebrahimi et al. 2016).

A considerable proportion of the voluminous data that is available is likely to be irrelevant, of low quality and/or of considerable sensitivity. Further, digitalisation involves not only quantification but also automation of decision-making, unmoderated by people. To a considerable degree, transparency has already been lost, partly as a result of digitalisation, and partly because of the application not of 20th century procedural software development tools, but of later-generation software tools whose 'rationale' is obscure, or to which the notion of rationale doesn't even apply (Clarke 1991, 2016b). The absence of transparency means that unreasonable and even downright wrong decisions cannot be detected, and recourse is all-but impossible.

In a highly competitive market, this may not matter all that much. In the real world, however - especially in Internet contexts where network effects make late entry to the market very challenging - monopolies and oligopolies exist, and micro-monopolies abound, and it is in the interests of business enterprises to create and sustain them. Serious harm to consumers arises in such circumstances as where, for example, all lenders, and all insurers, apply the same creditworthiness, and risk, evaluation technique to the same data, and where lenders and insurers reject applications and claims based on irrelevant or erroneous data, without transparency, and hence without any effective opportunity for clarification, challenge or rectification.

The intensity of data-holdings in the private sector results in increasing attractiveness of access to those holdings by public sector organisations. Moreover, there are strong tendencies towards sharing between business and government. For example, outsourcing of what have long been regarded as public functions is rife; so-called 'public-private partnerships' blur the boundaries between the sectors; the mantra of 'open access' to government data is being used to dramatically weaken data safeguards; and during the last 15 years law enforcement and national security agencies have been granted demand powers unprecedented in free nations prior to 2001 other than during circumstances of impending invasion. The risks arising from the digital surveillance economy therefore extend well beyond corporate behaviour, to include government agencies generally.

The connectivity enjoyed by city-dwellers brings with it surveillance that is not only ubiquitous within the areas in which they live, work and are entertained, but also continual, and for some even continuous. Intensive surveillance chills behaviour, undermines the feeling of personal sovereignty, and greatly reduces the scope for self-determination. The digital surveillance economy accordingly has impacts on the individual, in psychological, social and political terms: "psychic numbing ... inures people to the realities of being tracked, parsed, mined, and modified - or disposes them to rationalize the situation in resigned cynicism" (Hoofnagle et al., 2010). As Zuboff noted (2015, p.84), Varian and Google are confident that psychic numbing facilitates the new business model: " ... these digital assistants will be so useful that everyone will want one, and the statements you read today about them will just seem quaint and old fashioned" (Varian 2014, p. 29).

Many of the risks of chilling and oppression arise because of the likelihood of actions by government agencies. However, the incidence of corporate oppression of employees, public interest advocates and individual shareholders has been rising. It is reasonable to expect that the increased social distance between corporations and the public, combined with the arrogance of large corporations in relation to oversight agencies and even to the civil and criminal laws, will see much greater interference by organisations with the behaviour of individuals in the future.

The degree of concern that individuals have about these issues varies greatly. A small proportion, but a significant number, of people, are concerned and even alarmed, as a matter of principle. Some of them carry their concern into the practical world, by seeking out and using means to obfuscate and falsify data and to mislead, corrupt and subvert corporations' data-holdings. At the other extreme, a large proportion, and a very large number, of people have very little understanding of the nature of the digital surveillance economy; and some of those who have at least a vague sense of the issues profess to have little or no concern about them, or argue that they can do nothing about it and hence there's no point in worrying about it. Some proportion of the uninformed and unconcerned do, however, suffer experiences that change their views. A meme that has gained traction in recent years has been associated with the word 'creepiness', (e.g. Preston 2016).

5.2 Broader Impacts

A healthy society depends on the effective chilling of some kinds of behaviour (such as violence, and incitement to violence, against individuals, minorities and the public generally), but also on other kinds of behaviours not being chilled. Creativity in scientific, technological and economic contexts is just as vitally dependent on freedom of speech and action as is creativity in the artistic, cultural, social and political realms.

Considerable concerns have been repeatedly expressed, over an extended period, about the capacity of surveillance to stultify economies, societies and polities. The dramatic increase in surveillance intensity that arrived with the new century gives rise to much higher levels of concern: "Using surveillance to achieve one's aims, no matter how grand or how miniscule [sic], bestows great power. ... Some interests will be served, while others will be marginalised. ... Surveillance coalesces in places where power accumulates, underpinning and enhancing the activities of those who rule and govern. The danger is that surveillance power becomes ubiquitous: embedded within systems, structures and the interests they represent. Its application becomes taken for granted and its consequences go un-noticed. ... it is important that this power ... is wielded fairly, responsibly, and with due respect to human rights, civil liberties and the law. ... Sifting through consumer records to create a profitable clientele means that certain groups obtain special treatment based on ability to pay whereas those deemed `less valuable' fall by the wayside. ... If we are living in a society which relies on surveillance to get things done are we committing slow social suicide?" (SSN 2014).

More forcefully: "the modern age - which began with such an unprecedented and promising outburst of human activity - may end in the deadliest, most sterile passivity history has ever known" (Arendt 1998, p. 322, quoted in Zuboff 2015, p.82). It may be that the new business models will drive corporations to ignore and overwhelm governments and polities: "the waves of lawsuits breaking on the shores of the new surveillance fortress are unlikely to alter the behavior of surveillance capitalists. Were surveillance capitalists to abandon their contested practices according to the demands of aggrieved parties, the very logic of accumulation responsible for their rapid rise to immense wealth and historic concentrations of power would be undermined" (Zuboff 2015, p.86).

Art naturally preceded the hard reality. Beginning with Herbert (1965) and Brunner (1975), and in cyberpunk sci-fi from Gibson (1984) to Stephenson (1992), hypercorps (the successors to transnational corporations) dominate organised economic activity, and have associated with them polite society (the successor to 'corporation man'), the majority of the net and a great deal of information. Outside the official levels of society skulk large numbers of people, in communities in which formal law and order have broken down, and tribal patterns have re-emerged. Officialdom has not been able to sustain the myth that it was in control; society has become ungovernable: "[cyberpunk authors] depict Orwellian accumulations of power ..., but nearly always clutched in the secretive hands of a wealthy or corporate elite" (Brin 1998). The term 'corpocracy' has emerged, to refer to an economic or political system controlled by corporations (Monks 2008).

The pattern that this paper has described as the digital surveillance economy undermines a great deal more than just the interests of individual consumers. Many aspects of society and polity are threatened by the shift of power from the public sector - which is at least somewhat subject to ethical constraints, and to transparency requirements and regulatory processes that have some degree of effectiveness - to dominance of a private sector that is largely free of all of them.


6. A Research Agenda

Only a small proportion of the information systems literature concerns itself with the impacts and implications of IT applications. Within that small literature, even fewer are instrumentalist in orientation. One of the few that suggests a research agenda in the area addressed in this paper (although the authors use 'datification' as the focal-point) is Newell & Marabelli (2015). However, this suggests the use of an 'ethical dilemma lens' - which seems unlikely to gain traction with business enterprises, or even with regulatory agencies.

The purpose of this paper has been to lay the foundations for a research programme whereby concerns about the threats to individuals inherent in the digital surveillance economy can be investigated, and appropriate responses formulated. This section makes two contributions. It first suggests a set of research questions that reflect alternative future paths that society can trace following the explosive growth of the digital surveillance economy. The second section then suggests the shape that a research programme might take.

6.1 Research Questions

Future developments might follow any of a range of alternative paths. Scenario analysis builds story-lines that represent composite or 'imagined but realistic worlds' and hence provide insights into possible future paths, and is an appropriate approach to research into alternative futures (Wack 1985, Schwartz 1991, van Notten et al. 2003, Postma & Leibl 2005, Clarke 2015b). Three potential scenario settings are used below as an organising framework for research questions that arise from the preceding analysis.

Scenario (1) The Trajectory Is Set

In this scenario, the presumption is made that net technologies combine with firstly the weakening of the nation-state, and secondly the self-interest, power and momentum of powerful corporations at the centre of the digital surveillance economy, and hence the new business model establishes itself as the primary determinant not only of economies but also of societies and polities.

Key questions that need investigation are:

As this paper has shown, a great deal of what is necessary for Scenario (1) to come to fruition is already in place, and hence this might be regarded as the default Scenario.

Scenario (2) Regulatory Adaptation

An alternative scenario investigates the possibility that the action arising from the digital surveillance economy begets an effective reaction:

In Ceglowksi (2015), it is suggested that the foundations of the current digital surveillance economy are rotten, and that the current, bloated Internet advertising industry structure will implode. This would of course influence processes and industry structure. It seems unlikely, however, that such changes would resolve the issues that this paper addresses. The rewards arising from targeted advertising based on intensive data about consumers appear to be so large that industry re-structuring is more likely to alter the allocation of benefits to corporations than to cause fundamental change in the practices. More likely paths may be found among the possible regulatory interventions identified in Christl & Spiekermann (2016, pp. 142-143).

Scenario (3) Counter-Revolution

Zuboff extracted this implication from her 'surveillance capitalism' analysis: "Nothing short of a social revolt that revokes collective agreement to the practices associated with the dispossession of behavior will alter surveillance capitalism's claim to manifest data destiny ... New interventions [are] necessary ... The future of this narrative will depend upon the indignant scholars and journalists drawn to this frontier project, indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities, and indignant citizens who act in the knowledge that effectiveness without autonomy is not effective, dependency- induced compliance is no social contract, and freedom from uncertainty is no freedom" (Zuboff 2016). The third scenario accordingly considers the possibility that the public takes the matter into its own hands:

The scope for societal resistance to digital surveillance has been studied in the abstract (e.g. Martin et al. 2009). Meanwhile, a wide variety of technological initiatives have been launched in support of resistance against corporate excess (Goldberg 2007, Shen & Pearson 2011, and see the proceedings of the PETS and SOUPS communities). Clarke (2015a) identifies the principles of obfuscation and falsification of data, messages, identities, locations and social networks. Schneier (2015a) examines the battleground in greater detail, and offers a related set of principles: avoid, distort, block and break surveillance (Schneier 2015a, 2015b). Bösch et al. (2016) recommend instead: maximise, publish, centralise, preserve, obscure, deny, violate, fake. A large and rapidly-growing proportion of the world's user devices blocks ads (PageFair 2017), and it may be that this is a harbinger of greater awareness and activism among consumers.

6.2 A Research Programme

The analysis in the preceding sections may appear somewhat bold, futuristic and to a degree apocalyptic. There is, however, some recognition within at least the German Information Systems community of the need for a broadly-based research programme - although the authors' proposals are primarily for the benefit of business and only secondarily for individuals, and they seem to have given little consideration to society and polity: "research on Digital Life requires extensive multidisciplinary activities in mixed research teams with experts from, for instance, psychology, cognitive sciences, or engineering, and co-innovation in cooperation with start-ups, spin-offs, and SMEs" (Hess et al. 2014, p.248).

Scenario analysis was suggested above as an appropriate means of integrating the findings from multiple disciplines and professions. However, scenario analysis depends on a clear description of the current state, an appreciation of environmental and strategic factors that may play a role in future developments, and bodies of theory that suggest particular actions and reactions that may occur under the various contingencies. In order to provide the necessary springboard, existing expertise and theories need to be identified and applied.

A first step towards addressing Research Questions such as those identified in the previous section is to establish what existing research already tells us about the behaviours of individuals and societies under stress, and particularly under surveillance stress. That will enable the identification of gaps in knowledge, the formulation of more specific research questions to fill those gaps, and the conception and conduct of research to provide answers.

The programme can utilise the theories, insights and research techniques of many different disciplines. Some examples of relevant research are as follows:


7. Conclusions

This paper has identified the risks to individuals, society and polity inherent in digitalisation. In order to do this, it was necessary to review the insights gained through a succession of conceptualisations of the changing world that the march of information technologies has brought about during the last half-century. The high degree of optimism in the early post-industrial society and information society phases has been tempered, and has become more pessimistic, as surveillance society and the surveillance state have developed, the digital surveillance economy has emerged, and surveillance capitalism has been postulated.

During the last 15 years, the digital surveillance economy has come into existence, its impacts are measurable, and its implications can be at least speculated upon, while some aspects may now be sufficiently stable that preliminary studies can be undertaken. Digitalisation harbours threats to individuals, to society, to economies and to polities. Whether those threats will have the impact that doom-sayers suggest needs to be investigated. And such investigations need to be conducted sufficiently early that, to the extent that the pessimism is justified, countermeasures can be developed and implemented before the harm becomes irreversible.

Conventional, rigorous, empirical research is useless in the face of such a challenge. Research techniques need to be deployed that are future-oriented and instrumentalist. Researchers are needed who are not terrified by the prospect of embracing value-laden propositions within their research models. For progress to be made, hard questions need to be asked, and insights must be drawn from the findings and theories of multiple disciplines. This paper suggests a research agenda that enables the challenge to be taken up.


References

Acquisti A. (2008) 'Identity management, Privacy, and Price discrimination' IEEE Security & Privacy 6, 2 (March/April 2008) 18-22, at http://www.heinz.cmu.edu/~acquisti/papers/j2acq.pdf

Andrejevic M. (2014) 'The Big Data Divide' International Journal of Communication 8 (2014), 1673-1689, at http://espace.library.uq.edu.au/view/UQ:348586/UQ348586_OA.pdf

Angwin J. (2016) 'Google has quietly dropped ban on personally identifiable Web tracking' ProPublica, 21 October 2016, at https://www.propublica.org/article/google-has-quietly-dropped-ban-on-personally-identifiable-web-tracking

APF (2013) 'Meta-Principles for Privacy Protection' Australian Privacy Foundation, 2013, at http://www.privacy.org.au/Papers/PS-MetaP.html

Balkin J.M. & Levinson S. (2006) 'The Processes of Constitutional Change: From Partisan Entrenchment to the National Surveillance State' Yale Law School Faculty Scholarship Series, Paper 231, at http://digitalcommons.law.yale.edu/fss_papers/231

Bell D. (1973) 'The Coming of Post Industrial Society' Penguin, 1973

Bell W. (2004) 'Foundations of Futures Studies: Human Science for a New Era: Values' Transaction Publishers, 2004

Beniger J.R. (1986) 'The Control Revolution: Technological and Economic Origins of the Information Society' Harvard Uni. Press, Cambridge MA, 1986

Brennen S. & Kreiss D. (2016) 'Digitalization and Digitization' International Encyclopedia of Communication Theory and Philosophy, October 2016, PrePrint at http://culturedigitally.org/2014/09/digitalization-and-digitization/

Brin D. (1998) 'The Transparent Society' Basic Books, 1998

Brinker S. (2016) 'Marketing Technology Landscape' Chief Marketing Technologist Blog, March 2016, at http://chiefmartec.com/2016/03/marketing-technology-landscape-supergraphic-2016/

Bösch C., Erb B., Kargl F., Kopp H. & Pfattheicher S. (2016) 'Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns' Proc. Privacy Enhancing Technologies 4 (2016) 237-254, at http://www.degruyter.com/downloadpdf/j/popets.2016.2016.issue-4/popets-2016-0038/popets-2016-0038.xml

Brunner J. (1975) 'The Shockwave Rider' Harper & Row, 1975

Bujlow T., Carela-Español V., Solé-Pareta J. & Barlet-Ros P. (2015) 'Web Tracking: Mechanisms, Implications, and Defenses' arxiv.org. 28 Jul 2015, at https://arxiv.org/abs/1507.07872

Cawkwell A.E. (Ed.) (1987) 'Evolution of an Information Society' Aslib, London , 1987

Ceglowski M. (2015) 'The Advertising Bubble' Idlewords.com, 14 November 2015, at http://idlewords.com/2015/11/the_advertising_bubble.htm

Christl W. & Spiekermann S. (2016) 'Networks of Control: A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy' Facultas, Wien, 2016

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, PrePrint at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1991) 'A Contingency Approach to the Application Software Generations' Database 22, 3 (Summer 1991) 23 - 34, PrePrint at http://www.rogerclarke.com/SOS/SwareGenns.html

Clarke R. (1993) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance' Journal of Law and Information Science 4,2 (December 1993), PrePrint at http://www.rogerclarke.com/DV/PaperProfiling.html

Clarke R. (1994a) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, PrePrint at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. (1994b) 'Dataveillance by Governments: The Technique of Computer Matching' Information Technology & People 7,2 (December 1994) 46-85, PrePrint at http://www.rogerclarke.com/DV/MatchIntro.html

Clarke R. (1997) 'Instrumentalist Futurism: A Tool for Examining I.T. Impacts and Implications' Working Paper, Xamax Consultancy Pty Ltd, October 1997, at http://www.rogerclarke.com/DV/InstFut.html

Clarke R. (1999) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report' Proc. 12th Int'l Bled Electronic Commerce Conf., June 1999, PrePrint at http://www.rogerclarke.com/EC/WillPay.html

Clarke R. (2000) 'Technologies of Mass Observation' Mass Observation Movement Forum, Melbourne, October 2000, at http://www.rogerclarke.com/DV/MassObsT.html

Clarke R. (2001) 'If e-Business is Different Then So is Research in e-Business' Invited Plenary, Proc. IFIP TC8 Working Conference on E-Commerce/E-Business, Salzburg, 22-23 June 2001, published in Andersen K.V. et al. (eds.) 'Seeking Success in E-Business' Springer, New York, 2003, PrePrint at http://www.rogerclarke.com/EC/EBR0106.html

Clarke R. (2006a) 'A Pilot Study of the Effectiveness of Privacy Policy Statements' Proc. 19th Bled eCommerce Conf., Slovenia, June 2006, PrePrint at http://www.rogerclarke.com/EC/PPSE0601.html

Clarke R. (2006b) 'Google's Gauntlets' Computer Law & Security Report 22, 4 (July-August 2006) 287-297, Preprint at http://www.rogerclarke.com/II/Gurgle0604.html

Clarke R. (2006c) 'A Major Impediment to B2C Success is ... the Concept 'B2C'' Invited Keynote, ICEC'06, Fredericton NB, Canada, August 2006, PrePrint at http://www.rogerclarke.com/EC/ICEC06.html

Clarke R. (2006d) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006), PrePrint at http://www.rogerclarke.com/DV/APBD-0609.html

Clarke R. (2008a) 'B2C Distrust Factors in the Prosumer Era' Invited Keynote, Proc. CollECTeR Iberoamerica, Madrid, June 2008, PrePrint at http://www.rogerclarke.com/EC/Collecter08.html

Clarke R. (2008b) 'Web 2.0 as Syndication' Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#, PrePrint at http://www.rogerclarke.com/EC/Web2C.html

Clarke R. (2009a) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, revised version at http://www.rogerclarke.com/ID/IdModel-1002.html

Clarke R. (2009b) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html

Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, PrePrint at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, PrePrint at http://www.rogerclarke.com/EC/CCC.html

Clarke R. (2014) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, PrePrint at http://www.rogerclarke.com/ID/DP12.html

Clarke R. (2015a) 'Freedom and Privacy: Positive and Negative Effects of Mobile and Internet Applications' Proc. Interdisciplinary Conference on 'Privacy and Freedom', Bielefeld University, 4-5 May 2015, at http://www.rogerclarke.com/DV/Biel15.html

Clarke R. (2015b) 'Quasi-Empirical Scenario Analysis and Its Application to Big Data Quality' Proc. 28th Bled eConference, Slovenia, 7-10 June 2015, PrePrint at http://www.rogerclarke.com/EC/BDSA.html

Clarke R. (2016a) 'Privacy Impact Assessments as a Control Mechanism for Australian National Security Initiatives' Computer Law & Security Review 32, 3 (May-June 2016) 403-418, PrePrint at http://www.rogerclarke.com/DV/IANS.html

Clarke R. (2016b) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at http://www.rogerclarke.com/EC/BDQAS.html

Clarke R. (2017) 'Vignettes of Corporate Privacy Disasters' Xamax Consultancy Pty Ltd, March 2017, at http://www.rogerclarke.com/DV/PrivCorp.html

CoE (2001) 'The Cybercrime Convention' Convention 185, Council of Europe, 2001, at https://www.coe.int/en/web/conventions/full-list/-/conventions/rms/0900001680081561

Cohen J.E. (1996) 'A right to read anonymously: A closer look at 'copyright management' in cyberspace' Conn. Law Review 28, 981 (1996) 1003-19, at http://scholarship.law.georgetown.edu/facpub/814/

Cukier K. & Mayer-Schoenberger V. (2013) 'The Rise of Big Data' Foreign Affairs (May/June) 28-40

Davies S. (1997) 'Time for a byte of privacy please' Index on Censorship 26, 6 (1997) 44-48

Davis B. (2014) 'A guide to the new power of Facebook advertising' Econsultancy, 29 May 2014, at https://econsultancy.com/blog/64924-a-guide-to-the-new-power-of-facebook-advertising/

Degli Esposti S. (2014) 'When big data meets dataveillance: The hidden side of analytics' Surveillance & Society 12, 2 (2014) 209-225, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/analytics/analytic

Dreborg K.H. (1996) 'Essence of Backcasting' Futures 28, 9 (1996) 813-828

Drucker P.F. (1968) 'The Age of Discontinuity' Pan Piper, 1968

Drucker P.F. (1993) 'Post-Capitalist Society' Harper Business, 1993

Dunn E.S. (1967) 'The idea of a national data center and the issue of personal privacy' The American Statistician, 1967

Dwyer C. (2009) 'Behavioral Targeting: A Case Study of Consumer Tracking on Levis.com' Proc. AMCIS, 2009, p.460

Ebrahimi S., Ghasemaghaei M. & Hassanein K. (2016) 'Understanding the Role of Data Analytics in Driving Discriminatory Managerial Decisions' Proc. ICIS 2016

Englehardt S. & Narayanan A. (2017) 'Online tracking: A 1-million-site measurement and analysis' WebTAP Project, Princeton University, January 2017, at https://webtransparency.cs.princeton.edu/webcensus/

EPIC (2017) 'Privacy and Consumer Profiling' Electronic Privacy Information Center, 2017, at https://epic.org/privacy/profiling/

Flaherty D.H. (1984) 'Privacy and Data Protection: An International Bibliography' Mansell, 1984

Flaherty D.H. (1988) 'The emergence of surveillance societies in the western world: Toward the year 2000' Government Information Quarterly 5, 4 (1988) 377-387

Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989

Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Pantheon, 1977

Froomkin A.M. (1995) 'Anonymity and its Enmities' Journal of Online Law 1, 4 (1995), at https://papers.ssrn.com/sol3/Delivery.cfm/SSRN_ID2715621_code69470.pdf?abstractid=2715621&mirid=1

Funder A. (2003) 'Stasiland: Stories from behind the Berlin Wall' Granta, 2003

Gandy O.H. (1989) 'The Surveillance Society: Information Technology and Bureaucratic Social Control' Journal of Communication 39, 3 (September 1989) 61-76

Gandy O.H. (1993) 'The Panoptic Sort. Critical Studies in Communication and in the Cultural Industries' Westview, Boulder CO, 1993

Gibson W. (1984) 'Neuromancer' Grafton/Collins, London, 1984

Gilliom J. & Monahan T. (2012) 'SuperVision: An Introduction to the Surveillance Society' University of Chicago Press, 2012

Goldberg I. (2007) 'Privacy Enhancing Technologies for the Internet III: Ten Years Later' Chapter 1 of Acquisti A. et al. (eds.) 'Digital Privacy: Theory, Technologies, and Practices' Auerbach, 2007, at https://cs.uwaterloo.ca/~iang/pubs/pet3.pdf

Stephen Graham S. & Wood D. (2003) 'Digitizing Surveillance: Categorization, Space, Inequality' Critical Social Policy 23, 2 (May 2003) 227-248

Gray P. & Hovav A. (2008) 'From Hindsight to Foresight: Applying Futures Research Techniques in Information Systems' Communications of the Association for Information Systems 22, 12 (2008), at http://aisel.aisnet.org/cais/vol22/iss1/12

Greenwald G. (2014) 'No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State' Metropolitan Books, 2014, at https://archive.org/download/pdfy-jczuhqxxCAsjKlM0/no%20place%20to%20hide%20(edward%20snowden%20and%20gleen%20greenwald)%20complete.pdf

Harris S. (2010) 'The Watchers: The Rise of America's Surveillance State' Penguin, 2010

Hazen B.T., Boone C.A., Ezell J.D. & Jones-Farmer L.A. (2014) 'Data Quality for Data Science, Predictive Analytics, and Big Data in Supply Chain Management: An Introduction to the Problem and Suggestions for Research and Applications' International Journal of Production Economics 154 (August 2014) 72-80, at https://www.researchgate.net/profile/Benjamin_Hazen/publication/261562559_Data_Quality_for_Data_Science_Predictive_Analytics_and_Big_Data_in_Supply_Chain_Management_An_Introduction_to_the_Problem_and_Suggestions_for_Research_and_Applications/links/0deec534b4af9ed874000000

Herbert F. (1965) 'Dune' Chilton Books, 1965

Hess T., Legner C., Esswein W., Maaß W., Matt, C., Österle H., Schlieter H., Richter P. & Zarnekow, R. (2014) 'Digital Life as a Topic of Business and Information Systems Engineering?' Business & Information Systems Engineering 6, 4 (2014) 247-253

Hirst M. (2013) `Someone's looking at you: welcome to the surveillance economy' The Conversation, 26 July 2013, at http://theconversation.com/someones-looking-at-you-welcome-to-the-surveillance-economy-16357

Hoofnagle C.J., King J., Li S. & Turow J. (2010) 'How different are young adults from older adults when it comes to information privacy attitudes and policies?' SSRN, April 2010, at http://www.ssrn.com/ abstract=1589864

Kelly K. (1998) 'New Rules for the New Economy' Penguin, 1998, at http://kk.org/books/KevinKelly-NewRules-withads.pdf

Kistermann F.W. (1991) 'The Invention and Development of the Hollerith Punched Card: In Commemoration of the 130th Anniversary of the Birth of Herman Hollerith and for the 100th Anniversary of Large Scale Data Processing' IEEE Annals of the History of Computing 13, 3 (July 1991) 245 - 259

Land F. (2000) 'The First Business Computer: A Case Study in User-Driven Innovation' J. Annals of the Hist. of Computing 22, 3 (July-September 2000) 16-26

Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt and Company, New York, 1992

Levine R., Locke C., Searls D. & Weinberger D. (2000) 'The Cluetrain Manifesto: The End of Business as Usual' Perseus, 2000, summary at http://www.cluetrain.com/

Lewis A. (2010) 'If you are not paying for it, you're not the customer; you're the product being sold' Metafile, August 2010, at http://www.metafilter.com/95152/Userdriven-discontent#3256046

Lupton D. (2016) 'The diverse domains of quantified selves: self-tracking modes and dataveillance' Economy and Society 45, 1 (2016) 101-122, at https://www.researchgate.net/profile/Deborah_Lupton/publication/301330849_The_diverse_domains_of_quantified_selves_self-tracking_modes_and_dataveillance/links/5738371308ae9ace840c4138.pdf

Lycett, M. (2013), Datafication: Making Sense of (Big) Data in a Complex World, European Journal of Information Systems, 22, 4 (December 2014) 381-386, at http://v-scheiner.brunel.ac.uk/bitstream/2438/8110/2/Fulltext.pdf

Lynch C. (2017) 'The rise of reading analytics and the emerging calculus of reader privacy in the digital world' First Monday 22, 24 (3 April 2017), at http://firstmonday.org/ojs/index.php/fm/article/view/7414/6096

Lyne J. (2011) 'HTML5 and Security on the New Web' Sophos Security, December 2011, at http://www.sophos.com/en-us/medialibrary/PDFs/other/sophosHTML5andsecurity.pdf

Lyon D. (1994) 'The electronic eye: The rise of surveillance society' Polity Press, 1994

Lyon D. (2001) 'Surveillance society' Open University Press, 2001

Lyon D. (2003) 'Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination' Routledge, 2003

Machlup F. (1962) 'Production and Distribution of Knowledge in the United States' Princeton University Press, 1962

Maier R.M. (2014) 'Angst vor Google: Von der Suchmaschine zur Weltmacht' Frankfurter Allgemeine Zeitung, 3 April 2014, at http://www.faz.net/aktuell/feuilleton/debatten/weltmacht-google-ist-gefahr-fuer-die-gesellschaft-12877120.html?printPagedArticle=true#pageIndex_2

Majchrzak A. & Markus M.L. (2014) 'Methods for Policy Research: Taking Socially Responsible Action' Sage, 2nd Edition, 2014

Martin A.K., Van Brakel R. & Bernhard D. (2009) 'Understanding resistance to digital surveillance: Towards a multi-disciplinary, multi-actor framework' Surveillance & Society 6, 3 (2009) 213-232, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/3282/3245

Martino J.P. (1993) 'Technological Forecasting for Decision Making' 3rd Edition, McGraw-Hill, 1993

Marx G.T. (1985) 'The Surveillance Society: The Threat Of 1984-Style Techniques' The Futurist, 1985

Masuda Y. (1981) 'The Information Society as Post-Industrial Society' World Future Society, Bethesda, 1981

Mayer J.R. & Mitchell J.C. (2012) 'Third-Party Web Tracking: Policy and Technology ' Proc. IEEE Symposium on Security and Privacy, 2012, at http://ieeexplore.ieee.org/iel5/6233637/6234400/06234427.pdf

Miller A.R. (1969) 'Computers and Privacy' Michign L. Rev. 67 (1969) 1162-1246

Monks R.A.G. (2008) 'Corpocracy: How CEOs and the Business Roundtable Hijacked the World's Greatest Wealth Machine and How to Get it Back' Wiley, 2008

Naisbitt J. (1982) 'Megatrends' London, Macdonald, 1982

Negroponte N. (1995) 'Being Digital' Hodder & Stoughton, 1995

Newman N. (2011) 'You're Not Google's Customer -- You're the Product: Antitrust in a Web 2.0 World' The Huffington Post, 29 March 2011, at http://www.huffingtonpost.com/nathan-newman/youre-not-googles-custome_b_841599.html

Newell S. & Marabelli M. (2015) 'Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datification'' The Journal of Strategic Information Systems 24, 1 (2015) 3-14, at https://www.researchgate.net/profile/Marco_Marabelli/publication/273478334_Strategic_opportunities_and_challenges_of_algorithmic_decision-making_A_call_for_action_on_the_long-term_societal_effects_of_'datification'/links/558a2ce508ae2affe715374b/Strategic-opportunities-and-challenges-of-algorithmic-decision-making-A-call-for-action-on-the-long-term-societal-effects-of-datification

Niederman F, Applegate L., Beck R., Clarke R., King J.L. & Majchrzak A. (2016) 'IS Research and Policy' Notes from the 2015 ICIS Senior Scholar's Forum, Forthcoming Commun. Assoc. Infor. Syst., 2016

Normann R (2001) 'Reframing business: When the map changes the landscape' Wiley, 2001

Norris C. & Armstrong G. (1999) 'The maximum surveillance society: The rise of CCTV' Berg Publishers, 1999

van Notten P.W.F., Rotmans J., van Asselt M.B.A. & Rothman D.S. (2003) 'An Updated Scenario Typology' Futures 35 (2003) 423-443, at http://ejournal.narotama.ac.id/files/An%20updated%20scenario%20typology.pdf

Oesterle H. (2014) 'Business oder Life Engineering?' HMD Praxis der Wirtschaftsinformatik 51, 6 (December 2014) 744-761, at http://link.springer.com/article/10.1365/s40702-014-0097-x

O'Reilly T. (2005) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly 30 September 2005, at http://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html

Osler M., Lund R. Kriegbaum M., Christensen U. & Andersen A.-M.N. (2006) 'Cohort Profile: The Metropolit 1953 Danish Male Birth Cohort' Int. J. Epidemiol. 35, 3 (June 2006) 541-545, at http://ije.oxfordjournals.org/content/35/3/541.long

Packard V. (1960) 'The Hidden Persuaders' Penguin 1960

PageFair (2017) 'The state of the blocked web - 2017 Global Adblock Report' PageFair, February 2017, at https://pagefair.com/downloads/2017/01/PageFair-2017-Adblock-Report.pdf

Palmas K. (2011) Predicting What You'll Do Tomorrow: Panspectric Surveillance and the Contemporary Corporation' Surveillance & Society 8, 3 (2011) 338-354, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/4168/4170

PEW (2014) 'Digital life in 2015' PEW Research Center, at http://www.pewinternet.org/2014/03/11/digital-life-in-2025/

Postma T.J.B.M. & Leibl F. (2005) 'How to improve scenario analysis as a strategic management tool?' Technological Forecasting & Social Change 72 (2005) 161-173, at https://www.researchgate.net/profile/Theo_Postma/publication/222668211_How_to_improve_scenario_analysis_as_a_strategic_management_tool/links/09e41508117c39a2c7000000.pdf

Preston R. (2016) 'Are These Employer Data Practices Creepy? It Depends...' Forbes, 8 November 2016, at http://www.forbes.com/sites/oracle/2016/11/08/are-employer-data-practices-creepy-it-depends/

Priest D. & Arkin W.M. (2010) 'Top Secret America -- A Hidden World, Growing Beyond Control' The Washington Post, 19 July 2010', at http://projects.washingtonpost.com/top-secret-america/articles/a-hidden-world-growing-beyond-control/

Rosenberg J.M. (1969) 'The Death of Privacy' Random House, 1969

Rosenblatt B. (2015) 'The myth of DRM-free music' Copyright and Technology (31 May 2015), at http://copyrightandtechnology.com/2015/05/31/the-myth-of-drm-free-music/

Roszak T. (1986) 'The Cult of Information' Pantheon, 1986

Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974

Schmidt E. (2014) 'A chance for growth' Frankfurter Allgemeine Zeitung, 9 April 2014, at http://www.faz.net/aktuell/feuilleton/debatten/eric-schmidt- about-the-good-things-google-does-a-chance-for-growth-12887909.html

Schmidt E. & Cohen J. (2014) 'The New Digital Age: Reshaping the Future of People, Nations and Business' Knopf, 2013

Schneier B. (2015) 'Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World' Norton, March 2015

Schneier B. (2015a) 'Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World' Norton, March 2015

Schneier B. (2015b) 'How to mess with surveillance' Slate, 2 March 2015, at http://www.slate.com/articles/technology/future_tense/2015/03/data_and_goliath_excerpt_the_best_ways_to_undermine_surveillance.html

Schwartz P. (1991) 'The Art of the Long View: Planning for the Future in an Uncertain World' Doubleday, 1991

Shapiro C. & Varian H.R. (1999) 'Information Rules: A Strategic Guide to the Network Economy' Harvard Business School Press, 1999

Shen Y. & Pearson S. (2011) 'Privacy enhancing technologies: a review' HP Laboratories, 2011, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.377.2136&rep=rep1&type=pdf

Slobogin C. (2005) 'Transaction Surveillance by Government' Miss. L.J. 75, 139 (2005-2006) 139-191, at https://discoverarchive.vanderbilt.edu/bitstream/handle/1803/5720/Transaction%20Surveillance.pdf?sequence=1

SSN (2014) 'An introduction to the surveillance society', Surveillance Studies Network, 2014, at http://www.surveillance-studies.net/?page_id=119

Stephenson N. (1992) 'Snow Crash' Bantam, 1992

Toffler A. (1970) 'Future Shock' Pan, 1970

Toffler A. (1980) 'The Third Wave' Collins, London, 1980

Touraine A. (1971) 'The Post-Industrial Society: Tomorrow's Social History' Random House, 1971

Tufekci Z. (2014) 'Engineering the public: Big data, surveillance and computational politics' First Monday 19, 7 (7 July 2014), at http://firstmonday.org/ojs/index.php/fm/article/view/4901/4097

Vaidhyanathan S. (2011) 'The Googlization of Everything (And Why We Should Worry)' University of California Press, 2011

Varian H.R. (2014) 'Beyond Big Data' Business Economics 49, 1 (2014) 27-31

Wack P. (1985) 'Scenarios: Uncharted Waters Ahead' Harv. Bus. Rev. 63, 5 (September-October 1985) 73-89

Warner, M., and Stone, M. (1970) 'The Data Bank Society: Organisations, Computers and Social Freedom' George Allen and Unwin, 1970

Weingarten F.W. (1988) 'Communications Technology: New Challenges to Privacy' J. Marshall L. Rev. 21, 4 (Summer 1988) 735

Wessell, M.R. (1974) 'Freedom's Edge: The Computer Threat to Society' Addison- Wesley, Reading, Mass., 1974

Zang J., Dummit K. Graves J., Lisker P., Sweeney L. (2015) 'Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps' Technology Science, 2015103001, 30 October 2015, at http://techscience.org/a/2015103001

Zimmer M. (2008) 'The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance' in Spink A. & Zimmer M. (eds.) 'Web Search: Multidisciplinary Perspectives' Springer, 2008, pp. 77-102, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.457.8268&rep=rep1&type=pdf#page=83

Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75-89, at https://cryptome.org/2015/07/big-other.pdf

Zuboff S. (2016) 'The Secrets of Surveillance Capitalism' Frankfurter Allgemeine, 3 May 2016, at http://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance-capitalism-14103616.html?printPagedArticle=true#pageIndex_2


Appendix 1: Some Elements of Digital Surveillance Society

Data Management Elements

Commercial Elements

Technical Elements

Contexts

Spheres of Digital Life

After (Oesterle 2014)

...


Acknowledgements

My thanks for interactions and feedback on the topic to Prof. Robert Davison, City University of Hong Kong, and Prof. Markus Beckmann of the Friedrich-Alexander University Nürnberg, Liam Pomfret of the University of Queensland, and renowned sci-fi author and futurist David Brin.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 7 December 2016 - Last Amended: 10 April 2017 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/InDigR.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy