Roger Clarke's Web-Site


© Xamax Consultancy Pty Ltd,  1995-2017

Roger Clarke's 'Digitalisation and the Individual'

Risks Inherent in the Digital Surveillance Economy:
A Research Agenda

Review Version of 8 September 2017

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2016-17

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at


The digitisation of data about the world relevant to business has given rise to a new phase of digitalisation of business itself. The digitisation of data about people has linked with the notions of information society, surveillance society, surveillance state and surveillance capitalism, and given rise to what is referred to in this paper as the digital surveillance economy. At the heart of this is a new form of business model that is predicated on the acquisition and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. In the words of the model's architects, users are 'bribed' and 'induced' to make their data available at minimal cost to marketers.

The digital surveillance economy harbours serious threats to the interests of individuals, societies and polities. That in turn creates risks for corporations. The new economic wave may prove to be a tsunami that swamps the social dimension and washes away the last five centuries' individualism and humanism. Alternatively, institutional adaptation might occur, overcoming the worst of the negative impacts; or a breaking-point could be reached and consumers might rebel against corporate domination. A research framework is proposed, within which the alternative scenarios can be investigated.


1. Introduction

During the second half of the 20th century, a digitisation revolution occurred, such that a large proportion of data is now 'born digital', and analogue data can be inexpensively converted into digital form (Negroponte 1995, Yoo et al. 2010). Digitisation laid the platform for developments in the early 21st century that are currently being referred to as 'digitalisation'. A shift has occurred from the interpretation and management of the world through human perception and cognition, to processes that are almost entirely dependent on digital data (Brennen & Kreiss 2016). What began as an opportunity for strategic differentiation has morphed into a stimulus for organisational and sectoral transformation and disruption.

Digitalisation can be applied to any kind of entity. Its application to people is currently attracting a great deal of attention, giving rise to literatures on 'datafication' (Lycett 2014), or less commonly 'datification' (Newell & Marabelli 2015). The discussion is mostly upbeat, with a focus on the opportunities for organisations that are perceived to arise from streams of digital data about individuals. This paper, on the other hand, primarily reflects the perspectives of the individuals themselves, with implications for corporate interests as the second-order consideration.

Contemporary business models are predicated on the acquisition and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. Zuboff (2015) perceives the negative implications to be of enormous scale, and depicts the new context as 'surveillance capitalism'. The research reported in this paper adopts a narrower focus, seeking firm ground in understanding of the relevant technologies and of the activities that apply them.

In the digital surveillance economy, genuine relationships between organisations and people are replaced by decision-making based on data that has been consolidated into digital personae. This paper conceptualises the institutions and processes that serve the new business model using the notion of the 'digital surveillance economy'. The consumer is converted from a customer to a product, consumers' interests have almost no impact on the process, and they can be largely ignored. In the words of the model's architects, consumers are "bribed" and "induced" to make their data available at minimal cost to marketers (Shapiro & Varian 1999, pp. 35-36).

The aims of the research reported in this paper are to provide sufficient context that the digital surveillance economy's nature and origins can be appreciated, to delineate its key features, to identify its implications for the interests of individuals, societies and polities, and to structure a research agenda whereby deeper understanding can be achieved.

The paper commences with a brief description of the challenges facing research of this nature, and an outline of the approach adopted. The precursors to digitalisation are outlined. The nature of the digital surveillance economy is then discussed, and a model proposed that identifies the activities that give rise to it. The threats are then examined, firstly those to individuals, followed by the more abstract implications in the societal, economic and political domains. A range of possible futures exists. Because the phenomena remain fluid at this early stage in the new era, the research reported here is necessarily preliminary. Broad research questions are articulated, structured within three suggested scenarios.

2. The Research Approach Adopted

The purpose of the research is to provide insight, and thereby guide decisions and actions by business enterprises, by policy-makers, by consumers and consumer advocacy organisations, and by technologists seeking to support consumers' interests. Research of this kind is confronted by a number of challenges, which need consideration prior to a research method being proposed.

The conduct of empirical research is predicated on the assumptions that the relevant phenomena already exist, and that they exhibit a sufficient degree of stability. The rigour demanded of empirical research is undermined to the extent that results cannot be reliably reproduced because the population from which samples are extracted exhibits continual changes of behaviour. This is inevitably the case with emergent phenomena, such as those that arise from the application of multiple new digital technologies. Remarkably, instability of the object of study appears to be little-discussed in the IS literature. However, for a discussion of the difficulties that the dynamic nature of the eCommerce and eBusiness eras presented to researchers, see Clarke (2001). Guidance in relation to research techniques in such contexts is provided by Gray & Hovav (2008) and Majchrzak & Markus (2014). See also Niederman et al. (2016) and works on futures research, e.g. Martino (1993), Dreborg (1996) and Bell (2004). In addition to existing techniques such as environmental scanning, technology assessment, Delphi rounds, scenario analysis, cost/benefit/risk analysis and impact assessment, some innovation may be needed.

A further challenge arising with future-oriented, instrumentalist research is that it must deal with contexts in which stakeholders will be differentially impacted by change, and in which it is commonly the case that pareto optimality cannot be achieved and hence the interests of at least some stakeholders suffer harm. This necessarily draws the researcher beyond the merely descriptive, explanatory and predictive, and into realms characterised by value-laden contexts, value-conflicts, political negotiation processes, normative modes and prescriptive outcomes. If research is to deliver outputs relevant to business strategy, government policy and consumer behaviour, research techniques must be applied that can cope with both unstable phenomena and contested values, and that seek such levels of rigour as are feasible in the circumstances.

The results of this research are being addressed to diverse audiences. The outcomes are intended to influence decision-makers in corporations, by highlighting the broader implications of current changes in business models. The work's more direct contributions, however, are to the development of public policy. Beyond mere policy, guidance is needed for the articulation of programs intended to achieve normative objectives expressed in all of social, economic and environmental terms. These commonly depend on effective cooperation among disparate organisations, across the public and private sectors and often also the voluntary sector. The intended readership also includes a proportion of consumers, and particularly of consumer advocacy organisations and technology developers. Also within the target-zone are academics, most of whose work is currently conducted from the perspective of corporations, but who may consider broadening their scope to address the challenges identified in this paper. This diversity of audience adds to the challenges of research conception, design, conduct and reporting.

With these challenges in mind, an approach was adopted utilising what MacInnis (2011) refers to as 'a propositional inventory', to "lay out areas in which empirical research is needed" (pp. 2, 10). First, a contextual review was conducted of relevant threads of development. A conceptual framework was then outlined whereby the emergent phenomena could be researched, understood and integrated. By 'conceptual framework' I mean "the researcher's map of the territory being studied, which consists of the main concepts, constructs and variables and their related propositions" (Hassan 2015 p.12, citing Miles & Huberman 1994). The analysis drew on a wide range of sources. This included relevant prior research conducted by the author.

The broad conceptual framework was articulated by means of a model of the key processes within the digital surveillance economy. This was developed from observations of consumer marketing activities, primarily in contemporary Web-based eCommerce and mobile commerce. The term 'model' is used here in the sense of a "pre-theoretical imperfect copy of the phenomenon of interest" (Hassan 2015, p. 11, citing Hesse 1966).

The process model provided a basis for identifying the impacts of the new ways of doing business. It is also valuable as a means for testing the conceptual framework against continually changing real-world instances. In developing the framework, attention was paid to the guidelines for evaluating conceptual articles proposed in Yadav (2010, pp. 15-16). The research questions, rather than being declared at the outset, are an outcome of the analysis, and are presented at the end of the paper. They were developed by performing a first iteration of a scenario analysis approach.

The structure of the paper reflects the research process outlined above. The following section presents a brief review of several previous phases of technological change, the uses made of them by business enterprises, and the interpretations made of them by observers. In section 4, the concept of the digital surveillance economy is discussed, and the process model is presented. Threats to the individual that arise from this approach are then identified, and broader impacts are discussed. Finally, in section 6, research questions are framed within a set of scenarios.

3. Precursors

This section briefly summarises earlier conceptions that provide context and that condition observers' perceptions of contemporary consumer marketing. The digitisation phase was largely complete by the end of the twentieth century. A first, critical aspect of digitisation is that selection of attributes whose states or behaviour are to be represented in data is not a value-neutral event but rather a purposive activity, with choices made that reflect the intentions of the collector. Contrary to the implications of the term 'data capture', data is not already in existence and merely waiting to be gathered, but is actively created. The collector's perception of purpose results in some attributes of some entities and their actions being represented in the data collection, and the complete omission from the collection of representations of all other attributes of those entities and actions, and of all attributes of all other entities and actions. Further key issues are that data quality is a multi-dimensional notion, and that data-items, records and record-collections evidence very high variability in quality.

Digitisation of individuals can be traced at least to 1890, when the Hollerith card was applied to the US Census and the era of machine-readable data began (Kistermann 1991). The application of computing to administrative data commenced in 1951 at the Lyons Tea Company in the U.K. (Land 2000). Within a decade, the opportunity was perceived for 'national data systems' to consolidate data about people, variously as a basis for sociological research (e.g. the Metropolit projects in Sweden and Denmark - Osler et al. 2006) and for social control (e.g. the National Data Center proposals in the USA - Dunn 1967). The second half of the century saw an explosion in the creation of data about people. During that time, the basis for decisions about such matters as lending, insurance and social welfare changed from the evaluation of applicants face-to-face, complemented by consideration of supporting documents, to a process almost entirely based on the machine-readable-data-assembly that is thought to relate to the applicant. The author coined the term 'digital persona' to refer to that assembly (Clarke 1994, 2014a).

The Internet brought with it scope for pseudonymity and even anonymity (Froomkin 1995). Moreover, as the investment era boomed, and then c. 2000 imploded, 'new rules' were emerging (Kelly 1998). Until c. 2010, there was some optimism that consumer marketing would become less intrusive and manipulative, and more collaborative (Levine et al. 2000, Clarke 2008a). In the meantime, however, Web 2.0 was conceived as the means whereby the Web's architecture could be inverted from its original, consumer-driven request-response pair into a marketer's data collection and consumer manipulation framework (O'Reilly 2005, Clarke 2008b). Web 2.0 features laid the foundations for the digital surveillance economy. Within less than a decade from 2005 onwards, consumer marketing became anything but the two-sided conversation that Levine et al. had envisaged.

Meanwhile, information technologies were having impacts more broadly than just in the commercial sphere. The notions of the 'knowledge economy' (Machlup 1962, Drucker 1968), and 'post-industrial society' (Touraine 1971, Bell 1976) were absorbed into that of 'the information society' (Masuda 1981). The tone of the large majority of the relevant literature in the 1980s was very upbeat, strongly focussed on the opportunities, and with very little consideration of what threats the new socio-economic arrangements might embody, how impactful they might be, and whether they might fall unevenly.

In parallel, the concept of 'surveillance society' developed (Marx 1985, Flaherty 1988, Weingarten 1988, Gandy 1989, Lyon 1994, Lyon 2001). Surveillance is usefully defined as the systematic investigation or monitoring of the actions or communications of one or more persons. The sentiments are well summed-up by this early description: "Technology could be offering the tools for building a surveillance society in which everything people do and are as human beings is an open book to those who assert some 'right' to monitor them" (Weingarten 1988, p.747).

Beyond the longstanding forms of visual, aural and communications surveillance, dataveillance rapidly emerged as a more cost-effective alternative (Clarke 1988). Roszak (1986) observed that the purpose of 'computerised surveillance' was "to reduce people to statistical skeletons for rapid assessment" (p.186). He provided a caricature of the quality problems of what was then referred to as 'data mining' and is now re-badged as 'big data analytics' and 'data science': "garbage in - gospel out" (p.120).

The early forms have since been joined by electronic surveillance (of space, traffic, movement, behaviour and experience) and auto-surveillance (by devices carried with, on or in the person). More detailed analysis of surveillance categories, techniques and technologies is in Clarke (2009b, 2010). A strong expression of nervousness about the future of free societies has more recently been provided by Zuboff (2015, 2016), who has coined the term 'surveillance capitalism' to refer to the aim of corporations "to predict and modify human behavior as a means to produce revenue and market control" (2015, p.75).

The political notion of the 'surveillance state' involves more or less ubiquitous monitoring of individuals by government agencies (Balkin & Levinson 2006, Harris 2010). The notion may seem to be not directly relevant to the present analysis. On the other hand, it is no longer limited to the USSR, China, East Germany (Funder 2003), and developing and under-developed countries ruled by despotic regimes. The Snowden revelations confirmed that government agencies in 'the free world' also abuse their powers and circumvent controls that they are nominally subject to (Greenwald 2014). Computational politics has recently added new layers of threat (Tufekci 2014). Meanwhile, the 'public-private partnership' notion may be morphing into a vision of a high-tech, corporatised-government State (Schmidt & Cohen 2014), and hence becoming a major threat to democracy. It is reasonable to expect that developments towards a surveillance state would contribute to rapidly declining trust by individuals not only in governments but also in corporations.

These threads provide the intellectual context within which the current approach to consumer marketing has developed and from which vantage-point it can be observed and interpreted.

4. The Digital Surveillance Economy

This section builds on the precursor notions discussed in the previous section, identifies the key features of the digital surveillance economy, and lays a foundation for analysis of its implications and of the reactions against it that may arise. The focus is primarily on economic and social rather than political matters, and the analysis is intended to be relevant to business executives, policy-makers and consumers.

4.1 The Concept

The 'direct marketing' mechanisms of two to five decades ago have developed into heavy reliance by corporations on the acquisition and exploitation of detailed machine-readable dossiers of information on large numbers of individuals (Davies 1997, Hirst 2013). The term 'exploitation' is subject to interpretations both positive - as in the extraction of advantage from one's own assets - and negative - as in unfair utilisation of something belonging to someone else. Both senses of the word are applicable to the use of data in the digital surveillance economy.

Surveillance that produces machine-readable data, in quantity, about more-or-less everything and everybody, has changed the game. The working definition adopted for the core concept in this paper is as follows:

The digital surveillance economy is a combination of institutions, institutional relationships and processes that enables corporations to exploit data arising from the monitoring of people's electronic behaviour and on which consumer marketing corporations are rapidly becoming dependent

There was no hint of a surveillance element in mid-1990s depictions of the 'digital economy', such as the highly-cited Tapscott (1996). By the beginning of the new century, however, the possibility of "a universal surveillance economy" had been raised (Rheingold 2002, p. xviii, quoted in Fuchs et al. 2012). The term 'digital surveillance economy', which this author adopts as a suitably descriptive label for the phenomenon, appears to have been first used in Andrejevic (2014, p.1678).

A notion that stands at the heart of the new approach is represented by a longstanding aphorism in media theory: 'If you're not paying, you're the product'. For an early usage in the Web context, see Lewis (2010). The foundations lie neither in classical nor neo-classical economics (which are relevant to scarce resources), but rather in information economics (which applies to abundant resources). The business school text-book that heralded the revolution was Shapiro & Varian (1999). Its significance is underlined by the fact that, since 2002, Hal Varian has had the title of Chief Economist at Google.

The Shapiro & Varian thesis is that customers' attention is the marketer's stock-in-trade. It is to be traded for revenue, market-share and profit, and cross-leveraged with business partners. Shapiro & Varian is completely driven by supply-side thinking, and assumes that demand is not only disorganised and powerless, but also that it will remain so. The book continues to use the term 'customer', but the implication of a relationship has been jettisoned, and it is almost synonymous with the impersonal term 'consumer'. In their theory - and right now in the digital surveillance economy - the interests of consumers have almost no impact on the process, so they can be safely ignored. Relationship marketing, permission-based marketing, interactivity with and participation of customers, the prosumer concept, customer needs, customer needs satisfaction and customer alliances are all overlooked.

A further significant aspect of the Shapiro & Varian approach is that consumer rights, privacy rights, and laws to protect them might as well not exist. Regulation is merely a barrier to be overcome and to be replaced by the chimera of 'self-regulation'. The book lacks any vestige of the notions of community, society, equity, discrimination or individual wellbeing. Indeed, the key terms used in this paragraph do not even appear in Shapiro & Varian's index.

The techniques for acquiring personal data have become much more sophisticated during the nearly two decades since Shapiro & Varian wrote; but the principle was laid out very clearly: "[marketers'] strategy should be to bribe users to give them the appropriate demographics, which in turn can be passed onto advertisers ... [The game is about] inducing consumers to give [marketers] the information they want. ... we expect that many consumers will be happy to sell information about themselves for a nominal amount ..." (pp. 35-36). On the surface, the authors appear to have been oblivious to the fact that, in most civilised countries, both 'bribing' and 'inducing' are criminal acts. Alternatively they regarded the criminal law as being merely an inconvenient regulatory hurdle that business enterprises needed to overcome.

Some social scientists drew attention to the implications at an early stage: "Digitization facilitates a step change in the power, intensity and scope of surveillance" (Graham & Wood 2003). In mid-2017, that article had just over 250 Google citations, whereas Shapiro & Varian had accelerated beyond 10,000. Much more significant than the book's citation-count, however, has been the application of its propositions in business processes and in features embedded in the information infrastructure that supports consumer eCommerce and mobile commerce.

4.2 A Process Model

Shapiro & Varian's proposals are evident in the key features of the new forms of consumer marketing. Few explicit and adequately articulated models of these processes have been located in the literature, but see Ur et al. (2012). A partial model was put forward in Degli Esposti (2014), which considered: "the ability of reorienting, or nudging, individuals' future behavior by means of four classes of actions: 'recorded observation'; 'identification and tracking'; 'analytical intervention'; and 'behavioral manipulation'" (p.210). That list of activities is further articulated below, and a model provided in Figure 1.

Figure 1: Key Processes of the Digital Surveillance Economy

(1) Data Acquisition

Some data is acquired by corporations through interactions with consumers in which the consumers are conscious of the transaction taking place, and to some extent at least that data about them has been acquired as part of that transaction (Activity 1A - Overt Direct Collection in Figure 1). The consumer may be conscious of which data-items have been acquired, and for what purposes, and may have given express or implied consent. More commonly, consumers are at best very hazy about the terms of the deal, and consent is merely inferred by the collector in ways that might or might not be judged by a court to satisfy the requirements of contract and consumer law.

In addition, a considerable amount of data is acquired from consumers through various activities of which the consumer is unaware (1B - Covert Direct Collection). Some such data-items are arguably a byproduct of the protocols used in conducting the transaction, such as the consumer's net-location, the software that they use, and in some circumstances even their physical location. On the other hand, a great deal of this data-collection is a result of active endeavours by marketers and their partners and service-providers, utilising surreptitious techniques such as cookies, web-bugs, web-beacons and browser fingerprints, and through adware/spyware. Consumer tools such as Ghostery and AdBlock Plus, and their associated lists, identify many hundreds of organisations that are privy to individuals' web-traffic through such means.

Data about consumers' marketspace behaviour may be gained by marketers not only when the consumer is in contact with the organisation itself but also when they are interacting with other sites (1C - Covert Indirect Collection). This may be achieved using various forms of arrangements among consumer marketing corporations, or through spyware (Palmas 2011, Degli Esposti 2014, p.214, Christl & Spiekermann 2016, pp.45-72).

Other forms of data gathering exist which may not be associated with browser-usage (1D - Viewer Monitoring). The data-formats and access-tools whereby consumers experience text, images, sound and video (such as email, e-readers, streaming services and the viewing of previously-downloaded files) are increasingly designed to generate data about consumers' activities and pass it around the often very large network of corporations that are involved in the process, or merely have an interest in acquiring the data.

A great deal of data is also acquired from sources other than the individual. Some is from sources asserted to be 'public domain' and free for use for any purpose (2A - 'Public Domain' Collection). One important example is social media content. This includes a great deal of data about the individual persona (not only postings, but also personal profile-data and social networks), and about other personae associated with it.

There are, however, many collections of personal data whose accessibility and use individuals reasonably consider to be limited, but which corporations expropriate. For example, when consumers provide data to a telco, their reasonable expectation is that their consent is needed for it to be published in telephone subscriber listings, and that the purposes are constrained rather than being open to abuse for extraneous purposes. Even when they make mandated disclosures to, for example, an electoral commission, they expect access to and use of the resulting electoral roll to be constrained to the purpose for which the data was collected. Consumers lack the institutional power to prevent the massive abuses that electoral roll data is subject to. On the other hand, for a fee, they can vote in relation to telephone directories. Large proportions of the populations of many countries pay to keep their numbers unlisted (as it is called in the United States), ex-directory (in the United Kingdom), silent (in Australia), or private (in New Zealand and Canada). Few telcos publish statistics, but in various countries estimates of the proportion of the public that avoids publication range from 25% to >50%. Media reports suggest that two-thirds of UK numbers are ex-directory.

A great deal of personal data is acquired by organisations from other organisations (2B - Covert Secondary Collection). This may be by purchase, barter or other sharing, perhaps camouflaged by a term such as 'strategic partnership'. It is rare for the individual to be aware of such data transfers. Many such exchanges are conducted surreptitiously, variously because of their dubious legality and the risks of media coverage, public outrage, and responses by regulators and parliaments.

The diversity of data-streams that the digital surveillance economy draws on is enormous, and includes eCommerce and advertising transactions, search-terms (Zimmer 2008), media experiences (Cohen 1996, Rosenblatt 2015, Lynch 2017), social media contributions and accesses, and the increasingly 'quantified self' emanating from 'wellness devices' and other forms of 'wearables' (Lupton 2016). The many streams that have emerged since Web commerce began in the mid-1990s are additional to the plethora of data-sets from pre-Internet expropriation rounds, such as the holdings of credit bureaux, mailing-list and database marketing operators, loyalty schemes, phonebook database operators, and electoral rolls (Christl & Spiekermann 2016, pp. 76-117). Many of these hold reasonably reliable socio-demographic data for vast numbers of people.

The technical means that have been developed to support the feeding-frenzy represented by activities (1A-2B) are highly diverse. In some cases, protocols and data-formats are semi-standardised, e.g. in the highly merchant-friendly and consumer-hostile HTML 5.0 specification (Lyne 2011). In other cases, they are proprietary subterfuges and spyware designs that would quite possibly be found to be technically illegal under the cybercrime provisions enacted in many countries following the Convention on Cybercrime (CoE 2001) - if cases could ever be successfully brought before courts. For an overview of data-streams in various contexts, see Christl & Spiekermann (2016, pp. 45-75). For recent reviews of the array of techniques for achieving associations of Web data-streams with individuals, see Mayer & Mitchell (2012) and Bujlow et al. (2015). For data on the extent and effectiveness of tracking using these techniques, see Englehardt & Narayanan (2017). For a recent review of the extent to which Android and iOS apps scatter data to large numbers of organisations, see Zang et al. (2015).

User organisations that pay, in some cases for copies of data, and in other cases for services based on data, are found across the length and breadth of the consumer marketing, consumer credit, insurance and healthcare industries. There are also uses within the employment life-cycle, from talent-spotting, through hiring, to loyalty-monitoring and firing. Government agencies utilise these sources, for example in relation to taxation and welfare payment administration. Political parties and candidates have recently become an additional, and in some countries, highly lucrative user sector, with inferences drawn about electors' preferences in order to target political advertising, customise messages, and manipulate voters' choices.

(2) Data Exploitation

Each organisation that acquires data by means of activities 1A-2B is in a position to combine it into a single record for each individual (3 - Digital Persona Consolidation). Commonly-used terms in the industry include 'a data-point' to refer to the pairing of data-items with an identifier, and 'a consumer profile' to refer to the record as a whole. The term 'profile' has another established usage, which is relevant to activity (4), discussed in the next paragraph. The term 'digital persona' is therefore used in this paper to refer to the consolidated set of data that purports to be about a particular consumer. The consolidation process may be performed on the basis of some reasonably reliable identifier, such as loginid, a stable IP-address, browser-fingerprints or GPS coordinates. In many cases, however, the data consolidation processes that allocate new data to a record rely on loose inferencing techniques. Where many streams are combined, the result is what Palmas (2011) refers to as 'panspectric surveillance'.

The consolidated persona can be subjected to many forms of processing in order to draw inferences from it (4 - Analysis). Inferences may relate to the population, a sub-population, or an individual. One important form of analysis compares each digital persona with one or more categories that are usefully referred to as 'abstract consumer profiles' (Clarke 1993, EPIC 2003, Degli Esposti 2014, pp.215-219, Christl & Spiekermann 2016, pp.24-38, 84-86). Abstract profiles may be defined on a more or less ad hoc basis, or may apply insights from studies and experiments in relation to personality-types, attitudes, and long-term and short-term interests. In relation to personality, a range of psychographic classifications exists, including the 'Big Five' phenotypes of extraversion, neuroticism, agreeableness, conscientiousness, and openness (Goldberg 1990, Goldberg 1999, Gunter & Furnham 1992. See also Christl & Spiekermann 2016, pp. 11-41).

Corporations make decisions that affect individuals (5 - Individual Decision-Making). They do so on the basis of the digital personae that they have available to them, and what the corporation infers from them, including how those personae compare with abstract profiles. Many of the contexts in which corporations make decisions about people are very important to those people's interests and wellbeing, including applications for and administration of loans and insurance, in health care, and in employment.

One particular category of decision-making is critical to the digital surveillance economy. The data held about each consumer is used to select, and/or customise advertisements such that they are targeted at a very precise category, possibly as small as an individual (6 - Advert Targeting). The computational process reflects the individual's demographics, stated and demonstrated preferences, attitudes and interests at that particular time (Dwyer 2009, Davis 2014). The effectiveness of the 'target demographic' notions applied in broadcast print, radio and television are argued to pale into insignificance in comparison with narrowcasting to individual consumers whose digital personae match closely to carefully pre-tested abstract consumer profiles. In addition to the assertions of interested parties such as service-providers (e.g. NAI 2010), some empirical studies support the claims that behavioural ad targeting is highly effective (e.g. Yan et al. 2009), although some contrary evidence also exists (Lambrecht & Tucker 2013).

Where the marketer is sufficiently aware of the individual consumer's socio-demographic attributes, prior behaviour and current interests, the message in the advertisement can be designed to be highly persuasive, and to have a decisive and predictable effect on the consumer's decisions and actions (7 - Behaviour Manipulation) (Packard 1957, 1964, Wells 1975, Adomavicius & Tuzhilin 2005, Degli Esposti 2014, pp.220-221).

Moreover, the data held about consumers enables marketers to gauge the point at which buyer-resistance is likely to arise, so that they can pitch the offer just below it, in order to extract the maximum revenue from each customer (8 - Micro-Pricing) (Shapiro & Varian 1999 pp. 7-50, Acquisti 2008, Christl & Spiekermann 2016, pp. 41-44). As a result, marketers have moved beyond pre-set pricing of their offerings - typically at a price the market as a whole will bear - and instead set a price that each individual will bear. Some consumers may get a lower price, but most will get a higher one. So the technique allows marketers to extract far greater revenue from the market as a whole than would otherwise be the case (Shapiro & Varian 1999, p.32).

The most successful exponent, and the primary driver of the emergence of the digital surveillance economy, has been Google (Clarke 2006, Vaidhyanathan 2011, Newman 2011, Angwin 2016). However, multiple other large corporations have followed their own paths, which have begun in areas different from Google's origins in search-terms - Facebook in social media, Apple in music consumption, and Amazon in eCommerce transactions. Of the various acronyms in use, the most apposite is FANG (Facebook, Apple, amazoN, Google). Microsoft may eventually make up for missed opportunities and lost ground, perhaps through its cloud-based office suite.

The industry structure through which personal data expropriation, exploitation and cross-leveraging is conducted is in continual flux, and its processes are largely obscured from public view. At a simplistic level, it comprises sources, consolidators, analysts, service-providers and user organisations. But many sub-classifications exist, many overlaps occur, and organisations vie to take others over, and to be taken over. Profiling agencies such as VisualDNA, EFL and Cambridge Analytica specialise in processing data-hoards to address specific interests. See Mayer & Mitchell (2012) and the succession of industry models offered by Brinker (2016).

These applications of digitalisation are being imposed on society, not offered as a choice. The activities that make up the digital surveillance economy are justified by business enterprises on the basis of 'marketing efficiency', the convenience that their products and services offer to people, and the empirical evidence of people being very easily attracted to adopt and use them. Adoption is achieved through bandwagon effects ('viral marketing'), network effects, and features that appeal to individuals' hedonism and that encourage the suspension of disbelief and the suppression of rational decision-making in individuals' own interests. This comes at a cost to individuals, and to societies, economies, and polities.

5. The Digital Surveillance Economy as Threat

Individuals gain from the digital surveillance economy in a variety of ways. One form of benefits is the convenience and time-savings arising from not needing to think about purchases because attractive opportunities continually present themselves. There is also entertainment value built into the 'customer experience', and perhaps a warm glow arising from the feeling that profit-making corporations understand them. A more formalised classification scheme was suggested by Wang & Felsenmaier (2002), comprising hedonic benefits (entertainment, enjoyment, amusement, fun), functional benefits (information, efficiency, convenience), social benefits (communication, relationship, involvement, trust) and psychological benefits (affiliation, belonging, identification).

The promotional literature for Web 2.0 in its foundation period 2005-2009 waxed lyrical about the benefits to consumers, e.g. "Web 2.0 ... has contributed to an unprecedented customer empowerment" (Constantinides & Fountain 2008, p.231). The analysis conducted in (Clarke 2008b), and extended in this paper, suggests, on the other hand, that the bargain has been a very poor one from the consumer's perspective.

Substantial literatures have developed on consumers' individual cost/benefit privacy calculus, summarised in Smith et al. (2011, pp. 1001-1002), and also in relation to the economics of privacy, which is associated with Acquisti, whose most highly cited paper is Acquisti & Grossklags (2005). However, the flavour of that research is driven by the business-facilitative 'fair information practices' notion first enunciated by Westin (1967) and Westin & Baker (1974), and the consumer-hostile economic philosophy of Posner (1977, 1981, 1984), which sought to reduce privacy from the human right expressed in the International Covenant on Civil and Political Rights (ICCPR) to a merely economic right. See also Laudon (1993). The Westin/Posner/Laudon approach serves the desire of consumer marketing corporations, particularly those used to the freedoms accorded to corporations in the USA. Shifting the perception in that manner benefits business, because individuals can then be justifiably enveigled into trading their privacy away. The impact of these research streams has been to the benefit of marketers, because it is accepting of the very substantial information and power asymmetries, accepting of the many techniques used by marketers to manipulate consumer perceptions, and supportive of the approach of dividing-and-conquering the consumer interest.

Building on the ideas in the preceding sections, a range of issues can be identified. The focus of the first section is on aspects of digitalisation's impacts that are relevant to individuals, and of the second on social and economic concerns.

5.1 Threats to the Individual

One negative impact on people is the stimulation of what is variously referred to as 'impulse buying' and 'compulsive purchasing' of items that are unnecessary and/or unduly expensive. This arises in response to targeted ads that succeed in their aim of 'pushing that particular consumer's buttons' (Beatty & Ferrell 1998, Wells et al. 2011). A few individuals may pay lower prices than they would under the old set-price approach, but most pay more, because the attractiveness of factors other than price (in particular the way in which the offer is presented, and the time it appears) enable the seller to extract premiums from many buyers (Newman 2014). Concerns about consumer manipulation have been mainstream since at least the mid-20th century (Packard 1960), but the concerns are greatly intensified by contemporary features including the presentation quality (high pixel-density colour screens), the customised nature of the sales pitch, the immediacy of the experience, and the ease with which the deal is done - 'see ad, press buy-button'.

Further issues arise in relation to the way in which organisations make decisions about individuals, particularly in such contexts as financial, insurance and health care services. Discriminatory techniques are used in ways that harm individuals' access to services. Blacklisting of individuals (such as tenants) was originally based on prior transactions between each individual and one or more organisations. Redlining, and its electronic form, weblining, were originally techniques based on the demographics of a small area such as a suburb (Gandy 1993, Hernandez et al. 2001). Organisations are now able to conduct discriminatory practices on the basis of a far wider range of socio-demographic, behavioural and experiential data. They can therefore avoid conducting transactions with individuals who they deem on the basis of the digital personae available to the organisation to be unattractive customers (e.g. likely to be low-volume, low-margin or high-risk), or to inflate the prices quoted to them (Lyon 2003, Newell & Marabelli 2015, Ebrahimi et al. 2016).

A considerable proportion of the voluminous data that is available is likely to be irrelevant, of low quality and/or of considerable sensitivity, but quality controls are very limited and even non-existent (Horne et al. 2007, Hazen et al. 2014, Clarke 2016b). Another source of errors is the inevitably partial nature of each persona, and the possibility that it may be a composite, e.g. including some data that relates not to the individual but to some other member of the same household. Moreover, some of the data may be intentional misinformation, and some may have been wrongly associated with that identifier.

Further, digitalisation extends beyond digitisation and quantification to the automation of decision-making, unmoderated by people. To a considerable degree, transparency has already been lost, partly as a result of digitalisation, and partly because of the application not of 20th century procedural approaches and algorithmic software development tools, but of later-generation software tools whose 'rationale' is obscure, or to which the notion of rationale is not even applicable (Clarke 1991, 2016b). The absence of transparency means that unreasonable and even simply wrong decisions cannot be detected, and correction, recourse and restitution are all-but impossible.

In a highly competitive market, refusal by one provider to do business with a consumer may not matter all that much. In the real world, however - especially in Internet contexts where network effects make late entry to the market very challenging - monopolies and oligopolies exist, and micro-monopolies abound, and it is in the interests of business enterprises to create and sustain them. Serious harm to consumers arises in such circumstances as where, for example, all lenders apply the same creditworthiness technique to the same data, and where all lenders consequently reject applications based on irrelevant or erroneous data, without transparency, and without any effective opportunity for clarification, challenge or rectification. The harm may be even greater in the contexts of insurance and health care.

All users of electronic tools are subject to intensive surveillance. For city-dwellers, who enjoy superior connectivity, that surveillance is not only ubiquitous within the areas in which they live, work and are entertained, but also continual, and for some even continuous. The term 'chilling effect' appears to have originated in comments by a US judge in a freedom of expression case, Dombrowski v. Pfister, 380 U.S. 479 (1965). The essence of the concept is that intentional acts by some party have a strong deterrent effect on important, positive behaviours of some other party/ies (Schauer 1978). Since Foucault's 'virtual panopticon' notion (1975/77), the term 'chilling effect' has become widely used in the context of surveillance (Gandy 1993, Lyon 1994). Intensive surveillance chills behaviour, undermines personal sovereignty, and greatly reduces the scope for self-determination. The digital surveillance economy accordingly has impacts on the individual, in psychological, social and political terms.

An alternative formulation that has been proposed in the specific context of the digital surveillance economy uses the term 'psychic numbing'. This "inures people to the realities of being tracked, parsed, mined, and modified - or disposes them to rationalize the situation in resigned cynicism" (Hoofnagle et al., 2010). Google is confident that psychic numbing facilitates the new business model: " ... these digital assistants [that acquire data from consumers] will be so useful that everyone will want one, and the statements you read today about them will just seem quaint and old fashioned" (Varian 2014, p. 29, quoted in Zuboff 2015, p.84).

Much of the discussion to date on the risks of chilling and psychic numbing, and the resulting repression of behaviour, has related to actions by government agencies, particularly national security and law enforcement agencies, but also social welfare agencies in respect of their clients, and agencies generally in respect of their employees and contractors. However, the incidence of corporate oppression of employees, public interest advocates and individual shareholders appears to have been rising. It is open to speculation that increased social distance between corporations and the public, combined with the capacity of large corporations to ignore oversight agencies and even civil and criminal laws, may lead to much greater interference by organisations with the behaviour of individuals in the future.

In summary, the digital surveillance economy creates threats to individual consumers in the areas of compulsive buying, net higher prices, discriminatory decision-making by suppliers, decision-making by suppliers based on erroneous data, decisions by suppliers whose rationale is unclear and essentially uncontestable, a phalanx of suppliers that use much the same data and decision-criteria and hence make the same (sometimes discriminatory and erroneous) decisions, reduction in the scope for self-determination, psychic numbing, and ultimately passivity.

The degree of concern that individuals have about these issues varies greatly. Of the 12 activities in the model of the digital surveillance economy in Figure 1, at least 9 are conducted out of consumers' sight. The only three that may be apparent to consumers are overt direct data collection (activity 1A), individual decision-making (5), and targeted advertisements (6). It is very difficult for consumers to resist data collection, because web-forms approach a form of tyranny, in that most and even all items have to be filled in, most people, at least in western cultures, have a guilty conscience about falsification, and credible lies require considerable effort. Targeted ads, on the other hand, are not only unwanted by a large percentage of people, but are also capable of being avoided or suppressed (Turow et al. 2009).

A significant proportion of people evidence at least some level of concern, by seeking out and using means to obfuscate and falsify data and to mislead, corrupt and subvert corporations' data-holdings (Metzger 2007). However, given the surreptitiousness with which the digital surveillance economy has been constructed and is operated, it is unsurprising that many people have very little understanding of the nature of the digital surveillance economy. Some of those who do have at least a vague sense of the issues profess to have little or no concern about them, or argue that they can do nothing about it and hence there's no point in worrying about it. Some proportion of the uninformed and unconcerned do, however, suffer experiences that change their views. A meme that has gained traction in recent years has been associated with the word 'creepiness', to refer to the feeling that someone is watching you, and knows nearly as much about you as you do yourself (Ur et al. 2012, Shklovski et al. 2014).

5.2 Broader Impacts

In a healthy society, some kinds of behaviour (such as violence, and incitement to violence, against individuals, minorities and the public generally) need to be effectively chilled. But a healthy society also depends on other kinds of behaviours not being chilled. Creativity in scientific, technological and economic contexts is just as vitally dependent on freedom of speech and action as is creativity in the artistic, cultural, social and political realms (Johnson-Laird 1988).

Considerable concerns have been repeatedly expressed, over an extended period, about the capacity of surveillance to stultify economies, societies and polities (SSN 2014). The dramatic increase in surveillance intensity that arrived with the new century gives rise to much higher levels of concern: "the modern age - which began with such an unprecedented and promising outburst of human activity - may end in the deadliest, most sterile passivity history has ever known" (Arendt 1998, p. 322, quoted in Zuboff 2015, p.82).

Art naturally preceded the hard reality. Beginning with Herbert (1965) and Brunner (1975), and in cyberpunk sci-fi from Gibson (1984) to Stephenson (1992), hypercorps (the successors to transnational corporations) dominate organised economic activity, and have associated with them polite society, the majority of the net and a great deal of information. Outside the official levels of society skulk large numbers of people, in communities in which formal law and order have broken down, and tribal patterns have re-emerged. Officialdom has not been able to sustain the myth that it was in control; society has become ungovernable: "[cyberpunk authors] depict Orwellian accumulations of power ..., but nearly always clutched in the secretive hands of a wealthy or corporate elite" (Brin 1998). The term 'corpocracy' has emerged, to refer to an economic or political system controlled by corporations (Monks 2008).

During the current decade, it has become apparent that the threats inherent in the digital surveillance economy extend beyond the social and economic dimensions to the political. Election campaigns are increasingly driven by holdings of personal data on individual voters (Tufekci 2014, Bennett 2016). Candidates in recent US presidential elections have invested heavily in service-providers that specialise in behavioural profiling, with Cambridge Analytica gaining a great deal of media coverage. Although the degree of the technique's effectiveness is contested, there is little dispute that the volume of data available as a result of digital surveillance economy processes is enabling well-resourced candidates to have a material impact on electors' perceptions (Confessore & Hakimmarch 2017). The scope exists for skilful application of online behavioural advertising to warp election results and political processes.

The pattern that this paper has described as the digital surveillance economy undermines a great deal more than just the interests of individual consumers. Many aspects of societies and polities are threatened by the shift of power from the public sector - which is at least somewhat subject to ethical constraints, and to transparency requirements and regulatory processes that have some degree of effectiveness - to a private sector that is largely free of all of them. What can researchers do to enable understanding of these phenomena and their future trajectories?

6. A Research Agenda

The information systems literature primarily concerns itself with applications of IT, and far less attention is paid to IT's impacts and implications. Within that small literature, even fewer papers are instrumentalist in orientation. One of the few that suggests a research agenda in the area addressed in this paper (although the authors use 'datification' as the focal-point) is Newell & Marabelli (2015). However, this suggests the use of an 'ethical dilemma lens', which seems unlikely to gain traction with business enterprises, or even with regulatory agencies.

The purpose of this paper has been to lay the foundations for a research program whereby concerns about the threats inherent in the digital surveillance economy can be investigated, and appropriate responses formulated. This section makes two contributions. It first suggests a set of research questions that reflect alternative future paths that society can trace following the explosive growth of the digital surveillance economy. The second section then suggests the shape that a research program might take.

6.1 Research Questions

Future developments might follow any of a range of alternative paths. An appropriate approach to research into alternative futures is scenario analysis (Wack 1985, Schwartz 1991, van Notten et al. 2003, Postma & Leibl 2005, Clarke 2015). A scenario is "a detailed description of what the future might be like" (Starr 1971, p. 489), or "a trajectory of possible, logically connected future events" (p. 548). The description is expressed as a story-line that represents a composite or 'imagined but realistic world'. The creation process for a scenario involves the identification of existing contextual factors including those that have the capability to act as driving forces, postulation of how those driving forces might plausibly behave, and the projection of paths that might be followed depending on the responses of influential participants (Schwarz 1991, pp. 141-151, Kosov & Gassner 2008).

Scenario analysis involves the preparation of a small set of scenarios that have the potential to provide insights into an emergent new context. The value arising from the technique derives from a combination of the depth of insights achieved within each scenario, together with the diversity among the different scenarios' paths of development. Suggestions are made below for three potential scenarios that could be used as an organising framework for research questions that arise from the preceding analysis. A range of other alternatives could be usefully considered.

Scenario (1) 'The Trajectory Is Set'

In this scenario, the contextual setting would be dominated by net technologies, the weakening of the nation-state, and the self-interest, power and momentum of powerful corporations at the centre of the digital surveillance economy. Other forces would transpire to be ineffectual, and the plausible behaviour is that the new business model establishes itself as the primary determinant not only of economies but also of societies and polities.

Themes emerge from a well-developed scenario. The following are speculative questions that appear likely to require investigation:

This paper has shown that much of what is necessary for Scenario (1) to come to fruition is already in place, and hence this might be regarded as the default Scenario.

Scenario (2) 'Socio-Economic Systems are Adaptive'

In an alternative contextual setting, the activities of the participants in the digital surveillance economy would beget an effective reaction from other social and economic institutions. Possible themes that might arise from this scenario include:

In Ceglowksi (2015), it is suggested that the foundations of the current digital surveillance economy are rotten, and that the current, bloated Internet advertising industry structure will implode. This would of course influence processes and industry structure. It seems unlikely, however, that such changes would resolve the issues that this paper identifies. The rewards arising from targeted advertising based on intensive data about consumers appear to be so large that industry re-structuring is more likely to alter the allocation of benefits to corporations than to cause fundamental change in the practices.

Another way in which adaptation may occur is through intervention by parliaments or government agencies. Some possible regulatory measures are identified in Christl & Spiekermann (2016, pp. 142-143). To date, however, parliaments and regulatory agencies have been so meek that the activities of consumer marketers have been subject to very little constraint. The US Federal Trade Commission is the most important regulatory agency in the world in this arena, because all of the leading exponents of the digital surveillance economy are domiciled within its jurisdiction. A 'Do Not Track' initiative has been skilfully used by the industry for a decade as an excuse for legislative inaction (EPIC 2010, 2015). Yet, even now, the FTC's policy is quite candidly stimulatory of business, and not at all protective of consumer interests (FTC 2009, 2016).

A further possible driving force in this scenario is new forms of personal data markets, which support effective denial of consent, but encourage disclosures by consumers that are measured and compensated. A great many potential elements of consumer-friendly infrastructure have been piloted by the Privacy Enhancing Technologies (PETs) community, and a few have achieved some level of adoption. A concept that goes beyond the failed Platform for Privacy Preferences (P3P) scheme is outlined in Maguire et al. (2015), but it too fails to go beyond the technical aspects. A review of consumer-oriented social media is in Clarke (2014b). A more substantial proposal is in Spiekermann & Novotny (2015).

Scenario (3) 'The Revolution is Nigh'

Zuboff extracted the following implication from her 'surveillance capitalism' analysis: "Nothing short of a social revolt that revokes collective agreement to the practices associated with the dispossession of behavior will alter surveillance capitalism's claim to manifest data destiny ... New interventions [are] necessary ... The future of this narrative will depend upon the indignant scholars and journalists drawn to this frontier project, indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities, and indignant citizens who act in the knowledge that effectiveness without autonomy is not effective, dependency- induced compliance is no social contract, and freedom from uncertainty is no freedom" (Zuboff 2016).

The third suggested scenario would accordingly consider a contextual setting in which the public takes the matter into its own hands. It would appear likely to raise questions such as the following:

The scope for societal resistance to digital surveillance has been studied in the abstract (e.g. Martin et al. 2009). There has been discussion of the scope for, and merits of, transparent society (Brin 1998) and sousveillance (Mann et al. 2003). Meanwhile, a wide variety of technological initiatives have been launched in support of resistance against corporate excess (Goldberg 2007, Shen & Pearson 2011, and the proceedings of the PETS and SOUPS communities). Clarke (2016a) identifies the principles of obfuscation and falsification of data, messages, identities, locations and social networks. Schneier examines the battleground in greater detail, and offers a related set of principles: avoid, distort, block and break surveillance (Schneier 2015a, 2015b). Bösch et al. (2016) recommends instead: maximise, publish, centralise, preserve, obscure, deny, violate, fake.

Even prior to the emergence of the digital surveillance economy c.2005, surveys detected that "24% of Internet users have provided a fake name or personal information" (Pew 2000). That Report's words to describe the consumers' behaviour were all pejorative: "fake", "lying", and "deception". This was consistent with the perception of consumer marketers that such behaviour was a threat to their revenue. A limited amount of research has been undertaken into the phenomenon that authors variously refer to as 'consumer fabrication of information' and 'consumer lying' (e.g. Horne et al. 2007). Those studies generally adopt the assumption that such behaviour is unethical, without consideration of the implications of it being a response to unethical behaviour by marketers. The fact that a large and growing proportion of the world's user devices blocks ads (PageFair 2017) suggests that awareness and activism among consumers may now be considerably higher than it was before Web 2.0 enabled the emergence of the digital surveillance economy.

6.2 A Research Program

The analysis in the preceding sections may appear somewhat bold, futuristic and to a degree apocalyptic. There is, however, some recognition within at least the German Information Systems community of the need for a broadly-based research program - although the authors' proposals are primarily for the benefit of business and only secondarily for individuals, and they seem to have given little consideration to society and polity: "research on Digital Life requires extensive multidisciplinary activities in mixed research teams with experts from, for instance, psychology, cognitive sciences, or engineering, and co-innovation in cooperation with start-ups, spin-offs, and SMEs" (Hess et al. 2014, p.248).

This paper has proposed scenario analysis as an appropriate means of integrating the findings from multiple disciplines and professions. However, scenario analysis depends on a clear description of the current state, an appreciation of environmental and strategic factors that may play a role in future developments, and bodies of theory that suggest particular participant actions and reactions that may occur under various contingencies. In order to provide the necessary springboard, existing expertise and theories need to be identified and applied.

A first step towards addressing research questions such as those identified in the previous section is to establish what existing research already tells us about the behaviours of individuals and societies under stress, and particularly under surveillance stress. That will enable the identification of gaps in knowledge, the formulation of more specific research questions to fill those gaps, and the conception and conduct of research to provide answers.

The program can utilise the theories, insights and research techniques of many different disciplines. Some examples of relevant research, which would feed into each of the scenarios outlined above, are as follows:

7. Conclusions

This paper has presented a model of the activities within the digital surveillance economy. It has shown how consumer marketing corporations no longer have relationships with their customers as the core of their operations, and have instead become dependent on data-intensive digital personae to represent consumers.

By considering the phenomenon from the perspective of the individuals whose data is exploited, and taking into account the insights gained through a succession of conceptualisations of the changing world during the last half-century, the threats inherent in this new order have been identified. The high degree of optimism in the early post-industrial society and information society phases has been tempered, and has become more pessimistic, as surveillance society and the surveillance state have developed, the digital surveillance economy has emerged, and surveillance capitalism has been postulated.

During the last 15 years, the digital surveillance economy has come into existence, its impacts are measurable, and its implications can be at least speculated upon, while some aspects may now be sufficiently stable that preliminary empirical studies can be undertaken. Digitalisation of people harbours threats to individuals, to society, to economies and to polities. Whether those threats will give rise to the harm that doom-sayers suggest needs to be investigated. Such investigations need to be conducted sufficiently early that, to the extent that the pessimism is justified, countermeasures can be developed and implemented before the harm becomes irreversible.

Conventional, rigorous, empirical research is useless in the face of such a challenge. Research techniques need to be deployed that are future-oriented and instrumentalist. Researchers are needed who are not terrified by the prospect of embracing value-laden propositions within their research models. For progress to be made, hard questions need to be asked, and insights must be drawn from the findings and theories of multiple disciplines. This paper suggests a research agenda that enables the challenge to be taken up.


Acquisti A. & Grossklags J. (2005) 'Privacy and Rationality in Individual Decision Making' IEEE Security & Privacy 3, 1 (January/February 2005) 24-30, at

Acquisti A. (2008) 'Identity management, Privacy, and Price discrimination' IEEE Security & Privacy 6, 2 (March/April 2008) 18-22, at

Adomavicius G. & Tuzhilin A. (2005) 'Personalization Technologies: A Process-Oriented Perspective' Commun. ACM 48, 10 (October 2005) 83-90

Andrejevic M. (2014) 'The Big Data Divide' International Journal of Communication 8 (2014), 1673-1689, at

Angwin J. (2016) 'Google has quietly dropped ban on personally identifiable Web tracking' ProPublica, 21 October 2016, at

Balkin J.M. & Levinson S. (2006) 'The Processes of Constitutional Change: From Partisan Entrenchment to the National Surveillance State' Yale Law School Faculty Scholarship Series, Paper 231, at

Beatty S.E. & Ferrell M.E. (1998) 'Impulse buying: modeling its precursor' Journal of Retailing 74 (1998) 169-191Bell D. (1973) 'The Coming of Post Industrial Society' Penguin, 1973

Bell W. (2004) 'Foundations of Futures Studies: Human Science for a New Era: Values' Transaction Publishers, 2004

Bennett C. (2016) 'Voter databases, micro-targeting, and data protection law: can political parties campaign in Europe as they do in North America?' International Data Privacy Law 6, 4 (November 2016) 261-275, at

Brennen S. & Kreiss D. (2016) 'Digitalization and Digitization' International Encyclopedia of Communication Theory and Philosophy, October 2016, PrePrint at

Brin D. (1998) 'The Transparent Society' Basic Books, 1998

Brinker S. (2016) 'Marketing Technology Landscape' Chief Marketing Technologist Blog, March 2016, at

Bösch C., Erb B., Kargl F., Kopp H. & Pfattheicher S. (2016) 'Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns' Proc. Privacy Enhancing Technologies 4 (2016) 237-254, at

Brunner J. (1975) 'The Shockwave Rider' Harper & Row, 1975

Bujlow T., Carela-Español V., Solé-Pareta J. & Barlet-Ros P. (2015) 'Web Tracking: Mechanisms, Implications, and Defenses' 28 Jul 2015, at

Ceglowski M. (2015) 'The Advertising Bubble', 14 November 2015, at

Christl W. & Spiekermann S. (2016) 'Networks of Control: A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy' Facultas, Wien, 2016

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, PrePrint at

Clarke R. (1991) 'A Contingency Approach to the Application Software Generations' Database 22, 3 (Summer 1991) 23 - 34, PrePrint at

Clarke R. (1993) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance' Journal of Law and Information Science 4,2 (December 1993), PrePrint at

Clarke R. (1994) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, PrePrint at

Clarke R. (2001) 'If e-Business is Different Then So is Research in e-Business' Invited Plenary, Proc. IFIP TC8 Working Conference on E-Commerce/E-Business, Salzburg, 22-23 June 2001, published in Andersen K.V. et al. (eds.) 'Seeking Success in E-Business' Springer, New York, 2003, PrePrint at

Clarke R. (2006) 'Google's Gauntlets' Computer Law & Security Report 22, 4 (July-August 2006) 287-297, Preprint at

Clarke R. (2008a) 'B2C Distrust Factors in the Prosumer Era' Invited Keynote, Proc. CollECTeR Iberoamerica, Madrid, June 2008, PrePrint at

Clarke R. (2008b) 'Web 2.0 as Syndication' Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at, PrePrint at

Clarke R. (2009a) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, revised version at

Clarke R. (2009b) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at

Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, PrePrint at

Clarke R. (2014a) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, PrePrint at

Clarke R. (2014b) 'The Prospects for Consumer-Oriented Social Media' Proc. Bled eConference, June 2014, revised version published in Organizacija 47, 4 (2014) 219-230, PrePrint at

Clarke R. (2015) 'Quasi-Empirical Scenario Analysis and Its Application to Big Data Quality' Proc. 28th Bled eConference, Slovenia, 7-10 June 2015, PrePrint at

Clarke R. (2016a) 'A Framework for Analysing Technology's Negative and Positive Impacts on Freedom and Privacy' Datenschutz und Datensicherheit 40, 1 (January 2016) 79-83, PrePrint at

Clarke R. (2016b) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at

Clarke R. (2017) 'Vignettes of Corporate Privacy Disasters' Xamax Consultancy Pty Ltd, March 2017, at

CoE (2001) 'The Cybercrime Convention' Convention 185, Council of Europe, 2001, at

Cohen J.E. (1996) 'A right to read anonymously: A closer look at 'copyright management' in cyberspace' Conn. Law Review 28, 981 (1996) 1003-19, at

Confessore N. & Hakimmarch D. (2017) 'Data Firm Says `Secret Sauce' Aided Trump; Many Scoff' The New York Times, 6 March 2017, at

Constantinides E. & Fountain S.J. (2008) 'Web 2.0: Conceptual foundations and marketing issues' Journal of Direct, Data and Digital Marketing Practice 9, 3 (2008) 231-244, at

Davies S. (1997) 'Time for a byte of privacy please' Index on Censorship 26, 6 (1997) 44-48

Davis B. (2014) 'A guide to the new power of Facebook advertising' Econsultancy, 29 May 2014, at

Degli Esposti S. (2014) 'When big data meets dataveillance: The hidden side of analytics' Surveillance & Society 12, 2 (2014) 209-225, at

Dreborg K.H. (1996) 'Essence of Backcasting' Futures 28, 9 (1996) 813-828

Drucker P.F. (1968) 'The Age of Discontinuity' Pan Piper, 1968

Dunn E.S. (1967) 'The idea of a national data center and the issue of personal privacy' The American Statistician, 1967

Dwyer C. (2009) 'Behavioral Targeting: A Case Study of Consumer Tracking on' Proc. AMCIS, 2009, p.460

Ebrahimi S., Ghasemaghaei M. & Hassanein K. (2016) 'Understanding the Role of Data Analytics in Driving Discriminatory Managerial Decisions' Proc. ICIS 2016

Englehardt S. & Narayanan A. (2017) 'Online tracking: A 1-million-site measurement and analysis' WebTAP Project, Princeton University, January 2017, at

EPIC (2003) 'Privacy and Consumer Profiling' Electronic Privacy Information Center, 2003, at

EPIC (2010) 'Do Not Track Legislation: Is Now the Right Time?' Electronic Privacy Information Center (EPIC), Statement to a Committee of the US Congress, December 2010, at

EPIC (2015) 'Cross-Device Tracking' Electronic Privacy Information Center (EPIC), Statement to the Federal Trade Commission, December 2015, at

FTC (2009) 'Self-Regulatory Principles For Online Behavioral Advertising' Federal Trade Commission, February 2009, at

FTC (2016) 'Keeping Up with the Online Advertising Industry' Federal Trade Commission, April 2016, at

Flaherty D.H. (1988) 'The emergence of surveillance societies in the western world: Toward the year 2000' Government Information Quarterly 5, 4 (1988) 377-387

Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989

Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Peregrine, London, 1975, trans. 1977

Froomkin A.M. (1995) 'Anonymity and its Enmities' Journal of Online Law 1, 4 (1995), at

Fuchs C., Boersma K., Albrechtslund A. & Sandoval M. (2012) 'Internet and Surveillance ' Chapter 1, pp. 1-28 of Fuchs C. et al. (eds.) (2012) 'Internet and Surveillance: The Challenges of Web 2.0 and Social Media' Routledge, 2012

Funder A. (2003) 'Stasiland: Stories from behind the Berlin Wall' Granta, 2003

Gandy O.H. (1989) 'The Surveillance Society: Information Technology and Bureaucratic Social Control' Journal of Communication 39, 3 (September 1989) 61-76

Gandy O.H. (1993) 'The Panoptic Sort. Critical Studies in Communication and in the Cultural Industries' Westview, Boulder CO, 1993

Gibson W. (1984) 'Neuromancer' Grafton/Collins, London, 1984

Goldberg L.R. (1990) 'An alternative 'Description of personality'': The Big-Five structure' Journal of Personality and Social Psychology 59 (1990) 1216-1229

Goldberg L.R. (1999) 'A broad-bandwidth, public-domain, personality inventory measuring the lower-level facets of several Five-Factor models' Ch. 1 in Mervielde I. et al. (eds.) 'Personality Psychology in Europe', vol. 7, Tilburg Uni. Press, 1999, pp. 7-28, at

Goldberg I. (2007) 'Privacy Enhancing Technologies for the Internet III: Ten Years Later' Chapter 1 of Acquisti A. et al. (eds.) 'Digital Privacy: Theory, Technologies, and Practices' Auerbach, 2007, at

Graham S. & Wood D. (2003) 'Digitizing Surveillance: Categorization, Space, Inequality' Critical Social Policy 23, 2 (May 2003) 227-248

Gray P. & Hovav A. (2008) 'From Hindsight to Foresight: Applying Futures Research Techniques in Information Systems' Communications of the Association for Information Systems 22, 12 (2008), at

Greenwald G. (2014) 'No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State' Metropolitan Books, 2014, at

Gunter B. & Furnham A. (1992) 'Consumer Profiles: An Introduction to Psychographics' Routledge, 1992

Harris S. (2010) 'The Watchers: The Rise of America's Surveillance State' Penguin, 2010

Hassan N.R. (2014) 'Useful Products in Theorizing for Information Systems' Proc. 35th Intl Conf on Information Systems, December 2014, at

Hazen B.T., Boone C.A., Ezell J.D. & Jones-Farmer L.A. (2014) 'Data Quality for Data Science, Predictive Analytics, and Big Data in Supply Chain Management: An Introduction to the Problem and Suggestions for Research and Applications' International Journal of Production Economics 154 (August 2014) 72-80, at

Herbert F. (1965) 'Dune' Chilton Books, 1965

Hernandez G.A., Eddy K.J. & Muchmore J. (2001) 'Insurance Weblining and Unfair Discrimination in Cyberspace' SMU L. Rev. 54 (2001) 1953-1971, at

Hess T., Legner C., Esswein W., Maaß W., Matt, C., Österle H., Schlieter H., Richter P. & Zarnekow, R. (2014) 'Digital Life as a Topic of Business and Information Systems Engineering?' Business & Information Systems Engineering 6, 4 (2014) 247-253

Hesse M.B. (1966) 'Models and Analogies in Science' University of Notre Dame Press, 1966

Hirst M. (2013) `Someone's looking at you: welcome to the surveillance economy' The Conversation, 26 July 2013, at

Hoofnagle C.J., King J., Li S. & Turow J. (2010) 'How different are young adults from older adults when it comes to information privacy attitudes and policies?' SSRN, April 2010, at abstract=1589864

Horne D.R., Norberg P.A. & Ekin A.C. (2007) 'Exploring consumer lying in information based exchanges' Journal of Consumer Marketing 24, 2 (2007) 90-99

Johnson-Laird P. N. (1988) 'Freedom and constraint in creativity' in R. J. Sternberg (ed.) 'The nature of creativity: contemporary psychological perspectives' Cambridge University Press, 1988, at

Kelly K. (1998) 'New Rules for the New Economy' Penguin, 1998, at

Kistermann F.W. (1991) 'The Invention and Development of the Hollerith Punched Card: In Commemoration of the 130th Anniversary of the Birth of Herman Hollerith and for the 100th Anniversary of Large Scale Data Processing' IEEE Annals of the History of Computing 13, 3 (July 1991) 245 - 259

Kosov H. & Gassner R. (2008) 'Methods of Future and Scenario Analysis: Overview, Assessment, and Selection Criteria' German Development Institute, 2008, at

Lambrecht A. &Tucker C. (2013) 'When Does Retargeting Work? Information Specificity in Online Advertising' Journal of Marketing Research 50, 5 (October 2013) 561-76

Land F. (2000) 'The First Business Computer: A Case Study in User-Driven Innovation' J. Annals of the Hist. of Computing 22, 3 (July-September 2000) 16-26

Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt and Company, New York, 1992

Laudon K.C. (1993) 'Markets and Privacy' Proc. Int'l Conf. Inf. Sys., Orlando FL, Ass. for Computing Machinery, New York, 1993, pp. 65-75

Levine R., Locke C., Searls D. & Weinberger D. (2000) 'The Cluetrain Manifesto: The End of Business as Usual' Perseus, 2000, summary at

Lewis A. (2010) 'If you are not paying for it, you're not the customer; you're the product being sold' Metafile, August 2010, at

Lupton D. (2016) 'The Quantified Self: A Sociology of Self-Tracking' Polity Press, 2016

Lycett, M. (2014), Datafication: Making Sense of (Big) Data in a Complex World, European Journal of Information Systems, 22, 4 (December 2014) 381-386, at

Lynch C. (2017) 'The rise of reading analytics and the emerging calculus of reader privacy in the digital world' First Monday 22, 24 (3 April 2017), at

Lyne J. (2011) 'HTML5 and Security on the New Web' Sophos Security, December 2011, at

Lyon D. (1994) 'The electronic eye: The rise of surveillance society' Polity Press, 1994

Lyon D. (2001) 'Surveillance society' Open University Press, 2001

Lyon D. (2003) 'Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination' Routledge, 2003

Machlup F. (1962) 'Production and Distribution of Knowledge in the United States' Princeton University Press, 1962

MacInnis D.J. (2011) 'A Framework for Conceptual Contributions in Marketing' Journal of Marketing 75, 4 (July 2011) 136-154

Maguire S., Friedberg J., Nguyen M.-H.C. & Haynes P. (2015) 'A metadata-based architecture for user-centered data accountability' Electronic Markets 25, 2 (June 2015) 155-160

Majchrzak A. & Markus M.L. (2014) 'Methods for Policy Research: Taking Socially Responsible Action' Sage, 2nd Edition, 2014

Mann S., Nolan J. & Wellman B. (2003) 'Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments' Surveillance & Society 1(3): 331-355, at

Martin A.K., Van Brakel R. & Bernhard D. (2009) 'Understanding resistance to digital surveillance: Towards a multi-disciplinary, multi-actor framework' Surveillance & Society 6, 3 (2009) 213-232, at

Martino J.P. (1993) 'Technological Forecasting for Decision Making' 3rd Edition, McGraw-Hill, 1993

Marx G.T. (1985) 'The Surveillance Society: The Threat Of 1984-Style Techniques' The Futurist, 1985

Masuda Y. (1981) 'The Information Society as Post-Industrial Society' World Future Society, Bethesda, 1981

Mayer J.R. & Mitchell J.C. (2012) 'Third-Party Web Tracking: Policy and Technology ' Proc. IEEE Symposium on Security and Privacy, 2012, at

Metzger M.J. (2007) 'Communication Privacy Management in Electronic Commerce' Journal of Computer-Mediated Communication 12, 2 (January 2007) 335-361, at

Miles M.B. & Huberman A.M. (1994) 'Qualitative Data Analysis: An Expanded Sourcebook' Sage 1994

Monks R.A.G. (2008) 'Corpocracy: How CEOs and the Business Roundtable Hijacked the World's Greatest Wealth Machine and How to Get it Back' Wiley, 2008

NAI (2010) 'Study Finds Behaviorally-Targeted Ads More Than Twice As Valuable, Twice As Effective As Non-Targeted Online Ads' National Advertising Initiative, March 2010, at

Naisbitt J. (1982) 'Megatrends' London, Macdonald, 1982

Negroponte N. (1995) 'Being Digital' Hodder & Stoughton, 1995

Newell S. & Marabelli M. (2015) 'Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datification'' The Journal of Strategic Information Systems 24, 1 (2015) 3-14, at

Newman N. (2011) 'You're Not Google's Customer -- You're the Product: Antitrust in a Web 2.0 World' The Huffington Post, 29 March 2011, at

Newman N. (2014) 'The Costs of Lost Privacy: Consumer Harm and Rising Economic Inequality in the Age of Google' William Mitchell L. Rev. 40, 2 (2014) 849-889, at

Niederman F, Clarke R., Applegate L., King J.L., Beck R. & Majchrzak A. (2016) 'IS Research and Policy: Notes from the 2015 ICIS Senior Scholar's Forum' Commun. Assoc. Infor. Syst. 40, 5 (2016) 82-92, at

van Notten P.W.F., Rotmans J., van Asselt M.B.A. & Rothman D.S. (2003) 'An Updated Scenario Typology' Futures 35 (2003) 423-443, at

O'Reilly T. (2005) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly 30 September 2005, at

Osler M., Lund R. Kriegbaum M., Christensen U. & Andersen A.-M.N. (2006) 'Cohort Profile: The Metropolit 1953 Danish Male Birth Cohort' Int. J. Epidemiol. 35, 3 (June 2006) 541-545, at

Packard V. (1957) 'The Hidden Persuaders' Penguin, 1957

Packard V. (1964) 'The Naked Society' McKay, 1964

PageFair (2017) 'The state of the blocked web - 2017 Global Adblock Report' PageFair, February 2017, at

Palmas K. (2011) Predicting What You'll Do Tomorrow: Panspectric Surveillance and the Contemporary Corporation' Surveillance & Society 8, 3 (2011) 338-354, at

Pew (2000) 'Trust and Privacy Online: Why Americans Want to Rewrite the Rules?' Pew Internet and American Life Project, August 2000, at:

Posner R.A. (1977) 'The Right of Privacy' 12 Georgia Law Review 393, at

Posner R.A. (1981) 'The Economics of Privacy' American Economic Review 71, 2 (1981) 405-409

Posner R.A. (1984) 'An Economic Theory of Privacy' in Schoeman F. (ed.) 'Philosophical Dimensions of Privacy: An Anthology' Cambridge University Press, 1984, pp. 333-345

Postma T.J.B.M. & Leibl F. (2005) 'How to improve scenario analysis as a strategic management tool?' Technological Forecasting & Social Change 72 (2005) 161-173, at

Rheingold H. (2002) 'Smart mobs: The next social revolution' Basic Books, 2002

Rosenblatt B. (2015) 'The myth of DRM-free music' Copyright and Technology (31 May 2015), at

Roszak T. (1986) 'The Cult of Information' Pantheon, 1986

Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974

Schauer F. (1978) 'Fear, Risk and the First Amendment: Unraveling the Chilling Effect' Boston University Law Review 58 (1978) 685-732, at

Schmidt E. & Cohen J. (2014) 'The New Digital Age: Reshaping the Future of People, Nations and Business' Knopf, 2013

Schneier B. (2015a) 'Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World' Norton, March 2015

Schneier B. (2015b) 'How to mess with surveillance' Slate, 2 March 2015, at

Schwartz P. (1991) 'The Art of the Long View: Planning for the Future in an Uncertain World' Doubleday, 1991

Shapiro C. & Varian H.R. (1999) 'Information Rules: A Strategic Guide to the Network Economy' Harvard Business School Press, 1999

Shen Y. & Pearson S. (2011) 'Privacy enhancing technologies: a review' HP Laboratories, 2011, at

Shklovski I., Mainwaring S.D., Skúladóttir H.H. & Borgthorsson H. (2014) 'Leakiness and Creepiness in App Space: Perceptions of Privacy and Mobile App Use' Proc. CHI 2014, April 2014, at

Smith H.J., Dinev T. & Xu H. (2011) 'Information Privacy Research: An Interdisciplinary Review' MIS Quarterly 35, 4 (December 2011) 989-1015

Spiekermann S. & Novotny A. (2015) 'A Vision for Global Privacy Bridges: Technical and Legal Measures for International Data Markets' Computer Law & Security Review 31, 2 (2015) 181-200, at

SSN (2014) 'An introduction to the surveillance society', Surveillance Studies Network, 2014, at

Starr M.K. (1971) 'Management: A Modern Approach' Harcourt Brace Jovanovich, 1971

Stephenson N. (1992) 'Snow Crash' Bantam, 1992

Tapscott D. (1996) 'The Digital Economy: Promise and Peril in the Age of Networked Intelligence' McGraw-Hll, 1996

Touraine A. (1971) 'The Post-Industrial Society: Tomorrow's Social History' Random House, 1971

Tufekci Z. (2014) 'Engineering the public: Big data, surveillance and computational politics' First Monday 19, 7 (7 July 2014), at

Turow, J., King J., Hoofnagle C.J., Bleakley A. & Hennessy M. (2009) 'Americans Reject Tailored Advertising and Three Activities that Enable It' SSRN, September 2009, at

Ur B., Leon P.G., Cranor L.F., Shay R. & Wang Y. (2012) 'Smart, Useful, Scary, Creepy: Perceptions of Online Behavioral Advertising' Proc. 8th Symp. on Usable Privacy and Security, SOUPS '12, pp. 4:1-4:15, at ttps://

Vaidhyanathan S. (2011) 'The Googlization of Everything (And Why We Should Worry)' University of California Press, 2011

Varian H.R. (2014) 'Beyond Big Data' Business Economics 49, 1 (2014) 27-31

Wack P. (1985) 'Scenarios: Uncharted Waters Ahead' Harv. Bus. Rev. 63, 5 (September-October 1985) 73-89

Wang Y. & Felsenmaier D.R. (2002) 'Modeling Participation in an Online Travel Community' Journal of Travel Research 42, 3 (February 2004) 261-270

Weingarten F.W. (1988) 'Communications Technology: New Challenges to Privacy' J. Marshall L. Rev. 21, 4 (Summer 1988) 735

Wells W.D. (1975) 'Psychographics: A Critical Review' Journal of Marketing Research 12, 2 (May, 1975) 196-213

Wells J.D., Parboteeah V. & Valacich J.S. (2011) 'Online impulse buying: understanding the interplay between consumer impulsiveness and website quality' Journal of the Association for Information Systems 12 (2011) 32-56

Westin A.F. (1967) 'Privacy and Freedom' Atheneum 1967

Westin A.F. & Baker M.A. (1974) 'Databanks in a Free Society: Computers, Record-Keeping and Privacy' Quadrangle 1974

Yadav M.S. (2010) 'The Decline of Conceptual Articles and Implications for Knowledge Development' Journal of Marketing 74, 1 (January 2010) 1-19

Yan J., Liu N., Wang G., Zhang W., Jiang Y. & Chen Z. (2009) 'How Much can Behavioral Targeting Help Online Advertising?' Proc. WWW 2009, pp.261-270, at

Yoo Y., Henfridsson O. & Lyytinen K. (2010) 'The New Organizing Logic of Digital Innovation: An Agenda for Information Systems Research' Information Systems Research 21, 4 (December 2010) 724-73, at

Zang J., Dummit K. Graves J., Lisker P., Sweeney L. (2015) 'Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps' Technology Science, 2015103001, 30 October 2015, at

Zimmer M. (2008) 'The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance' in Spink A. & Zimmer M. (eds.) 'Web Search: Multidisciplinary Perspectives' Springer, 2008, pp. 77-102, at

Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75-89, at

Zuboff S. (2016) 'The Secrets of Surveillance Capitalism' Frankfurter Allgemeine, 3 May 2016, at


My thanks for interactions and feedback on the topic to Prof. Robert Davison, City University of Hong Kong, and Prof. Markus Beckmann of the Friedrich-Alexander University Nürnberg, Liam Pomfret of the University of Queensland, and renowned sci-fi author and futurist David Brin.

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.

Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 7 December 2016 - Last Amended: 8 September 2017 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2017   -    Privacy Policy