Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Digitalisation and the Individual'

Risks Inherent in the Digital Surveillance Economy:
A Research Agenda

Review Version of 19 September 2017

Journal of Information Technology 34,1 (Mar 2019) 59-80
https://doi.org/10.1177/0268396218815559

The Editor commissioned 4 Commentaries, and I prepared a Response entitled 'Future-Oriented Research Agendas, and Competing Ideologies' (12 August 2018)

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2016-17

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/EC/DSE.html


Abstract

The digitisation of data about the world relevant to business has given rise to a new phase of digitalisation of business itself. The digitisation of data about people has linked with the notions of information society, surveillance society, surveillance state and surveillance capitalism, and given rise to what is referred to in this paper as the digital surveillance economy. At the heart of this is a new form of business model that is predicated on the acquisition and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. In the words of the model's architects, users are 'bribed' and 'induced' to make their data available at minimal cost to marketers.

The digital surveillance economy harbours serious threats to the interests of individuals, societies and polities. That in turn creates risks for corporations. The new economic wave may prove to be a tsunami that swamps the social dimension and washes away the last five centuries' individualism and humanism. Alternatively, institutional adaptation might occur, overcoming the worst of the negative impacts; or a breaking-point could be reached and consumers might rebel against corporate domination. A research agenda is proposed, to provide a framework within which alternative scenarios can be investigated.


Contents


1. Introduction

During the second half of the 20th century, a digitisation revolution occurred, such that a large proportion of data is now 'born digital', and analogue data can be inexpensively converted into digital form (Negroponte 1995, Yoo et al. 2010). Digitisation laid the platform for developments in the early 21st century that are currently being referred to as 'digitalisation'. A shift has occurred from the interpretation and management of the world through human perception and cognition, to processes that are almost entirely dependent on digital data (Brennen & Kreiss 2016). What began as an opportunity for strategic differentiation has morphed into a stimulus for organisational and sectoral transformation and disruption.

Digitalisation can be applied to any kind of entity. Its application to people is currently attracting a great deal of attention, giving rise to literatures on 'datafication' (Lycett 2014), or less commonly 'datification' (Newell & Marabelli 2015). The discussion is mostly upbeat, with a focus on the opportunities for organisations that are perceived to arise from streams of digital data about individuals. This paper, on the other hand, primarily reflects the perspectives of the individuals themselves, with implications for corporate interests as the second-order consideration.

Contemporary business models are predicated on the acquisition and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear. Zuboff (2015) perceives the negative implications to be of enormous scale, and depicts the new context as 'surveillance capitalism'. The research reported in this paper adopts a narrower focus, seeking firm ground in understanding of the relevant technologies and of the activities that apply them.

In the digital surveillance economy, genuine relationships between organisations and people are replaced by decision-making based on data that has been consolidated into digital personae. This paper conceptualises the institutions and processes that serve the new business model using the notion of 'the digital surveillance economy'. The consumer is converted from a customer to a product, consumers' interests have almost no impact on the process, and they can be largely ignored. In the words of the model's architects, consumers are "bribed" and "induced" to make their data available at minimal cost to marketers (Shapiro & Varian 1999, pp. 35-36).

The aims of the research reported in this paper are to provide sufficient context that the digital surveillance economy's nature and origins can be appreciated, to delineate its key features, to identify its implications for the interests of individuals, societies and polities, and to structure a research agenda whereby deeper understanding can be achieved.

The paper commences with a brief description of the challenges facing research of this nature, and an outline of the approach adopted. The precursors to digitalisation are outlined. The nature of the digital surveillance economy is then discussed, and a model proposed that identifies the activities that give rise to it. The threats are then examined, firstly those to individuals, followed by the more abstract implications in the societal, economic and political domains. A range of possible futures exists. Because the phenomena remain fluid at this early stage in the new era, the research reported here is necessarily preliminary. A research agenda is proposed whch encompasses the analysis presented in the early sections and extends it into the choice of a research approach, the articulation of broad research questions structured within three scenarios, and an extensive research program.


2. The Research Approach Adopted

The purpose of the research presented here is to provide insight, and thereby guide decisions and actions by policy-makers, by consumers and consumer advocacy organisations, by technologists, particularly those seeking to support consumers' interests, and by business enterprises. Research of this kind is confronted by a number of challenges, which need consideration prior to a research method being proposed.

The conduct of empirical research is predicated on the assumptions that the relevant phenomena already exist, and that they exhibit a sufficient degree of stability. The rigour demanded of empirical research is undermined to the extent that results cannot be reliably reproduced because the population from which samples are extracted exhibits continual changes of behaviour. This is inevitably the case with emergent phenomena, such as those that arise from the application of multiple new digital technologies. Remarkably, instability of the object of study appears to be little-discussed in the IS literature. However, for a discussion of the difficulties that the dynamic nature of the eCommerce and eBusiness eras presented to researchers, see Clarke (2001).

A further challenge arising with future-oriented, instrumentalist research is that it must deal with contexts in which stakeholders will be differentially impacted by change, and in which it is commonly the case that pareto optimality cannot be achieved and hence the interests of at least some stakeholders suffer harm. This necessarily draws the researcher beyond the merely descriptive, explanatory and predictive, and into realms characterised by value-laden contexts, value-conflicts, political negotiation processes, normative modes and prescriptive outcomes. If research is to deliver outputs relevant to government policy, consumer behaviour and business strategy, then research techniques must be applied that can cope with both unstable phenomena and contested values, and that seek such levels of rigour as are feasible in the circumstances.

The results of this research are being addressed to diverse audiences. The outcomes are intended to influence decision-makers in corporations, by highlighting the broader implications of current changes in business models. The work's more direct contributions, however, are to the development of public policy. Beyond mere policy, guidance is needed for the articulation of programs intended to achieve normative objectives expressed in economic and social terms. These commonly depend on effective cooperation among disparate organisations, across the public and private sectors and often also the voluntary sector. The intended readership also includes a proportion of consumers, and particularly of consumer advocacy organisations and technology developers. Also within the target-zone are academics, most of whose work is currently conducted from the perspective of corporations, but who may consider broadening their scope to address the challenges identified in this paper. This diversity of audience adds to the challenges of research conception, design, conduct and reporting.

With these challenges in mind, an approach was adopted utilising what MacInnis (2011) refers to as 'a propositional inventory', to "lay out areas in which empirical research is needed" (pp. 2, 10). First, a contextual review was conducted of relevant threads of development. A conceptual framework was then outlined whereby the emergent phenomena could be researched, understood and integrated. By 'conceptual framework' I mean "the researcher's map of the territory being studied, which consists of the main concepts, constructs and variables and their related propositions" (Hassan 2015 p.12, citing Miles & Huberman 1994). The analysis drew on a wide range of sources. This included relevant prior research conducted by the author.

The broad conceptual framework was articulated by means of a model of the key processes within the digital surveillance economy. This was developed from observations of consumer marketing activities, primarily in contemporary Web-based eCommerce and mobile commerce. The term 'model' is used here in the sense of a "pre-theoretical imperfect copy of the phenomenon of interest" (Hassan 2015, p. 11, citing Hesse 1966).

The process model provided a basis for identifying the impacts of the new ways of doing business. It is also valuable as a basis for tests of the conceptual framework against continually changing real-world instances. In developing the framework, attention was paid to the guidelines for evaluating conceptual articles proposed in Yadav (2010, pp. 15-16). The research questions, rather than being declared at the outset, are an outcome of the analysis, and are presented at the end of the paper. They were developed by performing a first iteration of a scenario analysis approach.

The structure of the paper reflects the research process outlined above. The following section presents a brief review of several previous phases of technological change, the uses made of them by business enterprises, and the interpretations made of them by observers. In section 4, the concept of the digital surveillance economy is discussed, and the process model is presented. Threats to the individual that arise from this approach are then identified, and broader impacts are discussed. Finally, in section 6, a research agenda is framed.


3. Precursors

This section briefly summarises earlier conceptions that provide context and that condition observers' perceptions of contemporary consumer marketing. The digitisation phase was largely complete by the end of the twentieth century. A first, critical aspect of digitisation is that selection of attributes whose states or behaviour are to be represented in data is not a value-neutral event but rather a purposive activity, with choices made that reflect the intentions of the collector. Contrary to the implications of the term 'data capture', data is not already in existence and merely waiting to be gathered, but is actively created. The collector's perception of purpose results in some attributes of some entities and their actions being represented in the data collection, and the complete omission from the collection of representations of all other attributes of those entities and actions, and of all attributes of all other entities and actions. Further key issues are that data quality is a multi-dimensional notion, and that data-items, records and record-collections evidence very high variability in quality.

Digitisation of individuals can be traced to 1890, when Hollerith built on previous work by Jacquard and Babbage, used an 80-column punched-card to express US Census data, and thereby launched the era of machine-readable personal data (Kistermann 1991). The application of computing to administrative data commenced in 1951 at the Lyons Tea Company in the U.K. (Land 2000). Within a decade, the opportunity was perceived for 'national data systems' to consolidate data about people, variously as a basis for sociological research (e.g. the Metropolit projects in Sweden and Denmark - Osler et al. 2006) and for social control (e.g. the National Data Center proposals in the USA - Dunn 1967). The second half of the century saw an explosion in the creation of data about people. During that time, the basis for decisions about such matters as lending, insurance and social welfare changed from the evaluation of applicants face-to-face, complemented by consideration of supporting documents, to a process almost entirely based on the machine-readable-data-assembly that is thought to relate to the applicant. I coined the term 'digital persona' to refer to that assembly (Clarke 1994, 2014a).

The Internet brought with it scope for pseudonymity and even anonymity (Froomkin 1995). Moreover, as the dot.com investment era boomed, and then c. 2000 imploded, 'new rules' were emerging (Kelly 1998). Until c. 2010, there was some optimism that consumer marketing would become less intrusive and manipulative, and more collaborative (Levine et al. 2000, Clarke 2008a). In the meantime, however, Web 2.0 was conceived as the means whereby the Web's architecture could be inverted from its original, consumer-driven request-response pair into a marketer's data collection and consumer manipulation framework (O'Reilly 2005, Clarke 2008b). Web 2.0 features laid the foundations for the digital surveillance economy. Within less than a decade from 2005 onwards, consumer marketing became anything but the two-sided conversation that Levine et al. had envisaged.

Meanwhile, information technologies were having impacts more broadly than just in the commercial sphere. The notions of the 'knowledge economy' (Machlup 1962, Drucker 1968), and 'post-industrial society' (Touraine 1971, Bell 1976) were absorbed into that of 'the information society' (Masuda 1981). The tone of the large majority of the relevant literature in the 1980s was very upbeat, strongly focussed on the opportunities, and with very little consideration of what threats the new socio-economic arrangements might embody, how impactful they might be, and whether the impacts might fall unevenly.

In parallel, the concept of 'surveillance society' developed (Marx 1985, Flaherty 1988, Weingarten 1988, Gandy 1989, Lyon 1994, Lyon 2001). Surveillance is usefully defined as the systematic investigation or monitoring of the actions or communications of one or more persons. The sentiments are well summed-up by this early speculation: "Technology could be offering the tools for building a surveillance society in which everything people do and are as human beings is an open book to those who assert some 'right' to monitor them" (Weingarten 1988, p.747).

Beyond the longstanding forms of visual, aural and communications surveillance, dataveillance rapidly emerged as a more cost-effective alternative (Clarke 1988). Roszak (1986) observed that the purpose of 'computerised surveillance' was "to reduce people to statistical skeletons for rapid assessment" (p.186). He provided a caricature of the quality problems of what was then referred to as 'data mining' and is now re-badged as 'big data analytics' and 'data science': "garbage in - gospel out" (p.120).

The early forms have since been joined by electronic surveillance (of space, traffic, movement, behaviour and experience) and auto-surveillance (by devices carried with, on or in the person). More detailed analysis of surveillance categories, techniques and technologies is in Clarke (2009b, 2010). A strong expression of nervousness about the future of free societies has recently been provided by Zuboff (2015, 2016), who has coined the term 'surveillance capitalism' to refer to the aim of corporations "to predict and modify human behavior as a means to produce revenue and market control" (2015, p.75).

The idea of 'surveillance society' has been joined by the political notion of the 'surveillance state'. This involves more or less ubiquitous monitoring of individuals by government agencies (Balkin & Levinson 2006, Harris 2010). The notion may seem to be not directly relevant to the present analysis. On the other hand, it is no longer limited to the USSR, China, East Germany (Funder 2003), and developing and under-developed countries ruled by despotic regimes. The Snowden revelations confirmed that government agencies in 'the free world' also abuse their powers and circumvent controls that they are nominally subject to (Greenwald 2014). Computational politics has recently added new layers of threat (Tufekci 2014). Meanwhile, the 'public-private partnership' notion may be morphing into a vision of a high-tech, corporatised-government State (Schmidt & Cohen 2014), and hence becoming a major threat to democracy. It is reasonable to expect that developments towards a surveillance state would contribute to rapidly declining trust by individuals not only in governments but also in corporations.

These threads provide the intellectual context within which the current approach to consumer marketing has developed and from which vantage-point it can be observed and interpreted.


4. The Digital Surveillance Economy

This section builds on the precursor notions discussed in the previous section, identifies the key features of the digital surveillance economy, and lays a foundation for analysis of its implications and of the reactions against it that may arise. The focus is primarily on economic and social rather than political matters, and the analysis is intended to be relevant to policy-makers, consumers and business executives.

4.1 The Concept

The 'direct marketing' mechanisms of two to five decades ago have developed into heavy reliance by corporations on the acquisition and exploitation of detailed machine-readable dossiers of information on large numbers of individuals (Davies 1997, Hirst 2013). The term 'exploitation' is subject to interpretations both positive - as in the extraction of advantage from one's own assets - and negative - as in unfair utilisation of something belonging to someone else. Both senses of the word are applicable to the use of data in the digital surveillance economy.

Surveillance that produces machine-readable data, in quantity, about more-or-less everything and everybody, has changed the game. The working definition adopted for the core concept in this paper is as follows:

The digital surveillance economy is that combination of institutions, institutional relationships and processes which enables corporations to exploit data arising from the monitoring of people's electronic behaviour and on which consumer marketing corporations are rapidly becoming dependent

There was no hint of a surveillance element in mid-1990s depictions of 'the digital economy', such as the highly-cited Tapscott (1996). By the beginning of the new century, however, the possibility of "a universal surveillance economy" had been raised (Rheingold 2002, p. xviii, quoted in Fuchs et al. 2012). The term 'digital surveillance economy', which this author adopts as a suitably descriptive label for the phenomenon, appears to have been first used in Andrejevic (2014, p.1678).

A notion that stands at the heart of the new approach is represented by a longstanding aphorism in media theory: 'If you're not paying, you're the product'. For an early usage in the Web context, see Lewis (2010). The foundations lie neither in classical nor neo-classical economics (which are relevant to scarce resources), but rather in information economics (which applies to abundant resources). The business school text-book that heralded the revolution was Shapiro & Varian (1999). Its significance is underlined by the fact that, since 2002, Hal Varian has had the title of Chief Economist at Google.

The Shapiro & Varian thesis is that customers' attention is the marketer's stock-in-trade. It is to be traded for revenue, market-share and profit, and cross-leveraged with business partners. Shapiro & Varian is completely driven by supply-side thinking, and assumes that demand is not only disorganised and powerless, but also that it will remain so. The book continues to use the term 'customer', but the implication of a relationship has been jettisoned, and it is almost synonymous with the impersonal term 'consumer'. In their theory - and right now in the digital surveillance economy - the interests of consumers have almost no impact on the process, so they can be safely ignored. Relationship marketing, permission-based marketing, interactivity with and participation of customers, the prosumer concept, customer needs, customer needs satisfaction and customer alliances are all overlooked.

A further significant aspect of the Shapiro & Varian approach is that consumer rights, privacy rights, and laws to protect them might as well not exist. Regulation is merely a barrier to be overcome and to be replaced by the chimera of 'self-regulation'. The book lacks any vestige of the notions of community, society, equity, discrimination or individual wellbeing. Indeed, the key terms used in this paragraph do not even appear in Shapiro & Varian's index.

The techniques for acquiring personal data have become much more sophisticated during the nearly two decades since Shapiro & Varian wrote; but the principle was laid out very clearly: "[marketers'] strategy should be to bribe users to give them the appropriate demographics, which in turn can be passed onto advertisers ... [The game is about] inducing consumers to give [marketers] the information they want. ... we expect that many consumers will be happy to sell information about themselves for a nominal amount ..." (pp. 35-36). On the surface, the authors appear to have been oblivious to the fact that both 'bribing' and 'inducing' are pejorative terms indicating immoral behaviour, and in some contexts are criminal acts. Alternatively they regarded morality, and even the criminal law, as being merely an inconvenient regulatory hurdle that business enterprises needed to overcome.

Some social scientists drew attention to the implications at an early stage: "Digitization facilitates a step change in the power, intensity and scope of surveillance" (Graham & Wood 2003). In mid-2017, that article had just over 250 Google citations, whereas Shapiro & Varian had accelerated beyond 10,000. Much more significant than the book's citation-count, however, has been the application of its propositions in business processes and in features embedded in the information infrastructure that supports consumer eCommerce and mobile commerce.

4.2 A Process Model

Shapiro & Varian's proposals are evident in the key features of the new forms of consumer marketing. Few explicit and adequately articulated models of these processes have been located in the literature, but see Ur et al. (2012). A partial model was put forward in Degli Esposti (2014), which represented "the ability of reorienting, or nudging, individuals' future behavior by means of four classes of actions: 'recorded observation'; 'identification and tracking'; 'analytical intervention'; and 'behavioral manipulation'" (p.210). That list of activities is further articulated below, and a model provided in Figure 1.

Figure 1: Key Processes of the Digital Surveillance Economy

(1) Data Acquisition

Some data is acquired by corporations through interactions with consumers in which the consumers are conscious of the transaction taking place, and to some extent at least of the fact that data has been acquired as part of the transaction (Activity 1A - Overt Direct Collection in Figure 1). The consumer may even be conscious of which data-items have been acquired, and for what purposes, and may have given express or implied consent. An early analysis of the transition from clumsy web-server logging to more effective acquisition of data about consumer transactions is in Ansari et al. (2000). More commonly, however, consumers are at best very hazy about the terms of the deal, and consent is merely inferred by the collector in ways that might or might not be judged by a court to satisfy the requirements of contract and consumer law.

In addition, a considerable amount of data is acquired from consumers through various activities of which the consumer is unaware (1B - Covert Direct Collection). Some such data-items are arguably a byproduct of the protocols used in conducting the transaction, such as the consumer's net-location, the software that they use, and in some circumstances even their physical location. The term 'click-stream data' has been used to refer to this category of data acquisition (Montgomery et al. 2004). Consumers seldom appreciate how much data they gift to intermediaries, in particular search-engine operators, as they go about locating suppliers. In addition, a great deal of data-collection is achieved through active endeavours by marketers and their partners and service-providers, utilising surreptitious techniques such as cookies, web-bugs /web-beacons / tracking pixels and browser fingerprints, and through adware/spyware. Advertising service-providers such as DoubleClick (owned by Google) and Quantcast are key players in this space. Consumer tools such as Ghostery and AdBlock Plus, and their associated lists, identify many hundreds of organisations that are privy to individuals' web-traffic through such means.

Data about consumers' marketspace behaviour may be gained by marketers not only when the consumer is in contact with the organisation itself but also when they are interacting with other sites (1C - Covert Indirect Collection). This may be achieved using spyware or tracking pixels, which cause an unconsented transfer of data from a consumer's device to a third party. Surreptitiously-acquired data is then traded within each organisation's networks of strategic partners and/or with advertising-service providers, resulting in the data being further exploited by many more organisations (Palmas 2011, Degli Esposti 2014, p.214, Christl & Spiekermann 2016, pp.45-72).

Other forms of data gathering exist which may not be associated with browser-usage (1D - Viewer Monitoring). The data-formats and access-tools whereby consumers experience text, images, sound and video (such as email, e-readers, streaming services and the viewing of previously-downloaded files) are increasingly designed to generate data about consumers' activities and pass it around the often very large network of corporations that are involved in the process, or merely have an interest in acquiring the data. A long-term view of this form of data acquisition is provided by Wedel & Kannan (2006).

A great deal of data is also acquired from sources other than the individual. Some is from sources asserted to be 'public domain' and free for use for any purpose (2A - 'Public Domain' Collection). One important example is social media content. This includes a great deal of data about the individual persona (not only postings, but also personal profile-data and social networks), and about other personae associated with it.

There are, however, many collections of personal data whose accessibility and use individuals reasonably consider to be limited, but which corporations expropriate. For example, when consumers provide data to a telco, their reasonable expectation is that their consent is needed for it to be published in telephone subscriber listings, and that the purposes are constrained rather than being open to abuse for extraneous purposes. Even when they make mandated disclosures to, for example, an electoral commission, they expect access to and use of the resulting electoral roll to be constrained to the purpose for which the data was collected. Consumers lack the institutional power to prevent the massive abuses that electoral roll data is subject to. On the other hand, for a fee, they can vote in relation to telephone directories. Large proportions of the populations of many countries pay to keep their numbers unlisted (as it is called in the United States), ex-directory (in the United Kingdom), silent (in Australia), or private (in New Zealand and Canada). Few telcos publish statistics, but in various countries estimates of the proportion of the public that avoids publication range from 25% to >50%. Media reports suggest that two-thirds of UK numbers are ex-directory.

A great deal of personal data is acquired by organisations from other organisations (2B - Covert Secondary Collection). This may be by purchase, barter or other sharing, perhaps camouflaged by a term such as 'strategic partnership'. It is rare for the individual to be aware of such data transfers. Many such exchanges are conducted surreptitiously, variously because of their dubious legality and the risks of media coverage, public outrage, and responses by regulators and parliaments. An example that has gained recent media coverage is the abuse by Australian charities of the personal data of their donors, by passing the data on to other charities, inevitably gifting it to marketing services corporations in the process (Sutton 2017).

The diversity of data-streams that the digital surveillance economy draws on is enormous, and includes eCommerce and advertising transactions, search-terms (Zimmer 2008), media experiences (Cohen 1996, Rosenblatt 2015, Lynch 2017), social media contributions and accesses, and the increasingly 'quantified self' emanating from 'wellness devices' and other forms of 'wearables' (Lupton 2016). The many streams that have emerged since Web commerce began in the mid-1990s (Clarke 2002) are additional to the plethora of data-sets from pre-Internet expropriation rounds, such as the holdings of credit bureaux, mailing-list and database marketing operators, loyalty schemes, phonebook database operators, and electoral rolls (Christl & Spiekermann 2016, pp. 76-117). Many of these hold reasonably reliable socio-demographic data for vast numbers of people.

The technical means that have been developed to support the feeding-frenzy represented by activities (1A-2B) are highly diverse. In some cases, protocols and data-formats are semi-standardised, e.g. in the highly merchant-friendly and consumer-hostile HTML 5.0 specification (Lyne 2011). In other cases, they are proprietary subterfuges and spyware designs that would quite possibly be found to be technically illegal under the cybercrime provisions enacted in many countries following the Convention on Cybercrime (CoE 2001) - if cases could ever be successfully brought before courts. For an overview of data-streams in various contexts, see Christl & Spiekermann (2016, pp. 45-75). For recent reviews of the array of techniques for achieving associations of Web data-streams with individuals, see Mayer & Mitchell (2012) and Bujlow et al. (2015). For data on the extent and effectiveness of tracking using these techniques, see Englehardt & Narayanan (2017). For a recent review of the extent to which Android and iOS apps scatter data to large numbers of organisations, see Zang et al. (2015).

User organisations that pay, in some cases for copies of data, and in other cases for services based on data, are found across the length and breadth of the consumer marketing, consumer credit, insurance and healthcare industries. There are also uses within the employment life-cycle, from talent-spotting, through hiring, to loyalty-monitoring and firing. Government agencies utilise these sources, for example in relation to taxation and welfare payment administration. Political parties and candidates have recently become an additional, and in some countries, highly lucrative user segment, with inferences drawn about electors' preferences in order to target political advertising, customise messages, and manipulate voters' choices.

(2) Data Exploitation

Each organisation that acquires data by means of activities 1A-2B is in a position to combine it into a single record for each individual (3 - Digital Persona Consolidation). Commonly-used terms in the industry include 'a data-point' to refer to the pairing of a data-item with an identifier, and 'a consumer profile' to refer to the record as a whole. The term 'profile' has another established usage, which is relevant to activity (4), discussed in the next paragraph. This paper accordingly uses the term 'digital persona' to refer to the consolidated set of data that purports to be about a particular consumer. The consolidation process may be performed on the basis of some reasonably reliable identifier, such as a loginid, a stable IP-address, browser-fingerprints or GPS coordinates. In many cases, however, the data consolidation processes that allocate new data to a record rely on loose inferencing techniques. Where many streams are combined, the result is what Palmas (2011) refers to as 'panspectric surveillance'.

The consolidated persona can be subjected to many forms of processing in order to draw inferences from it (4 - Analysis). Inferences may relate to the population, a sub-population, or an individual. One important form of analysis compares each digital persona with one or more categories that are usefully referred to as 'abstract consumer profiles' (Clarke 1993, EPIC 2003, Degli Esposti 2014, pp.215-219, Christl & Spiekermann 2016, pp.24-38, 84-86). Abstract profiles may be defined on a more or less ad hoc basis, or may apply insights from studies and experiments in relation to personality-types, attitudes, and long-term and short-term interests. In relation to personality, a range of psychographic classifications exists, including the 'Big Five' phenotypes of extraversion, neuroticism, agreeableness, conscientiousness, and openness (Goldberg 1990, Goldberg 1999, Gunter & Furnham 1992. See also Christl & Spiekermann 2016, pp. 11-41).

Corporations make decisions that affect individuals (5 - Individual Decision-Making). They do so on the basis of the digital personae that they have available to them, and what the corporation infers from them, including how those personae compare with abstract profiles. Many of the contexts in which corporations make decisions about people are very important to those people's interests and wellbeing, including applications for and administration of loans and insurance, in health care, and in employment.

One particular category of decision-making is critical to the digital surveillance economy. The data held about each consumer is used to select, and/or customise advertisements such that they are targeted at a very precise category, possibly as small as a specific individual (6 - Advert Targeting). The computational process reflects the individual's demographics, stated and demonstrated preferences, attitudes and interests at that particular time (Dwyer 2009, Davis 2014). The effectiveness of the 'target demographic' notions applied in broadcast print, radio and television are argued to pale into insignificance in comparison with narrowcasting to individual consumers whose digital personae match closely to carefully pre-tested abstract consumer profiles. For a description of the highly-developed, real-time process whereby ad space in consumers' browser-windows is auctioned off, see Trusov et al. (2016). In addition to the assertions of interested parties such as service-providers (e.g. NAI 2010), some empirical studies support the claims that behavioural ad targeting is highly effective (e.g. Yan et al. 2009), although some contrary evidence also exists (Lambrecht & Tucker 2013).

Where the marketer is sufficiently aware of the individual consumer's socio-demographic attributes, prior behaviour and current interests, the message in the advertisement can be designed to be highly persuasive, and to have a decisive and predictable effect on the consumer's decisions and actions (7 - Behaviour Manipulation) (Packard 1957, 1964, Wells 1975, Adomavicius & Tuzhilin 2005, Degli Esposti 2014, pp.220-221). Calo (2014), building on Hanson & Kysar (1999) uses the term 'digital market manipulation' to refer to "nudging for profit ... [exploiting the fact] that individuals systematically behave in nonrational ways" (p.1001). The analysis is valuable, but the co-option of a term that has an established meaning risks unnecessary confusion. The manipulation is not of a market per se, but rather of one of the participants in a market.

The data held in each consumer's digital persona enables marketers to gauge the point at which buyer-resistance is likely to arise, so that they can pitch the offer just below it, in order to extract the maximum revenue from each consumer (8 - Micro-Pricing) (Shapiro & Varian 1999 pp. 7-50, Acquisti 2008, Christl & Spiekermann 2016, pp. 41-44). As a result, marketers have moved beyond pre-set pricing of their offerings - typically at a price that they estimate the market as a whole will bear - and instead set a price that each individual will bear. Some consumers may get a lower price, but most will get a higher one. So the technique allows marketers to extract far greater revenue from the market as a whole than would otherwise be the case (Shapiro & Varian 1999, p.32).

The most successful exponent, and the primary driver of the emergence of the digital surveillance economy, has been Google (Clarke 2006, Vaidhyanathan 2011, Newman 2011, Zuboff 2015, Angwin 2016). However, multiple other large corporations have followed their own paths, which have begun in areas different from Google's origins in search-terms - Facebook in social media, Apple in music consumption, and Amazon in eCommerce transactions. Of the various acronyms in use for this group of supra-national organisations, the most apposite is arguably FANGS (Facebook, Apple, amazoN, Google, microSoft).

The industry structure through which personal data expropriation, exploitation and cross-leveraging is conducted is in continual flux, and its processes are largely obscured from public view. At a simplistic level, it comprises sources, consolidators, analysts, service-providers and user organisations. But many sub-classifications exist, many overlaps occur, and organisations vie to take others over, and to be taken over. Profiling agencies such as VisualDNA, EFL and Cambridge Analytica specialise in processing data-hoards to address specific interests. See Mayer & Mitchell (2012) and the succession of industry models offered by Brinker (2016).

These applications of digitalisation are being imposed on society, not offered as a choice. The activities that make up the digital surveillance economy are justified by business enterprises on the basis of 'marketing efficiency', the convenience that their products and services offer to people, and the empirical evidence of people being very easily attracted to adopt and use them. Adoption is achieved through bandwagon effects ('viral marketing'), network effects, and features that appeal to individuals' hedonism and that encourage the suspension of disbelief and the suppression of rational decision-making in individuals' own interests. This comes at a cost to individuals, and to societies, economies, and polities.


5. The Digital Surveillance Economy as Threat

Individuals gain from the digital surveillance economy in a variety of ways. Convenience and time-savings result from not needing to think about purchases because attractive opportunities continually present themselves. There is entertainment value built into the 'customer experience', and perhaps a warm glow arising from the feeling that profit-making corporations understand them. A more formalised classification scheme was suggested by Wang & Felsenmaier (2002), comprising hedonic benefits (entertainment, enjoyment, amusement, fun), functional benefits (information, efficiency, convenience), social benefits (communication, relationship, involvement, trust) and psychological benefits (affiliation, belonging, identification).

The promotional literature for Web 2.0 during its foundation period 2005-2009 waxed lyrical about the benefits to consumers, e.g. "Web 2.0 ... has contributed to an unprecedented customer empowerment" (Constantinides & Fountain 2008, p.231). The analysis conducted in (Clarke 2008b), and extended in this paper, suggests, on the other hand, that the bargain has been a very poor one from the consumer's perspective.

Substantial literatures have developed on consumers' individual cost/benefit privacy calculus, summarised in Smith et al. (2011, pp. 1001-1002), and in relation to the economics of privacy, recently associated with Acquisti, whose most highly cited paper is Acquisti & Grossklags (2005). However, these research genres work within the business-facilitative 'fair information practices' framework that was first enunciated by Westin (1967) and Westin & Baker (1974), and the consumer-hostile economic philosophy of Posner (1977, 1981, 1984), as applied by Laudon (1993). These have sought to reduce privacy from the human right expressed in the International Covenant on Civil and Political Rights (ICCPR) to a merely economic right. The Westin/Posner/Laudon approach serves the desire of consumer marketing corporations, particularly those used to the freedoms accorded to corporations in the USA. Shifting the perception from human right to economic right benefits business, because individuals can then be justifiably enveigled into trading their privacy away. These research streams have advantaged marketers, because they are accepting of the very substantial information and power asymmetries, they assist in the articulation and extension of the many techniques used by marketers to manipulate consumer perceptions, and they thereby support the approach of dividing-and-conquering the consumer interest.

Building on the ideas in the preceding sections, a range of specific issues can be identified. The focus of the first section is on aspects of digitalisation's impacts that are relevant to individuals, and of the second on social and economic concerns.

5.1 Threats to the Individual

One negative impact on people is the stimulation of what is variously referred to as 'impulse buying' and 'compulsive purchasing' of items that are unnecessary and/or unduly expensive. This arises in response to targeted ads that succeed in their aim of 'pushing that particular consumer's buttons' (Beatty & Ferrell 1998, Wells et al. 2011). A few individuals may pay lower prices than they would under the old set-price approach, but most pay more, because the attractiveness of factors other than price (in particular the way in which the offer is presented, and the time it appears) enable the seller to extract premiums from many buyers (Newman 2014). Concerns about consumer manipulation have been mainstream since at least the mid-20th century (Packard 1960), but the concerns are greatly intensified by contemporary features including the presentation quality (high pixel-density colour screens), the customised nature of the sales pitch, the immediacy of the experience, and the ease with which the deal is done - 'see ad, press buy-button'. The value of this mechanism was attested to by the contests in the U.S., Europea and Australia over Amazon's single-click patent (e.g. Worstall 2011).

Further issues arise in relation to the way in which organisations make decisions about individuals, particularly in such contexts as financial, insurance and health care services. Discriminatory techniques are used in ways that harm individuals' access to services. Blacklisting of individuals (such as tenants) was originally based on prior transactions between each individual and one or more organisations. Redlining, and its electronic form, weblining, were originally techniques that denied services to individuals, or offered services only on punitive terms, on the basis of where they lived (Gandy 1993, Hernandez et al. 2001). Organisations are now able to conduct discriminatory practices on the basis of a far wider range of socio-demographic, behavioural and experiential data. Through analysis of the digital personae that an organisation has available, it can avoid conducting transactions with individuals who it deems to be unattractive customers (e.g. likely to be low-volume, low-margin or high-risk), or inflate the prices quoted to them (Lyon 2003, Newell & Marabelli 2015, Ebrahimi et al. 2016).

Discrimination that is based on reliable empirical evidence is of course justifiable in many circumstances, and hence the emphasis in the preceding paragraph was on equity issues. Another set of issues entirely arises where the reliability of the data is questionable. In practice, a considerable proportion of the voluminous data that is available is likely to be irrelevant, of low quality and/or of considerable sensitivity, yet quality controls are very limited and even non-existent (Horne et al. 2007, Hazen et al. 2014, Clarke 2016b). Another source of errors is the inevitably partial nature of each persona. The possibility also exists that the persona may be a composite, e.g. including some data that relates not to the individual but to some other member of the same household, or to some other person whose profile or transacxtion-derived data has been conflated with it. Moreover, some of the data may be intentional misinformation, and some may have been wrongly associated with that identifier.

Further, digitalisation extends beyond digitisation and quantification to the automation of decision-making, unmoderated by people. To a considerable degree, transparency has already been lost, partly as a result of digitalisation, and partly because of the application not of 20th century procedural approaches and algorithmic software development tools, but of later-generation software tools whose 'rationale' is obscure, or to which the notion of rationale is not even applicable (Clarke 1991, 2016b). The absence of transparency means that unreasonable and even simply wrong decisions cannot be detected, and correction, recourse and restitution are all-but impossible.

In a highly competitive market, refusal by one provider to do business with a consumer may not matter all that much. In the real world, however - especially in Internet contexts where network effects make late entry to the market very challenging - monopolies and oligopolies exist, and micro-monopolies abound, and it is in the interests of business enterprises to create and sustain them. Serious harm to consumers arises in such circumstances as where, for example, all lenders apply the same creditworthiness technique to the same data, and where all lenders consequently reject applications based on irrelevant or erroneous data, without transparency, and without any effective opportunity for clarification, challenge or rectification. The harm is likely to be even greater in the contexts of insurance and health care.

All users of electronic tools are subject to intensive surveillance. For city-dwellers, who enjoy superior connectivity, that surveillance is not only ubiquitous within the areas in which they live, work and are entertained, but also continual, and for some even continuous. The term 'chilling effect' appears to have originated in comments by a US judge in a freedom of expression case, Dombrowski v. Pfister, 380 U.S. 479 (1965). The essence of the concept is that intentional acts by one party have a strong deterrent effect on important, positive behaviours of some other party/ies (Schauer 1978). Since Foucault's 'virtual panopticon' notion (1975/77), the term 'chilling effect' has become widely used in the context of surveillance (Gandy 1993, Lyon 1994). Intensive surveillance chills behaviour, undermines personal sovereignty, and greatly reduces the scope for self-determination. The digital surveillance economy accordingly has impacts on the individual, in psychological, social and political terms.

An alternative formulation that has been proposed in the specific context of the digital surveillance economy uses the term 'psychic numbing'. This "inures people to the realities of being tracked, parsed, mined, and modified - or disposes them to rationalize the situation in resigned cynicism" (Hoofnagle et al., 2010). Google is confident that psychic numbing facilitates the new business model: " ... these digital assistants [that acquire data from consumers] will be so useful that everyone will want one, and the statements you read today about them will just seem quaint and old fashioned" (Varian 2014, p. 29, quoted in Zuboff 2015, p.84).

Much of the discussion to date on the risks of chilling and psychic numbing, and the resulting repression of behaviour, has related to actions by government agencies, particularly national security and law enforcement agencies, but also social welfare agencies in respect of their clients, and agencies generally in respect of their employees and contractors. However, the incidence of corporate oppression of employees, public interest advocates and individual shareholders appears to have been rising. It is open to speculation that increased social distance between corporations and the public, combined with the capacity of large corporations to ignore oversight agencies and even civil and criminal laws, may lead to much greater interference by organisations with the behaviour of individuals in the future.

In summary, the digital surveillance economy creates threats to individual consumers in the areas of compulsive buying, net higher prices, discriminatory decision-making by suppliers, decision-making by suppliers based on erroneous data, decisions by suppliers whose rationale is unclear and essentially uncontestable, a phalanx of suppliers that use much the same data and decision-criteria and hence make the same (sometimes discriminatory and erroneous) decisions, reduction in the scope for self-determination, psychic numbing, and ultimately passivity.

The degree of concern that individuals have about these issues varies greatly. Of the 12 activities in the model of the digital surveillance economy in Figure 1, at least 9 are conducted out of consumers' sight. The only three that may be apparent to consumers are overt direct data collection (activity 1A), individual decision-making (5), and targeted advertisements (6). It is very difficult for consumers to resist data collection, because web-forms approach a form of tyranny, in that most and even all items have to be filled in; most people, at least in western cultures, have a guilty conscience about falsification; and credible lies require considerable effort. Targeted ads, on the other hand, are not only unwanted by a large percentage of people, but are also capable of being avoided or suppressed (Turow et al. 2009).

A significant proportion of people evidence at least some level of concern, by seeking out and using means to obfuscate and falsify data and to mislead, corrupt and subvert corporations' data-holdings (Metzger 2007). However, given the surreptitiousness with which the digital surveillance economy has been constructed and is operated, it is unsurprising that many people have very little understanding of the nature of the digital surveillance economy. Some of those who do have at least a vague sense of the issues profess to have little or no concern about them, or argue that they can do nothing about it and hence there's no point in worrying about it. Some proportion of the uninformed and unconcerned do, however, suffer experiences that change their views. A meme that has gained traction in recent years has been associated with the word 'creepiness', to refer to the feeling that someone is watching you, and knows nearly as much about you as you do yourself (Ur et al. 2012, Shklovski et al. 2014).

5.2 Broader Impacts

In a healthy society, some kinds of behaviour (such as violence, and incitement to violence, against individuals, minorities and the public generally) need to be effectively chilled. But a healthy society also depends on other kinds of behaviours not being chilled. Creativity in scientific, technological and economic contexts is just as vitally dependent on freedom of speech and action as is creativity in the artistic, cultural, social and political realms (Johnson-Laird 1988).

Considerable concerns have been repeatedly expressed, over an extended period, about the capacity of surveillance to stultify economies, societies and polities (SSN 2014). The dramatic increase in surveillance intensity that arrived with the new century gives rise to much higher levels of concern: "the modern age - which began with such an unprecedented and promising outburst of human activity - may end in the deadliest, most sterile passivity history has ever known" (Arendt 1998, p. 322, quoted in Zuboff 2015, p.82).

Art naturally preceded the hard reality. Beginning with Herbert (1965) and Brunner (1975), and in cyberpunk sci-fi from Gibson (1984) to Stephenson (1992), hypercorps (the successors to transnational corporations) dominate organised economic activity, and have associated with them polite society, the majority of the net and a great deal of information. Outside the official levels of society skulk large numbers of people, in communities in which formal law and order have broken down, and tribal patterns have re-emerged. Officialdom has not been able to sustain the myth that it was in control; society has become ungovernable: "[cyberpunk authors] depict Orwellian accumulations of power ..., but nearly always clutched in the secretive hands of a wealthy or corporate elite" (Brin 1998). The term 'corpocracy' has emerged, to refer to an economic or political system controlled by corporations (Monks 2008).

During the current decade, it has become apparent that the threats inherent in the digital surveillance economy extend beyond the social and economic dimensions to the political. Election campaigns are increasingly driven by holdings of personal data on individual voters (Tufekci 2014, Bennett 2016). Candidates in recent US presidential elections have invested heavily in service-providers that specialise in behavioural profiling, with Cambridge Analytica gaining a great deal of media coverage. Although the degree of the technique's effectiveness is contested, there is little dispute that the volume of data available as a result of digital surveillance economy processes is enabling well-resourced candidates to have a material impact on electors' perceptions (Confessore & Hakimmarch 2017). The scope exists for skilful application of online behavioural advertising to warp election results and political processes.

The pattern that this paper has described as the digital surveillance economy undermines a great deal more than just the interests of individual consumers. Many aspects of societies and polities are threatened by the shift of power from the public sector - which is at least somewhat subject to ethical constraints, and to transparency requirements and regulatory processes that have some degree of effectiveness - to a private sector that is largely free of all of them. What can researchers do to enable understanding of these phenomena and their future trajectories?


6. A Research Agenda

The previous sections have clarified the origins and nature of the digital surveillance economy, delineated its key features, and identified a wide range of ways in which it may negatively impact on individuals, societies and polities. Research is needed in order to deliver deeper understanding of those impacts. I contend that all such research is of relevance to the IS discipline, and that some proportion of that research is within the scope of the IS discipline itself. However, the quantum and diversity of research questions that need to be addressed are so great that the first contribution that is needed is the definition of a research agenda that encompasses the whole.

Although many articles in both the IS and other literatures use the term 'research agenda', few provide an explanation and citations to indicate what the authors mean by the term. Common features of these articles are descriptions of fundamental concepts and processes within the relevant domain, and an arrangement of key elements within a two- or three-dimensional matrix of ideas, often referred to as a 'framework'.

A highly-cited paper in the IS literature that uses the term, Wand & Weber (2002), refers to a research agenda as being "a framework to motivate research that addresses [a] fundamental question" (p.363) and "a framework to structure the way we might think about research on [the topic-area]" (p.364). For the strongly formalised topic-area that they were addressing, they proposed a framework comprising four elements: a grammar (constructs, and rules for combining the constructs into models); a method (procedures by which the grammar can be used); scripts (statements in the language generated by the grammar); and contextual factors.

In Ahuja (2002), the authors propose "three basic research phases necessary to answer the conceptual, empirical and analytic questions" (p.30). This aspect appears to be particularly relevant where the domain features emergent phenomena. Avgerou (2008) refers to a research agenda as providing structure to the themes and issues within the relevant domain. The only article that has been located that is directly relevant to the digital surveillance economy topic, and that uses the term 'research agenda', is Newell & Marabelli (2015). This adopts the interpretation that it is a means of structuring the issues and the various categories of tradeoffs that arise within the domain. However, those authors' suggestion of the use of an 'ethical dilemma lens', which seems unlikely to gain traction with business enterprises, or even with regulatory agencies.

In this paper, the term 'research agenda' refers to the combination of the broad conceptual framework presented in s.4.1, the process model articulated in s.4.2, and the analysis of impacts and implications in s.5, all carried forward into the instrumentalist research work identified below. The further aspects presented in the following sub-sections comprise a research approach (s.6.1), a set of broad research questions that reflect alternative future paths that society might trace as a result of the explosive growth of the digital surveillance economy (s.6.2), and a research program whereby those broad questions can be further articulated, and addressed (s.6.3).

6.1 Research into Alternative Futures

In section 2, the research domain was characterised by two key and challenging features: emergent and unstable phenomena, and value-conflicts. Further, the research is not being conducted merely for the interest of theoreticians, and corporate executives are only a secondary audience. The primary purpose is to provide insight and guidance to practitioners who devise interventions in the real world. In order to cope with these challenges, research techniques need to be applied that cope with the challenging features and that are instrumentalist in nature.

Within the IS discipline, some guidance in relation to relevant research techniques is provided by Gray & Hovav (2008) and Majchrzak & Markus (2014). See also Niederman et al. (2016). Many futures research techniques have been formulated, summaries of which are provided by Martino (1993), Dreborg (1996), Bell (2004) and Glenn & Gordon (2009). Some techniques, such as environmental scanning, are primarily observational and lack the dynamic element necessary for the current purpose. Others are limited to forecasting based on historical trends, and hence are incapable of coping with the discontinuities that are inevitable in the context of unstable phenomena. This section considers five diverse approaches and techniques: critical theory research, visionary depiction, technology assessment, delphi studies and scenario analysis.

Critical theory research is not usually portrayed as a futurist technique. It does, however, directly address the value-conflict challenge, by recognising the effects of power and the tendency of some stakeholders' interests to dominate those of other stakeholders. It brings to light "the restrictive and alienating conditions of the status quo" and expressly sets out to "eliminate the causes of alienation and domination" (Myers 1997). "Typically critical studies put a particular object of study in a wider cultural, economic and political context, relating a focused phenomenon to sources of broader asymmetrical relations in society" (Alvesson & Deetz 2000, p.1). "Critical IS research ... seeks emancipation from unrecognised forms of domination and control enabled or supported by information systems" (Cecez-Kezmanovic 2005, p.19). Critical theory research is applied by a small minority of IS academics, and at this stage remains marginalised. The approach has relevance to the current purpose, but it does not necessarily concern itself with emergent and unstable phenomena, and it may be addressed to theorists rather than aiming to provide guidance to practitioners.

A second relevant category of techniques has been referred to as 'visionary depiction' (Kling & Lamb 1997). A description of an imagined future is commonly 'utopian' (strongly emphasing 'good' aspects), sometimes 'dystopian' (strongly emphasing 'bad' aspects), and occasionally balanced, ambivalent or intentionally ambiguous. Utopianism dates to Plato and Thomas More. Relevant examples include Vannevar Bush's 1948 vision of a computer that would provide convenient access to the world's literature, Toffler's works, especially 'The Third Wave' (1980), and Stonier (1983) on 'a wealth of information in the post-industrial economy'. Among dystopian visions, more relevant than Orwell's works are Dreyfus (1974), Weizenbaum's 'Computer Power and Human Reason' (1976) and Reinecke (1984). Such works focus on the harm that technology appears likely to do to social or environmental values. Whereas utopianist visions tend to be either upbeat sales pitches or pollyanna-ish, dystopianism is usually at least partly polemical. A common feature of both forms is that they phrase speculations as though they were realistic or factual accounts.

Visionary depictions of a balanced nature are less exciting, and more challenging to compose. Various forms of business case development, cost/benefit assessment, and risk assessment offer a degree of discipline, but they tend to squeeze out the creativity that is a necessary element of what is essentially an exercise in imagination. Moreover, many applications of those techniques adopt the perspective of a single organisation. Technology Assessment (TA), on the other hand, is inherently multi-perspective. It is "a scientific, interactive and communicative process, which aims to contribute to the formation of public and political opinion on societal aspects of science and technology" (EPTA 2017). The scope is generally defined in terms of one or more technologies, viewed as interventions into a particular, existing physical, social and/or economic context. An early treatment of the US Office's methods is in OTA (1977). See also Garcia (1991). The practice of TA demands considerable resources and draws on multiple disciplines and professions.

Among the few potentially relevant techniques that have been applied within the IS discipline is the delphi study. This involves a request to a group of people with assumed expertise in the relevant domain to respond to a series of propositions about the future. In second and perhaps further rounds, the group is asked for responses to their peers' previous thoughts (Okoli & Pawlowski 2004). The approach is capable of delivering some insight into opinions, but its orientation is much less to the development of insight, and much more to the achieval of consensus, whether or not a diversity of opinions exists. Its informational base may also be slim, not evidence-based, or diverse to the point of internal inconsistency.

A further approach to research into alternative futures is scenario analysis (Wack 1985, Schwartz 1991, van Notten et al. 2003, Postma & Leibl 2005, Bradfield et al. 2005, Clarke 2015). Since Hermann Kahn's innovations in the mid-1960s, a scenario in the long-range planning context is "a detailed description of what the future might be like" (Starr 1971, p. 489), or "a trajectory of possible, logically connected future events" (p. 548). The description is expressed as a story-line that represents 'an imagined but realistic world'. The creation process for a scenario involves the identification of existing contextual factors including those that have the capability to act as driving forces, postulation of how those driving forces might plausibly behave, and the projection of paths that might be followed depending on the responses of influential participants (Schwarz 1991, pp. 141-151, Kosov & Gassner 2008). Critically, the technique involves the development of a set of story-lines, not just one. Each is based on a combination of a factual starting-point and known societal processes. The value derives from a combination of the depth of insights achieved within each scenario, together with the diversity among the different scenarios' paths of development. In Clarke (2015), research into the impacts of big data applications was characterised as being 'quasi-empirical' in nature, on the basis that the information deriving from the scenario analysis 'resembles', or is 'seemingly but not actually', representative of real-world phenomena.

Of the wide range of techniques available, scenario analysis offers the greatest prospect of surfacing feasible lines of development, supporting level-headed analysis, and delivering useful results. The following sections accordingly propose a research agenda built around a small set of scenarios.

Neither scenario analysis, nor many others among the research techniques identified in this section, are mainstream within the IS discipline. On the other hand, the mainstream IS research genres are empirical, and are severely limited in their ability to support the investigation of impacts and implications of changes in socio-technical systems that are emergent and at least evolutionary, even revolutionary. They are in any case incapable of dealing with values and value-conflicts.

Some early signs exist suggesting that some futures techniques may gradually gain acceptance within the IS discipline. A proportion of the papers cited in this paper were published in IS journals, and panel session discussions have taken place at leading conferences. Moreover, the discipline has shown adaptive behaviour in the past, with interpretivism and design science becoming accepted as alternatives to the positivist approach. In addition, many theories and many research techniques have been adopted from other disciplines, and others have been formulated within the discipline itself.

For research approaches that are future-oriented and instrumentalist in nature to break through, they need to overcome the neutrality, dubiousness and in some cases even hostility of the discipline's key institutions and gatekeepers. This initially requires painstaking and patient work by authors with journal editors and reviewers, and with conference program chairs and reviewers. It is likely that selection and promotion committee evaluation criteria will change even more slowly. This leads to a conclusion that in less stultifying circumstances would be counter-intuitive: entry-level and early-career researchers, even though they would appear likely to reject the old and be impatient to adopt the new, are less likely to drive this change than are established academics whose livelihoods are less dependent on conformity with current conventions.

6.2 Research Questions

Future developments might follow any of a range of alternative paths. Suggestions are made below for three potential scenarios that could be used as an organising framework for research questions that arise from the preceding analysis. The key principle followed in framing the set was the inclusion of a diversity of drivers and contexts, and therefore of paths and outcomes . The set has some similarities with a commonly-used pattern described by Schwarz (1991, p.19) as 'more of the same, but better', 'worse', and 'different but better'. A range of other alternatives could be usefully considered.

Scenario (1) 'The Trajectory Is Set'

In this scenario, the context would be dominated by net technologies, the weakening of the nation-state, and the self-interest, power and momentum of powerful corporations at the centre of the digital surveillance economy. Other forces would transpire to be ineffectual, and the plausible behaviour is that the new business model establishes itself as the primary determinant not only of economies but also of societies and polities.

Themes emerge from a well-developed scenario. The following are speculative questions that appear likely to require investigation:

This paper has shown that much of what is necessary for Scenario (1) to come to fruition is already in place. Research projects undertaken to address aspects of this scenario are likely to deliver results of relevance to the other scenarios as well.

Scenario (2) 'Socio-Economic Systems are Adaptive'

In an alternative contextual setting, the activities of the participants in the digital surveillance economy would beget an effective reaction from other social and economic institutions. Possible themes that might arise from this scenario include:

In Ceglowksi (2015), it is suggested that the foundations of the current digital surveillance economy are rotten, and that the current, bloated Internet advertising industry structure will implode. This would of course influence processes and industry structure. It seems unlikely, however, that such changes would resolve the issues that this paper identifies. The rewards arising from targeted advertising based on intensive data about consumers appear to be so large that industry re-structuring is more likely to alter the allocation of benefits to corporations than to cause fundamental change in the practices.

Another way in which adaptation may occur is through intervention by parliaments or government agencies. Some possible regulatory measures are identified in Christl & Spiekermann (2016, pp. 142-143). To date, however, parliaments and regulatory agencies have been so meek that the activities of consumer marketers have been subject to very little constraint. The US Federal Trade Commission is the most important regulatory agency in the world in this arena, because all of the leading exponents of the digital surveillance economy (the FANGS) are domiciled within its jurisdiction. A 'Do Not Track' initiative has been skilfully used by the industry for over a decade as an excuse for legislative inaction (EPIC 2010, 2015). Yet, even now, the FTC's policy is quite candidly stimulatory of business, and not at all protective of consumer interests (FTC 2009, 2016).

A further possible driving force in this scenario is new forms of personal data markets, which support effective denial of consent, but encourage disclosures by consumers that are measured and compensated. A great many potential elements of consumer-friendly infrastructure have been piloted by the Privacy Enhancing Technologies (PETs) community, and a few have achieved some level of adoption. A concept that goes beyond the failed Platform for Privacy Preferences (P3P) scheme is outlined in Maguire et al. (2015), but it too fails to go beyond the technical aspects. A review of consumer-oriented social media is in Clarke (2014b). A more substantial proposal is in Spiekermann & Novotny (2015).

Scenario (3) 'The Revolution is Nigh'

Zuboff extracted the following implication from her 'surveillance capitalism' analysis: "Nothing short of a social revolt that revokes collective agreement to the practices associated with the dispossession of behavior will alter surveillance capitalism's claim to manifest data destiny ... New interventions [are] necessary ... The future of this narrative will depend upon the indignant scholars and journalists drawn to this frontier project, indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities, and indignant citizens who act in the knowledge that effectiveness without autonomy is not effective, dependency- induced compliance is no social contract, and freedom from uncertainty is no freedom" (Zuboff 2016).

The third suggested scenario would accordingly consider a contextual setting in which the public takes the matter into its own hands. It would appear likely to raise questions such as the following:

The scope for societal resistance to digital surveillance has been studied in the abstract (e.g. Martin et al. 2009). There has been discussion of the scope for, and merits of, transparent society (Brin 1998), sousveillance (Mann et al. 2003) and equiveillance (Mann et al. 2006). Meanwhile, a wide variety of technological initiatives have been launched in support of resistance against corporate excess (Goldberg 2007, Shen & Pearson 2011. See also the proceedings of the Privacy Enhancing Technologies Symposia (PETS) and the Symposium On Usable Privacy and Security (SOUPS). Clarke (2016a) identifies the principles of obfuscation and falsification of data, messages, identities, locations and social networks. Schneier examines the battleground in greater detail, and offers a related set of principles: avoid, distort, block and break surveillance (Schneier 2015a, 2015b). Bösch et al. (2016) recommends instead: maximise, publish, centralise, preserve, obscure, deny, violate, fake.

Even prior to the emergence of the digital surveillance economy c.2005, surveys detected that "24% of Internet users have provided a fake name or personal information" (Pew 2000). That Report's words to describe the consumers' behaviour were all pejorative: "fake", "lying", and "deception". This was consistent with the perception of consumer marketers that such behaviour was a threat to their revenue. A limited amount of research has been undertaken into the phenomenon that authors variously refer to as 'consumer fabrication of information' and 'consumer lying' (e.g. Horne et al. 2007). Those studies generally adopt the assumption that such behaviour is unethical, without consideration of the implications of it being a response to unethical behaviour by marketers. The fact that a large and growing proportion of the world's user devices blocks ads (PageFair 2017) suggests that awareness and activism among consumers may now be considerably higher than it was before Web 2.0 enabled the emergence of the digital surveillance economy.

6.3 A Research Program

The purpose of this paper has been to lay the foundations for a research program whereby concerns about the threats inherent in the digital surveillance economy can be investigated, and appropriate responses formulated. The term 'research program' is widely used, but seldom clearly defined. One of the foundation papers in the IS discipline did so, however, declaring that "a research program ... would explore the differing characteristics of [the object of study] ... [and] ... would attempt to explore each of the [relevant] variables in a systematic fashion" (Mason & Mitroff 1973, p.475). In a manner comparable to that of Mason & Mitroff in relation to MIS, the research program proposed here in relation to the digital surveillance economy presents a framework within which large numbers of disparate projects adopt disparate approaches to disparate research questions, utilising bodies of theory and techniques from disparate disciplines, and achieving a degree of cohesion through inter- and multi-disciplinary elements of the program.

The analysis in the preceding sections may appear somewhat bold, futuristic, and to a degree apocalyptic, and to extend across the boundaries of IS into the stamping-grounds of cognate discipilines. There is, however, some recognition within at least the German Wirtschaftsinformatik community of the need for a broadly-based research program - although in that case the authors' proposals are primarily for the benefit of business and only secondarily for individuals, and they seem to have given little consideration to society and polity: "research on Digital Life requires extensive multidisciplinary activities in mixed research teams with experts from, for instance, psychology, cognitive sciences, or engineering, and co-innovation in cooperation with start-ups, spin-offs, and SMEs" (Hess et al. 2014, p.248).

This paper has proposed scenario analysis as an appropriate means of integrating the findings from multiple disciplines and professions. However, scenario analysis depends on a clear description of the current state, an appreciation of environmental and strategic factors that may play a role in future developments, and bodies of theory that suggest particular participant actions and reactions that may occur under various contingencies. In order to provide the necessary springboard, existing expertise and theories need to be identified and applied.

A first step towards addressing research questions such as those identified in the previous section is to establish what existing research already tells us about the behaviours of individuals and societies under stress, and particularly under surveillance stress. That will enable the identification of gaps in knowledge, the formulation of more specific research questions to fill those gaps, and the conception and conduct of research to provide answers.

The program can utilise the theories, insights and research techniques of many different disciplines. Indicative examples of relevant research, which would feed into each of the scenarios outlined above, are as follows:


7. Conclusions

This paper has presented a model of the activities within the digital surveillance economy. It has shown how consumer marketing corporations no longer have relationships with their customers as the core of their operations, and have instead become dependent on data-intensive digital personae to represent consumers.

By considering the phenomena from the perspective of the individuals whose data is exploited, and taking into account the insights gained through a succession of conceptualisations of the changing world during the last half-century, threats inherent in this new order have been identified. The high degree of optimism in the early post-industrial society and information society phases has first been tempered, and then has become pessimistic, as surveillance society and the surveillance state have developed, the digital surveillance economy has emerged, and surveillance capitalism has been postulated.

During the last 15 years, the digital surveillance economy has come into existence, its impacts are measurable, and its implications can be at least speculated upon, while some aspects may now be sufficiently stable that preliminary empirical studies can be undertaken. Digitalisation of people harbours threats to individuals, to society, to economies and to polities. Whether those threats will give rise to the harm that doom-sayers suggest needs to be investigated. Such investigations need to be conducted sufficiently early that, to the extent that the pessimism is justified, countermeasures can be developed and implemented before the harm becomes irreversible.

Conventional, rigorous, empirical research is useless in the face of such a challenge. Research techniques need to be deployed that are future-oriented and instrumentalist. Researchers are needed who are not terrified by the prospect of embracing value-laden propositions within their research models. For progress to be made, hard questions need to be asked, and insights must be drawn from the findings and theories of multiple disciplines. This paper proposes a research agenda that enables the challenge to be taken up.


References

Acquisti A. (2008) 'Identity management, Privacy, and Price discrimination' IEEE Security & Privacy 6, 2 (March/April 2008) 18-22, at http://www.heinz.cmu.edu/~acquisti/papers/j2acq.pdf

Acquisti A. & Grossklags J. (2005) 'Privacy and Rationality in Individual Decision Making' IEEE Security & Privacy 3, 1 (January/February 2005) 24-30, at http://people.ischool.berkeley.edu/~jensg/research/paper/Acquisti_Grossklags05.pdf

Adomavicius G. & Tuzhilin A. (2005) 'Personalization Technologies: A Process-Oriented Perspective' Commun. ACM 48, 10 (October 2005) 83-90

Ahuja M.K. (2002) 'Women in the information technology profession: a literature review, synthesis and research agenda' European Journal of Information Systems (2002) 11, 20-34 ? 2002, at https://pdfs.semanticscholar.org/5c05/af60eb00cd9364643157398a3bdd1aee2b7f.pdf

Alvesson M. & Deetz S. (2000) 'Doing Critical Management Research' Sage, 2000

Andrejevic M. (2014) 'The Big Data Divide' International Journal of Communication 8 (2014), 1673-1689, at http://espace.library.uq.edu.au/view/UQ:348586/UQ348586_OA.pdf

Angwin J. (2016) 'Google has quietly dropped ban on personally identifiable Web tracking' ProPublica, 21 October 2016, at https://www.propublica.org/article/google-has-quietly-dropped-ban-on-personally-identifiable-web-tracking

Ansari S., Kohavi R., Mason L. & Zheng Z. (2000) 'Integrating E-Commerce and Data Mining: Architecture and Challenges' Proc. WEBKDD 2000, at https://arxiv.org/pdf/cs/0007026

Avgerou C. (2008) 'Information systems in developing countries: a critical research review' Journal of Information Technology 23 (2008) 133 - 146, at http://ai2-s2-pdfs.s3.amazonaws.com/c130/a316d37382ba89b05ffbaebfe85c0068e6ad.pdf

Balkin J.M. & Levinson S. (2006) 'The Processes of Constitutional Change: From Partisan Entrenchment to the National Surveillance State' Yale Law School Faculty Scholarship Series, Paper 231, at http://digitalcommons.law.yale.edu/fss_papers/231

Beatty S.E. & Ferrell M.E. (1998) 'Impulse buying: modeling its precursor' Journal of Retailing 74 (1998) 169-191_Bell D. (1973) 'The Coming of Post Industrial Society' Penguin, 1973

Bell W. (2004) 'Foundations of Futures Studies: Human Science for a New Era: Values' Transaction Publishers, 2004

Bennett C. (2016) 'Voter databases, micro-targeting, and data protection law: can political parties campaign in Europe as they do in North America?' International Data Privacy Law 6, 4 (November 2016) 261-275, at https://academic.oup.com/idpl/article/6/4/261/2567747/Voter-databases-micro-targeting-and-data

Bösch C., Erb B., Kargl F., Kopp H. & Pfattheicher S. (2016) 'Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns' Proc. Privacy Enhancing Technologies 4 (2016) 237-254, at http://www.degruyter.com/downloadpdf/j/popets.2016.2016.issue-4/popets-2016-0038/popets-2016-0038.xml

Bradfield R., Wright G., Burt G., Cairns G. & Van Der Heijden K. (2005) 'The origins and evolution of scenario techniques in long range business planning' Futures 37 (2005) 795-812, at http://www.academia.edu/download/46103896/The_origins_and_evolution_of_scenario_te20160531-12850-l6xpls.pdf

Brennen S. & Kreiss D. (2016) 'Digitalization and Digitization' International Encyclopedia of Communication Theory and Philosophy, October 2016, PrePrint at http://culturedigitally.org/2014/09/digitalization-and-digitization/

Brin D. (1998) 'The Transparent Society' Basic Books, 1998

Brinker S. (2016) 'Marketing Technology Landscape' Chief Marketing Technologist Blog, March 2016, at http://chiefmartec.com/2016/03/marketing-technology-landscape-supergraphic-2016/

Brunner J. (1975) 'The Shockwave Rider' Harper & Row, 1975

Bujlow T., Carela-Espa--ol V., Solé-Pareta J. & Barlet-Ros P. (2015) 'Web Tracking: Mechanisms, Implications, and Defenses' arxiv.org. 28 Jul 2015, at https://arxiv.org/abs/1507.07872

Calo R. (2014) 'Digital Market Manipulation' George Wash Law Rev 82 (2014) 995-1051, at http://www.gwlr.org/wp-content/uploads/2014/10/Calo_82_41.pdf

Cecez-Kecmanovic D. (2005) 'Basic assumptions of the critical research perspectives in information systems' Ch. 2 in Howcroft D. & Trauth E.M. (eds) (2005) 'Handbook of Critical Information Systems Research: Theory and Application', pp.19-27, Edward Elgar, 2005

Ceglowski M. (2015) 'The Advertising Bubble' Idlewords.com, 14 November 2015, at http://idlewords.com/2015/11/the_advertising_bubble.htm

Christl W. & Spiekermann S. (2016) 'Networks of Control: A Report on Corporate Surveillance, Digital Tracking, Big Data & Privacy' Facultas, Wien, 2016

Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, PrePrint at http://www.rogerclarke.com/DV/CACM88.html

Clarke R. (1991) 'A Contingency Approach to the Application Software Generations' Database 22, 3 (Summer 1991) 23 - 34, PrePrint at http://www.rogerclarke.com/SOS/SwareGenns.html

Clarke R. (1993) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance' Journal of Law and Information Science 4,2 (December 1993), PrePrint at http://www.rogerclarke.com/DV/PaperProfiling.html

Clarke R. (1994) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, PrePrint at http://www.rogerclarke.com/DV/DigPersona.html

Clarke R. (2001) 'If e-Business is Different Then So is Research in e-Business' Invited Plenary, Proc. IFIP TC8 Working Conference on E-Commerce/E-Business, Salzburg, 22-23 June 2001, published in Andersen K.V. et al. (eds.) 'Seeking Success in E-Business' Springer, New York, 2003, PrePrint at http://www.rogerclarke.com/EC/EBR0106.html

Clarke R. (2002) 'The Birth of Web Commerce' Working Paper, Xamax Consultancy Pty Ltd, October 2002, at http://www.rogerclarke.com/II/WCBirth.html

Clarke R. (2006) 'Google's Gauntlets' Computer Law & Security Report 22, 4 (July-August 2006) 287-297, Preprint at http://www.rogerclarke.com/II/Gurgle0604.html

Clarke R. (2008a) 'B2C Distrust Factors in the Prosumer Era' Invited Keynote, Proc. CollECTeR Iberoamerica, Madrid, June 2008, PrePrint at http://www.rogerclarke.com/EC/Collecter08.html

Clarke R. (2008b) 'Web 2.0 as Syndication' Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#, PrePrint at http://www.rogerclarke.com/EC/Web2C.html

Clarke R. (2009a) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, revised version at http://www.rogerclarke.com/ID/IdModel-1002.html

Clarke R. (2009b) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html

Clarke R. (2010) 'What is +berveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, PrePrint at http://www.rogerclarke.com/DV/RNSA07.html

Clarke R. (2014a) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, PrePrint at http://www.rogerclarke.com/ID/DP12.html

Clarke R. (2014b) 'The Prospects for Consumer-Oriented Social Media' Proc. Bled eConference, June 2014, revised version published in Organizacija 47, 4 (2014) 219-230, PrePrint at http://www.rogerclarke.com/II/COSM-1402.html

Clarke R. (2015) 'Quasi-Empirical Scenario Analysis and Its Application to Big Data Quality' Proc. 28th Bled eConference, Slovenia, 7-10 June 2015, PrePrint at http://www.rogerclarke.com/EC/BDSA.html

Clarke R. (2016a) 'A Framework for Analysing Technology's Negative and Positive Impacts on Freedom and Privacy' Datenschutz und Datensicherheit 40, 1 (January 2016) 79-83, PrePrint at http://www.rogerclarke.com/DV/Biel15-DuD.html

Clarke R. (2016b) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at http://www.rogerclarke.com/EC/BDQAS.html

Clarke R. (2017) 'Vignettes of Corporate Privacy Disasters' Xamax Consultancy Pty Ltd, March 2017, at http://www.rogerclarke.com/DV/PrivCorp.html

CoE (2001) 'The Cybercrime Convention' Convention 185, Council of Europe, 2001, at https://www.coe.int/en/web/conventions/full-list/-/conventions/rms/0900001680081561

Cohen J.E. (1996) 'A right to read anonymously: A closer look at 'copyright management' in cyberspace' Conn. Law Review 28, 981 (1996) 1003-19, at http://scholarship.law.georgetown.edu/facpub/814/

Confessore N. & Hakimmarch D. (2017) 'Data Firm Says `Secret Sauce' Aided Trump; Many Scoff' The New York Times, 6 March 2017, at https://www.nytimes.com/2017/03/06/us/politics/cambridge-analytica.html

Constantinides E. & Fountain S.J. (2008) 'Web 2.0: Conceptual foundations and marketing issues' Journal of Direct, Data and Digital Marketing Practice 9, 3 (2008) 231-244, at http://thinkspace.csu.edu.au/kaylaflegg/files/2014/09/4350098a-1311g82.pdf

Davies S. (1997) 'Time for a byte of privacy please' Index on Censorship 26, 6 (1997) 44-48

Davis B. (2014) 'A guide to the new power of Facebook advertising' Econsultancy, 29 May 2014, at https://econsultancy.com/blog/64924-a-guide-to-the-new-power-of-facebook-advertising/

Degli Esposti S. (2014) 'When big data meets dataveillance: The hidden side of analytics' Surveillance & Society 12, 2 (2014) 209-225, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/analytics/analytic

Dreborg K.H. (1996) 'Essence of Backcasting' Futures 28, 9 (1996) 813-828

Dreyfus H.L. (1974) 'What Computers Still Can't Do : A Critique of Artificial Reason', MIT Press, Boston Mass., 1974, 1992

Drucker P.F. (1968) 'The Age of Discontinuity' Pan Piper, 1968

Dunn E.S. (1967) 'The idea of a national data center and the issue of personal privacy' The American Statistician, 1967

Dwyer C. (2009) 'Behavioral Targeting: A Case Study of Consumer Tracking on Levis.com' Proc. AMCIS, 2009, p.460

Ebrahimi S., Ghasemaghaei M. & Hassanein K. (2016) 'Understanding the Role of Data Analytics in Driving Discriminatory Managerial Decisions' Proc. ICIS 2016

Englehardt S. & Narayanan A. (2017) 'Online tracking: A 1-million-site measurement and analysis' WebTAP Project, Princeton University, January 2017, at https://webtransparency.cs.princeton.edu/webcensus/

EPIC (2003) 'Privacy and Consumer Profiling' Electronic Privacy Information Center, 2003, at https://epic.org/privacy/profiling/

EPIC (2010) 'Do Not Track Legislation: Is Now the Right Time?' Electronic Privacy Information Center (EPIC), Statement to a Committee of the US Congress, December 2010, at https://epic.org/privacy/consumer/EPIC_Do_Not_Track_Statement_120910.pdf

EPIC (2015) 'Cross-Device Tracking' Electronic Privacy Information Center (EPIC), Statement to the Federal Trade Commission, December 2015, at https://epic.org/apa/comments/EPIC-FTC-Cross-Device-Tracking-Comments.pdf

EPTA (2017) 'What is Technology Assessment?' European Parliamentary Technology Assessment, 2017, at http://www.eptanetwork.org/about/what-is-ta

Flaherty D.H. (1988) 'The emergence of surveillance societies in the western world: Toward the year 2000' Government Information Quarterly 5, 4 (1988) 377-387

Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989

Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Peregrine, London, 1975, trans. 1977

Froomkin A.M. (1995) 'Anonymity and its Enmities' Journal of Online Law 1, 4 (1995), at https://papers.ssrn.com/sol3/Delivery.cfm/SSRN_ID2715621_code69470.pdf?abstractid=2715621&mirid=1

FTC (2009) 'Self-Regulatory Principles For Online Behavioral Advertising' Federal Trade Commission, February 2009, at https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-staff-report-self-regulatory-principles-online-behavioral-advertising/p085400behavadreport.pdf

FTC (2016) 'Keeping Up with the Online Advertising Industry' Federal Trade Commission, April 2016, at https://www.ftc.gov/news-events/blogs/business-blog/2016/04/keeping-online-advertising-industry

Fuchs C., Boersma K., Albrechtslund A. & Sandoval M. (2012) 'Internet and Surveillance ' Chapter 1, pp. 1-28 of Fuchs C. et al. (eds.) (2012) 'Internet and Surveillance: The Challenges of Web 2.0 and Social Media' Routledge, 2012

Funder A. (2003) 'Stasiland: Stories from behind the Berlin Wall' Granta, 2003

Gandy O.H. (1989) 'The Surveillance Society: Information Technology and Bureaucratic Social Control' Journal of Communication 39, 3 (September 1989) 61-76

Gandy O.H. (1993) 'The Panoptic Sort. Critical Studies in Communication and in the Cultural Industries' Westview, Boulder CO, 1993

Garcia L. (1991) 'The U.S. Office of Technology Assessment' Chapter in Berleur J. & Drumm J. (eds.) 'Information Technology Assessment' North-Holland, 1991, at pp.177-180

Gibson W. (1984) 'Neuromancer' Grafton/Collins, London, 1984

Glenn J.C. & Gordon T.J. (eds.) (2009) 'Futures Research Methodology' Version 3.0, The Millenium Project, 2009, from http://107.22.164.43/millennium/FRM-V3.html

Goldberg I. (2007) 'Privacy Enhancing Technologies for the Internet III: Ten Years Later' Chapter 1 of Acquisti A. et al. (eds.) 'Digital Privacy: Theory, Technologies, and Practices' Auerbach, 2007, at https://cs.uwaterloo.ca/~iang/pubs/pet3.pdf

Goldberg L.R. (1990) 'An alternative 'Description of personality'': The Big-Five structure' Journal of Personality and Social Psychology 59 (1990) 1216-1229

Goldberg L.R. (1999) 'A broad-bandwidth, public-domain, personality inventory measuring the lower-level facets of several Five-Factor models' Ch. 1 in Mervielde I. et al. (eds.) 'Personality Psychology in Europe', vol. 7, Tilburg Uni. Press, 1999, pp. 7-28, at http://projects.ori.org/lrg/PDFs_papers/A%20broad-bandwidth%20inventory.pdf

Graham S. & Wood D. (2003) 'Digitizing Surveillance: Categorization, Space, Inequality' Critical Social Policy 23, 2 (May 2003) 227-248

Gray P. & Hovav A. (2008) 'From Hindsight to Foresight: Applying Futures Research Techniques in Information Systems' Communications of the Association for Information Systems 22, 12 (2008), at http://aisel.aisnet.org/cais/vol22/iss1/12

Greenwald G. (2014) 'No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State' Metropolitan Books, 2014, at https://archive.org/download/pdfy-jczuhqxxCAsjKlM0/no%20place%20to%20hide%20(edward%20snowden%20and%20gleen%20greenwald)%20complete.pdf

Gunter B. & Furnham A. (1992) 'Consumer Profiles: An Introduction to Psychographics' Routledge, 1992

Hanson J.D. & Kysar D.A. (1999) 'Taking Behavioralism Seriously: The Problem of Market Manipulation' NYU Law Review 74 (1999) 630-747

Harris S. (2010) 'The Watchers: The Rise of America's Surveillance State' Penguin, 2010

Hassan N.R. (2014) 'Useful Products in Theorizing for Information Systems' Proc. 35th Intl Conf on Information Systems, December 2014, at https://pdfs.semanticscholar.org/7679/7111dd5ced079832437f08faada72ac00c52.pdf

Hazen B.T., Boone C.A., Ezell J.D. & Jones-Farmer L.A. (2014) 'Data Quality for Data Science, Predictive Analytics, and Big Data in Supply Chain Management: An Introduction to the Problem and Suggestions for Research and Applications' International Journal of Production Economics 154 (August 2014) 72-80, at https://www.researchgate.net/profile/Benjamin_Hazen/publication/261562559_Data_Quality_for_Data_Science_Predictive_Analytics_and_Big_Data_in_Supply_Chain_Management_An_Introduction_to_the_Problem_and_Suggestions_for_Research_and_Applications/links/0deec534b4af9ed874000000

Herbert F. (1965) 'Dune' Chilton Books, 1965

Hernandez G.A., Eddy K.J. & Muchmore J. (2001) 'Insurance Weblining and Unfair Discrimination in Cyberspace' SMU L. Rev. 54 (2001) 1953-1971, at http://scholar.smu.edu/smulr/vol54/iss4/6

Hess T., Legner C., Esswein W., Maa[[section]] W., Matt, C., ...sterle H., Schlieter H., Richter P. & Zarnekow, R. (2014) 'Digital Life as a Topic of Business and Information Systems Engineering?' Business & Information Systems Engineering 6, 4 (2014) 247-253

Hesse M.B. (1966) 'Models and Analogies in Science' University of Notre Dame Press, 1966

Hirst M. (2013) `Someone's looking at you: welcome to the surveillance economy' The Conversation, 26 July 2013, at http://theconversation.com/someones-looking-at-you-welcome-to-the-surveillance-economy-16357

Hoofnagle C.J., King J., Li S. & Turow J. (2010) 'How different are young adults from older adults when it comes to information privacy attitudes and policies?' SSRN, April 2010, at http://www.ssrn.com/ abstract=1589864

Horne D.R., Norberg P.A. & Ekin A.C. (2007) 'Exploring consumer lying in information based exchanges' Journal of Consumer Marketing 24, 2 (2007) 90-99

Johnson-Laird P. N. (1988) 'Freedom and constraint in creativity' in R. J. Sternberg (ed.) 'The nature of creativity: contemporary psychological perspectives' Cambridge University Press, 1988, at http://mentalmodels.princeton.edu/papers/1988freedomandconstraint.pdf

Kelly K. (1998) 'New Rules for the New Economy' Penguin, 1998, at http://kk.org/books/KevinKelly-NewRules-withads.pdf

Kistermann F.W. (1991) 'The Invention and Development of the Hollerith Punched Card: In Commemoration of the 130th Anniversary of the Birth of Herman Hollerith and for the 100th Anniversary of Large Scale Data Processing' IEEE Annals of the History of Computing 13, 3 (July 1991) 245 - 259

Kling R. & Lamb R. (1997) 'Analyzing Visions of Electronic Publishing and Digital Libraries' In Newby G.B. & Peek R.P. (Eds.) 'Scholarly Publishing: The Electronic Frontier' MIT Press, Cambridge Mass., 1997

Kosov H. & Gassner R. (2008) 'Methods of Future and Scenario Analysis: Overview, Assessment, and Selection Criteria' German Development Institute, 2008, at https://www.die-gdi.de/uploads/media/Studies_39.2008.pdf

Lambrecht A. &Tucker C. (2013) 'When Does Retargeting Work? Information Specificity in Online Advertising' Journal of Marketing Research 50, 5 (October 2013) 561-76

Land F. (2000) 'The First Business Computer: A Case Study in User-Driven Innovation' J. Annals of the Hist. of Computing 22, 3 (July-September 2000) 16-26

Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt and Company, New York, 1992

Laudon K.C. (1993) 'Markets and Privacy' Proc. Int'l Conf. Inf. Sys., Orlando FL, Ass. for Computing Machinery, New York, 1993, pp. 65-75

Levine R., Locke C., Searls D. & Weinberger D. (2000) 'The Cluetrain Manifesto: The End of Business as Usual' Perseus, 2000, summary at http://www.cluetrain.com/

Lewis A. (2010) 'If you are not paying for it, you're not the customer; you're the product being sold' Metafile, August 2010, at http://www.metafilter.com/95152/Userdriven-discontent#3256046

Lupton D. (2016) 'The Quantified Self: A Sociology of Self-Tracking' Polity Press, 2016

Lycett, M. (2014), Datafication: Making Sense of (Big) Data in a Complex World, European Journal of Information Systems, 22, 4 (December 2014) 381-386, at http://v-scheiner.brunel.ac.uk/bitstream/2438/8110/2/Fulltext.pdf

Lynch C. (2017) 'The rise of reading analytics and the emerging calculus of reader privacy in the digital world' First Monday 22, 24 (3 April 2017), at http://firstmonday.org/ojs/index.php/fm/article/view/7414/6096

Lyne J. (2011) 'HTML5 and Security on the New Web' Sophos Security, December 2011, at http://www.sophos.com/en-us/medialibrary/PDFs/other/sophosHTML5andsecurity.pdf

Lyon D. (1994) 'The electronic eye: The rise of surveillance society' Polity Press, 1994

Lyon D. (2001) 'Surveillance society' Open University Press, 2001

Lyon D. (2003) 'Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination' Routledge, 2003

Machlup F. (1962) 'Production and Distribution of Knowledge in the United States' Princeton University Press, 1962

MacInnis D.J. (2011) 'A Framework for Conceptual Contributions in Marketing' Journal of Marketing 75, 4 (July 2011) 136-154

Maguire S., Friedberg J., Nguyen M.-H.C. & Haynes P. (2015) 'A metadata-based architecture for user-centered data accountability' Electronic Markets 25, 2 (June 2015) 155-160

Majchrzak A. & Markus M.L. (2014) 'Methods for Policy Research: Taking Socially Responsible Action' Sage, 2nd Edition, 2014

Mann S., Fung J. & Lo R. (2006) 'Cyborglogging with Camera Phones: Steps toward Equiveillance' Proc. 14th ACM Int'l Conf. on Multimedia, ,October 2006, pp. 177-80, at http://www.eyetap.org/papers/docs/art0905s-mann.pdf

Mann S., Nolan J. & Wellman B. (2003) 'Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments' Surveillance & Society 1(3): 331-355, at https://ojs.library.queensu.ca/index.php/surveillance-and-society/article/view/3344/3306

Martin A.K., Van Brakel R. & Bernhard D. (2009) 'Understanding resistance to digital surveillance: Towards a multi-disciplinary, multi-actor framework' Surveillance & Society 6, 3 (2009) 213-232, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/3282/3245

Martino J.P. (1993) 'Technological Forecasting for Decision Making' 3rd Edition, McGraw-Hill, 1993

Marx G.T. (1985) 'The Surveillance Society: The Threat Of 1984-Style Techniques' The Futurist, 1985

Mason R. & Mitroff I. (1973) 'A program for research on management information systems' Management science 19, 5 (1973) 475-487, at https://www.researchgate.net/profile/Richard_Mason3/publication/227443936_A_Program_For_Research_on_Management_Information_Systems/links/0fcfd51224c2f4f55c000000.pdf

Masuda Y. (1981) 'The Information Society as Post-Industrial Society' World Future Society, Bethesda, 1981

Mayer J.R. & Mitchell J.C. (2012) 'Third-Party Web Tracking: Policy and Technology ' Proc. IEEE Symposium on Security and Privacy, 2012, at http://ieeexplore.ieee.org/iel5/6233637/6234400/06234427.pdf

Metzger M.J. (2007) 'Communication Privacy Management in Electronic Commerce' Journal of Computer-Mediated Communication 12, 2 (January 2007) 335-361, at http://onlinelibrary.wiley.com/doi/10.1111/j.1083-6101.2007.00328.x/full

Miles M.B. & Huberman A.M. (1994) 'Qualitative Data Analysis: An Expanded Sourcebook' Sage 1994

Monks R.A.G. (2008) 'Corpocracy: How CEOs and the Business Roundtable Hijacked the World's Greatest Wealth Machine and How to Get it Back' Wiley, 2008

Montgomery A.L., Li S., Srinivasan K. & Liechty J.C. (2004) 'Modeling Online Browsing and Path Analysis Using Clickstream Data' Marketing Science 23, 4 (2004) 579-595, at https://www.researchgate.net/profile/Shibo_Li2/publication/227442351_Modeling_Online_Browsing_and_Path_Analysis_Using_Clickstream_Data/links/004635346a4cc11070000000.pdf

Myers M.D. (1997) 'Qualitative research in information systems' MISQ Discovery, June 1997, at http://www.academia.edu/download/11137785/qualitative%20research%20in%20information%20systems.pdf

NAI (2010) 'Study Finds Behaviorally-Targeted Ads More Than Twice As Valuable, Twice As Effective As Non-Targeted Online Ads' National Advertising Initiative, March 2010, at http://www.networkadvertising.org/pdfs/Beales_NAI_Study.pdf

Naisbitt J. (1982) 'Megatrends' London, Macdonald, 1982

Negroponte N. (1995) 'Being Digital' Hodder & Stoughton, 1995

Newell S. & Marabelli M. (2015) 'Strategic Opportunities (and Challenges) of Algorithmic Decision-Making: A Call for Action on the Long-Term Societal Effects of 'Datification'' The Journal of Strategic Information Systems 24, 1 (2015) 3-14, at http://marcomarabelli.com/Newell-Marabelli-JSIS-2015.pdf

Newman N. (2011) 'You're Not Google's Customer -- You're the Product: Antitrust in a Web 2.0 World' The Huffington Post, 29 March 2011, at http://www.huffingtonpost.com/nathan-newman/youre-not-googles-custome_b_841599.html

Newman N. (2014) 'The Costs of Lost Privacy: Consumer Harm and Rising Economic Inequality in the Age of Google' William Mitchell L. Rev. 40, 2 (2014) 849-889, at http://open.mitchellhamline.edu/wmlr/vol40/iss2/12

Niederman F, Clarke R., Applegate L., King J.L., Beck R. & Majchrzak A. (2016) 'IS Research and Policy: Notes from the 2015 ICIS Senior Scholar's Forum' Commun. Assoc. Infor. Syst. 40, 5 (2016) 82-92, at https://pure.itu.dk/portal/files/81829621/IS_Research_and_Policy_Notes_From_the_2015_ICIS_Senior_Scholar_s.pdf

van Notten P.W.F., Rotmans J., van Asselt M.B.A. & Rothman D.S. (2003) 'An Updated Scenario Typology' Futures 35 (2003) 423-443, at http://ejournal.narotama.ac.id/files/An%20updated%20scenario%20typology.pdf

Okoli C. & Pawlowski S.D. (2004) 'The Delphi Method as a Research Tool: An Example, Design Considerations and Applications' Information & Management 42, 1 (December 2004) 15-29, at http://spectrum.library.concordia.ca/976864/1/OkoliPawlowski2004DelphiPostprint.pdf

O'Reilly T. (2005) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly 30 September 2005, at http://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html

Osler M., Lund R. Kriegbaum M., Christensen U. & Andersen A.-M.N. (2006) 'Cohort Profile: The Metropolit 1953 Danish Male Birth Cohort' Int. J. Epidemiol. 35, 3 (June 2006) 541-545, at http://ije.oxfordjournals.org/content/35/3/541.long

OTA (1977) 'Technology Assessment in Business and Government' Office of Technology Assessment, NTIS order #PB-273164', January 1977, at http://www.princeton.edu/~ota/disk3/1977/7711_n.html

Packard V. (1957) 'The Hidden Persuaders' Penguin, 1957

Packard V. (1964) 'The Naked Society' McKay, 1964

PageFair (2017) 'The state of the blocked web - 2017 Global Adblock Report' PageFair, February 2017, at https://pagefair.com/downloads/2017/01/PageFair-2017-Adblock-Report.pdf

Palmas K. (2011) Predicting What You'll Do Tomorrow: Panspectric Surveillance and the Contemporary Corporation' Surveillance & Society 8, 3 (2011) 338-354, at http://ojs.library.queensu.ca/index.php/surveillance-and-society/article/download/4168/4170

Pew (2000) 'Trust and Privacy Online: Why Americans Want to Rewrite the Rules?' Pew Internet and American Life Project, August 2000, at: http://www.pewinternet.org/2000/08/20/trust-and-privacy-online/

Posner R.A. (1977) 'The Right of Privacy' 12 Georgia Law Review 393, at http://chicagounbound.uchicago.edu/cgi/viewcontent.cgi?article=2803&context=journal_articles

Posner R.A. (1981) 'The Economics of Privacy' American Economic Review 71, 2 (1981) 405-409

Posner R.A. (1984) 'An Economic Theory of Privacy' in Schoeman F. (ed.) 'Philosophical Dimensions of Privacy: An Anthology' Cambridge University Press, 1984, pp. 333-345

Postma T.J.B.M. & Leibl F. (2005) 'How to improve scenario analysis as a strategic management tool?' Technological Forecasting & Social Change 72 (2005) 161-173, at https://www.researchgate.net/profile/Theo_Postma/publication/222668211_How_to_improve_scenario_analysis_as_a_strategic_management_tool/links/09e41508117c39a2c7000000.pdf

Reinecke I. (1984) 'Electronic Illusions: A Skeptic's View of Our High-Tech Future' Penguin, New York, 1984

Rheingold H. (2002) 'Smart mobs: The next social revolution' Basic Books, 2002

Rosenblatt B. (2015) 'The myth of DRM-free music' Copyright and Technology (31 May 2015), at http://copyrightandtechnology.com/2015/05/31/the-myth-of-drm-free-music/

Roszak T. (1986) 'The Cult of Information' Pantheon, 1986

Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974

Schauer F. (1978) 'Fear, Risk and the First Amendment: Unraveling the Chilling Effect' Boston University Law Review 58 (1978) 685-732, at http://scholarship.law.wm.edu/facpubs/879

Schmidt E. & Cohen J. (2014) 'The New Digital Age: Reshaping the Future of People, Nations and Business' Knopf, 2013

Schneier B. (2015a) 'Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World' Norton, March 2015

Schneier B. (2015b) 'How to mess with surveillance' Slate, 2 March 2015, at http://www.slate.com/articles/technology/future_tense/2015/03/data_and_goliath_excerpt_the_best_ways_to_undermine_surveillance.html

Schwartz P. (1991) 'The Art of the Long View: Planning for the Future in an Uncertain World' Doubleday, 1991

Shapiro C. & Varian H.R. (1999) 'Information Rules: A Strategic Guide to the Network Economy' Harvard Business School Press, 1999

Shen Y. & Pearson S. (2011) 'Privacy enhancing technologies: a review' HP Laboratories, 2011, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.377.2136&rep=rep1&type=pdf

Shklovski I., Mainwaring S.D., Skoelad-ttir H.H. & Borgthorsson H. (2014) 'Leakiness and Creepiness in App Space: Perceptions of Privacy and Mobile App Use' Proc. CHI 2014, April 2014, at https://www.researchgate.net/profile/Irina_Shklovski/publication/265784619_Leakiness_and_Creepiness_in_App_Space_Perceptions_of_Privacy_and_Mobile_App_Use_Mobile/links/541b02540cf203f155ae72f7.pdf

Smith H.J., Dinev T. & Xu H. (2011) 'Information Privacy Research: An Interdisciplinary Review' MIS Quarterly 35, 4 (December 2011) 989-1015

Spiekermann S. & Novotny A. (2015) 'A Vision for Global Privacy Bridges: Technical and Legal Measures for International Data Markets' Computer Law & Security Review 31, 2 (2015) 181-200, at http://epub.wu.ac.at/5485/1/without_appendix__SpiekermannNovotny2015_CLSR-D-14-00096_Manuscript_20150112.pdf

SSN (2014) 'An introduction to the surveillance society', Surveillance Studies Network, 2014, at http://www.surveillance-studies.net/?page_id=119

Starr M.K. (1971) 'Management: A Modern Approach' Harcourt Brace Jovanovich, 1971

Stephenson N. (1992) 'Snow Crash' Bantam, 1992

Stonier T. (1983) 'The Wealth of Information: A Profile of the Post-Industrial Economy' Methuen, London, 1984

Sutton M. (2017) 'Charities 'creating privacy risks by collecting and distributing donor information' ABC News, 27 Jul 2017, at http://www.abc.net.au/news/2017-07-27/collection-of-personal-data-by-charities/8745186

Tapscott D. (1996) 'The Digital Economy: Promise and Peril in the Age of Networked Intelligence' McGraw-Hll, 1996

Toffler A. (1980) 'The Third Wave' Collins, London, 1980

Touraine A. (1971) 'The Post-Industrial Society: Tomorrow's Social History' Random House, 1971

Trusov M., Ma L. & Jamal Z. (2016) 'Crumbs of the Cookie: User Profiling in Customer-Base Analysis and Behavioral Targeting' Marketing Science 35, 3 (2016) 405-426

Tufekci Z. (2014) 'Engineering the public: Big data, surveillance and computational politics' First Monday 19, 7 (7 July 2014), at http://firstmonday.org/ojs/index.php/fm/article/view/4901/4097

Turow, J., King J., Hoofnagle C.J., Bleakley A. & Hennessy M. (2009) 'Americans Reject Tailored Advertising and Three Activities that Enable It' SSRN, September 2009, at https://ssrn.com/abstract=1478214

Ur B., Leon P.G., Cranor L.F., Shay R. & Wang Y. (2012) 'Smart, Useful, Scary, Creepy: Perceptions of Online Behavioral Advertising' Proc. 8th Symp. on Usable Privacy and Security, SOUPS '12, pp. 4:1-4:15, at ttps://pdfs.semanticscholar.org/db4c/502bafef23e7d8b1de60d628c952a4780acc.pdf

Vaidhyanathan S. (2011) 'The Googlization of Everything (And Why We Should Worry)' University of California Press, 2011

Varian H.R. (2014) 'Beyond Big Data' Business Economics 49, 1 (2014) 27-31

Wack P. (1985) 'Scenarios: Uncharted Waters Ahead' Harv. Bus. Rev. 63, 5 (September-October 1985) 73-89

Wand Y. & Weber R. (2002) 'Research Commentary: Information Systems and Conceptual Modeling-- A Research Agenda' Information Systems Research 13, 4 (December 2002) 363-376, at http://www.cs.northwestern.edu/~paritosh/papers/sketch-to-models/wand-weber-information-systems-and-conceptual-modeling-2002.pdf

Wang Y. & Felsenmaier D.R. (2002) 'Modeling Participation in an Online Travel Community' Journal of Travel Research 42, 3 (February 2004) 261-270

Wedel M. & Kannan P.K. (2016) 'Marketing Analytics for Data-Rich Environments' Journal of Marketing 80 (November 2016), 97-121, at https://www.rhsmith.umd.edu/files/Documents/Departments/Marketing/wedel-kannan-jm-2016-final.pdf

Weingarten F.W. (1988) 'Communications Technology: New Challenges to Privacy' J. Marshall L. Rev. 21, 4 (Summer 1988) 735

Weizenbaum J. (1976) 'Computer Power and Human Reason' Freeman, San Francisco, 1976

Wells J.D., Parboteeah V. & Valacich J.S. (2011) 'Online impulse buying: understanding the interplay between consumer impulsiveness and website quality' Journal of the Association for Information Systems 12 (2011) 32-56

Wells W.D. (1975) 'Psychographics: A Critical Review' Journal of Marketing Research 12, 2 (May, 1975) 196-213

Westin A.F. (1967) 'Privacy and Freedom' Atheneum 1967

Westin A.F. & Baker M.A. (1974) 'Databanks in a Free Society: Computers, Record-Keeping and Privacy' Quadrangle 1974

Worstall T. (2011) 'Amazon Loses 1-Click Patent, Forbes Magazine, 7 July 2011, at https://www.forbes.com/sites/timworstall/2011/07/07/amazon-loses-1-click-patent/#3577b51f1962

Yadav M.S. (2010) 'The Decline of Conceptual Articles and Implications for Knowledge Development' Journal of Marketing 74, 1 (January 2010) 1-19

Yan J., Liu N., Wang G., Zhang W., Jiang Y. & Chen Z. (2009) 'How Much can Behavioral Targeting Help Online Advertising?' Proc. WWW 2009, pp.261-270, at http://www2009.eprints.org/27/1/p261.pdf

Yoo Y., Henfridsson O. & Lyytinen K. (2010) 'The New Organizing Logic of Digital Innovation: An Agenda for Information Systems Research' Information Systems Research 21, 4 (December 2010) 724-73, at http://www.informatik.umu.se/digitalAssets/158/158547_yoo-et-al--2010_1-.pdf

Zang J., Dummit K. Graves J., Lisker P., Sweeney L. (2015) 'Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps' Technology Science, 2015103001, 30 October 2015, at http://techscience.org/a/2015103001

Zimmer M. (2008) 'The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance' in Spink A. & Zimmer M. (eds.) 'Web Search: Multidisciplinary Perspectives' Springer, 2008, pp. 77-102, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.457.8268&rep=rep1&type=pdf#page=83

Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75-89, at https://cryptome.org/2015/07/big-other.pdf

Zuboff S. (2016) 'The Secrets of Surveillance Capitalism' Frankfurter Allgemeine, 3 May 2016, at http://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance-capitalism-14103616.html?printPagedArticle=true#pageIndex_2


Acknowledgements

My thanks for interactions and feedback on the topic to Prof. Robert Davison of the City University of Hong Kong, Prof. Markus Beckmann of the Friedrich-Alexander University N[[Ydieresis]]rnberg, Liam Pomfret of the University of Queensland, Kayleen Manwaring of UNSW Law, and renowned sci-fi author and futurist David Brin.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 7 December 2016 - Last Amended: 19 September 2017 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/EC/DSE.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy