Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2017
|Identity Matters||Other Topics||Waltzing Matilda||What's New|
Working Paper Preliminary Version of 20 December 2016
Intended as a basis for submissions to
Bled (Feb'17) and an EM Special Issue (Mar'17)
Roger Clarke **
© Xamax Consultancy Pty Ltd, 2016
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://www.rogerclarke.com/DV/InDigR.html
The digitisation of a considerable amount about the world relevant to business has given rise to a new phase of 'digitalisation'. This involves a substantial shift in business models and industrial organisation, such that the interpretation and management of the world through human perception and cognition has been to a considerable extent replaced by processes that are almost entirely dependent on digital data. Digitalisation is being applied by business enterprises to many entities, including people. In addition to opportunities, this gives rise to threats to individuals, and risks to people, society and polity.
A review of the notions of information society, surveillance society and surveillance capitalism provides a basis for appreciating the nature of what is referred to here as 'the digital surveillance economy' - a new form of business model that was initiated by Google at the beginning of the 21st century. This model is predicated on the acquisition, expropriation and consolidation of very large volumes of personal data, and its exploitation in order to target advertisements, manipulate consumer behaviour, and price goods and services at the highest level that each individual is willing to bear.
In the digital surveillance economy, not only is the consumer converted from the customer to the product, but consumers' interests have almost no impact on the process, and are ignored. In the words of the model's architects, users are 'bribed' and 'induced' to make their data available at minimal cost to marketers. The industrial-era notion of a contract between producer and consumer is no longer applicable, and the essence of the industrial-era social contract is undermined.
The process of digitalisation of the individual, and the digital surveillance economy that this has given rise to, harbour great threats to the interests of individuals, and to the relationship between corporations, on the one hand, and society and polity on the other. The new economic wave may prove to be a tsunami that swamps the social dimension and washes away the last five centuries' individualism and humanism. Alternatively, institutions may achieve regulatory adaptation in order to overcome the worst of the negative impacts; or a breaking-point could be reached and consumers might rebel against corporate domination. A research framework is suggested, within which the alternative scenarios can be investigated.
During the last few decades, a succession of terms has been used to refer to the most impactful forms of information technology (IT). These have included strategic information systems, IT alignment, eCommerce, eBusiness, disruptive IT and digital transformation. Following on from these predecessor notions, the term 'digitalisation' has recently emerged as a focal point for research into IT-enabled and IT-driven change.
The work reported in this paper is motivated by the author's concern that the digitalisation movement not only offers promise, but also harbours threats that are very substantial but that are far from being understood and managed, and that are for the most part being overlooked.
The terms 'digitisation' and 'digitalisation' are frequently used very loosely, even in many sources relevant to the present analysis. In this paper, following the OED and Brennen & Kreiss (2016), the following distinction is drawn:
In a revolution that was largely completed by the end of the 20th century, almost all aspects of the handling of most forms of data have come to be digitised. This includes not only the conversion and creation of data, but also its (near-costless) replication, transmission, access, analysis and manipulation (Negroponte 1995). The digitisation of data, together with the extraordinarily low cost of handling it, had a great deal of impact on the accessibility of information, the economics of activities that handled information, and the structure of industry sectors in which the revenue streams of established players were undermined by digitisation, as more nimble competitors took advantage of greatly lowered cost-profiles to erode the dinosaurs' customer-base.
However, it took some time for a second round of more fundamental change to come about, and to be recognised. One term that has been applied to this is 'datafication'. Some authors merely use the term as a synonym for 'digitisation'. Its more useful application, however, is to refer to "the reliance of enterprises on data" (Lycett 2014). Citing Normann (2001), Lycett uses 'dematerialisation' to refer to the expression of digital data, 'liquifaction' to refer to the ease of manipulation and transmission of digital data, and 'density' as the "(re)combination of resources, mobilised for a particular context, at a given time and place".
The term 'digitalisation' now appears to be gaining greater prominence than 'datafication' as a means of describing a shift in business models and industrial organisation in sectors in which information performs a central role. It is, for example, the subject of active programmes within the EU and the OECD. In some respects, it is merely the latest in a long line of neologisms; but it will be argued in this paper that convergence among multiple threads of development is indeed leading to substantial change.
Digitalisation is occurring in business and government, but also in economies and societies more broadly. Inherent in digitalisation is the reduction of the attributes and behaviour of real-world entities to digital form, as data-items and data-streams associated with digital identifiers (Clarke 2009a). All categories of real-world entities can be represented in this way, including inanimate natural objects and artefacts; geo-physical, geological and biological processes that occur independently of people; economic and social processes that are created and managed by humankind through groups, communities, associations and societies, and by government agencies, corporations and not-for-profit organisations; and all kinds of living beings, vegetable and animal.
The focus of this paper is on digitalisation applied to people. This topic is currently attracting a great deal of attention. Most of the discussion is upbeat, and focussed on opportunities that are perceived to arise from streams of digital data about individuals, e.g.: "the digit[al]ization of private lives enables suppliers of digital technologies to form closer and stronger connections with their customers and to build services and devices that better match their expectations and improve their everyday lives" (EM Call for Papers, 2016). This paper considers, on the other hand, the threats that are inherent in digitalisation of individuals, the risks to people, society and polity that may arise from those threats, and the processes whereby those risks might be addressed.
There is to date a limited literature considering the risks arising from the digitalisation of individuals. The research reported here is accordingly preliminary, and designed to lay the foundations for a research programme in the area. The intention is to delineate the dimensions of a suitable framework within which more detailed work can be undertaken, including the articulation of relevant research questions.
The paper commences with a brief description of the challenges facing research of this nature, and an outline of the approach adopted. The nature and origins of the digitalisation concept are then discussed, including the technologies that are driving it. Diverse literatures are drawn on, in order to surface the threats that digitalisation of the individual embodies. These threats affect not only the individuals themselves, but also societies, economies and polities. A range of alternative future paths exists. Because the categorisation of phenomena remains fluid at this early stage in the new era, it is necessary to articulate broad research questions, identify a range of circumstances, typify them using vignettes, and work towards the construction of more detailed scenarios. A broad research programme is outlined, which would enable the research questions to be addressed.
In the research reported in this paper, the object of study is a contemporary shift in the patterns of behaviour of organisations. The purpose of the research is not merely to describe, explain or even predict, but rather to provide insight, and thereby guide decisions and actions by business enterprises and by policy-makers. These features of the research give rise to a number of challenges.
The conduct of empirical research is predicated on the assumption that the phenomena being observed exhibit a degree of stability. The rigour demanded of empirical research is undermined to the extent that the results cannot be reliably reproduced because the population from which samples are extracted exhibits continual changes of behaviour. This is inevitably the case with emergent phenomena, such as those that arise from the application of multiple new digital technologies. For a discussion of the difficulties that the dynamic nature of the eCommerce and eBusiness eras presented to researchers, see Clarke (2001).
A further challenge arising with future-oriented, instrumentalist research is that it must deal with a value-laden context, in which stakeholders will be differentially impacted by change, and in which it is commonly the case that pareto optimiality cannot be achieved and the interests of at least some stakeholders suffer harm. This necessarily draws the researcher beyond the merely descriptive, explanatory and predictive, and into the normative realm. However, if research is to deliver outputs relevant to business strategy and government policy, it is necessary to adopt research techniques that can cope with unstable phenomena and contested values, and seek such levels of rigour as are feasible in the circumstances.
The audience to whom the reports of this research is addressed must also be considered. To some extent, the outcomes are intended to influence decision-makers in large corporations, by highlighting the broader implications of current changes in business models. However, the work also contributes to public policy research, whose purpose is to support the development of policies, and the articulation of programmes, that are intended to achieve normative objectives expressed in all of social, economic and environmental terms, and that depend on effective cooperation among disparate organisations, usually across the public, private and often also the voluntary sectors. This duality of audience adds to the challenges of research conception, design, conduct and reporting.
Appropriate research techniques need to be identified. Conventional empirical research techniques are of limited applicability to research questions of the kind being addressed in this paper, because the objects of study are unstable, emergent and to some extent not yet observable in the field. Some guidance is provided by Gray & Hovav (2008) and Majchrzak & Markus (2014). See also Niederman et al. (2016) and the Wikipedia entry on futures techniques. In addition to existing techniques such as cost/benefit/risk analysis, environmental scanning, technology assessment, privacy impact assessment and Delphi rounds, some innovativeness may be needed, as with quasi-empirical scenario analysis (Clarke 2015), and even some outright inventiveness, as with instrumentalist futurism (Clarke 1997).
The approach adopted to the research reported in this paper was to conduct and present a review of the threads of development that make up the digitalisation of the individual, identify threats embedded in those threads, and outline a framework whereby the emergent phenomena can be researched, understood and managed. The research questions, rather than being declared at the outset, are an outcome of the analysis, and are presented at the end of the paper.
The analysis draws on many sources, including multiple strands of research previously conducted by the author. The self-citations provide access to a substantial set of additional references, above and beyond those in the Reference List to the present paper.
Digitalisation has been emergent for some time. This section briefly reviews the digitisation phase that laid the foundations for it and shaped it. It then traces the emergence of digitalisation of the individual since the end of the nineteenth century. The notions of Information Society and Surveillance Society are revisited, culminating in consideration of Zuboff's recent writings in relation to Digital Capitalism. It is concluded that an appropriate conceptualisation of the research domain being addressed in the current work is the Digital Surveillance Economy.
The first step is digitisation is the selection of attributes whose states or behaviour are to be represented in data. The expression of data is a more or less purposive activity, reflecting the intentions of the collector. The collector's perception of purpose results in some attributes of some entities and their actions being represented in the model, and the complete omission from the model of all other attributes of those entities, and all attributes of all other entities.
The creation of data may involve some form of sampling of the physical state of attributes, such as the use of a sensor to measure the height of water in a container, temperature, or particulate content in air. Alternatively, the creation of data may be a by-product of some actions using an artefact, such as the scanning of product codes and a payment card at a cash register. A further form taken by data creation is purposeful human actions that result in machine-readable data, including data capture via keyboard, but also by voice (where text-recognition software is used) and as images (where some form of data-extraction capability exists, such as optical character recognition for printed text, and so-called facial recognition technologies).
Data may be subject to various transformation operations, display to humans, input to software that draws inferences, copying, transmission, disclosure to other parties, and deletion. There are many quality aspects of data, and data-items, records and record-collections evidence very high variability of quality. For reviews in the context of 'big data', see Hazen et al. (2014) and Clarke (2016). Evidence in support of quality assertions may or may not exist. Such evidence is sometimes referred to as 'meta-data', indicating that its purpose is to store data about other data. Meta-data is subject to the same quality aspects as any other kind of data, but with some further attributes as well.
Digitisation began to affect individuals as long ago as 1890, when the Hollerith card was applied to the US Census, and the era of machine-readable data began (Kistermann 1991). Following the emergence of electronic computing on both sides of the Atlantic c. 1940, electronic data processing commenced in 1951 at the Lyons Tea Company in the U.K. (Land 2000). Within a decade, the opportunity was perceived for 'national data systems' to consolidate data about people, variously as a basis for sociological research (e.g. the Metropolit projects in Sweden and Denmark - Osler et al. 2006) and for social control (e.g. the National Data Center proposals in the USA - Dunn 1967).
By the 1980s, government agencies were conducting data matching programs. This involved the expropriation of personal data from two or more sources, thereby divorcing it from its original purposes and contexts, its merger, and the use of the combined records to identify individuals whose behaviour was deemed deserving of suspicion on the basis of the matched data (Clarke 1994b). Similar approaches were applied by corporations to consumers (Larsen 1992), and to discrimination among consumer segments on the basis of location, class and race (Gandy 1993). Another early form of digitalisation of individuals was profiling, which refers to the extraction of a set of characteristics of a particular category of person (Clarke 1993).
Over the course of a few decades c. 1980-2000, the basis for decisions about such matters as lending changed from the evaluation of the applicants face-to-face, complemented by consideration of supporting documents, to a process almost entirely based on the data assembly that was thought to relate to the applicant, i.e. to the digital persona rather than to the person (Clarke 1994a, 2014).
The Internet brought with it new potentials for corporations and then government agencies to unmask the individuals that they deal with. The original World Wide Web was structured in a manner inconvenient to business and government, because it contained little scope for organisations to extract data about the individuals communicating with the organisation's web-site. A review of early attempts to subvert the Web's original user-driven orientation is in Clarke (1999). However, as the dot.com investment era boomed, and then c. 2000-01 imploded, 'new rules' were emerging (Kelly 1998).
Web 2.0 was conceived as the means whereby the Web's architecture could be inverted into a marketer's data collection and consumer manipulation framework (O'Reilly 2005). The defining feature of Web 2.0 was originally claimed to be that businesses leverage user-generated content. The common aspect was identified a little differently in Clarke (2008b), as 'syndication':
Since then, the syndication of storage has ebbed, as consumers have been drawn into the cloud computing model, and have abandoned control of their data to service-providers (Clarke 2011). The other three aspects of Web 2.0 syndication, on the other hand, are foundational elements of consumer marketing as it is practised in the second half of the century's second decade.
Until c. 2010, there remained some optimism that consumer marketing would become less intrusive and manipulative, rather than increasing its arrogance and intensity. 'The Cluetrain Manifesto' "nailed 95 theses to the Web", arguing that the new context resulted in consumers gaining market power, and that 'markets are conversations' (Levine et al. 2000). Clarke (2006a, 2006d) discussed strategic approaches to consumer privacy. Clarke (2006c, 2008) investigated the scope for positive marketer-consumer communications. As will be presented shortly, however, the digital surveillance economy for which Web 2.0 laid the foundations has proven to be anything but a two-sided conversation.
It has been clear for many decades that secondary industry was being gradually overhauled by service industries as the engine-room of the economy and the major shaper of society. The term 'knowledge industries' was coined by Machlup (1962). Drucker (1968) took the idea further, writing that "knowledge, during the last few decades, has become the central capital, the cost centre and the crucial resource of the economy" (1968, p.9). However, his focus at that stage was on knowledge within large corporations, and he said little about society as a whole. Meanwhile, some influential authors remained as yet uncertain as to whether or not information was central to future developments. Toffler's 'Future Shock' (1970) considered information and knowledge only in relation to their capacity to overload and overwhelm the individual, and for a decade the common term for the emergent era remained 'the post-industrial society' (Touraine 1971, Bell 1976).
In 1981, however, Masuda depicted 'The Information Society as Post-Industrial Society', and the specialist journal The Information Society journal commenced publication. Toffler, even in 'The Third Wave' (1980), still contained little on the notion of information society (pp.187-189). This proposed that a new form of 'social memory' would arise, which would be "extensive and active" and therefore "propulsive". The 'Megatrends' of Naisbitt (1982), on the other hand, were based on the proposition that "we have ... changed to an economy based on the creation and distribution of information" (p.1). Beniger (1986) and Cawkwell (1987) further popularised the term. The tone of the large majority of the literature on information society in the 1980s was very upbeat, strongly focussed on the opportunities, and with very little consideration of what distributive effects it might have, what threats it might embody, or whether negative impacts might fall unevenly.
A striking aspect of the primary references in information society literature is that the notions of personal data, data surveillance and even privacy are barely mentioned, and in many of the books do not even appear in the index. On the other hand, a strong thread of privacy-related literature had been developing since the mid-1960s, and much of this was motivated by the increasing capabilities of organisations to gather, store, process and apply data about individuals (Miller 1969, Rosenberg 1969, Warner & Stone 1970, Wessell 1974, Flaherty 1984, Flaherty 1989).
Roszak (1986) observed that the purpose of 'computerised surveillance' was "to reduce people to statistical skeletons for rapid assessment" (p.186). He declared himself to be aghast at Toffler's misplaced enthusiasm for, and lack of scepticism about, the new form of 'social memory': "What we confront in the burgeoning surveillance machinery of our society is not a value-neutral technological process ... It is, rather, the social vision of the Utilitarian philosophers at last fully realized in the computer. It yields a world without shadows, secrets or mysteries, where everything has become a naked quantity" (pp.186-7). He provided a caricature of the quality problems of what was then referred to as 'data mining' and is now re-badged as 'big data analytics' and 'data science': "garbage in - gospel out" (p.120).
The earliest uses of the term 'surveillance society' in formal literatures appear to have been by Marx (1985), Flaherty (1988), Weingarten (1988) and Gandy (1989). The majority of the surveillance literature has been contributed by social scientists, and careful definitions of the term are in short supply. However, the sentiments are well summed-up by this early description: "Technology could be offering the tools for building a surveillance society in which everything people do and are as human beings is an open book to those who assert some 'right' to monitor them" (Weingarten 1988, p.747).
The literature has also suffered from the dominance of a single metaphor. The term 'surveillance' was adopted into English c. 1800, derived from the French word for watching, in particular of prisoners. Bentham's late 18th century notion of a (physical) 'panopticon' was co-opted by Foucault (1977) as a virtual notion. Foucault's popularity has been such that discussions of monitoring have been for several decades heavily impregnated with the implicit assumption that surveillance is first and foremost visual in nature, and only secondarily takes other forms. Highly-cited works, such as Lyon (1994) and Norris & Armstrong (1999), embodied that assumption. Despite criticism that "social construction of the surveillance notion has been seriously limited by being rooted in the visual" (Clarke 2000), the metaphor is still in vogue (e.g. Gilliom & Monahan 2012).
Surveillance is usefully defined as the systematic investigation or monitoring of the actions or communications of one or more persons. The purpose of surveillance is commonly to support management of the behaviour of individuals or populations. Surveillance may well deter particular behaviours; but it is a passive measure, and it is an inappropriate stretch of meaning to use the term to also encompass active measures to repress or constrain, or to enforce, particular behaviours.
The breadth of surveillance is far greater than physical observation, as was attested to by such contributions as Rule (1974) and Clarke (1988). In addition to visual surveillance, other forms comprise physical surveillance, physical surveillance at distance, auto-physical surveillance (by devices carried with, on or in the person), dataveillance, communications surveillance (of messages), and electronic surveillance (of space, traffic and movement) (Clarke 2010). For a framework for surveillance analysis, see Clarke (2009b).
The various usages of 'surveillance society' refer to a society in which pervasive monitoring of individuals' behaviour is routine. Justifications advanced by the proponents of surveillance initiatives are customised in order to invoke the 'ideas in good standing' of the particular time and place. Excuses have included combatting organised crime and money-laundering, conducting the 'war on drugs', fighting terrorism, protecting the public against violent crime, preventing social unrest, identifying publishers and users of child pornography, and defending copyright-dependent corporations. The notion of protecting national security is supremely vague and flexible, encompassing not only the many dimensions of national sovereignty interests, but also the protection of critical infrastructure (itself a highly extensible notion), the physical security of individuals deemed by the State to be very important, and the physical security of the general public. A common feature of surveillance initiatives claimed to protect such values is that they are not subjected to evaluation or transparency, nor to adequate controls, nor to audit, and that substantial doubts exist about their justification and proportionality, and the inclusion of appropriate safeguards and mitigation measures (APF 2013).
The media provides continual confirmation of the diversity and intensity of monitoring, and of the burgeoning industry that manufactures and operates surveillance technologies. Much of this is associated with Google and its successors in the Web advertising industry, initially because of its search engine, but also its massive Gmail archive, and its relentless tracking of consumer behaviour (Clarke 2006b, Zimmer 2008). Facebook has also successfully adopted the approach (Davis 2014).
Related to the notion 'surveillance society' is that of the 'surveillance state'. This scope of this phenomenon is not restricted to the old East Germany and developing and under-developed countries ruled by despotic regimes. The Snowden relevations confirmed what was already understood by surveillance-watchers, but that came as a surprise to segments of the media and the public that had wilfully closed their eyes to the evidence. Law enforcement and national security agencies, even in free-world countries, not only behave at the margins of legality, but frequently breach the law as they see fit. The USA and its collaborating allies, including the other members of the 'Five Eyes' group, have long had very substantial electronic surveillance networks in place, in defiance of national laws and the expectations of international diplomacy.
The relevance of the 'surveillance state' to 'surveillance society' and to business is that the trust that governments ought to earn from their citizens has been, and continues to be, undermined by serious misbehaviour; and the distrust in public sector institutions leads inevitably to distrust in corporations as well. One reason for this is the parallel and continual narrative of corporate malperformance (such as grossly inadequate data security, resulting in continual, avoidable data breaches) and of corporate misbehaviour, and in some cases outright lies, in surveillance as in other areas. Another factor is the widespread outsourcing of public sector surveillance to private sector providers, and the 'public-private partnerships' that blur the edges of the (often mandated) personal data collection practices of government agencies and the (mostly at least nominally consensual) personal data collection processes of corporations.
Further evidence of the nervousness in some parts of contemporary society is the recently-formulated notion of 'surveillance capitalism'. This is defined, accessibly in Zuboff (2016) and more formally in Zuboff (2015) as being a "new form of information capitalism [which] aims to predict and modify human behavior as a means to produce revenue and market control" (2015, p.75), and to do so at scale. Pioneered by Google since 2001, "this new market form has quickly developed into the default business model for most online companies and startups ..." (2015, p.81). More succinctly, Zuboff (2016) argues that "Google's profits derive from the unilateral surveillance and modification of human behavior", and, most brutally: "Capitalism has been hijacked by surveillance".
Extending somewhat beyond contemporary Google services, Zuboff depicts surveillance capitalism as "a ubiquitous networked institutional regime that records, modifies, and commodifies everyday experience from toasters to bodies, communication to thought, all with a view to establishing new pathways to monetization and profit [which] supplants the need for contracts, governance, and the dynamism of a market democracy" (pp.81-82).
Zuboff identities the substantial change as occurring c. 2000, as Google (at that stage a company that had yet to turn a profit) sought a way to 'monetise' its dominance of the search-engine marketplace: "[Google's post-2000] advertising model ... depended upon the acquisition of user data as the raw material for proprietary analyses and algorithm production that could sell and target advertising through a unique auction model with ever more precision and success. As Google's revenues rapidly grew, they motivated ever more comprehensive data collection ... Google's business is the auction business, and its customers are advertisers" (Zuboff 2015, p.79).
"[Social media] data are acquired, datafied, abstracted, aggregated, analyzed, packaged, sold, further analyzed and sold again. These data flows have been labeled by technologists as 'data exhaust'" (Zuboff, 2015, p.79). Zuboff (2016) quotes the chief data scientist of another corporation as saying "The goal of everything we do is to change people's actual behavior at scale. When people use our app, we can capture their behaviors, identify good and bad behaviors, and develop ways to reward the good and punish the bad. We can test how actionable our cues are for them and how profitable for us".
"Surveillance capitalism preys on dependent populations who are neither its consumers nor its employees and are largely ignorant of its procedures" (Zuboff 2016). The distance between the consumers of services and the supplier "eliminate[s] the need for, or possibility of, feedback loops between the firm and its populations" (2015, p.80). So there is no possibility even of transparency, let alone recourse. Google deals with advertisers, and ignores the individuals who consume its services, and whose data it exploits.
In Zuboff's view, the social contract inherent in the industrial age has been undermined. So too has the individual contract between consumer and provider. According to (Zuboff, 2015, p.82), Varian argues that people agree to the invasion of privacy if they "get something they want in return ... a mortgage, medical advice, legal advice - or advice from your personal digital assistant'" (Varian 2014, p.30). PEW (2014) quotes Varian as asserting that "Everyone will expect to be tracked and monitored, since the advantages, in terms of convenience, safety, and services, will be so great ... continuous monitoring will be the norm".
Zuboff strongly criticises this new approach, because, whereas industrial capitalism, typified by Henry Ford c. 1910, "helped constitute a broad middle class with steady income growth and a rising standard of living, Google and the 'big data' project represent a break with this past. Its populations are no longer necessary as the source of customers or employees. Advertisers are its customers along with other intermediaries who purchase its data analyses" (2015, p.80). This destroys the balance that enabled industrial-era society to gain sufficient benefits from industrial-era business models, because "deception-induced ignorance is no social contract" (p.86)
Moreover, the new model pays no respect to governmental regulation intended to achieve balance between corporate innovation and social needs. Zuboff (2015, p.80) argues that "Google has not been subject to any meaningful public oversight". Moreover, in a response to criticism in a leading German newspaper (Maier 2014), Google's Chair Eric Schmidt expressed his frustration with the prospect of public oversight, characterizing it as 'heavy-handed regulation' and threatening that it would create 'serious economic dangers' for Europe (Schmidt, 2014)
The purpose of the study reported here is more concerned with economic and social than with political matters, and its analysis is intended to be relevant to business executives and policy-makers, rather than to be broad-ranging and polemical. This paper accordingly treats 'surveillance capitalism' as an adjacent topic rather than adopting it as the frame of reference. Contemporary business models are adapting to the new highly-information-rich context, and hence the appropriate conceptualisation of the research domain for the present study utilises the word 'economy'.
The term 'surveillance economy' has emerged as a means of conveying how the 'direct marketing' mechanisms of two to five decades ago have developed into heavy reliance by corporations on the acquisition of detailed electronic dossiers of information on large numbers of individuals (Davies 1997, Hirst 2013). To date, the term 'surveillance economy' is only sparsely evident in the formal literature, although it has gained some currency in popular media. No reference was found that offered any deep analysis of the notion.
Surveillance that produces machine-readable data, in quantity, about more-or-less everybody, has changed the game. So a reasonable descriptor for that new game is 'the digital surveillance economy'. But what does that glib phase mean, in terms amenable to sober analysis? As at December 2016, Google Scholar found a single, prior usage of the term 'digital surveillance economy'. That usage, published within the Media Studies discipline, is consistent with the interpretation adopted here: "I suggest that there is an emerging understanding on the part of users that the asymmetry and opacity of a 'big data divide' augurs an era of powerful but undetectable and un-anticipatable forms of data mining, contributing to their concern about potential downsides of the digital surveillance economy" (Andrejevic (2014, p.1678).
This section builds on the predecessor notions, identifies the key features of the digital surveillance economy, and lays a foundation for analysis of its implications and of the reactions against it that may arise.
A longstanding aphorism in media theory is 'If you're not paying, you're the product'. For an early usage in the Web context, see Lewis (2010). This notion stands at the heart of the new approach. The foundations for the new economy lie neither in classical nor neo-classical economics (which are relevant to scarce resources), but rather in information economics (which applies to abundant resources). The business school text-book that heralded the revolution was Shapiro & Varian (1999). Its significance is underlined by the fact that Hal Varian was recruited from Berkeley in 2002 to be Chief Economist at Google.
The Varian thesis is that customers' attention is the marketer's stock-in-trade. It is to be traded for profit, and cross-leveraged with business partners. Shapiro & Varian are completely driven by supply-side thinking, and assume that demand is not only disorganised and powerless, but also that it will remain so.
In their theory - and right now in the digital surveillance economy - the interests of consumers have almost no impact on the process, so they can be safely ignored. Relationship marketing, permission-based marketing, interactivity with and participation of consumers, the prosumer concept, customer needs, need satisfaction and consumer alliances are all overlooked. Consumer rights, privacy rights, and laws to protect them might as well not exist. Regulation is merely a barrier to be overcome, and to be replaced by the chimera of 'self-regulation'. The book lacks any vestige of the notions of community, society, equity, discrimination or individual wellbeing. Indeed, not a single one of the terms used in this paragraph even appear in Shapiro & Varian's index.
The techniques for acquiring personal data have become much more sophisticated during the nearly two decades since Shapiro & Varian wrote; but the principle was laid out very clearly: "[marketers'] strategy should be to bribe users to give them the appropriate demographics, which in turn can be passed onto advertisers ... [The game is about] inducing consumers to give [marketers] the information they want. ... we expect that many consumers will be happy to sell information about themselves for a nominal amount ..." (pp.35-36). On the surface, the authors appear to have been oblivious to the fact that, in most civilised countries, both 'bribing' and 'inducing' are criminal acts. Alternatively they regarded the criminal law as being merely an inconvenient regulatory hurdle that business enterprises needed to overcome.
Shapiro & Varian's work has been well-applied, and is evidenced in the following key features of the new forms of consumer marketing business:
The last of those elements is deserving of further attention, because Shapiro & Varian's segment on differential pricing (pp. 7-50) is looming as a particularly important element within the whole. The vast array of personal data that has become available to consumer marketers enables not merely the targeting of advertisements and the manipulation of behaviour through timing and shaping of offers. It enables marketers to move beyond pre-set pricing of their offerings - typically at a price the market as a whole will bear - and instead set a price that each individual will bear. Although some consumers may get a lower price, the technique allows marketers to extract far greater revenue from the market as a whole - just as Shapiro & Varian (1999) recommended: "How do you extract the most value ...? ... First, personalize or customize your product ... Second, establish pricing arrangements that capture as much of that value as possible" (p.32).
Further elements of digital surveillance economics are identified in Appendix 1. This distinguishes data management, commercial and technical aspects of the phenomenon, contexts in which monitoring is undertaken, and spheres of digital life that are impinged upon by contemporary business-at-consumer marketing mechanisms.
As discussed earlier, the digitalisation notion has broad applicability. Whatever the value of digitalisation in other contexts, its application to individuals is being imposed on society, not offered as a choice. Digitalisation features customisation of services to individuals' profiles, enabled by comprehensive digital surveillance of individuals' behaviour. Organisations justify their applications of digital technology on the basis of the convenience that their products and services offer people, and the empirical evidence of people being very easily attracted to adopt and use them. The digital surveillance economy features appeals to individuals' hedonism that are designed to encourage the suspension of disbelief and the suppression of rational decision-making in individuals' own interests.
Building on the ideas in the preceding sections, a range of issues can be identified. The first section focusses on those relevant to individuals, and the second to broader concerns.
Individuals gain from the digital surveillance economy in a hedonistic sense. This includes the convenience of not needing to think about purchases because attractive opportunities continually present themselves, the entertainment value built into the 'customer experience', and perhaps a warm glow arising from the feeling that profit-making corporations understand them.
A few individuals may pay lower prices than they would under the old set-price approach; but most will pay more because the attractiveness of factors other than price (in particular the way in which the offer is presented, and the time it appears) enable the seller to extract premiums from most buyers. A first negative element is therefore the very substantial degree of consumer manipulation inherent in the digital surveillance economy. These concerns have been mainstream since at least the mid-20th century (Packard 1960), but are greatly intensified during the digital era.
Further issues arise in relation to the way in which organisations make decisions about individuals, particularly in such contexts as financial services, insurance and health care services. Discriminatory techniques are used in ways that harm individuals' access, such as blacklisting and 'redlining' - which was originally based on residential address, but is now using a far wider range of socio-demographic data to avoid conducting transactions with individuals deemed on the basis of their digital persona to be unattractive customers, or to inflate the prices quoted to them.
A considerable proportion of the voluminous data that is available is likely to be irrelevant, of low quality and/or of considerable sensitivity. Further, digitalisation involves not only quantification but also automation of decision-making, unmoderated by people. To a considerable degree, transparency has already been lost, partly as a result of digitalisation, and partly because of the application not of 20th century procedural software development tools, but of later-generation software tools whose 'rationale' is obscure, or to which the notion of rationale doesn't even apply (Clarke 1991, 2016). The absence of transparency means that unreasonable and even downright wrong decisions cannot be detected, and recourse is all-but impossible.
In a highly competitive market, this may not matter all that much. In the real world, however, micro-monopolies abound, and it is in the interests of business enterprises to create and sustain them. Serious harm to consumers arises in such circumstances as where all lenders apply the same creditworthiness evaluation technique to the same data, and where an insurer rejects claims based on irrelevant or erroneous data without transparency, and hence without any effective opportunity for challenge.
The intensity of data-holdings in the private sector results in increasing attractiveness of access to those holdings by public sector organisations. Moreover, there are strong tendencies towards sharing between business and government. For example, outsourcing of what have long been regarded as public functions is rife; so-called 'public-private partnerships' blur the boundaries between the sectors; the mantra of 'open access' to government data is being used to dramatically weaken data safeguards; and during the last 15 years law enforcement and national security agencies have been granted demand powers unprecedented in free nations prior to 2001. The risks arising from the digital surveillance economy therefore extend well beyond corporate behaviour, to include government agencies generally.
The ubiquitous connectivity enjoyed by city-dwellers brings with it surveillance that is not only ubiquitous within their living space, but also continual, and for some even continuous. Intensive surveillance chills behaviour, undermines the feeling of personal sovereignty, and greatly reduces the scope for self-determination. The digital surveillance economy accordingly has impacts on the individual, in psychological, social and political terms: "psychic numbing ... inures people to the realities of being tracked, parsed, mined, and modified - or disposes them to rationalize the situation in resigned cynicism" (Hoofnagle et al., 2010). As Zuboff noted (2015, p.84), Varian and Google are confident that psychic numbing facilitates the new business model: " ... these digital assistants will be so useful that everyone will want one, and the statements you read today about them will just seem quaint and old fashioned" (Varian 2014, p. 29).
Many of the risks of chilling and oppression arise because of the likelihood of actions by government agencies. However, the incidence of corporate oppression of employees, customers, critics and individual shareholders has been rising. It is reasonable to expect that the increased social distance between corporations and the public, combined with the arrogance of large corporations in relation to oversight agencies and even to the civil and criminal laws, will see much greater interference by organisations with the behaviour of individuals in the future.
The degree of concern that individuals have about these issues varies greatly. A small proportion, but a significant number, of people, are concerned and even alarmed, as a matter of principle. Some of them carry their concern into the practical world, by seeking out and using means to obfuscate and falsify data and to mislead, corrupt and subvert corporations' data-holdings. At the other extreme, a large proportion, and a very large number, of people have very little understanding of the nature of the digital surveillance economy; and some of those that have at least a vague sense of the issues profess to have little or no concern about them, or argue that they can do nothing about it and hence there's no point in worrying about it. Some proportion of the uninformed and unconcerned do, however, suffer experiences that change their views. A meme that has gained traction in recent times has been associated with the word 'creepiness', (e.g. Preston 2016).
A healthy society depends on the effective chilling of some kinds of behaviour (such as violence, and incitement to violence, against individuals, minorities and the public generally), but also on other kinds of behaviours not being chilled. Creativity in scientific, technological and economic contexts are just as vitally dependent on freedom of speech and action as is creativity in the artistic, cultural, social and political realms.
Considerable concerns have been repeatedly expressed, over an extended period, about the capacity of surveillance to stultify economies, societies and polities. The dramatic increase in surveillance intensity that arrived with the new century gives rise to much higher levels of concern: "Using surveillance to achieve one's aims, no matter how grand or how miniscule, bestows great power. ... Some interests will be served, while others will be marginalised. ... Surveillance coalesces in places where power accumulates, underpinning and enhancing the activities of those who rule and govern. The danger is that surveillance power becomes ubiquitous: embedded within systems, structures and the interests they represent. Its application becomes taken for granted and its consequences go un-noticed. ... it is important that this power ... is wielded fairly, responsibly, and with due respect to human rights, civil liberties and the law. ... Sifting through consumer records to create a profitable clientele means that certain groups obtain special treatment based on ability to pay whereas those deemed `less valuable' fall by the wayside. ... If we are living in a society which relies on surveillance to get things done are we committing slow social suicide?" (SSN 2014).
More forcefully: "the modern age - which began with such an unprecedented and promising outburst of human activity - may end in the deadliest, most sterile passivity history has ever known" (Arendt 1998, p. 322, quoted in Zuboff 2015, p.82). The new business models will drive corporations to ignore and overwhelm governments and polities: "the waves of lawsuits breaking on the shores of the new surveillance fortress are unlikely to alter the behavior of surveillance capitalists. Were surveillance capitalists to abandon their contested practices according to the demands of aggrieved parties, the very logic of accumulation responsible for their rapid rise to immense wealth and historic concentrations of power would be undermined" (Zuboff 2015, p.86).
Art naturally preceded the hard reality. In cyberpunk sci-fi from Gibson (1984) to Stephenson (1992), hypercorps (the successors to transnational corporations) dominate organised economic activity, and have associated with them polite society (the successor to 'corporation man'), the majority of the net and a great deal of information. Outside the official levels of society skulk large numbers of people, in communities in which formal law and order have broken down, and tribal patterns have re-emerged. Officialdom has not been able to sustain the myth that it was in control; society has become ungovernable: "[cyberpunk authors] depict Orwellian accumulations of power ..., but nearly always clutched in the secretive hands of a wealthy or corporate elite" (Brin 1998).
The pattern that this paper has described as the digital surveillance economy "threatens the existential and political canon of the modern liberal order defined by principles of self-determination that have been centuries, even millennia, in the making. I am thinking of matters that include, but are not limited to, the sanctity of the individual and the ideals of social equality; the development of identity, autonomy, and moral reasoning; the integrity of contract, the freedom that accrues to the making and fulfilling of promises; norms and rules of collective agreement; the functions of market democracy; the political integrity of societies; and the future of democratic sovereignty" (Zuboff 2016).
A specific implication that Zuboff extracts from her 'surveillance capitalism' analysis is particularly important to the project presented in this paper: "Nothing short of a social revolt that revokes collective agreement to the practices associated with the dispossession of behavior will alter surveillance capitalism's claim to manifest data destiny ... New interventions [are] necessary ... The future of this narrative will depend upon the indignant scholars and journalists drawn to this frontier project, indignant elected officials and policy makers who understand that their authority originates in the foundational values of democratic communities, and indignant citizens who act in the knowledge that effectiveness without autonomy is not effective, dependency- induced compliance is no social contract, and freedom from uncertainty is no freedom" (Zuboff 2016).
The purpose of this paper has been to lay the foundations for a research programme whereby concerns about the risks inherent in digitalisation of the individual can be investigated, and appropriate responses formulated. This section makes two contributions. It first suggests a set of research questions that reflect alternative future paths that human society can trace following the explosive growth of the digital surveillance economy. The second section then suggests the shape that a research programme might take.
Future developments might follow any of a range of alternative paths. Scenario analysis is an appropriate approach to research into alternative futures (Wack 1985, Schwartz 1991, van Notten et al. 2003, Postma & Leibl 2005). Three potential scenario settings are used below as an organising framework for research questions that arise from the preceding analysis.
In this scenario, the presumption is made that net technologies combine with firstly the weakening of the nation-state, and secondly the self-interest, power and momentum of powerful corporations at the centre of the digital surveillance economy, and hence the new business model establishes itself as the primary determinant not only of economies but also of societies and polities.
Key questions that need investigation are:
An alternative scenario investigates the possibility that the action arising from the digital surveillance economy begets an effective reaction:
The third scenario considers the possibility that the public takes the matter into its own hands:
The analysis in the preceding sections may appear somewhat bold, futuristic and in part apocalyptic. There is, however, some recognition within at least the German Information Systems community of the need for a broadly-based research programme - although the authors' proposals are primarily for the benefit of business and only secondarily for individuals, and they seem to have given little consideration to society and polity: "research on Digital Life requires extensive multidisciplinary activities in mixed research teams with experts from, for instance, psychology, cognitive sciences, or engineering, and co-innovation in cooperation with start-ups, spin-offs, and SMEs" (Hess et al. 2014, p.248).
Scenario analysis was suggested above as an appropriate means of integrating the findings from multiple disciplines and professions. However, scenario analysis depends on a clear description of the current state, an appreciation of environmental and strategic factors that may play a role in future developments, and bodies of theory that suggest particular actions and reactions that may occur under the various contingencies. In order to provide the necessary springboard, existing expertise and theories need to be identified and applied.
A first step towards addressing Research Questions such as those identified in the previous section is to establish what existing research already tells us about the behaviours of individuals and societies under stress, and particularly under surveillance stress. This will enable the identification of gaps in knowledge, the formulation of more specific research questions to fill those gaps, and the conception and conduct of research to provide answers.
The programme can utilise the theories, insights and research techniques of many different disciplines. Some examples of the kinds of research that may be relevant are as follows:
This paper has identified the risks inherent in the digitalisation of the individual. In order to do this, it was necessary to review the insights gained through a succession of conceptualisations of the changing world that the march of information technologies has brought about during the last half-century. The high degree of optimism in the early 'post-industrial society' and 'information society' phases has been tempered, and has become more pessimistic, as 'surveillance society' has developed, the 'digital surveillance economy' has emerged, and 'surveillance capitalism' has been postulated.
During the last 15 years, the digital surveillance economy has come into existence, its impacts are measurable, and its implications can be at least speculated upon. It harbours threats to individuals, to society, to economies and to polities. Whether those threats have the impact that doom-sayers suggest needs to be investigated. And it needs to be investigated sufficiently early that, to the extent that the pessimism is justified, countermeasures can be developed and implemented before the harm becomes irreversible.
Conventional, rigorous, empirical research is useless in the face of such a challenge. Research techniques need to be deployed that are future-oriented and instrumentalist. Researchers are needed who are not terrified by the prospect of embracing normative propositions within their research models. For progress to be made, hard questions need to be asked, and insights must be drawn from the findings and theories of multiple discplines. This paper suggests a research agenda that enables the challenge to be taken up.
Andrejevic M. (2014) 'The Big Data Divide' International Journal of Communication 8 (2014), 1673-1689, at http://espace.library.uq.edu.au/view/UQ:348586/UQ348586_OA.pdf
APF (2013) 'Meta-Principles for Privacy Protection' Australian Privacy Foundation, 2013, at http://www.privacy.org.au/Papers/PS-MetaP.html
Bell D. (1973) 'The Coming of Post Industrial Society' Penguin, 1973
Beniger J.R. (1986) 'The Control Revolution: Technological and Economic Origins of the Information Society' Harvard Uni. Press, Cambridge MA, 1986
Brennen S. & Kreiss D. (2016) 'Digitalization and Digitization' International Encyclopedia of Communication Theory and Philosophy, October 2016, PrePrint at http://culturedigitally.org/2014/09/digitalization-and-digitization/
Brin D. (1998) 'The Transparent Society' Basic Books, 1998
Cawkwell A.E. (Ed.) (1987) 'Evolution of an Information Society' Aslib, London , 1987
Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, PrePrint at http://www.rogerclarke.com/DV/CACM88.html
Clarke R. (1991) 'A Contingency Approach to the Application Software Generations' Database 22, 3 (Summer 1991) 23 - 34, PrePrint at http://www.rogerclarke.com/SOS/SwareGenns.html
Clarke R. (1993) 'Profiling: A Hidden Challenge to the Regulation of Data Surveillance' Journal of Law and Information Science 4,2 (December 1993), PrePrint at http://www.rogerclarke.com/DV/PaperProfiling.html
Clarke R. (1994a) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994) 77-92, PrePrint at http://www.rogerclarke.com/DV/DigPersona.html
Clarke R. (1994b) 'Dataveillance by Governments: The Technique of Computer Matching' Information Technology & People 7,2 (December 1994) 46-85, PrePrint at http://www.rogerclarke.com/DV/MatchIntro.html
Clarke R. (1997) 'Instrumentalist Futurism: A Tool for Examining I.T. Impacts and Implications' Working Paper, Xamax Consultancy Pty Ltd, October 1997, at http://www.rogerclarke.com/DV/InstFut.html
Clarke R. (1999) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report' Proc. 12th Int'l Bled Electronic Commerce Conf., June 1999, PrePrint at http://www.rogerclarke.com/EC/WillPay.html
Clarke R. (2000) 'Technologies of Mass Observation' Mass Observation Movement Forum, Melbourne, October 2000, at http://www.rogerclarke.com/DV/MassObsT.html
Clarke R. (2001) 'If e-Business is Different Then So is Research in e-Business' Invited Plenary, Proc. IFIP TC8 Working Conference on E-Commerce/E-Business, Salzburg, 22-23 June 2001, published in Andersen K.V. et al. (eds.) 'Seeking Success in E-Business' Springer, New York, 2003, PrePrint at http://www.rogerclarke.com/EC/EBR0106.html
Clarke R. (2006b) 'Google's Gauntlets' Computer Law & Security Report 22, 4 (July-August 2006) 287-297, Preprint at http://www.rogerclarke.com/II/Gurgle0604.html
Clarke R. (2006c) 'A Major Impediment to B2C Success is ... the Concept 'B2C'' Invited Keynote, ICEC'06, Fredericton NB, Canada, August 2006, PrePrint at http://www.rogerclarke.com/EC/ICEC06.html
Clarke R. (2006d) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006), PrePrint at http://www.rogerclarke.com/DV/APBD-0609.html
Clarke R. (2008a) 'B2C Distrust Factors in the Prosumer Era' Invited Keynote, Proc. CollECTeR Iberoamerica, Madrid, June 2008, PrePrint at http://www.rogerclarke.com/EC/Collecter08.html
Clarke R. (2008b) 'Web 2.0 as Syndication' Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#, PrePrint at http://www.rogerclarke.com/EC/Web2C.html
Clarke R. (2009a) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, revised version at http://www.rogerclarke.com/ID/IdModel-1002.html
Clarke R. (2009b) 'A Framework for Surveillance Analysis' Xamax Consultancy Pty Ltd, August 2009, at http://www.rogerclarke.com/DV/FSA.html
Clarke R. (2010) 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25, PrePrint at http://www.rogerclarke.com/DV/RNSA07.html
Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, PrePrint at http://www.rogerclarke.com/EC/CCC.html
Clarke R. (2014) 'Promise Unfulfilled: The Digital Persona Concept, Two Decades Later' Information Technology & People 27, 2 (Jun 2014) 182 - 207, PrePrint at http://www.rogerclarke.com/ID/DP12.html
Clarke R. (2015) 'Quasi-Empirical Scenario Analysis and Its Application to Big Data Quality' Proc. 28th Bled eConference, Slovenia, 7-10 June 2015, PrePrint at http://www.rogerclarke.com/EC/BDSA.html
Clarke R. (2016) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at http://www.rogerclarke.com/EC/BDQAS.html
Davies S. (1997) 'Time for a byte of privacy please' Index on Censorship 26, 6 (1997) 44-48
Davis B. (2014) 'A guide to the new power of Facebook advertising' Econsultancy, 29 May 2014, at https://econsultancy.com/blog/64924-a-guide-to-the-new-power-of-facebook-advertising/
Drucker P.F. (1968) 'The Age of Discontinuity' Pan Piper, 1968
Drucker P.F. (1993) 'Post-Capitalist Society' Harper Business, 1993
Dunn E.S. (1967) 'The idea of a national data center and the issue of personal privacy' The American Statistician, 1967
Flaherty D.H. (1984) 'Privacy and Data Protection: An International Bibliography' Mansell, 1984
Flaherty D.H. (1988) 'The emergence of surveillance societies in the western world: Toward the year 2000' Government Information Quarterly 5, 4 (1988) 377-387
Flaherty D.H. (1989) 'Protecting Privacy in Surveillance Societies' Uni. of North Carolina Press, 1989
Foucault M. (1977) 'Discipline and Punish: The Birth of the Prison' Pantheon, 1977
Gandy O.H. (1989) 'The Surveillance Society: Information Technology and Bureaucratic Social Control' Journal of Communication 39, 3 (September 1989) 61-76
Gandy O.H. (1993) 'The Panoptic Sort. Critical Studies in Communication and in the Cultural Industries' Westview, Boulder CO, 1993
Gibson W. (1984) 'Neuromancer' Grafton/Collins, London, 1984
Gilliom J. & Monahan T. (2012) 'SuperVision: An Introduction to the Surveillance Society' University of Chicago Press, 2012
Gray P. & Hovav A. (2008) 'From Hindsight to Foresight: Applying Futures Research Techniques in Information Systems' Communications of the Association for Information Systems 22, 12 (2008), at http://aisel.aisnet.org/cais/vol22/iss1/12
Hazen B.T., Boone C.A., Ezell J.D. & Jones-Farmer L.A. (2014) 'Data Quality for Data Science, Predictive Analytics, and Big Data in Supply Chain Management: An Introduction to the Problem and Suggestions for Research and Applications' International Journal of Production Economics 154 (August 2014) 72-80, at https://www.researchgate.net/profile/Benjamin_Hazen/publication/261562559_Data_Quality_for_Data_Science_Predictive_Analytics_and_Big_Data_in_Supply_Chain_Management_An_Introduction_to_the_Problem_and_Suggestions_for_Research_and_Applications/links/0deec534b4af9ed874000000
Hess T., Legner C., Esswein W., Maaß W., Matt, C., Österle H., Schlieter H., Richter P. & Zarnekow, R. (2014) 'Digital Life as a Topic of Business and Information Systems Engineering?' Business & Information Systems Engineering 6, 4 (2014) 247-253
Hirst M. (2013) `Someone's looking at you: welcome to the surveillance economy' The Conversation, 26 July 2013, at http://theconversation.com/someones-looking-at-you-welcome-to-the-surveillance-economy-16357
Hoofnagle C.J., King J., Li S. & Turow J. (2010) 'How different are young adults from older adults when it comes to information privacy attitudes and policies?' SSRN, April 2010, at http://www.ssrn.com/ abstract=1589864
Kelly K. (1998) 'New Rules for the New Economy' Penguin, 1998, at http://kk.org/books/KevinKelly-NewRules-withads.pdf
Kistermann F.W. (1991) 'The Invention and Development of the Hollerith Punched Card: In Commemoration of the 130th Anniversary of the Birth of Herman Hollerith and for the 100th Anniversary of Large Scale Data Processing' IEEE Annals of the History of Computing 13, 3 (July 1991) 245 - 259
Land F. (2000) 'The First Business Computer: A Case Study in User-Driven Innovation' J. Annals of the Hist. of Computing 22, 3 (July-September 2000) 16-26
Larsen E. (1992) 'The Naked Consumer: How Our Private Lives Become Public Commodities' Henry Holt and Company, New York, 1992
Levine R., Locke C., Searls D. & Weinberger D. (2000) 'The Cluetrain Manifesto: The End of Business as Usual' Perseus, 2000, summary at http://www.cluetrain.com/
Lewis A. (2010) 'If you are not paying for it, you're not the customer; you're the product being sold' Metafile, August 2010, at http://www.metafilter.com/95152/Userdriven-discontent#3256046
Lycett, M. (2013), Datafication: Making Sense of (Big) Data in a Complex World, European Journal of Information Systems, 22, 4 (December 2014) 381-386, at http://v-scheiner.brunel.ac.uk/bitstream/2438/8110/2/Fulltext.pdf
Lyon D. (1994) 'The electronic eye: The rise of surveillance society' Polity Press, 1994
Lyon D. (2001) 'Surveillance society' Open University Press, 2001
Machlup F. (1962) 'Production and Distribution of Knowledge in the United States' Princeton University Press, 1962
Maier R.M. (2014) 'Angst vor Google: Von der Suchmaschine zur Weltmacht' Frankfurter Allgemeine Zeitung, 3 April 2014, at http://www.faz.net/aktuell/feuilleton/debatten/weltmacht-google-ist-gefahr-fuer-die-gesellschaft-12877120.html?printPagedArticle=true#pageIndex_2
Majchrzak A. & Markus M.L. (2014) 'Methods for Policy Research: Taking Socially Responsible Action' Sage, 2nd Edition, 2014
Marx G.T. (1985) 'The Surveillance Society: The Threat Of 1984-Style Techniques' The Futurist, 1985
Masuda Y. (1981) 'The Information Society as Post-Industrial Society' World Future Society, Bethesda, 1981
Miller A.R. (1969) 'Computers and Privacy' Michign L. Rev. 67 (1969) 1162-1246
Naisbitt J. (1982) 'Megatrends' London, Macdonald, 1982
Negroponte N. (1995) 'Being Digital' Hodder & Stoughton, 1995
Niederman F, Applegate L., Beck R., Clarke R., King J.L. & Majchrzak A. (2016) 'IS Research and Policy' Notes from the 2015 ICIS Senior Scholar's Forum, Forthcoming Commun. Assoc. Infor. Syst., 2016
Normann R (2001) 'Reframing business: When the map changes the landscape' Wiley, 2001
Norris C. & Armstrong G. (1999) 'The maximum surveillance society: The rise of CCTV' Berg Publishers, 1999
van Notten P.W.F., Rotmans J., van Asselt M.B.A. & Rothman D.S. (2003) 'An Updated Scenario Typology' Futures 35 (2003) 423-443, at http://ejournal.narotama.ac.id/files/An%20updated%20scenario%20typology.pdf
Oesterle H. (2014) 'Business oder Life Engineering?' HMD Praxis der Wirtschaftsinformatik 51, 6 (December 2014) 744-761, at http://link.springer.com/article/10.1365/s40702-014-0097-x
O'Reilly T. (2005) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly 30 September 2005, at http://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html
Osler M., Lund R. Kriegbaum M., Christensen U. & Andersen A.-M.N. (2006) 'Cohort Profile: The Metropolit 1953 Danish Male Birth Cohort' Int. J. Epidemiol. 35, 3 (June 2006) 541-545, at http://ije.oxfordjournals.org/content/35/3/541.long
Packard V. (1960) 'The Hidden Persuaders' Penguin 1960
PEW (2014) 'Digital life in 2015' PEW Research Center, at http://www.pewinternet.org/2014/03/11/digital-life-in-2025/
Postma T.J.B.M. & Leibl F. (2005) 'How to improve scenario analysis as a strategic management tool?' Technological Forecasting & Social Change 72 (2005) 161-173, at https://www.researchgate.net/profile/Theo_Postma/publication/222668211_How_to_improve_scenario_analysis_as_a_strategic_management_tool/links/09e41508117c39a2c7000000.pdf
Preston R. (2016) 'Are These Employer Data Practices Creepy? It Depends...' Forbes, 8 November 2016, at http://www.forbes.com/sites/oracle/2016/11/08/are-employer-data-practices-creepy-it-depends/
Rosenberg J.M. (1969) 'The Death of Privacy' Random House, 1969
Roszak T. (1986) 'The Cult of Information' Pantheon, 1986
Rule J.B. (1974) 'Private Lives and Public Surveillance: Social Control in the Computer Age' Schocken Books, 1974
Schmidt E. (2014) 'A chance for growth' Frankfurter Allgemeine Zeitung, 9 April 2014, at http://www.faz.net/aktuell/feuilleton/debatten/eric-schmidt- about-the-good-things-google-does-a-chance-for-growth-12887909.html
Schwartz P. (1991) 'The Art of the Long View: Planning for the Future in an Uncertain World' Doubleday, 1991
Shapiro C. & Varian H.R. (1999) 'Information Rules: A Strategic Guide to the Network Economy' Harvard Business School Press, 1999
SSN (2014) 'An introduction to the surveillance society', Surveillance Studies Network, 2014, at http://www.surveillance-studies.net/?page_id=119
Stephenson N. (1992) 'Snow Crash' Bantam, 1992
Toffler A. (1970) 'Future Shock' Pan, 1970
Toffler A. (1980) 'The Third Wave' Collins, London, 1980
Touraine A. (1971) 'The Post-Industrial Society: Tomorrow's Social History' Random House, 1971
Varian H.R. (2014) 'Beyond Big Data' Business Economics 49, 1 (2014) 27-31
Wack P. (1985) 'Scenarios: Uncharted Waters Ahead' Harv. Bus. Rev. 63, 5 (September-October 1985) 73-89
Warner, M., and Stone, M. (1970) 'The Data Bank Society: Organisations, Computers and Social Freedom' George Allen and Unwin, 1970
Weingarten F.W. (1988) 'Communications Technology: New Challenges to Privacy' J. Marshall L. Rev. 21, 4 (Summer 1988) 735
Wessell, M.R. (1974) 'Freedom's Edge: The Computer Threat to Society' Addison- Wesley, Reading, Mass., 1974
Zimmer M. (2008) 'The Gaze of the Perfect Search Engine: Google as an Infrastructure of Dataveillance' in Spink A. & Zimmer M. (eds.) 'Web Search: Multidisciplinary Perspectives' Springer, 2008, pp. 77-102, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.457.8268&rep=rep1&type=pdf#page=83
Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30, 75-89, at https://cryptome.org/2015/07/big-other.pdf
Zuboff S. (2016) 'The Secrets of Surveillance Capitalism' Frankfurter Allgemeine, 3 May 2016, at http://www.faz.net/aktuell/feuilleton/debatten/the-digital-debate/shoshana-zuboff-secrets-of-surveillance-capitalism-14103616.html?printPagedArticle=true#pageIndex_2
After (Oesterle 2014)
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Computer Science at the Australian National University.
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.
Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916
Created: 7 December 2016 - Last Amended: 21 December 2016 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/InDigR.html