Roger Clarke's Web-Site

 

© Xamax Consultancy Pty Ltd,  1995-2017


Roger Clarke's 'Social Media: Trust, Distrust & Privacy'

Privacy and Social Media: An Analytical Framework

Version of 18 November 2013

Published in Journal of Law, Information and Science 23,1 (April 2014) 1-23

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2012-13

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/DV/SMTD.html

The prior version of 12 May 2013 is at http://www.rogerclarke.com/DV/SMTD-13.html

The prior version of 19 April 2012 is at http://www.rogerclarke.com/DV/SMTD-120419.html


Abstract

Social media services offer users tools for interaction, publishing and sharing, but in return demand exposure of users' selves and of personal information about the members of their social networks. The Terms of Service imposed by providers are uniformly privacy-hostile. The practice of social media exhibits a great many distrust influencers, and some are sufficiently strong that they constitute distrust drivers. This paper presents an analytical framework whereby designers of social media services can overcome user distrust and inculculate user trust.


Contents


1. Introduction

Social media is a collective term for a range of services that support users in exchanging content and pointers to content, but in ways that are advantageous to the service-provider. The services emerged in conjunction with the 'Web 2.0' and 'social networking' notions, during 2004-05 (O'Reilly 2005).

As shown by Best (2006) and Clarke (2008b), there was little terminological clarity or coherence about 'Web 2.0'. Similarly, understanding of what social media encompasses remains somewhat rubbery, e.g. "Social Media is a group of Internet-based applications that build on the ideological and technological foundations of Web 2.0, and that allow the creation and exchange of User Generated Content" (Kaplan & Haenlein 2010, p.61). Those authors did, however, apply theories in the field of media research (social presence, media richness) and social processes (self-presentation, self-disclosure), in order to propose the classification scheme in Exhibit 1.

Exhibit 1: Kaplan & Haenlein (2010)'s Classification of Social Media

Social network analysis is a well-established approach to modelling actors and the 'ties' or linkages among them. It generally emphasises the importance of the linkages, and underplays or even ignores the attributes of the actors and other aspects of the context within which each actor operates (Otte & Rousseau 2002). Various forms of graphing and inferencing techniques have been harnessed by social media service-providers. In a social media context, networks may be based on explicit linkages such as bookmarking, 'friending', following, 'liking' / +1 and endorsing, on communications linkages such as email and chat messages, or on implicit linkages such as tagging, visiting the same sites, purchasing the same books, etc.

The motivation for social media services-providers is to attract greater traffic on their sites. They therefore actively seek network effects (Katz & Shapiro 1994) based on positive feedback loops, preferably of the extreme variety referred to as 'bandwagon effects'. These depend on the generation of excitement, and the promotion of activities of individuals to large numbers of other individuals, based on both established linkages and inferred affinities. An important element of commercially successful social media is the encouragement of exhibitionism by a critical mass of users, in order to stimulate voyeuristic behaviour (in the sense of gratification through observation) by many users. For the provider, exhibitionism of self is useful, but the exposure of others is arguably even more valuable (Adams 2013).

Another insight from social network theory is that links among content-items are important to providers not so much because they provide a customer service, but because they support social linkages (Hendler & Golbeck 2008). The major social media services place much less emphasis on enabling close management of groups such as 'cliques' (tightly-knit groups) and 'social circles' (looser groups), because activity within small sub-nets serves the provider's interests far less well than open activity. This has significant negative implications for the privacy of the many individuals who participate in social circles or cliques whose norm is to respect the confidentiality of transactions occurring within those groups.

Consumer marketing companies generally perceive the need to convey corporate image and messages, shape consumer opinions, and build purchase momentum; and social media has been harnessed to those purposes. The Kaplan & Haenlein classification scheme is a good fit to the perspectives of both service-providers and consumer-facing corporations. On the other hand, through its commitment to the 'consumer as prey' tradition (Clarke 2002a, 2008a, Deighton & Kornfeld 2009), the scheme fails to reflect the interests of the users who social media services exploit. A classification scheme would better serve users of social media if it focussed on its features and affordances in the areas of human interaction, content broadcast and content collaboration (Clarke 2013).

Social media services offer varying mixes of features. In (Clarke 2013), these were classified as:

* interaction tools (e.g. email, chat/IM, SMS, voice and video-conferencing)

* broadcast tools (e.g. web-pages and their closed equivalent of 'wall-postings', blogs, micro-blogs, content communities for images, videos and slide-sets, and locations), and

* sharing tools (e.g. wikis, social bookmarking, approvals and disapprovals, social gaming)

A review of social media services since 2000 identifies waves that have been centred on, successively, blogs (e.g. Wordpress, Blogspot, LiveJournal), voice-calls (Skype), social gaming (Friendster, Zynga), virtual worlds (Second Life), social bookmarking (Delicious), approvals (Digg, Reddit), video communities (YouTube, Vine), image communities (Flickr, Picasa, Instagram, Pinterest), social networking (Plaxo, LinkedIn, Xing, Orkut, Facebook, Buzz, Google+), micro-blogs (Twitter, Tumblr), location (Foursquare, Latitude) and back to messaging (WhatsApp, Snapchat). Some social media services have proven to be short-lived fads. Some appear to be instances of longer-lived genres, although, even in these cases, waves of differently-conceived services have been crashing over one another in quick succession. Some aspects may mature into long-term features of future services, because they satisfy a deeper human need rather than just a fashion-driven desire.

Consumers' hedonistic desires may be well-served by contemporary social media services. On the other hand, a proportion of users understands that they are being exploited by social media service-providers. The boldness and even arrogance of many of those providers has given rise to a growing body of utterances by influential commentators, which has caused a lot more users to become aware of the extent of the exploitation (Petersen 2008, 2012, at O'Connor 2012, Rey 2012). Consumer and privacy concerns are legion. Combined with other factors, these concerns are giving rise to doubts about whether sufficient trust exists for the first decade's momentum in social media usage to be sustained (Marks 2013, SMF 2013).

Privacy has long loomed as a strategic factor for both corporations and government agencies (Peppers & Rogers 1993, Clarke 1996, Levine et al. 1999, Cavoukian & Hamilton 2002, 2006b). Commitedly privacy-friendly social media services have been conspicuous by their absence, however. Attempts at consumer-oriented social media, such as Diaspora, duuit, Gnu social and Open Social, have all faltered. Snapchat nominally supports ephemeral (`view once') messaging, but it has been subject to a formal complaint that its published claims are constructively misleading (Guynn 2013).

In Clarke (2008a), it was suggested that the long wait for the emergence of the 'prosumer' may be coming to a close. That term was coined by Toffler in 1980, to refer to a phenomenon he had presaged 10 years earlier (Toffler 1970 pp. 240-258; Toffler 1980 pp. 275-299, 355-356, 366, 397-399). The concept was revisited in Tapscott & Williams (2006), applied in Brown & Marsden (2013), and has been extended to the notion of `produser' in Bruns (2009). A prosumer is a consumer who is proactive (e.g. is demanding, and expects interactivity with the producer) and/or a producer as well as a consumer (e.g. one who expects to be able to exploit digital content for mashups). To the extent that a sufficient proportion of consumers do indeed mature into prosumers, consumer dissatisfaction with untrustworthy social media service-providers can be expected to rise, and to influence consumers' choices.

The need therefore exists for means whereby the most serious privacy problems can be identified, and ways can be devised to overcome them. This paper commences with a review of the nature of privacy, followed by application of the core ideas to social media. The notion of trust is then considered, and operational definitions are proposed for a family of concepts. Implications for social media design are drawn from that analytical framework, including constructive proposals for addressing privacy problems to the benefit of consumers and service-providers alike. Research opportunities arising from the analysis are identified.


2. Privacy and Social Media

This section commences by summarising key aspects of privacy. It then applies them to social media.

2.1 Privacy

Privacy is a multi-dimensional construct rather than a concept, and hence definitions are inevitably contentious. Many of the conventional approaches are unrealistic, serve the needs of powerful organisations rather than those of individuals, or are of limited value as a means of guiding operational decisions. Other treatments include those of Hirshleifer (1980), Schoeman (1984), Lindsay (2005) and Nissenbaum (2009).

The approach adopted here is to adopt a definition of privacy as an interest, to retain its positioning as a human right, to reject the attempts on behalf of business interests to reduce it to a mere economic right (Posner 1981, Acquisti 2004), and to supplement the basic definition with a discussion of the multiple dimensions inherent in the construct.

The following definition is of long standing (Morison 1973, Clarke 1996, 1997, 2006a):

Privacy is the interest that individuals have in sustaining a 'personal space', free from interference by other people and organisations

A weakness in discussions of privacy throughout the world is the limitation of the scope to data protection / Datenschutz. It is vital to recognise that privacy is a far broader notion than that. The four-dimensional scheme below has been in consistent use since at least Clarke (1996, 1997), and a fifth dimension is tentatively added at the end of the discussion.

(1) Privacy of the Physical Person

This is concerned with the integrity of the individual's body. At its broadest, it extends to freedom from torture and the right to medical treatment. Issues include compulsory immunisation, imposed treatments such as lobotomy and sterilisation, blood transfusion without consent, compulsory provision of samples of body fluids and body tissue, requirements for submission to biometric measurement, and, significantly for social media, risks to personal safety arising from physical, communications or data surveillance, or location and tracking (Clarke 1999b, Jones 2003, boyd 2008, Clarke & Wigan 2011, Michael & Clarke 2013).

(2) Privacy of Personal Behaviour

This is the interest that individuals have in being free to conduct themselves as they see fit, without unjustified surveillance. Particular concern arises in relation to sensitive matters such as sexual preferences and habits, religious practices, and political activities. Some privacy analyses, particularly in Europe, extend this discussion to personal autonomy, liberty and the right to self-determination.

Intrusions into this dimension of privacy have a chilling effect on social, economic and political behaviour. The notion of 'private space' is vital to all aspects of behaviour and acts. It is relevant in 'private places' such as the home and toilet cubicles, and in 'public places', where casual observation by the few people in the vicinity is very different from systematic observation and recording. The recent transfer of many behaviours from physical to electronic spaces has enabled marked increases in behavioural surveillance and consequential threats to privacy, because:

(3) Privacy of Personal Communications

Individuals want, and need, the freedom to communicate among themselves, using various media, without routine monitoring of their communications by other persons or organisations. Issues include mail 'covers', use of directional microphones and 'bugs' with or without recording apparatus, and telephonic interception. The recent transfer of many behaviours from physical to electronic spaces has enabled marked increases in communications surveillance and consequential threats to privacy. This is because:

(4) Privacy of Personal Data

This is referred to variously as 'data privacy' and 'information privacy', and regulatory measures are referred to as `data protection'. Individuals claim that data about themselves should not be automatically available to other individuals and organisations, and that, even where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. Many analyses are almost entirely limited to this dimension, including Solove (2006, 2008).

Recent developments have substantially altered longstanding balances, and created new risks, in particular:

Underpinning the dramatic escalation of privacy threats since about 1995 has been greatly intensified opposition by organisations to anonymity and even pseudonymity, and greatly increased demands for all acts by all individuals to be associated with the individual's 'real identity' (e.g. Krotoski 2012). As a result of decreased anonymity, content digitisation, cloud services, government intrusions and copyright-protection powers, consideration now needs to be given to adding one or more further dimensions (Finn et al. 2013). The following is a likely candidate fifth dimension:

(5) Privacy of Personal Experience

Individuals reasonably feel threatened by surveillance of their reading and viewing, inter-personal communications and electronic social networks, and of their physical meetings and electronic associations with other people. A significant industry has developed that infers individuals' interests and attitudes from personal data mined from a wide array of sources.

The recording of previously private library-search transactions (through web search-engine logs), book-purchases (through eCommerce logs) and reading activities (through eBook logs and licensing databases) is striking far more deeply inside the individual's psyche than ever before, enabling much more reliable inferencing about the person's interests, formative influences and attitudes. Concerns have been evident for many years, in the form of arguments for "a right to read anonymously" (Cohen 1996), and a broader "right to experience intellectual works in private - free from surveillance" (Greenleaf 2003). The diversity and intensity of threats makes it necessary to recognise the cluster of issues as constituting a fifth dimension of privacy.

The increased concerns evident in the European Union in relation to US corporate and government abuses of the personal data of Europeans (EC 2012) appears to embody recognition of not only the privacy of personal communications, data and behaviour, but to some extent also of the privacy of personal experience.

2.2 Privacy Applied to Social Media

At an early stage, commentators identified substantial privacy threats inherent in Web 2.0, social networking services and social media generally (e.g. (Clarke 2004), Harris 2006, Barnes 2006, Clarke 2008b). Although privacy threats arise in relation to all categories of social media, social networking services (SNS) are particularly rich both in inherent risks and in aggressive behaviour by service-providers. This section accordingly pays particular attention to SNS.

One of the earliest SNS, Plaxo, was subjected to criticism at the time of its launch (Clarke 2004). Google had two failures - Orkut and Buzz - before achieving moderate market penetration with Google+. All three have been roundly criticised for their serious hostility to the privacy not only of their users but also of people exposed by their users (e.g. boyd 2008, Helft 2010, Waugh 2012, Bell 2012). Subsequently, Google's lawyers have argued that users of Gmail, and by implication of all other Google services "have no legitimate expectation of privacy" (RT 2013).

However, it is difficult not to focus on Facebook, not so much because it has dominated many national markets for SNS for several years, but rather because it has done so much to test the boundaries of privacy abuse. Summaries of its behaviour are in Bankston (2009), Opsahl (2010), NYT (2010), McKeon (2010), boyd & Hargittai 2010, BBC (2011). Results of a survey are reported in Lankton & McKnight (2011).

After 5 years of bad behaviour by Facebook, Opsahl (2010) summarised the situation as follows: "When [Facebook] started, it was a private space for communication with a group of your choice. Soon, it transformed into a platform where much of your information is public by default. Today, it has become a platform where you have no choice but to make certain information public, and this public information may be shared by Facebook with its partner websites and used to target ads".

The widespread publication of several epithets allegedly uttered by Facebook's CEO Zuckerberg, have reinforced the impression of exploitation, particularly "We are building toward a web where the default is social" (Shiels 2010) and "They 'trust me'. Dumb f..ks" (Carlson 2010). These were exacerbated by the hypocrisy of Zuckerberg's marketing executive and sister in relation to the re-posting of a photograph of her on Twitter, documented in Adams (2013). Some of the most memorable quotations in relation to privacy and social media have been self-serving statements by executives in the business, such as Scott McNealy's "You have zero privacy anyway. Get over it" (Sprenger 1999) and Eric Schmidt's "If you have something that you don't want [Google and its customers] to know, maybe you shouldn't be doing it in the first place" (Esguerra 2009).

The conclusion reached in 2012 by a proponent of social media was damning: "[Social networking services] profit primarily by using heretofore private information it has collected about you to target advertising. And Zuckerberg has repeatedly made sudden, sometimes ill conceived and often poorly communicated policy changes that resulted in once-private personal information becoming instantly and publicly accessible. As a result, once-latent concerns over privacy, power and profit have bubbled up and led both domestic and international regulatory agencies to scrutinize the company more closely ... The high-handed manner in which members' personal information has been treated, the lack of consultation or even communication with them beforehand, Facebook's growing domination of the entire social networking sphere, Zuckerberg's constant and very public declarations of the death of privacy and his seeming imposition of new social norms all feed growing fears that he and Facebook itself simply can not be trusted" (O'Connor 2012).

Social media features can be divided into three broad categories. In the case of human interaction tools, users generally assume that conversations are private, and in some cases are subject to various forms of non-disclosure conventions. However, social media service-providers sit between the conversation-partners and in almost all cases store the conversation - thereby converting ephemeral communications into archived statements - and give themselves the right to exploit the contents. Moreover, service-providers strive to keep the messaging flows internal, and hence exploitable by and only by that company, rather than enabling their users to take advantage of external and standards-based messaging services such as email.

The second category of social media features, content broadcast, by its nature involves publication. Privacy concerns still arise, however, in several ways. SNS providers, in seeking to capture 'wall-postings', encourage and in some cases even enforce self-exposure of profile data. A further issue is that service-providers may intrude into users' personal space by monitoring content with a view to punishing or discriminating against individuals who receive particular broadcasts, e.g. because the content is in breach of criminal or civil law, is against the interests of the service-provider, or is deemed by the service-provider to be in breach of good taste. The primary concern, however, is that the individual who initiates the broadcast may not be able to protect their identity. This is important where the views may be unpopular with some organisations or individuals, particularly where those organisations or individuals may represent a threat to the person's safety. It is vital to society that 'whistleblowing' be possible. There are contrary public interests, in particular that individuals who commit serious breaches such as unjustified disclosure of sensitive information, intentionally harmful misrepresentation, and incitement to violence, be able to be held accountable for their actions. That justifies the establishment of carefully constructed forms of strong pseudonymity; but it does not justify an infrastructure that imposes on all users the requirement to disclose, and perhaps openly publish, their 'real identity'.

The third category of social media features, content collaboration, overlaps with broadcasting, but is oriented towards shared or multi-sourced content rather than sole-sourced content. SNS providers exploit the efforts of content communities - an approach usefully referred to as `effort syndication' (Clarke 2008b). This gives rise to a range of privacy issues. Even the least content-rich form, indicator-sharing, may generate privacy risk for individuals, such as the casting of a vote in a particular direction on some topic that attracts opprobrium (e.g. paedophilia, racism or the holocaust, but also criticisms of a repressive regime).

The current, somewhat chaotic state of social media services involves significant harm to individuals' privacy, which is likely to progressively undermine suppliers' business models. Constructive approaches are needed to address the problems.


3. An Analytical Framework

This section proposes a basis for a sufficiently deep understanding of the privacy aspects of social media, structured in a way that guides design decisions. It first reviews the notion of trust. In both the academic and business literatures, the focus has been almost entirely on the positive notion of trust, frequently to the complete exclusion of the negative notion of distrust. It is argued here that both concepts need to be understood and addressed. A framework and definitions are proposed that enable the varying impacts of trust-relevant factors to be recognised and evaluated.


3.1 Concepts

Trust originates in family and social settings, and is associated with cultural affinities and inter-dependencies. This paper is concerned with factors that relate to the trustworthiness or otherwise of other parties to a social or economic transaction. Trust in another party is associated with a state of willingness to expose oneself to risks (Schneier 2012, particularly Chapter 4). Rather than treating trust as "expectation of the persistence and fulfilment of the natural and the moral orders" (Barber 1983), Yamagishi (2011) distinguishes "trust as expectations of competence" and "trust as expectations of intention" (p.28). The following is proposed as an operational definition relevant to the contexts addressed in this paper (Clarke 2002a, 2002b):

Trust is confident reliance by one party on the behaviour of one or more other parties

The importance of trust varies a great deal, depending on the context. Key factors include the extent of the risk exposure, the elapsed time during which the exposure exists, and whether insurance is available, affordable and effective. Trust is most critical where a party has little knowledge of the other party, or far less power than the other party.

The trust concept has been applied outside its original social setting. Of relevance to the present paper, it is much-used in economic contexts: "From a rational prespective, trust is a calculation of the likelihood of future cooperation" (Kramer & Tyler 1996 p.4, citing Williamson 1993). This paper is concerned with the trustworthiness or otherwise of a party to a transaction, rather than, for example, of the quality of a tradeable item, of its fit to the consumer's need, of the delivery process, or of the infrastructure and institutions on which the conduct of the transaction depends. The determinants of trust have attracted considerable attention during the two decades after 1994, as providers of web-commerce services endeavoured to overcome impediments to the adoption of electronic transactions between businesses and consumers, popularly referred to as B2C eCommerce. The B2C eCommerce literature contains a great many papers that refer to trust, and that investigate very specific aspects of it (e.g. Lee & Turban 2001).

It is feasible for cultural affinities to be achieved in some B2C contexts. For example, consumers' dealings with cooperatives, such as credit unions / Kreditgenossenschaften, may achieve this, because 'they are us'. On the other hand, it is infeasible for for-profit corporations to achieve anything more than an ersatz form of trust with their customers, because corporations law demands that priority be given to the interests of the corporation above all other interests.

The bases on which a proxy for positive trust can be established in B2C eCommerce were analysed in Clarke (2002a). Trust may arise from a direct relationship between the parties (such as a contract, or prior transactions); or from experience (such as a prior transaction, a trial transaction, or vicarious experience). When such relatively strong sources of trust are not available, it may be necessary to rely on 'referred trust', such as delegated contractual arrangements, 'word-of-mouth', or indicators of reputation. Ignoring the question of preparedness to transact, empirical evidence from studies of eBay transactions suggests that "the quality of a seller's reputation has a consistent, statistically significant, and positive impact on the price of the good" (Melnik & Alm 2002, p.337). The fact that reputation's impact on price "tends to be small" (p.349) is consistent with the suggestion that reputation derived from reports by persons unknown is of modest rather than major importance.

Mere brandnames are a synthetic and ineffective basis for trust. The weakest of all forms is meta-brands, such as 'seals of approval' signifying some form of accreditation by organisations that are no better-known than the company that they purport to be attesting to (Clarke 2001a). The organisations that operate meta-brands generally protect their own interests with considerably more vigour than they do the interests of consumers.

Where the market power of the parties to a transaction is reasonably balanced, the interests of both parties may be reflected in the terms of the contracts that they enter into. This is seldom the case with B2C commerce generally, however, nor with B2C eCommerce in particular. It is theoretically feasible for consumers to achieve sufficient market power to ensure balanced contract terms, in particular through amalgamation of their individual buying-power, or through consumer rights legislation. In practice, however, an effective balance of power seldom arises.

Many recent treatments of trust in B2C eCommerce adopt the postulates of Mayer et al. (1995), to the effect that the main attributes underlying a party's trustworthiness are ability, integrity and benevolence - to which web-site quality has subsequently been added. See for example Lee & Turban (2001) and Salo & Karjaluoto (2007). However, it appears unlikely that a body of theory based on the assumption of altruistic behaviour by for-profit organisations is capable of delivering much of value. In practice, consumer marketing corporations seek to achieve trust by contriving the appearance of cultural affinities. One such means is the offering of economic benefits to frequent buyers, but projecting the benefits using not an economic term but rather one that invokes social factors: 'loyalty programs'. Another approach is to use advertising, advertorials and promotional activities to project socially positive imagery onto a brandname and logo. To be relevant to the practice of social media, research needs to be based on models that reflect realities, rather than embody imaginary constructs such as 'corporate benevolence'.

During the twentieth century's 'mass marketing' and 'direct marketing' eras, consumer marketing corporations were well-served by the conceptualisation of consumer as prey. That stance continued to be applied when B2C eCommerce emerged during the second half of the 1990s. That the old attitude fitted poorly to the new context was evidenced by the succession of failed initiatives documented in Clarke (1999a). These included billboards on the information superhighway (1994-95), closed electronic communities (1995-97), push technologies (1996-98), spam (1996-), infomediaries (1996-99), portals (1998-2000) and surreptitious data capture (1999-). Habits die hard, however, and to a considerable extent consumers are still being treated as quarry by consumer marketers. It is therefore necessary to have terms that refer to the opposite of trust.

The convention has been to assume that trust is either present, or it is not, and hence:

Lack of Trust is the absence, or inadequacy, of confidence by one party in the reliability of the behaviour of one or more other parties

This notion alone is not sufficient, however, because it fails to account for circumstances in which, rather than there being either a positive feeling or an absence of it, there is instead a negative element present, which represents a positive impediment rather than merely the lack of a positive motivator to transact. The concept of `negative ratings' (Ba & Pavlou 2002) falls short of that need. This author accordingly proposes the following additional term and definition:

Distrust is the the active belief by one party that the behaviour of one or more other parties is not reliable

There are few treatments of distrust in the B2C eCommerce literature, but see McKnight & Chervany (2001a, 2001b, 2006), and the notion of `trustbuster' in Riegelsberger & Sasse (2001).

One further concept is needed, in order to account for the exercise of market power in B2C eCommerce. Most such business is subject to Terms of Service imposed by merchants. These embody considerable advantages to the companies and few for consumers. The Terms imposed by social media companies are among the most extreme seen in B2C eCommerce (Clarke 2008a, 2011). In such circumstances, there is no trust grounded in cultural affinity. The consumer may have a choice of merchants, but the Terms that the merchants impose are uniformly consumer-hostile. Hence a separate term is proposed, to refer to a degraded form of trust:

Forced Trust is hope held by one party that the behaviour of one or more other parties will be reliable, despite the absence of important trust factors

3.2 Categories of Trust Factor

The concepts presented in the previous sub-section provide a basis for developing insights into privacy problems and their solutions. In order to operationalise the concepts, however, it is necessary to distinguish several different categories of trust factor.

A Driver is a factor that, alone, is sufficient to determine an adoption / non-adoption decision. This is distinct from the 'straw that broke the camel's back' phenomenon. In that case, the most recent Influencer causes a threshold to be reached, but only by adding its weight to other, pre-existing Influencers.

The definitions in Exhibit 2 use the generic terms 'party' and 'conduct of a transaction', in order to provide broad scope. They encompass both social and economic transactions, and both natural persons as parties - variously in social roles, as consumers, as prosumers, and as producers - and legal persons, including social not-for-profit associations, economic not-for-profits such as charities, for-profit corporations, and government agencies.

Exhibit 2: The Four Categories of Trust Factor

A Trust Influencer is a factor that has a positive influence on the likelihood of a party conducting a transaction

A Distrust Influencer is a factor that has a negative influence on the likelihood of a party conducting a transaction

A Trust Driver is a factor that has such a strong positive influence on the likelihood of a party conducting a transaction that it determines the outcome

A Distrust Driver is a factor that has such a strong negative influence on the likelihood of a party conducting a transaction that it determines the outcome

A Driver is a factor that, alone, is sufficient to determine an adoption / non-adoption decision. This is distinct from the 'straw that broke the camel's back' phenomenon. In that case, the most recent Influencer causes a threshold to be reached, but only by adding its weight to other, pre-existing Influencers.

In Exhibit 2, the relationships among the concepts are depicted in the mnemonic form of a see-saw, and are supplemented by commonly used terms associated with each category.

Exhibit 2: A Depiction of the Categories of Trust Factor

Some trust factors are applicable to consumers generally, and hence the distinctions can be applied to an aggregate analysis of the market for a particular tradeable item or class of items. Attitudes to many factors vary among consumers, however; and hence the framework can be usefully applied at the most granular level, that is, to individual decisions by individual consumers. For many purposes, an effective compromise is likely to be achieved by identifying consumer segments, and conducting analyses from the perspective of each segment. Hence a driver is likely to be described in conditional terms, such as 'for consumers who are risk-averse, ...', 'for 'consumers who live outside major population centres, ...', and 'for consumers who are seeking a long-term service, ...'.

The commercial aspects of the relationship between merchant and consumer offer many examples of each category of trust factor. Distrust Drivers include proven incapacity of the merchant to deliver, such as insolvency. On the other hand, low cost combined with functional superiority represents a Trust Driver. Non-return policies are usually a Distrust Influencer rather than a Distrust Driver, whereas 'return within 7 days for a full refund' is likely to be a Trust Influencer. Merchants naturally play on human frailties in an endeavour to convert Influencers into Drivers, through such devices as '50% discount, for today only'.

Beyond commercial aspects, other clusters of trust factors include the quality, reliability and safety of the tradeable item, its fit to the consumer's circumstances and needs, and privacy. This paper is concerned with the last of these.


4. Implications

The analytical framework introduced in the previous section provides a basis for improving the design of social media services in order to address the privacy concerns of consumers and the consequential privacy risks of service-providers.

4.1 The Design of Social Media Services

The categorisation of trust factors provides designers with a means of focussing on the aspects that matter most. Hedonism and fashion represent drivers for the adoption and use of services. The first priority therefore is to identify and address the Distrust Drivers that undermine adoption of the provider's services.

Some privacy factors will have a significant negative impact on all or most users. Many, however, will be relevant only to particular customer-segments, or will only be relevant to particular users in particular circumstances or for a particular period of time. An important example of a special segment is people whose safety is likely to be threatened by exposure, perhaps of their identity, their location, or sensitive information about them. A comprehensive analysis of the categories of 'persons at risk' is in GFW (2011). Another important segment is people who place very high value on their privacy, preferring (for any of a variety of reasons) to stay well out of the public eye.

A detailed analysis of specific privacy features is in a companion paper (Clarke 2013). A couple of key examples of features that may be Distrust Drivers are reliance on what was referred to in the above framework as 'forced trust', requirements that the user declare their commonly-used identity or so-called 'real name', and default disclosure of the user's geo-location to the service-provider.

The distrust analysis needs to extend to factors that are influencers rather than drivers, because moderate numbers of negative factors, particularly if they frequently rise into a user's consciousness, are likely to morph into an aura of untrustworthiness, and thereby cause the relationship with the user to be fragile rather than loyal.

Examples of active measures that can be used to negate distrust include transparency in relation to Terms of Service (Clarke 2005), the conduct of a privacy impact assessment, including focus groups and consultations with advocacy organisations (Wright & de Hert 2012), mitigation measures where privacy risks arise, prepared countermeasures where actual privacy harm arises, and prepared responses to enquiries and accusations about privacy harm.

Although overcoming distrust offers the greater payback to service-providers, they may benefit from attention to Trust Drivers and Trust Influencers as well. This will also be in many cases to the benefit of users. Factors of relevance include privacy-settings that are comprehensive, clear and stable; conservative defaults; means for users to manage their profile-data; consent-based rather than unilateral changes to Terms of Service; express support for pseudonymity and multiple identifiers; and inbuilt support and guidance in relation to proxy-servers and other forms of obfuscation. Some users will be attracted by the use of peer-to-peer (P2P) architecture with dispersed storage and transmission paths rather than the centralisation of power that is inherent in client-server architecture.

4.2 Research Opportunities

The copious media coverage of privacy issues in the social media services marketplace provides considerable raw material for research (Clarke 2013). A formal research literature is only now emergent, and opportunities exist for significant contributions to be made.

The analysis conducted in this paper readily gives rise to a number of research questions. For example:

An example of the kinds of trade-off research that would pay dividends is provided by Kaplan & Haenlein (2010, p.67): "In December 2008, the fast food giant developed a Facebook application which gave users a free Whopper sandwich for every 10 friends they deleted from their Facebook network. The campaign was adopted by over 20,000 users, resulting in the sacrificing of 233,906 friends in exchange for free burgers. Only one month later, in January 2009, Facebook shut down Whopper Sacrifice, citing privacy concerns. Who would have thought that the price of a friendship is less than $2 a dozen?" . The vignette suggests that the presumption that people sell their privacy very cheaply may have a corollary - that people's privacy can be bought very cheaply as well.

A wide range of research techniques can be applied to such studies (Galliers 1992, Clarke 2001b, Chen & Hirschheim 2004). Surveys deliver 'convenience data' of limited relevance, testing what people say they do, or would do - and all too often merely what they say they think. Other techniques hold much more promise as means of addressing the research questions identified above, in particular field studies of actual human behaviour, laboratory experiments involving actual human behaviour in controlled environments, demonstrators, and open-source code to implement services or features.


5. Conclusions

Privacy concerns about social media services vary among user segments, and over time. Some categories of people, some of the time, are subject to serious safety risks as a result of self-exposure and exposure by others; many people, a great deal of the time, are subject to privacy harm; and some people simply dislike being forced to be open and exposed, and much prefer to lead a closed life. From time to time, negative public reactions and media coverage have affected many social media providers, including Facebook, Google and Instagram. There are signs that the occurrences are becoming more frequent and more intensive.

The analytical framework presented in this paper offers a means whereby designers can identify aspects of their services that need attention, either to prevent serious harm to their business or to increase the attractiveness of their services to their target markets. Researchers can also apply the framework in order to gain insights into the significance of the various forms of privacy concern.


References

Acquisti A. (2004) `Pri vacy in Electronic Commerce and the Economics of Immediate Gratification' Proc. ACM Electronic Commerce Conference (EC 04).,New York, 21-29, at http://www.heinz.cmu.edu/~acquisti/papers/privacy-gratification.pdf

Adams A.A. (2013) `Facebook Code: SNS Platform Affordances and Privacy' in this issue of the Journal of Law, Information and Science

Ba S. & Pavlou P. (2002) 'Evidence of the effect of trust building technology in electronic markets: Price premiums and buyer behavior' MIS Quarterly 26, 3 (September 2002) 243-268, at http://oz.stern.nyu.edu/rr2001/emkts/ba.pdf

Bankston K. (2009) 'Facebook's New Privacy Changes: The Good, The Bad, and The Ugly' Electronic Frontier Foundation, 9 December 2009, at https://www.eff.org/deeplinks/2009/12/facebooks-new-privacy-changes-good-bad-and-ugly

Barber B. (1983) 'The logic and limit of trust' Rutgers University Press, 1983

Barnes S.B. 'A privacy paradox: Social networking in the United States' First Monday 11, 9 (September 2006), at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/1394/1312%23

BBC (2011) 'Facebook U-turns on phone and address data sharing' BBC News, 18 January 2011, at http://www.bbc.com/news/technology-12214628

Bell E. (2012) 'The real threat to the open web lies with the opaque elite who run it' The Guardian, 16 April 2012, at http://www.guardian.co.uk/commentisfree/2012/apr/16/threat-open-web-opaque-elite

Best D. (2006) 'Web 2.0: Next Big Thing or Next Big Internet Bubble?' Technische Universiteit Eindhoven, January 2006, at http://www.scribd.com/doc/4635236/Web-2-0

boyd d. (2008) 'Facebook's Privacy Trainwreck: Exposure, Invasion, and Social Convergence' Convergence: The International Journal of Research into New Media Technologies 14, 1 (2008) 13-20

boyd d. & Hargittai E. (2010) 'Facebook privacy settings: Who cares?' First Monday 15, 8 (July 2010), at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/3086/2589

Brown I. & Marsden C.T. (2013) `Regulating Code' MIT Press, 2013

Bruns A. (2009) `From Prosumer to Produser: Understanding User-Led Content Creation' In Transforming Audiences, 3-4 Sep, 2009, London

Carlson N. (2010c) 'Well, These New Zuckerberg IMs Won't Help Facebook's Privacy Problems' Business Insider, 13 May 2010, at http://www.businessinsider.com/well-these-new-zuckerberg-ims-wont-help-facebooks-privacy-problems-2010-5

Cavoukian A. & Hamilton T. (2002) 'The Privacy Payoff: How Successful Businesses Build Consumer Trust' McGraw-Hill Ryerson Trade, 2002

Chen W. & Hirschheim R. (2004) 'A paradigmatic and methodological examination of information systems research from 1991 to 2001' Information Systems Journal 14, 3 (2004) 197-235

Clarke R. (1994) 'Information Infrastructure for The Networked Nation' Xamax Consultancy Pty Ltd, November 1994, at http://www.rogerclarke.com/II/NetNation.html, Extract from Section 2.4 at http://www.rogerclarke.com/II/NetN2.html

Clarke R. (1996) 'Privacy, Dataveillance, Organisational Strategy' Keynote Address, I.S. Audit & Control Association Conf. (ISACA / EDPAC'96), Perth, 28 May 1996, at http://www.rogerclarke.com/DV/PStrat.html

Clarke R. (1997) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' Xamax Consultancy Pty Ltd, August 1997, at http://www.rogerclarke.com/DV/Intro.html

Clarke R. (1999a) 'The Willingness of Net-Consumers to Pay: A Lack-of-Progress Report' Proc. 12th International Bled Electronic Commerce Conference, Bled, Slovenia, June 7 - 9, 1999, at http://www.rogerclarke.com/EC/WillPay.html

Clarke R. (1999b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conf. Privacy and Personal Data Protection, Hong Kong, September 1999. Revised version published in Info. Techno. & People 14, 1 (2001) 206-231, at http://www.rogerclarke.com/DV/PLT.html

Clarke R. (2001a) 'Meta-Brands' Privacy Law & Policy Reporter 7, 11 (May 2001) , at http://www.rogerclarke.com/DV/MetaBrands.html

Clarke R. (2001b) 'If e-Business is Different Then So is Research in e-Business' Proc. IFIP TC8 Working Conference on E-Commerce/E-Business, Salzburg, 22-23 June 2001, at http://www.rogerclarke.com/EC/EBR0106.html

Clarke R. (2002a) 'Trust in the Context of e-Business' Internet Law Bulletin 4, 5 (February 2002) 56-59, at http://www.rogerclarke.com/EC/Trust.html

Clarke R. (2002b) 'e-Consent: A Critical Element of Trust in e-Business' Proc. 15th Bled Electronic Commerce Conference, Bled, Slovenia, 17-19 June 2002, at http://www.rogerclarke.com/EC/eConsent.html

Clarke R. (2004) 'Very Black 'Little Black Books'' Xamax Consultancy Pty Ltd, February 2004, at http://www.rogerclarke.com/DV/ContactPITs.html

Clarke R, (2005) 'Privacy Statement Template' Xamax Consultancy Pty Ltd, December 2005, at http://www.rogerclarke.com/DV/PST.html

Clarke R. (2006a) 'What's 'Privacy?' Submission to the Australian Law Reform Commission, Xamax Consultancy Pty Ltd, July 2006, at http://www.rogerclarke.com/DV/Privacy.html

Clarke R. (2006b) 'Make Privacy a Strategic Factor - The Why and the How' Cutter IT Journal 19, 11 (October 2006), at http://www.rogerclarke.com/DV/APBD-0609.html

Clarke R. (2008a) 'B2C Distrust Factors in the Prosumer Era' Proc. CollECTeR Iberoamerica, Madrid, 25-28 June 2008, pp. 1-12, Invited Keynote Paper, at http://www.rogerclarke.com/EC/Collecter08.html

Clarke R. (2008b) 'Web 2.0 as Syndication' Journal of Theoretical and Applied Electronic Commerce Research 3,2 (August 2008) 30-43, at http://www.jtaer.com/portada.php?agno=2008&numero=2#, Preprint at http://www.rogerclarke.com/EC/Web2C.html

Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, at http://www.rogerclarke.com/EC/CCC.html

Clarke R. (2013) 'Consumer-Oriented Social Media: The Identification of Key Characteristics' Xamax Consultancy Pty Ltd, January 2013, at http://www.rogerclarke.com/II/COSM-1301.html

Clarke R. & Wigan M.R. (2011) 'You Are Where You've Been The Privacy Implications of Location and Tracking Technologies' Journal of Location Based Services 5, 3-4 (December 2011) 138-155, at http://www.rogerclarke.com/DV/YAWYB-CWP.html

Cohen J.E. (1996) 'A Right to Read Anonymously: A Closer Look at 'Copyright Management' In Cyberspace' 28 Conn. L. Rev 981 (1996) , at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=17990

Deighton J. & Kornfeld L. (2009) `Interactivity's Unanticipated Consequences for Marketers and Marketing' Journal of Interactive Marketing 23, 1 (February 2009) 4-10

Ellsberg D. (2013) 'Edward Snowden: saving us from the United Stasi of America' The Guardian, 10 June 2013, at http://www.theguardian.com/commentisfree/2013/jun/10/edward-snowden-united-stasi-america

EC (2012) `Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)` European Commission, 25 January 2012, COM(2012) 11 final 2012/0011 (COD), at http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf

Esguerra R. (2009) `Google CEO Eric Schmidt Dismisses the Importance of Privacy' Electronic Frontiers Foundation, 10 December 2009, at https://www.eff.org/deeplinks/2009/12/google-ceo-eric-schmidt-dismisses-privacy

Finn R.L., Wright D. & Friedewald M. (2013) `Seven Types of Privacy', Chapter in `European Data Protection: Coming of Age' Ed. S. Gutwirth et al.. Dordrecht: Springer Science+Business Media, 2013, at http://works.bepress.com/michael_friedewald/60

Galliers, R.D. (1992) 'Choosing Information Systems Research Approaches', in Galliers R.D. (ed., 1992) 'Information Systems Research: Issues, Methods and Practical Guidelines', Blackwell, 1992. pp. 144-162

GFW (2011) 'Who is harmed by a "Real Names" policy?' Geek Feminism Wiki, at http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy%3F, accessed 3 August 2011

Greenleaf G.W. (2003) 'IP, Phone Home: Privacy as Part of Copyright's Digital Commons in Hong Kong and Australian Law' in Lessig L. (ed) 'Hochelaga Lectures 2002: The Innovation Commons' Sweet & Maxwell Asia, Hong Kong, 2003, at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2356079

Guynn J. (2013) 'Privacy watchdog EPIC files complaint against Snapchat with FTC' Los Angelese Times, 17 May 2013, at http://www.latimes.com/business/technology/la-fi-tn-privacy-watchdog-epic-files-complaint-against-snapchat-with-ftc-20130517,0,3618395.story#axzz2krJ3636V

Harris W. (2006) 'Why Web 2.0 will end your privacy' bit-tech.net, 3 June 2006, at http://www.bit-tech.net/columns/2006/06/03/web_2_privacy/

Helft M. (2010) 'Critics Say Google Invades Privacy With New Service' The New York Times, 12 February 2010, at http://www.nytimes.com/2010/02/13/technology/internet/13google.html?_r=1

Hendler J. & Golbeck J. (2008) 'Metcalfe's law, Web 2.0, and the Semantic Web' Web Semantics: Science, Services and Agents on the World Wide Web 6, 1 (2008) 14-20, at http://smtp.websemanticsjournal.org/index.php/ps/article/download/130/128

Hirshleifer J. `Privacy: Its Origin, Function and Future' The Journ al of Legal Studies' 9, 4 (Dec. 1980) 649-664

Jones K.S. (2003) `Privacy: what's different now?' Interdisciplinary Science Reviews 28, 4 (December 2003 ) 287-292

Kaplan A.M. & Haenlein M. (2010) 'Users of the world, unite! The challenges and opportunities of social media' Business Horizons 53, 1 (Jan-Feb 2010) 59-68, slide-set at http://www.slideshare.net/studente1000/kaplan-andreas-m-haenlein-michael-2010-users-of-the-world-unite-the-challenges-and-opportunities-of-social-media-business-horizons-vol-53-issue-1-p-5968

Katz M.L. & Shapiro C. (1994) 'Systems Competition and Network Effects' The Journal of Economic Perspectives 8, 2 (Spring 1994) 93-115, at http://brousseau.info/pdf/cours/Katz-Shapiro%5B1994%5D.pdf

Kramer R.M. & Tyler T.R. (eds.) (1996) 'Trust in Organizations: Frontiers of Theory and Research' Sage Research, 1996

Krotoski A. (2012) 'Online identity: is authenticity or anonymity more important?' The Guardian, at April 2012, at http://www.guardian.co.uk/technology/2012/apr/19/online-identity-authenticity-anonymity/print

Lankton N. & McKnight D. H. (2011) 'What Does it Mean to Trust Facebook? Examining Technology and Interpersonal Trust Beliefs' The Data Base for Advances in Information Systems 42, 2 (2012) 32-54

Lee M.K.O. & Turban E. (2001) 'A Trust Model for Consumer Internet Shopping' Int'l J. of Electronic Commerce 6, 1 (Fall 2001) 75-91

Levine R., Locke C., Searls D. & Weinberger D. (1999) 'The Cluetrain Manifesto: The End of Business As Usual' Perseus Books, 1999

Lindsay D. (2005) 'An Exploration of the Conceptual Basis of Privacy and the Implications for the Future of Australian Privacy Law' Melb. Uni. L. Rev. 29, 1 (2005) 131-178

McKeon M. (2010) 'The Evolution of Privacy on Facebook' Self-Published, May 2010, at http://mattmckeon.com/facebook-privacy/

McKnight D.H. & Chervany N.L. (2001a) 'While trust is cool and collected, distrust is fiery and frenzied: A model of distrust concepts' Proc. 7th Americas Conference on Information Systems, 2001, pp. 883-888

McKnight D.H. & Chervany N.L. (2001b) 'Trust and Distrust Definitions: One Bite at a Time' in R. Falcone, M. Singh, and Y.-H. Tan (Eds.): Trust in Cyber-societies, LNAI 2246, Springer-Verlag Berlin Heidelberg 2001, pp. 27-54

McKnight D.H. & Chervany N.L. (2006) 'Distrust and Trust in B2C E-Commerce: Do They Differ?' Proc. ICEC'06, August 14-16, 2006, Fredericton, Canada

Mann S. (2002) 'Cyborglogs ("glogs")' Wearcam.org, 2002, at http://wearcam.org/glogs.htm

Marks G. (2013) 'Why Facebook Is In Decline' Forbes, 19 August 2013, at http://www.forbes.com/sites/quickerbettertech/2013/08/19/why-facebook-is-in-decline/

Mayer R.C., Davis J.H. & Schoorman F.D. (1995) 'An Integrative Model of Organizational Trust' Academy of Management Review 20, 3 (1995) 709-734

Melnik M.I. & Alm J. (2002) `Does a seller's ecommerce reputation matter? Evidence from eBay auctions' The Journal of Industrial Economics 50, 3 (September 2002) 337-349, at http://www.gsu.edu/~wwwsps/publications/2002/ebay.pdf

Michael K. & Clarke R. (2013) `Location and Tracking of Mobile Devices: Überveillance Stalks the Streets' Computer Law & Security Review 29, 3 (June 2013) 216-228, at http://www.rogerclarke.com/DV/LTMD.html

Morison W.L. (1973) 'Report on the Law of Privacy' Govt. Printer, Sydney 1973

Nissenbaum H. (2009) `Privacy in Context: Technology, Policy, and the Integrity of Social Life' Stanford University Press, 2009

NYT (2010) 'Facebook Privacy: A Bewildering Tangle of Options' The New York Times, 12 May 2010, at http://www.nytimes.com/interactive/2010/05/12/business/facebook-privacy.html

O'Connor R. (2012) 'Facebook is Not Your Friend' Huffington Post, 15 April 2012, at http://www.huffingtonpost.com/rory-oconnor/facebook-privacy_b_1426807.html

Opsahl K. (2010) 'Facebook's Eroding Privacy Policy: A Timeline' Electronic Frontier Foundation, 28 April 2010, at https://www.eff.org/deeplinks/2010/04/facebook-timeline/

O'Reilly T. (2005) 'What Is Web 2.0? Design Patterns and Business Models for the Next Generation of Software' O'Reilly, 30 September 2005, at http://www.oreillynet.com/lpt/a/6228

Otte E. & Rousseau R. (2002) 'Social network analysis: a powerful strategy, also for the information sciences' Journal of Information Science 28, 6 (2008) 441-453, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.3227&rep=rep1&type=pdf

Peppers D. & Rogers M. (1993) 'The One to One Future : Building Relationships One Customer at a Time', Doubleday, 1993

Petersen S.M. (2008) `Loser Generated Content: From Participation to Exploitation. First Monday ( Mar 2008), at http://firstmonday.org/ojs/index.php/fm/article/view/2141/1948

Posner R.A. (1981) `The Economics of Privacy' The American Economic Review 71, 2 (May 1981) 405-409

Rey P.J. (2012) `Alienation, Expoitation, and Social Media' American Behavioral Scientist 56, 4 (April 2012) 399-420

Riegelsberger J. & Sasse M.A. (2001) 'Trustbuilders and Trustbusters: The Role of Trust Cues in Interfaces to e-Commerce Applications' I3E, 17-30, at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.17.8688&rep=rep1&type=pdf

RT (2013) `Google: Gmail users `have no legitimate expectation of privacy' rt.com, 13 August 2013, at http://rt.com/usa/google-gmail-motion-privacy-453/

Salo J. & Karjaluoto H. (2007) 'A conceptual model of trust in the online environment' Online Information Review 31, 5 (September 2007) 604 - 621

Schneier B. (2012) 'Liars and Outliers: Enabling the Trust that Society Needs to Thrive' Wiley, 2012

Schoeman F. D. Ed. (1984) 'Philosophical Dimensions of Privacy: An Anthology' Cambridge University Press, 1984

Shiels M. (2010) `Facebook's bid to rule the web as it goes social' BBC News, 22 April 2010, at http://news.bbc.co.uk/2/hi/8590306.stm

SMF (2013) 'The Decline of Facebook' Social Media Frontiers, 24 September 2013, at http://www.socialmediafrontiers.com/2013/09/social-media-news-decline-of-facebook.html

Solove D.J. (2006) 'A Taxonomy of Privacy' University of Pennsylvania Law Review 154, 3 (January 2006) 477-560

Solove D. (2008) `Understanding Privacy' Harvard University Press, Cambridge MA, 2008

Sprenger P. (1999) 'Sun on privacy: Get over it' Wired, 26 Jan 1999, at http://www.wired.com/politics/law/news/1999/01/17538

Tapscott D. & Williams A.D. (2006) 'Wikinomics: How Mass Collaboration Changes Everything' Portfolio, 2006

Toffler A. (1970) 'Future Shock' Pan, 1970

Toffler A. (1980) 'The Third Wave' Pan, 1980

Waugh R. (2012) 'Unfair and unwise': Google brings in new privacy policy for two billion users - despite EU concerns it may be illegal' Daily Mail, 2 March 2012, at http://www.dailymail.co.uk/sciencetech/article-2108564/Google-privacy-policy-changes-Global-outcry-policy-ignored.html

Wigan M.R. & Clarke R. (2013) 'Big Data's Big Unintended Consequences' IEEE Computer 46, 3 (June 2013), at http://www.rogerclarke.com/DV/BigData-1303.html

Williamson O. (1993) 'Calculativeness, trust, and economic organization' Journal of Law and Economics 36 (1993) 453-486

Wright D. & de Hert P. (eds.) (2012) `Privacy Impact Assessment' Springer, 2012

Yamagishi T. (2011) 'Trust: The Evolutionary Game of Mind and Society' Sptringer, 2011


Acknowledgements

This paper has been in gestation for a decade. Its drafting was stimulated by an invitation from Neils Christian Juul to visit Roskilde University in Denmark in June 2012 to present a seminar on privacy, trust and user-involvement, stressing the research perspective. This was followed by an invitation from Kiyoshi Murata and Andrew Adams at Meiji University to present at the Asian Privacy Scholars Network Conference in Tokyo in November 2012 and at an associated research workshop. The feedback from Neils Christian Juul, Andrew Adams and participants in those events assisted in clarification of the analysis. The further comments of the reviewers and editor materially contributed to the academic quality of the paper.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.

Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 13 April 2012 - Last Amended: 18 November 2013 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/SMTD.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2017   -    Privacy Policy