Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'The Limited Use of PETs'

Key Factors in the Limited Adoption of End-User PETs

Preparatory Notes for a Panel-Session on
'Using Technologies: How Can We Better Promote Useable,
Effective Privacy-Enhancing / Anti-Surveillance Technologies?'
Politics of Surveillance Workshop
University of Ottawa, May 8-10, 2014

Emergent Draft of 26 April 2014

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2014

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/DV/UPETs-1405.html


Abstract

PETs don't get adopted by users. Why not? And what can we do about it? This paper considers the evidence available in the literature relating to the adoption of innovations, impediments to adoption, and usability. It argues that PETs have failed because (1) they are mostly research artefacts rather than tools designed for people to use; (2) their design has been based on the presumption that the user is 'everyman'; and (3) they are standalone rather than integrated into the infrastructure that individuals depend upon. It is proposed that designers must identify relevant user segments, assess each segment's risks, target their needs, and slide PET functionality out of sight, 'under the bonnet', so that 'privacy just happens'.


Contents


1. Introduction

Privacy Enhancing Technologies (PETs) are tools, standards and protocols that directly assist in the protection of the privacy interest. It was recognised at an early stage of the Internet era that new threats to privacy would emerge, and that protections would need to be engineered.

The term 'privacy-enhanced' dates to at least the mid-1980s, in the Privacy-Enhanced Mail (PEM) specification in the IETF RFC series 989 (February 1987), 1040 (January 1988), and 1113-1115 (August 1989). These defined a 'Privacy Enhancement for Internet Electronic Mail'. The term referred, however, only to the narrow concept of message transmission security, and its requirements of confidentiality, authentication, and message integrity assurance.

As relevant tools emerged during the early-to-mid-1990s, straightforward descriptors would have been 'privacy protective technologies', 'privacy-supportive technologies', or 'privacy-sympathetic technologies' (Clarke 1999). Instead, the existing adjective was adapted to 'privacy-enhancing' and pre-pended to 'technologies'. An early use of the phrase, at that stage without the acronym, was in CPSR (1991). The earliest use found by Google Scholar is in a later CPSR Newsletter (Agre 1995). In that year, John Bjorking of the Netherlands Data Protection Commissioner's Office and Ontario Information Commissioner Ann Cavoukian used it as the title of a joint report (ICPR 1995). See also Burkert (1997) and Goldberg et al. (1997). At that stage, the focus was on only one of the three categories of PET discussed below - tools for strong anonymity.

During the intervening two decades, a great deal has been published, and many tools have been made available. Two series of annual conferences exist, the Privacy Enhancing Technologies Symposium, and the Symposium On Usable Privacy and Security (SOUPS). Three substantial EU-funded research projects have published a series of contributions, Privacy in Software Agents (PISA, 2000-02), Privacy and Identity Management for Europe (PRIME, 2006-08) and PrimeLife (2009-11). In parallel, there has been an ongoing explosion in Privacy-Invasive Technologies (referred to by this author using the parallel term 'the PITs'), as corporations and government agencies seek variously to control citizens and exploit consumers, through their digital personae. Yet, despite both the well-demonstrated need for PETs, and at least some degree of availability, take-up has been exceedingly low.

The Panel that stimulated this paper was concerned with 'How Can We Better Promote Useable, Effective Privacy-Enhancing / Anti-Surveillance Technologies?'. The generic research question that I address in this paper can be expressed as 'Why have PETs not been adopted, and what guidance is available in order to achieve higher levels of adoption in the future?'.

A modest literature exists addressing the question of why organisations don't design and implement PETs, and what can be done to stimulate organisations to do so (Borking 2003, Clarke 2008a, Borking 2011, Harbach et al. 2013). Concepts such as business ethics, corporate social responsibility and 'beneficence' have been invented by academics in business schools; but there is little evidence of behaviour by organisations that suggests that such ideas are anything more than academic fluff. Organisations do things for which there are business drivers. These are primarily revenue-generation, protection and enhancement of market-share, and cost-savings, but also market power exercised by competitors and (although in recent years less frequently) institutional power exercised by regulatory agencies.

This paper does not address the 'supply-push' approach whereby organisations implement PETs for the benefit of their users. Insteas, the focus here is on the 'demand-pull' dimension - or 'end-user PETs', as the PRIME Project calls them. In the absence of business drivers, and with little prospect of laws being sufficient to cause organisations to implement privacy-sensitive designs, the onus is on consumers and citizens to protect themselves. Many people recognise that organisations abuse personal data, yet very few seek out and adopt PETs. This paper seeks explanations as to why that is so, with a view to formulating remedies.


2. Background

PETs are innovations. Relevant literature on innovation diffusion, and on the adoption of information technology innovations in particular, can provide insights into the extent to which they are and are not taken up.

2.1 Diffusion of Innovations

Innovation is the step beyond invention, and is concerned with the deployment of one or more ideas in the real world. A body of work that is commonly relied upon in describing and seeking to understanding adoption trajectories is the Diffusion of Innovation (DOI) theory developed by Rogers (1962). According to that theory, technological innovation is communicated through particular channels, over time, among the members of a social system. For a brief summary, see Clarke (1991).

The most relevant aspect of DOI theory for the purposes of the present discussion is its postulates as to the important characteristics of a successful innovation. These are:

This theory has been much applied during the last 50 years, including to the diffusion of information technology innovations. It is therefore important to keep these factors in mind when considering the diffusion of PETs.

2.2 Information Technology Adoption

DOI theory originated in rural settings, and applied to innovations of various kinds. In relation to the adoption of technologies specifically, a substantial amount of empirical research has been driven by the Technology Acceptance Model (TAM). The original source of TAM theory is Davis (1989), and successive refinements to the model are in Venkatesh & Davis (1996, 2000) and Venkatash (2000). A useful exposition is in Chuttur (2009).

TAM posits that adoption of an IT innovation is dependent on an intermediating variable, 'Behavioural Intention', which is in turn dependent on 'Perceived Usefulness' and 'Perceived Ease of Use'. Key factors influencing those variables are listed in Figure 1.

Figure 1: Primary Variables in TAM Theory

These variables to some extent echo DOI theory. Moderate empirical support for various aspects of the model has been established. Despite the fact that TAM has been widely criticised, e.g. Benbasat & Barki (2007), there is a clear need for PET implementors to take the account the accumulated knowledge about technology adoption.

A further theory of relevance is Technology Threat Avoidance Theory (TTAT - Liang & Xue 2009). As represented in Figure 2, TTAT posits that key factors determining whether users take action to avoid technological threats are the perception of threat - i.e. whether the user is aware of themselves or their devices as being susceptible to a threat, and the severity that they associate with the threat; and the perception of avoidability - i.e. whether the user perceives a PET as being effective, the costs including time, money, inconvenience, and comprehension, and 'self-efficacy' or self-confidence about implementing the PET.

Figure 2: Variables in TTAT Theory

2.3 Adoption of End-User PETs

The experience has been that PETs have achieved very limited adoption.

[Evidence needed here!!! Where are the studies of adoption levels??]

[The vignette about the German ID Card can go here - but it's an organisation rather than an end-user PET.

[Include the one major counter-example: TLS, in particular https?

[The lesson is that designing and implementing an element is not enough, nomatter how good it is. An innovation is not an element, but a suite of interacting elements, that together satisfy the multiple characteristics identified in s.2.2.]

It has long been observed that PETs lack user-friendliness, e.g. Cas & Hafsjold (2006). Norcie et al. (2012) identified many 'stop-points' in the installation and use of the Tor Browser Bundle. Fahl et al. (2012) sought ways to manage encryption keys in automated but secure ways, such that little or no user involvement is necessary.

It is important to learn from both product-specific guidance and past failures, in order to formulate guidance for the design and implementation of future PETs. The following section considers the main body of literature, whose focus is on the interactions between a user and their PETs.


3. Usability Factors in PETs

A range of literatures exist in the area of the usability of artefacts, including 'usability engineering' (Nielsen 1993), Human-Computer Interaction (HCI) theory, User Interface (UI) design theory, and 'the design of everyday things' (Norman 2000). Nielsen identified five "usability attributes": learnability, efficiency of use, memorability, lowness of error-rate, and satisfaction; and ISO Standard 9241-11, published in 1998, identified the key elements of usability as being effectiveness, efficiency and satisfaction, and learnability.

A body of work highly relevant to the study of PET adoption is the 'usable security' literature. As part of their study of the usability of PGP, Whitten & Tygar (1999) proposed that security software is usable if:

Garfinkel & Miller (2005) applied the Whitten & Tygar approach to investigate the usability of Key Continuity Management as an alternative to PGP. In Clark et al. (2007), the 'core tasks' needed to install and use Tor were defined, and the following set of guidelines proposed, and tested:

A comprehensive model for user interface design for privacy is proposed in PRIME (2008), particularly pp. 32-48. This draws on Patrick et al. (2002), but is driven by privacy principles and in particular the EU Directive. It identifies four HCI requirements for privacy: comprehension, consciousness, control and consent. A second set of 'socio-cultural' characteristics to achieve adoption is declared as comprising configurability, minimised skill requirements, accountability, trust in communications infrastructure, trust in transaction partners and affordability. Thirdly, leveraging off the 'usable security' notions in Herzog & Ahahmehri (2007), the following PET design guidelines were proposed:

A subsequent EU-funded project resulted in PrimeLife (2011). This asserts the need for a User-Centric Design Process, and notes the following challenges, which represent impediments to the adoption of end-user PETs:

PrimeLife offers HCI guidelines for the design of usable PETs including a set of 11 heuristics that were adapted from prior HCI principles and further 4 that are PET-specific:

Surprisingly, these various sets of guidelines appear to overlook a message learnt many times over in user interface design. Any product of moderate functional power has large number of parameters, and careful selection of a collection of settings that effectively express a particular person's needs is a significant task that demands education, comprehension and concentration. The solution to this is well-known as being pre-configuration of a modest number of profiles, and the abilities to select parameter-values in bulk, and to override in detail where desired. A further guideline is accordingly proposed:

In Camp (2013), principles of 'translucent security' were proposed: high security defaults, single-click override, context-specific settings, personalised settings, and use-based settings. Camp regards it as being vital to provide the user with the choice to knowingly make risk-seeking actions. Security is only one dimension of privacy. However, these guidelines are relevant to that dimension at least, and may have relevance to other aspects of privacy, such as collection practices; assessment of data quality; determination of the justification for collection, use, disclosure and retention; openness about the handling of personal data; and subject access to data about themselves.

A case study of the German National ID card - an organisational PET but with hooks for end-user PETs - identified lack of perceived relevance, complexity and lack of control as impediments, and proposed that a PET needs to be effortlessly integrated into personal activities (Harbach et al. 2013).

An aspect that must not be overlooked is the capacity for usability factors, far from being positive adoption factors, to become impediments to adoption. By their nature, PETs block, or detect and combat, privacy-invasive features of services. PETs may therefore give rise to slow service, degraded service, or non-service, from the many organisations whose implementations have privacy-invasive features (whether they were intentionally designed-in, or were inherited from the providers and software libraries that the organisation depends on). A PET that simply 'blocks' or 'degrades' privacy-abusive services is likely to frustrate its user. Even if it 'notifies' that a service is blocked, a user may be unhappy with the PET's performance. The user may be more relaxed about the blockage if the PET also 'explains' why the service is blocked or degraded. But users are likely to want a finer degree of control over a PET than a 'take it or leave it' / 'on/off-switch'. The following is proposed as a refinement to guideline H7 above, which merely required that end-users 'have control':


4. Beyond Usability

As shown by the above survey, the existing literature on PET adoption is primarily concerned with user interfaces and the somewhat broader notion of HCI. This section identifies and then articulates several aspects of end-user PETs that have emerged in the discussion, and that it is contended require much more attention than they have been given to date.

4.1 The Key Research Questions

The first observation about the existing literature is that the conception of 'end-user' is generic, and vague. It appears to have been unusual for designers to undertake a careful investigation of users' needs, define a target market, and design to satisfy that specific market. Hence the first detailed research question that demands attention is:

(1) To what extent have PETs been designed without any clear conception of the kinds of users who would apply them, and to what extent have they been designed to target specific needs?

Very little of the literature considers the necessary architectural features that end-user PETs, nor the means whereby they would be delivered into the possession of their intended end-users, installed, configured, demonstrated, and trialled. This leads to the following further research questions:

(2) To what extent have PETs been research artefacts, and to what extent have they been productised?

(3) To what extent have PETs been developed as standalone tools, and to what extent have they been packaged, and integrated with functional tools such as OS, office suites, email-clients and web-browsers?

(4) To what extent have PETs been pre-installed into consumer products, such that they arrive 'off-the-shelf', as part of the product, whether in the base product or as an alternative, enhanced product?

(5) To what extent have PETs been conceived and implemented as monolithic products, rather than in a modular manner, particularly in relation to the HCI and user interface 'skin'?

The approach adopted in the remainder of this article is as follows. The many and varied forms of PET are categorised according to a variety of criteria. The most promising of these varous bases is judged to be user segments, and some specific categories of users are identified as being appropriate targets for PET adoption.

It is proposed that risk assessment be performed in respect of each user segment, in order to provide guidance for designers as to the elements that need to be incorporated into an appropriate privacy-friendly product. An example of such a risk assessment is provided. Two miniature case studies are then offered, as an indication of the rather different approach to PET construction that the analysis conducted in this paper leads towards.

4.2 Categorisation of End-User PETs

A wide array of tools have emerged that fit within the umbrella-concept of end-user PETs. This section provides a superficial review, with the primary purposes of identifying useful classification schemes, and establishing to what extent available PETs are standalone or integrated into users' working environments.

The most common organisational basis for lists of PETs is the tool's technical purpose, moderated by its technical context of use. See, for example, Goldberg (2007), BestVPN (2014) and PRISM-Break (2014). Table 1 provides a re-structured version of EPIC's longstanding 'Guide to Practical Privacy Tools'.

Table 1: A Technology-Oriented Categorisation of End-User PETs

After (EPIC 2014)

Device-Management Tools

Traffic-Related Tools

Web-Related Tools

Email-Related Tools

Other Communications-Related Tools

Other Commerce-Related Tools

It is significant that technology-oriented classifications include no category that represents a comprehensive approach to the risks that users face. Each category and instance is valuable, or at least potentially so, but they are not available in bundles, and they are not integrated with the environments in which people work and play.

Since their inception, PETs have existed in a 'self-help' world. Each individual first has to understand what they need. Then they must find, install and configure software, or access services, that address those needs. Hence, rather than a technology-based categorisation, some other basis may serve users better. An early attempt, in Clarke (2001a), distinguished the following:

However, the approach that would really serve users would be based on their needs, or the risks that they are faced with. A vignette that may throw light on this is the Bradley Manning case. No comprehensive analysis of the technical aspects of the Manning case has come to light, and it is known that the source of the information that led to Manning's arrest was a correspondent who Manning naively trusted. However, early media reports suggested that Manning used the Tor network to obfuscate the path that the leaked documents took to reach Wikileaks, but did not appreciate that he also needed to obfuscate the net-location from which the documents were introduced to the Tor network, and encrypt the content to Wikileak's private key.

Two inferences can be drawn. The first is that a whisteblower faces particular risks, and a single PET of the kinds that have been conceived to date cannot provide all of the necessary safeguards. A comprehensive approach is needed to achieve that. However, not every potential users of PETs is a whistleblower. The second inference is that, in order to delver value, motivate adoption, and overcome impediments, the most important categorisation basis for end-user PETs is the nature of the particular end-user and their needs.

4.3 Market Segmentation

PETs are of the nature of infrastructure rather than functional tools like an office suite or accounting package or service. The focus needs to be not on 'use-cases', but on 'user-cases', more commonly referred to as user segments.

A seemingly obvious but often overlooked observation is that PETs are not for everyday users. They are for people who perceive the need to use technology to protect their privacy against incursions by organisations and other individuals, and against unsafe and hostile technologies. It is important that Internet architecture and governance be re-conceived, in order to design security in instead of grafting it on. But that endeavour is subject to massive opposition from repressive governments and subversion by national security agencies. Effective end-user PETs are therefore vital, at least for the next two decades, and probably for all time.

Which categories of individual are the most likely adopters? Some people have a preference for privacy, which may be sufficiently strong to motivate effort to achieve privacy in the context of Internet usage. Some people suffer from irrational fears about threats in their environment, including in Internet contexts. However, there are also many readily-definable categories of people for whom paranoia is entirely justified, because they may indeed be subject to electronic surveillance, may interact with others who may be subject to surveillance, may be subject to interference with their traffic or their own devices, and/or may be at risk of physical harm if their location becomes known to some person or organisation. Previous work on generic categories of persons-at-risk includes lists by Clarke (2001b) and GFW (2011).

Drawing on those sources, the following categories are proposed as being in need of PET safeguards:

Table 2: User Segmentation for PETs
aka Categories of Persons-At-Risk


4.4 Risk Assessment

The conventional security model considers stakeholders as perceiving value in assets, which are subject to being harmed as a result of threats impinging on vulnerabilities, without adequate safeguards being in place to deal with that particular eventuality (e.g. Clarke 2013). The process of risk assessment involves analysing a particular context in order to determine what additional safeguards are needed, in order to achieve a security profile that suitably balances those risks against the costs involved in combatting them.

It is important to appreciate that risk assessments for the various user segments identified in the previous section will result in considerably different sets of safeguards. For example, a victim of domestic violence faces a very specific threat - the particular person who it is feared may do them harm. The victim's location is the critical asset that needs to be protected, and a range of strategies can be adopted to achieve that aim, depending on the specific circumstances. In contrast, a dissident, especially one living in a country with a particularly repressive government, faces various threats from an attacker that is far larger, and has far greater has access to resources and expertise.

Searches have found few documents that adopt the approach proposed here. Two partial exceptions are Wikileaks (2011) and Lee (2013). The Wikileaks document specifically addressed people wishing to obscure the fact that they were the source of a document that had come into Wikileak's possession. The organisation operated a 'secure drop box' and a secure chat venue, suggested use of the Tor network, and listed aspects of their site-design intended to provide protection for people making submissions.

Lee presents detailed guidance for whistleblowers seeking to protect themselves against government agencies that seek to unmask them. Lee's analysis nominally begins with a discussion of the 'threat model' that confronts whistleblowers, but unfortunately it is brief and discursive rather than analytical and thorough, and does not represent a risk assessment. As a result, it is unclear what specific risks Lee envisages a whistleblower as facing, nor how each of the recommended tools addresses those risks. The risks that can inferred from the document are outlined in Table 3.

Table 3: An Indicative Risk Assessment for a Whistleblower

Inferred from Lee (2013)

Lee provides information about the use of the following suite of technologies, complemented by carefully-designed processes:

Table 4: PETs for a Whistleblower

After Lee (2013)

This outline demonstrates firstly that a risk assessment of the curcumstances facing a particular category of user is capable of delivering a requirements statement from a comprehensive PET product, and secondly that the features of PET products are different for each of the various user-segments identified in Table 2 above.


5. PET Integration

This section offers a couple of examples of ways in which PETs can be integrated with functional tools, and thereby become available to relevant market segments. These fall a long way short of the needs identified for even the least complex user segments discussed above. The purposes of the examples are to provide insights into the challenges and opportunities, and to do so in the context of important elements of a comprehensive solution.

5.1 A Privacy-Friendly Browser

During the last two decades, the Web has become an indispensible tool for achieving access to information. It is also currently a primary means of conducting economic transactions electronically, and most people use it for social transactions as well. All browsers have:

All user segments need to be able to, at the very least, use one or more browsers with features that address those vulnerabilities. At one stage, the world was given the impression that Mozilla was consumer-friendly, but its current products are all vulnerable-by-design, with optional add-ons that partially address some of the problems, and their directions of development are increasingly hostile to users.

There are many sources of plugins that address particular problems, but they are all piecemeal, and it is difficult for a user to find even a moderately protected browser that is available for download. One such offering is WhiteHat Aviator, at https://www.whitehatsec.com/aviator/. This comprises Chromium, pre-installed with a set of plug-ins that address many aspects of problem (b) above, the designed-in security vulnerabilities. Chromium is widely-regarded as the browser that has the least-worst profile in relation to unintended security vulnerabilities, which addresses much of problem (a). Might this represent a privacy-friendly browser solution?

On the other hand, Chromium is an open version of Google's Chrome browser. Chrome is quite simply a trojan horse, designed to ensure that its users' data is captive to Google Inc. Hence Chromium, much more so than other browsers, has an additional undesirable characteristic:

It is therefore essential to configure Chromium very carefully, and probably to reconfigure it from time to time, in order to address the risks arising from dependence upon an open-source fork made available by a self-interested and voracious supplier.

Normal human beings aren't aware of these problems, wouldn't understand them even if they were aware of them, and wouldn't understand or be capable of implementing such safeguards as are available.

[ https://www.whitehatsec.com/aviator/help/help.html has Help, but there seems to be no list of the actual plug-ins and associated settings ...]

[ I can't see what OS versions are supported.]

[Experiments are being / have been conducted with? A Gedankenexperiment has been outlined? It would be a good idea to do?] a further version, effectively Chromium++, which is configured so as to address [which type (c) problems?], but unfortunately not [which other type (c) problems?].

5.2 Convenient End-to-End Encryption

Another element within the set of safeguards needed by persons-at-risk is the ability to transmit content in such a way that access to it is only feasible by those intended to have access to it. Various forms give rise to somewhat different risks, in particular:

End-to-end encryption can be effected by a user, but currently it requires a moderate amount of understanding and expertise. One approach is to have a readily downloadable and installable pre-configured product that could be easily invoked by the user, with all cryptographic processes, including key management, administered transparently, securely and reliably by the software. This has proven to be an elusive goal.

Another approach would be for this capability to be provided as a service rather than as an application. For contemporary desktops and portables, this could be delivered within a web-browser context. A server-operator could deliver software, in particular in the form of Javascript, but with the cryptographic processes including key management performed on the client, and in such a way that the server-operator was demonstrably unable to access the content other than by a brute-force search of the key-space.

Such a tool is investigated in Godsiff (2014), and has been recently reported by an MIT team in the form of Mylar (Popa et al. 2014).


6. Conclusions

The adoption of end-user PETs has been low. Attention to usability factors is a necessary condition, but is not sufficient to achieve more widespread use. Table 5 presents principles for effective PET design that are apparent from the analysis conducted in this paper:

Table 5: Principles for Effective PET Design

Many aspects of this paper are tentative, and require deeper study in order to test and articulate the ideas proposed. Case studies need to be undertaken, and those previously done need to be revisited with the above set of principles in mind. Existing tools need to be assessed, to establish the extent to which they are capable of being integrated into users' products or can be adapted to achieve that aim, or require redesign.

The many user segments identified in s.4.3 have an urgent need for PETs to reach a level of maturity and a degree of mainstreaming that has to date eluded them. Applicatins of the proposals in this paper should contribute to much-improved adoption levels of end-user PETs.


References

Agre P. (1995) 'Looking Down the Road: Transport Informatics and the New Landscape of Privacy Issues' CPSR News 13, 3 (Fall 1995), at /http://cpsr.org/prevsite/publications/newsletters/issues/1995/Fall1995/agre.html/

Benbasat I. & Barki H. (2007) 'Quo vadis, TAM?' Journal of the Association of Information Systems 8, 4 (April 2007) 211-218

BestVPN (2014) 'The Ultimate Privacy Guide' BestVPN, 2014, at https://www.bestvpn.com/the-ultimate-privacy-guide/

van Blarkom G.W., Borking J.J. & Olk J.G.E. (eds.) (2002) 'Handbook for Privacy and Privacy-Enhancing Technologies. PISA project, 2002, at http://www.andrewpatrick.ca/pisa/handbook/handbook.html

Borking J.J. (2003) 'The status of privacy enhancing technologies, Certification and security in E-services: From e-government to e-business' Kluwer, 2003

Borking J. (2011) 'Why Adopting Privacy Enhancing Technologies (PETs) Takes so Much Time, in Gutwirth S., Poullet Y., De Hert P., Leenes R. (eds.) 'Computers, Privacy and Data Protection: An Element of Choice' Springer Netherlands, 2011, pp. 309-341

Burkert H. (1997) 'Privacy-Enhancing Technologies: Typology, Critique, Vision' in Agre P.E. & Rotenberg M. (Eds.) (1997) 'Technology and Privacy: The New Landscape' MIT Press, 1997

Camp L.J. (2013) 'Beyond usability: Security Interactions as Risk Perceptions' Proc. Workshop on Risk Perception in IT Security and Privacy, July 24-26, 2013, Newcastle, UK, at http://cups.cs.cmu.edu/soups/2013/risk/RiskWksp_Translucent.pdf

Cas J. & Hafskjold C. (2006) 'Access in ICT and Privacy in Europe, Experiences from technology assessment of ICT and Privacy in seven different European countries' EPTA, Geneva, 2006, p. 41

Chuttur M.Y. (2009) 'Overview of the Technology Acceptance Model: Origins, Developments and Future Directions' Sprouts: Working Papers on Information Systems 9, 37 (2009), at http://sprouts.aisnet.org/9-37

Clark J., van Oorschot P.C. & Adams C. (2007) 'Usability of Anonymous Web Browsing: An Examination of Tor Interfaces and Deployability' Proc. Symposium On Usable Privacy and Security, July 18-20, 2007 Pittsburgh, at http://cups.cs.cmu.edu/soups/2007/proceedings/p41_clark.pdf

Clarke R. (1990) 'Open Applications Architecture:A User-Oriented Reference Model for Standardization of the Application Platform' Computer Standards & Interfaces 11 (1990) 15-27, PrePrint at http://www.rogerclarke.com/SOS/OAA-1990.html

Clarke R. (1991) 'A Primer in Diffusion of Innovations Theory' Xamax Consultancy Pty Ltd, May 1991, at http://www.rogerclarke.com/SOS/InnDiff.html

Clarke R. (1999) 'The Legal Context of Privacy-Enhancing and Privacy-Sympathetic Technologies' Xamax Consultancy Pty Ltd, Presentation at AT&T Research Labs, Florham Park NJ, 5 April 1999, at http://www.anu.edu.au/people/Roger.Clarke/DV/Florham.html

Clarke R. (2001a) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001), PrePrint at http://www.rogerclarke.com/DV/PITsPETs.html

Clarke R. (2001b) 'Research Challenges in Emergent e-Health Technologies' Xamax Consultancy Pty Ltd, July 2001, at http://www.rogerclarke.com/EC/eHlthRes.html#PAR

Clarke R. (2008a) 'Business Cases for Privacy-Enhancing Technologies' Chapter 7 in Subramanian R. (Ed.) 'Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions' IDEA Group, 2008, pp. 135-155, PrePrint at , at http://www.rogerclarke.com/EC/PETsBusCase.html

Clarke R. (2008b) 'Dissidentity' Identity in the Information Society 1, 1 (December, 2008) 221-228, at http://www.rogerclarke.com/DV/Dissidentity.html

Clarke R. (2013) 'Why Isn't Security Easier for SMEs and Consumers?' Xamax Consultancy Pty Ltd, November 2013, at http://www.rogerclarke.com/EC/SSACS-13.html

CPSR (1991) 'CPSR Co-sponsors Meeting on Encryption, Privacy and Communications' CPSR News 9, 2 (Winter-Spring 1991), archived at http://web.archive.org/web/20040705115955/http://www.cpsr.org/publications/newsletters/issues/1991/WinSpr1991/crypt.html

Davis F.D (1989) 'Perceived usefulness, perceived ease of use, and user acceptance of information technology' MIS Qtly 13 (1989) 319-339

EPIC (2014) 'EPIC Online Guide to Practical Privacy Tools', Electronic Privacy Information Center, Washington DC, various versions since 1997, at http://www.epic.org/privacy/tools.html

Fahl S., Harbach M., Muders T., Sander U. & Smith M. (2012) 'Helping Johnny 2.0 to Encrypt His Facebook Conversations' Proc. Symposium On Usable Privacy and Security, July 11-13, 2012, Washington, DC, at http://cups.cs.cmu.edu/soups/2012/proceedings/a11_Fahl.pdf

Garfinkel S. & Miller R.C. (2005) 'Johnny 2: A User Test of Key Continuity Management with S/MIME and Outlook Express' Symposium On Usable Privacy and Security, July 6-8, 2005 Pittsburgh, at http://cups.cs.cmu.edu/soups/2005/2005proceedings/p13-garfinkel.pdf

GFW (2011) 'Who is harmed by a "Real Names" policy?' Geek Feminism Wiki, undated, apparently of 2011, at http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy%3F

Godsiff J. (2014) 'End-to-End Encryption Functionality Delivered by a Server-Operator: A Javascript Implementation' Computer Science Honours Thesis, Forthcoming, Australian National University, July 2014

Goldberg I., Wagner D. & Brewer E. (1997) 'Privacy-enhancing Technologies for the Internet' Proc. 42nd IEEE Spring COMPCON, February 1997, at http://www.dtic.mil/dtic/tr/fulltext/u2/a391508.pdf

Goldberg I. (2002) 'Privacy-enhancing technologies for the Internet, II: Five years later', Proc. Workshop on Privacy Enhancing Technologies 2002, Lecture Notes in Computer Science 2482, Spinger-Verlag, 2002, pp. 1-12, at http://web.cs.dal.ca/~abrodsky/7301/readings/Go02.pdf

Goldberg I. (2007) 'Privacy Enhancing Technologies for the Internet III: Ten Years Later' Chapter 1 of Acquisti A. et al. (eds.) 'Digital Privacy: Theory, Technologies, and Practices' Auerbach, 2007, at https://cs.uwaterloo.ca/~iang/pubs/pet3.pdf

Harbach M., Fahl S., Rieger M., Smith M. (2013) 'On the Acceptance of Privacy-Preserving Authentication Technology: The Curious Case of National Identity Cards' in Cristofaro E. & Wright M. (eds.) 'Privacy Enhancing Technologies' Lecture Notes in Computer Science. Springer Berlin Heidelberg, pp. 245-264

Herzog A. & Shahmehri N. (2007) 'Usable set-up of runtime security policies' Proc. Int'l Symposium on Human Aspects of Informaton Security and Assurance, July 2007

IPCR (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner (Ontario, Canada) and Registratiekamer (The Netherlands), 2 vols., August 1995, Vol. II at http://www.ipc.on.ca/images/Resources/anoni-v2.pdf

Lee M. (2013) 'Encryption Works: How to Protect Your Privacy in the Age of NSA Surveillance' Freedom of the Press Foundaiton, 2 July 2013, at https://pressfreedomfoundation.org/encryption-works

Liang H. & Xue Y. (2009) 'Avoidance of Information Technology Threats: A Theoretical Perspective' MIS Quarterly 33, 1 (March 2009) 71-90

Nielsen J. (1993) 'Usability engineering' Morgan Kaufmann, 1993

Norcie G., Caine K. & Camp J. (2012) 'Eliminating Stop-Points in the Installation and Use of Anonymity Systems: A Usability Evaluation of the Tor Browser Bundle' Proc. 12th Privacy Enhancing Technologies Symposium, Vigo, Spain, 10-13 July 2012, at http://petsymposium.org/2012/papers/hotpets12-1-usability.pdf

Norman D. (2000) 'The Design of Everyday Things' MT Press, 2000

Patrick A.S., Kenny S., Holmes C. & van Breukelen M. (2002) 'Human Computer Interaction' Chapter 12 in van Blarkom et al. (2002), at http://www.andrewpatrick.ca/pisa/handbook/handbook.html

Popa R.A., Stark E., Helfer J., Valdez S., Zeldovich N., Kaashoek M.F. & Balakrishnan H. (2014) 'Building web applications on top of encrypted data using Mylar' Proc. 11th USENIX Symposium on Networked Systems Design and Implementation (NSDI'14), at http://css.csail.mit.edu/mylar/mylar.pdf

PRIME (2008) ' HCI Guidance for Privacy' Privacy and Identity Management for Europe, 1 February 2008, at https://www.prime-project.eu/prime_products/reports/arch/pub_del_D06.1.f_ec_wp06.1_v1_final.pdf

PrimeLife (2011) 'Towards Usable Privacy Enhancing Technologies: Lessons Learned from the PrimeLife Project' Privacy and Identity Management in Europe for Life, June 2011, at http://primelife.ercim.eu/images/stories/deliverables/d4.1.6-towards_usable_pets-public.pdf

PRISM-Break (2014) 'Opt out of global data surveillance programs', PRISM-Break, 2014, at http://prism-break.org/en/protocols/

Rogers E.M. (1962) 'Diffusion of Innovations' The Free Press, New York, originally published in 1962, 3rd Edition 1983

Spiekermann S. & Cranor L.F. (2009) 'Engineering Privacy' IEEE Transactions on Software Engineering 35, 1 (January/February 2009), at http://ssrn.com/abstract=1085333

Venkatesh V. (2000) 'Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model' Information Systems Research 11, 14 (2000) 342-365

Venkatesh V. & Davis F.D. (1996) 'A model of the antecedents of perceived ease of use: Development and test' Decision Sci. 27 (1996) 451- 481

Venkatesh V. & Davis F.D. (2000) 'A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies' Management Science 46, 2 (February 2000) 186-204

Whitten A. & Tygar J. (1999) 'Why johnny can't encrypt: A usability evaluation of pgp 5.0' Proc. 8th USENIX Security Symposium, vol. 99, 1999, at http://gaudior.net/alma/johnny.pdf

Wikileaks (2011) 'Submissions' Wikileaks, undated, apparently of September 2011, at http://wikileaks.org/Submissions.html


Acknowledgements

TEXT


Resources

Privacy Enhancing Technologies (PETs) Symposium
http://petsymposium.org/

Symposium On Usable Privacy and Security (SOUPS)
http://cups.cs.cmu.edu/soups/
Past SOUPS (2005-13) are linked to from that page

Privacy in Software Agents (PISA, 2000-02)
http://www.cbpweb.nl/downloads_artikelen/art_jbo_2001_pisa.pdf
http://link.springer.com/chapter/10.1007/3-540-44702-4_8

Privacy and Identity Management for Europe (PRIME, 2006-08)
https://www.prime-project.eu/

PrimeLife (2009-11)
http://primelife.ercim.eu/


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 3 April 2014 - Last Amended: 26 April 2014 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/UPETs-1405.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2024   -    Privacy Policy