Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Privacy Regulatory Mechanisms'

A Comprehensive Framework for Regulatory Regimes
as a Basis for Effective Privacy Protection

Version of 11 January 2021

Proc. 14th Computers, Privacy and Data Protection Conference (CPDP'21)
Brussels, 27-29 January 2021

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2020

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at

Th supporting slide-set is at

The 20-minute video-presentation is at


Effective regulatory regimes comprise custom-designed complexes of multiple mechanisms. This paper presents a generic framework within which possible measures can be identified and evaluated, and combinations can be devised that can serve the public need. The framework distinguishes 'systemic governance' (including natural mechanisms and infrastructural regulation), 'self-governance' (at organisational and industry sector levels) and 'government' (both formal regulation and variants such as meta- and co-regulation).

Examples from the privacy arena are provided, primarily from the author's local jurisdiction. Suggestions are made of alternative ways to apply the framework to design effective privacy protection schemes. Co-regulation is suggested as the most appropriate form for dealing with emergent challenges, particularly those deriving from advanced information technologies.


1. Introduction

The 2020 CPDP Theme is 'Enforcing Rights in a Changing World'. There are many mechanisms that can enable the exercise of control over behaviours that unreasonably harm privacy interests. Many contributions to the Theme may have their focus on one or other of those mechanisms.

This paper adopts a different approach. It argues that effective protections for privacy depend firstly on appreciation of the entire field of regulation, and secondly on the fashioning of blends of multiple mechanisms in order to achieve the desired outcomes.

The paper begins by standing back from the tumult, and presenting a broad view of the field. It identifies specific examples of measures that are too easily overlooked. It culminates in some suggestions about appropriate combinations of measures to deliver privacy protection.

The primary objective of the research reported here is the development and exposition of a framework to assist in understanding regulatory regimes, and in identifying opportunities for the conception, design and refinement of arrangements to fulfil the needs of the public for privacy protection. By framework is meant a structure for the themes and issues in past and future research within a particular domain. Common features of frameworks are descriptions of fundamental concepts and processes, and arrangements of key elements, often within a two- or three-dimensional matrix of ideas.

Regulatory contexts vary considerably, across such dimensions as ethnic and lingual cultures, systems of law, jurisdictions and industry sectors. On the other hand, there are also commonalities among countries' approaches to regulation, arising in part from 16th-19th century colonialism and in part from 20th-21st century globalisation. Examples include British common law and Napoleonic code law systems, international treaties, multilateral and bilateral trade agreements, international standards, and supranational instruments such as EU Directives and Regulations. To the extent practicable, the framework presented in this paper is intended to be agnostic to the differences, and hence broadly applicable to most circumstances.

The framework is presented in the following section, and comprises four segments. Together, they provide a structured view of spaces that are to be subjected to regulation, which can then be applied to the specific needs of privacy protection.

2. A Framework for Regulatory Analysis

A framework needs to be sufficiently comprehensive to embrace all aspects of the structures and processes that exist within a broad field of view. A framework is inherently mult-dimensional, encompassing some components at a meta-level, some that are relatively static in nature, and others that are more dynamic.

The first sub-section articulates the concept, nature and purposes of a regulatory regime, and defines the criteria whereby the appropriateness of any particular instantiation can be evaluated. The second presents the layers within which regulatory measures are usefully organised. The third section identifies the various actors within the regulatory space; and the fourth examines the dynamics within the space, as each of those actors seeks to satisfy its own interests.

2.1 Regulation of Economic and Social Systems

The concept of regulation is frequently thought of as a matter of law and policy. Its foundations go far deeper, however. General systems theory grew out of observations of biology, where natural processes are subject to other natural processes whose effect is to limit, control or regulate them, giving rise to homeostasis - the tendency of natural systems to maintain the status quo (von Bertalanffy 1940, 1968).

During the industrial revolution, a significant breakthrough occurred when Watts invented the steam or 'fly-ball' governor, such that a man-made process was subjected to automated control by another man-made process, rather than by a natural process. Together, these threads gave rise to the insights of cybernetics, whereby sensors provide feedback that enables a controller to monitor a process, effectors enable the controller to influence the process, and successive levels of nested controllers enable complex systems to be managed (Wiener 1948).

The present analysis is not concerned with biological phenomena or manufacturing processes, but with economic and social systems. In these contexts, the motivation for active regulatory measures arises when some class of entities behaves in a manner that has materially negative impacts on other entities. Figure 1 provides a preliminary graphical representation of key entities involved in the regulatory arena, and key relationships among them. The left-hand side of Figure 1 depicts an unregulated state, in which one entity has a negative effect on the interests of a second entity. In the regulated state depicted on the right-hand side, the second entity has been converted into a 'beneficiary' of the existence, power and actions of a third entity, commonly called a 'regulator', which influence the behaviour of the first entity, referred to as the 'regulatee'.

Figure 1: Abstract Model of the Entities Involved in Regulatory Schemes

Regulators include tightly-controlled government agencies and relatively independent commissions. Regulatees include corporations, unincorporated business enterprises, government agencies, cooperatives, incorporated and unincorporated associations, and individuals. Beneficiaries include not only all of those categories but also social values such as trust in social and economic institutions, and environmental values.

A simple, useful, but incomplete definition of regulation in economic and social contexts is "instruments used ... to influence or control the way people and businesses behave in order to achieve economic, social or environmental policy objectives" (ANAO 2007). A valuable aspect of that interpretation is its generality, in that ways to influence behaviour are expressed more broadly then merely formal legal mechanisms. A weakness of this approach, however, is that it is restricted firstly to explicit human actions ("instruments"), and secondly to only those instruments that are applied with the intention to achieve influence ("used to").

Each of the participants in a regulatory regime naturally has its own objectives. For example, an organisation subject to regulatory requirements may adopt a 'responsible citizen' or 'corporate social {and environmental} responsibility' (CSR/CSER) attitude, with an objective of efficiently achieving compliance with regulatory requirements (Sethi 1975, Wood 1991, Hedman & Henningsson 2016).

Alternatively, a 'cowboy' in the same sector may have the objective of avoiding, subverting and ignoring regulatory requirements in order to minimise their negative impacts on the organisation's interests. Similarly, a regulatory agency may adopt the stance of a 'watchdog', interpreting its legal authority as widely as possible, and seeking to withstand the depradations wrought by lobbying against its activities; or it may stolidly administer the scheme's enabling legislation; or it may perceive itself to have a minimalist, window-dressing role on behalf of government, and may even facilitate industry behaviour irrespective of the harm that it may cause (Drahos & Krygier 2017).

The assumption adopted in this analysis is that the purpose of a regulatory regime is to exercise control over harmful behaviours. That objective is, however, subject to constraints. The most salient of these are that the measures imposed be effective in achieving their aims, and that they be efficient, i.e. that they impose no higher costs on organisations than are justified by the harm being avoided. Many additional factors are involved, however. Criteria for the evaluation of regulatory regimes are discussed in Gunningham et al. (1998). See also Hepburn (2006) and ANAO (2007). An articulated set of attributes of regulatory arrangements that draws on the above sources is presented in Clarke & Bennett Moses (2014) and summarised in Table 1. This facilitates the evaluation of regulatory arrangements, the adaptation of existing schemes, and the development of new schemes.

Table 1: Criteria for the Evaluation of a Regulatory Regime

Adaptation of Clarke & Bennett Moses (2014, Table 2)





This section has articulated the nature and purposes of a regulatory regime, and thereby laid the foundations for examination of the elements and processes that they entail. The following section presents a model of the layers within which regulatory processes are conventionally organised.

2.2 Regulatory Layers

Theoretical works on regulation refer to an 'enforcement pyramid', with persuasive measures at the bottom, escalating upwards to aggressive sanctions such as licence revocation (Ayres & Braithwaite 1992). "There is a heavy presumption in favour of starting at the base of the pyramid because dialogue is a low-cost, respectful and time-efficient strategy for obtaining compliance. The responses of the regulatee to interventions drawn from the base of the pyramid are the ones that determine if, how far and when the regulator escalates up the pyramid" (Drahos & Krygier 2017, p.5).

For the present purpose, rather than individual instruments or measures, it is more useful to focus on categories of mechanisms. This paper accordingly proposes the model in Figure 2. This distinguishes layers, based on the degree of formalism and compulsion. Each layer is outlined below, and keyed across from the diagram to the text using numerals (1) through (7).

Figure 2: A Hierarchy of Regulatory Mechanisms

The foundational layer, (1) Natural Regulation, is a correlate of the natural control processes that occur in biological systems. It comprises natural influences, by which is meant processes that are intrinsic to the relevant socio-economic system (Clarke 1995, 2014c). Examples of natural regulation include the exercise of countervailing power by those affected by an initiative, activities by competitors, reputational effects, and cost/benefit trade-offs. The postulates that an individual who "intends only his own gain" is led by "an invisible hand" to promote the public interest (Smith 1776), and that economic systems are therefore inherently self-regulating, have subsequently been bolstered by transaction cost economics (Williamson 1979). Limits to inherent self-regulation have also been noted, however, such as 'the tragedy of the (unmanaged) commons' notion (Hardin 1968, 1994, Ostrom 1999). Similarly, whereas neo-conservative economists commonly recognise 'market failure' as the sole justification for interventions, Stiglitz (2008) adds 'market irrationality' (which justifies the use of circuit-breakers to stop bandwagon effects in stock markets) and 'distributive justice' (in such forms as safety nets and anti-discrimination measures).

An appreciation of pre-existing natural controls is a vital precursor to any analysis of regulation, because the starting-point always has to be 'what is there about the natural order of things that is inadequate, and how will intervention improve the situation?'. For example, the first of 6 principles proposed by the Australian Productivity Commission was "Governments should not act to address 'problems' through regulation unless a case for action has been clearly established. This should include evaluating and explaining why existing measures are not sufficient to deal with the issue" (PC 2006, p.v). That threshold test is important, in order to ensure a sufficient understanding of the natural controls that exist in the particular context. In addition, regulatory measures can be designed to reinforce natural controls. For example, approaches that are applicable in a wide variety of contexts include adjusting the cost/benefit/risk balance perceived by the players, by subsidising costs, levying revenues and/or assigning risk.

In the privacy context, the sorry history of applications of economic analysis can be traced through Westin (1971), Westin & Baker (1974), Posner (1977), Laudon (1993) and Varian (1996). From this line of thinking, the 'privacy calculus' proposition emerged (Culnan & Armstrong 1999). This postulates that "Customers will continue to participate in this social contract [i.e. exchanging personal information for intangible benefits] as long as the perceived benefits exceed the risks" (p. 106). Dinev and Hart (2006), applying the notion in the Internet Commerce context, expressly adopted the naive economic assumption-set: "Internet users' behavioral intentions should be consistent with expectancy theory, which holds that individuals will behave in ways that maximize positive outcomes and minimize negative outcomes" (p.62). This is consistent with the position that "Implicit in most of the neoclassical economics literature on privacy is the assumption that consumers are rationally informed agents with stable privacy preferences" (Acquisti et al. 2013, p. 253).

Unsurprisingly, this unworldly view led to a wide variety of conflicts between the predictions from naive theoretical models and observed human behaviour, and to many different flavours of the so-called 'privacy paradox' (e.g. Jorstad 2001, Awad & Krishnan 2006, Barnes 2006).

Naive economic approaches deliver no value to policy-makers who are actually seeking solutions to privacy problems. Any model that seeks to support analysis of privacy behaviours needs to reflect a wide variety of complexities, including the multiple dimensions of privacy (Clarke 1997, Finn et al. 2013, Koops et al. 2016, Clarke 2017), the diversity of individuals' values and contexts, the dominance of hedonism over rationalism in personal decision-making, the ease with which hedonistic impulses can be excited, and the fuzzy notion of impressionistic satisficing within narrow frames, rather than unrealistic assumptions about rational optimisation of a multi-variable objective function taking into account many sources of up-to-date information. A useful model must also take into account asymmetries among participants. One factor is information asymmetry, and another the power asymmetry arising from the market power of corporations and the institutional power of government agencies, which greatly exceeds the capacity to influence that is achieved by ill-organised consumers and citizens, and by poorly-resourced civil society.

As a result of such forces, Natural Regulation has proven to be almost entirely ineffective in the privacy arena. Markets demonstrably fail, there has been bandwagon adoption of the consumer-manipulative business models of the digital surveillance economy (Clarke 2019), and distributive injustice is evidenced in such forms as algorithmic bias (Bennett Moses & Chan 2018).

Natural Regulation cannot be simply ignored in the privacy context, because its inadequacy must be demonstrated in order to justify the adoption of active measures. Moreover, opportunities may exist to stimulate greater effectiveness of some natural processes. On the other hand, the substantial failure of natural controls means that considerable emphasis needs to be placed on elements within the other layers of the regulatory hierarchy.

All of the other layers in Figure 2 represent interventions into natural processes, and comprise 'instruments' and 'measures', generally designed with an intention to achieve some end. That end is desirably to curb harmful behaviours and excesses, but in some cases the purpose is to give the appearance of doing so, in order to hold off stronger or more effective interventions. Such 'counter-regulatory' phenomena are discussed in a later section.

The second-lowest layer, referred to in this paper as (2) Infrastructural Regulation, is exemplified by artefacts like the mechanical steam governor. It comprises particular features of the infrastructure that embed features that serve particular interests, reinforcing positive aspects and inhibiting negative aspects of the relevant socio-economic system. Those features may be byproducts of the artefact's design, or they may be retro-fitted onto it, or architected into it. (The first steam-engines did not embody adequate controls over excessive steam-pressure. The first steam-governor was a retro-fitted feature. In subsequent iterations, however, controls became intrinsic to the design of steam-engines). Beyond mechanical controls, information technology provides many opportunities, and the two have merged in the form of robotics. For example, dam sluice-gate settings can be automatically adjusted in response to measures of catchment-area precipitation events or changes in feeder-stream water-flows. One popular expression for infrastructural regulation in the context of IT is '[US] West Coast Code' (Lessig 1999, Hosein et al. 2003).

In the privacy arena, mainstream examples of Infrastructural Regulation include validation processes for captured data prior to its storage and use, the application of cryptography to protect data in transit and data at rest, user authentication and access control processes, and permissions management.

At the uppermost layer of the regulatory hierarchy, (7) Formal Regulation exercises the power of a parliament through statutes and delegated legislation such as Regulations. In common law countries at least, statutes are supplemented by case law that provides clarification of the application of the legislation in particular contexts. Formal regulation demands compliance with requirements that are expressed in more or less specific terms, and is complemented by sanctions and enforcement powers. Lessig underlined the distinction between infrastructural and legal measures by referring to formal regulation as '[US] East Coast Code'.

In the privacy field, an index of the vast array of national laws is provided by Greenleaf (2019). For detailed analyses of Asian laws, see Greenleaf (2014), and for African laws see Greenleaf & Cottier (2018). Successful supra-national instruments exist (CoE 1981, GDPR 2016). On the other hand, the OECD document, particularly in recent years (OECD 2013), and APEC throughout its life (APEC 2005), have been endeavours to ratchet down privacy protections, and advantage business and government over consumers and citizens. A somewhat broader view of enforcement mechanisms is in Wright & De Hert (2016).

Regulation of the formal kind imposes considerable constraints and costs. Several intermediate forms exist. These reduce the imposts, but at considerable risk of also reducing the effectiveness of the regulation. The lowest layer of these intermediate forms is (3) Organisational Self-Regulation. Examples include internal codes of conduct and 'customer charters', and self-restraint associated with expressions such as 'business ethics' and 'corporate social responsibility' (Parker 2002).

In the privacy arena, the dominant such mechanism is privacy policy statements (FTC 1997, Clarke 2006). These are necessary in jurisdictions that lack effective formal regulatory arrangements, in particular in the USA, because they create a basis for possible enforcement through consumer or other laws. They have become widespread in many other jurisdictions, even though they are in many cases largely redundant.

The mid-point of the heirarchy is (4) Industry Sector Self-Regulation. In many sectors, schemes exist that express technical or process standards. There are also many codes of conduct, or of practice, or of ethics, and some industries feature agreements or Memoranda of Understanding (MoUs) that are claimed to have, and may even have, some regulatory effect.

However, by their nature, and under the influence of trade practices / anti-monopoly / anti-cartel laws, such instruments seldom apply to all relevant players, and in any case their provisions are rarely binding. To the extent that they have any direct impact, it is only on those organisations that choose to adopt them. This seldom includes the 'cowboys' in the industry, which tend to be responsible for a disproportionate amount of the harm that the industry causes (Sethi & Emelianova 2006). Another mechanism used in some fields is accreditation ('tick-of-approval' or 'good housekeeping') schemes. These are best understood by describing them as meta-brands. The conditions for receiving the tick, and for retaining it, are seldom materially protective of the interests of the nominal beneficiaries (Clarke 2001a, Moores & Dhillon 2003).

The effectiveness of the two self-regulatory layers, perceived from the viewpoint of the entities that are meant to be beneficiaries of regulatory arrangements, generally falls well short of the promise. Activities conducted under the 'self-governance' label may provide some limited safeguards and even the prospect of mitigation of some of the harmful impacts, but they are primarily motivated by the avoidance of harm to the regulatees rather than the assurance of protections for beneficiaries. Braithwaite (2017) notes that "self-regulation has a formidable history of industry abuse of privilege" (p.124). The conclusion of Gunningham & Sinclair (2017) is that 'voluntarism' is generally an effective regulatory element only when it exists in combination with 'command-and-control' components.

In the privacy arena, Industry Sector Self-Regulation has been of some benefit to corporations, but has largely failed consumers. An example is the chimera of 'meta-brands' such as 'privacy seals'. These have mostly come and gone in rapid succession. The few that have lasted any length of time have been notable for their almost complete failure to exercise the nominal enforcement mechanisms that they embody - although on occasions they have sued companies for breach of their own intellectual property rights. TrustE has been well-documented as a sham (FTC 2014, Connolly et al. 2014, Connolly et al 2015). It re-branded in 2017 as TrustARC.

Other, intermediate forms have emerged that have been claimed to offer greater prospects of achieving the regulatory objective of protecting against inappropriate behaviour and excesses. These are clustered into layer (6) Meta- and Co-Regulation. In many areas, convincing arguments can reasonably be made by regulatees to the effect that government is poorly placed to cope with the detailed workings of complex industry sectors and/or the rate of change in industries' structures, technologies and practices. Hence, the argument proceeds, parliaments should legislate the framework, objectives and enforcement mechanisms, but delegate the articulation of the detailed requirements.

During the last four decades, several forms have emerged that are intermediate between (often heavy-handed) formal regulation and (mostly ineffective and excusatory) self-regulation. In Grabowsky (2017), the notion of 'enforced self-regulation' is traced to Braithwaite (1982), and the use of the term (6a) Meta-Regulation, in its sense of 'government-regulated industry self-regulation', to Gupta & Lad (1983). See also Parker (2007).

In the privacy arena, my searches have yet to uncover a good exemplar of Meta-Regulation.

In parallel, the notion of (6b) Co-Regulation' emerged (Ayres & Braithwaite 1992, Clarke 1999). Broadly, these approaches involve enactment of a legislative framework, but delegation of the details. Where the delegation is to the industry alone, it is (6a) Meta-Regulation, but where it is to an at least moderately institutionalised negotiation process with all relevant parties involved, it is (6b) Co-Regulation.

For co-regulation to be more than nominal, the participants necessarily include at least the regulatory agency, the regulatees and the intended beneficiaries of the regulation, and the process must reflect the needs of all parties, rather than be dominated by institutional and/or market power. In addition, meaningful sanctions, and enforcement of them, are intrinsic elements of a scheme of this nature.

In the privacy arena, my searches have yet to uncover a good exemplar of Co-Regulation.

The remaining layer in the hierarchy, (5) Pseudo Meta- and Co-Regulation, is made necessary by the paucity of exemplars of effective implementations of the two notions. The promises of enforced self-regulation, meta-regulation and co-regulation have seldom been delivered. Commonly, the nominal beneficiaries are effectively excluded from the negotiations, and terms are not meaningfully enforced, and may even be unenforceable (Balleisen & Eisner 2009).

In the privacy arena, Australia provides two exemplars of such failures. During 1999, the Australian Attorney-General of the day formed a 'Core Consultative Group' to negotiate amendments to the Privacy Act (Cth) 1988. These were to extend its scope from the federal public sector alone to encompass the private sector. The consultation process had credibility, because the National Privacy Principles that were to form the heart of the the Bill had been the result of robust negotiations that started with a set whose drafting was driven by a civil society group (APCC 1994). However, with the discussions well-advanced, the Attorney-General abruptly disestablished his Co-Regulation experiment (or perhaps it had always been a smokescreen). He instead pushed through Cabinet and the Parliament an amendment Act that had been negotiated in a parallel process involving solely public servants and industry associations.

The amended Australian Privacy Act contains scores of exemptions, many of them very broad. One of those exemptions is a a nominally Meta-Regulatory mechanism. Media activities are excluded from the Act's scope by means of the provision that "a media organisation is exempt ... if ... [it] is publicly committed to observe standards that ... deal with privacy ... and ... have been published" (s.78(4), inserted in 2000). Media organisations are free to choose or create any 'standards' that they like, provided that they purport to "deal with privacy". There are no requirements, or external benchmarks, or tests of credibility, and there is no requirement for consultation with the affected public. Eight years later, a Law Reform Commission enquiry considered the media exemption, but failed to recommend any change beyond suggesting that, to qualify for the media exemption, an organisation should be required to "deal adequately with privacy" (ALRC 2008, at paras. 42.80-89, emphasis in original). Despite being almost entirely devoid of any negative impact on media corporations, the recommendation was ignored by the government (Clarke 2012).


Real-world regulatory regimes sometimes contain elements from just one of the layers discussed in this section, but they usually embody ideas from more than one of them. The purpose of Figure 2 is to provide designers of regulatory schemes with a visual reminder of the need to remain aware of the possibilities, and to seek a combination of elements that is appropriate to the particular context. In addition, their design needs to satisfy the criteria identified in Table 1: "in the majority of circumstances, the use of multiple rather than single policy instruments, and a broader range of regulatory actors, will produce better regulation [by means of] the implementation of complementary combinations of instruments and participants ..." (Gunningham & Sinclair 2017, p.133).

In the privacy protection field, this hierarchy of regulatory layers provides a basis for appreciating the options, for evaluating existing regimes, for devising adaptations, and for conceiving new regimes. The following two sections identify the entities involved in regulatory regimes, and their behaviours and interactions.

2.3 Regulatory Players

A preliminary model was presented in Figure 1 above, identifying three categories of entity involved in regulatory schemes, referred to in this paper as regulators, regulatees and beneficiaries. This section expands that preliminary model in order to identify the much fuller set of players that may take to the field across all of the regulatory layers identified in Figure 2. In Figure 3, the three central players remain unchanged. They have been joined, however, by many other entities.

Figure 3: Players in Regulatory Schemes

Considering firstly the upper areas of Figure 3, any one regulatee may be subject to multiple regulators (e.g. relating to the corporations law, tax, occupational health and safety, and product-specific aspects such as food, chemicals or financial advice). Even in the privacy arena, an organisation is likely to be subject to more than one regulatory regime. In some jurisdictions, an organisation exists with sufficient powers and resources in relation to the handling of personal data to warrant the term 'regulatory agency', while in others the term 'oversight agency' is more appropriate. At one extreme, the USA lacks such an organisation at federal level. The Federal Trade Commission (FTC) desultorily fills some of the gap, through somewhat mild enforceable undertakings (second-strike threats, sometimes with audit strings attached). At the other extreme, each nation within the EU has a regulator at national level. The EU arrangements are complex, however, with some legislation at EU level, in particular the General Data Protection Regulation (GDPR), and national legislation required to be consistent with EU Directives. Countries that have sub-jurisdictions (states, provinces, cantons, Länder) in many cases have agencies at the lower levels as well.

Central though data protection agencies may seem to privacy protection, additional regulators are also relevant to most organisations that handle personal data. Some gain jurisdiction by virtue of the organisation's business lines (e.g. in the health care and telecommunications sectors) andothers in respect of one or more categories of the organisation's customers, suppliers, employees or usees (e.g. through laws relating to consumer rights, employment and credit reporting).

Each regulator is created, empowered and resourced by a parliament, and that parliament can vary the regulator's terms of reference, and can further empower, neutralise or disestablish it. In some cases, a regulator may be accountable directly to the relevant parliament, but the more common model is for the regulator to report to a designated Minister through a higher-level agency.

Regulators need to research, consult, draw on the services of consultants, negotiate, draft and promulgate. Later they need to investigate, and may enforce, sue and prosecute. Inevitably, their reports to the higher-level agency or the parliament involve a degree of defence against attacks from aggrieved regulatees and their associations, and, in most cases less powerfully, from aggrieved beneficiaries. Other policy agencies, and other regulators, may also influence the regulator's behaviour.

In some sectors, a further role is evident, sitting astride the regulator / regulatee boundary. For example, stock exchanges play an intermediary role in relation to listed corporations, registrars, brokers and traders; and bank industry clearing associations perform similar functions in relation to participants in payments systems.

Turning attention to the lower block in Figure 3, regulatees' compliance with relevant regulatory schemes is, at least in principle, subject to audit, although the extent to which this is implemented and effective is variable, and in some schemes the auditor is so closely associated with the regulatee as to be indistinguishable from any other consultant.

In the privacy arena, the inclusion of privacy compliance within annual audit programs, and even within the scope of internal audit, remains unusual, five decades since the emergence of data protection obligations began to be imposed on organisations.

Formal Industry Standards have a degree of influence on industry activities, usually very substantial in the case of technical standards, whereas process standards sometimes have rather less impact. Some regulatees contribute to the formation of such Standards. Regulatees club together in industry associations, and may use an industry 'tick-of-approval' scheme.

In the privacy arena, some well-meaning attempts have been made to produce standards, in particular in Canada for Privacy Impact Assessment (PIA). More recently, corporate interests have worked on such documents, but with the purpose of 'dumbing-down', routinising and neutralising the notion of PIAs, as explicated in Clarke (2009), and reducing the impact on corporations and government agencies of PIA processes and of the recommendations arising from PIA processes. This watering-down has been greatly assisted by the very poorly-conceived notion of Data Protection Impact Assessment (DPIA) in GDPR Art.35 (Clarke 2017b).

Ombudsman schemes handle complaints from beneficiaries, and may have some influence over aspects of regulatee behaviour. In some circumstances, beneficiaries may themselves have the capacity to sue and to achieve recompense through courts, tribunals or other schemes.

Meanwhile, consultants, technology providers and service-providers abound. Reflecting market-scale, these primarily target regulatees, to some extent regulators, and to a much more limited extent beneficiaries of regulation (e.g. comparator web-sites, rating-schemes, and in some jurisdictions contingency-fee litigation practices and financiers).

The model in Figure 3 generally is largely applicable to the privacy arena. In any paticular jurisdiction there would of course be considerable benefit in customising it to the local context and terminology.

The scope for beneficiaries to force regulators to perform their functions, and to themselves enforce data protection and other privacy laws, varies enormously between jurisdictions. In Australia, for example, the complaints systems of data protection commissioners are dysfunctional and loaded against complainants. At federal level, the OAIC blocks almost all complainants from escalating their complaints even to tribunal-, let alone court-level, with the result that, after 32 years of nominal regulation of the federal public sector, and 20 years of nominal regulation of the private sector, there is almost no jurisprudence in the area (Greenleaf 2001, 2012).

Law reform commissions at federal level and in three States, recognising that the courts have failed to develop a privacy tort, have made well-argued, clear and articulated recommendations for the establishment of a privacy right of action. The lobbying of public servants and media companies, led by the Murdoch group, has ensured that the public continues to have no ability to seek court consideration of even the gravest of privacy injustices. This is exacerbated by the failure of Australian governments and the parliament to fulfil the nation's obligations following its accession to the ICCPR in 1980. Almost alone among democracries, Australia lacks any generic human rights guarantees, as they are not entrenched at constitutional level nor even enacted by statute (Mann 2018).

The situation in Australia stands in stark contrast to that in other countries. In the USA, the established mechanism of class actions offers some scope. In the EU, the capacity of individuals to exercise their rights has been enhanced since 2018 by means of GDPR Art. 80. This enables an incorporated not-for-profit to act on behalf of individual data subjects. Max Schrems' None Of Your Business (NOYB) and La Quadrature du Net were well-prepared to fulfil the Article's requirements. The provision has promise as a means of altering the power imbalances between regulates and beneficiaries, and providing access to remedies.

Given the scale of organisations and their activities, many players are heavily dependent on IT, and on information systems that manage the relevant data, support decision processes, and in some cases automate decision-making and reporting. To fulfil those needs, technology providers and technology-service providers address the needs of players in the regulatory space. Given the mania for outsourcing and virtualisation of organisations, many regulatees have limited in-house expertise even in aspects very close to the organisation's mission, let alone in regulatory compliance matters. In many circumstances, suppliers devise and even perform the business processes that are subject to regulation, and in many contexts deep appreciation of the industry context and compliance obligations may be vested in them rather than in regulatees themselves.

Given their deep understanding of relevant regulatory mechanisms service-providers may also be channels for suggesting adaptations to regulatory schemes. Their primary focus is likely to be the avoidance of undue process inefficiencies that affect the regulatee's interests. In some circumstances, however, they may serve, or also serve, the achievement of the objectives of the scheme as a whole, i.e. satisfaction of the criteria presented in Table 1 above.

2.4 Regulatory Play

The dynamics of a regulatory scheme are driven by the motivations and behaviours of the players. This section outlines the primary factors arising in relation to the three main categories - regulatees, regulators and beneficiaries.

Regulatees that are subject to formal regulation adopt various stances (Greenaway et al. 2015). The 'responsible citizen' approach involves a positive attitude to compliance, whereas some organisations treat it as low-priority administrative overhead, and the 'cowboy' segment of the industry flouts the rules. In addition to in-house staff, regulatees draw on well-developed consultancy services.

In the privacy arena, consultancies develop and sell ways in which their clients can establish and refine their privacy strategy, identify privacy issues in particular business-lines, conduct privacy impact assessment, and inexpensively comply with formal regulation, including software products featuring data management capabilities. In many circumstances, of course, the advice extends to how the client can mitigate, circumvent and even nullify the impacts of privacy laws.

A profession of 'privacy officers' has emerged, to service compliance needs, particularly within corporations. A professional association exists, IAPP, which, despite its nominally international scope, is very strongly oriented toward the US conceptions of privacy and privacy protection. These have always been, and remain, out of step with the rest of the world. Give the very awkward mapping of US concepts onto the very different structures, processes and substantive laws in Australian jurisdictions, the importation of US values, attitudes and techniques into Australia was ineffectual and counter-productive. The clash with EU values, attitudes and legal frameworks appears even more fraught.

Formal regulation imposes considerable constraints and costs (Fisher & Harindranath 2004). As a result, regulatees and their industry associations invest a great deal of time, effort and money in order to avoid, minimise and dilute formal regulation. Political influence may be used to capture the regulator, relevant government agencies, one or more Ministers, a political party and/or the parliament (Shapiro 2012). Tools commonly used at the level of industry associations and by very large corporations include lobbying of Ministers in parallel with negotiations with regulators and other government agencies, codes of conduct, meta-brands, and industry-funded complaints schemes. An examination of the dynamics underlying the failure of industry self-regulation in a particular sector is in King & Lennox (2000).

In the privacy arena, the success of such lobbying activities likely varies considerably among jurisdictions. As noted earlier, in Australia, data protection law regulating the private sector was negotiated by industry associations, in the express absence of consumer and privacy advocates. That relationship has continued for the subsequent two decades, resulting in considerably greater protection for business against the ravages of privacy law, than for consumers against privacy-invasive corporate behaviour. One example is the express authorisation of direct marketing, not merely in the Act, but as 'Australian Privacy Principle' No. 7.

Reference was made earlier to Regulators playing roles depicted as activist watchdog, passive administrator or industry-friendly facilitator. Because regulators are generally constituted by statute, the scope for them to determine which of those roles they play depends to a considerable extent on the intention of the parliament. This may be determined by the Minister or the agency that drives the legislation through. A regulatory initiative may be fully committed to control over negative impacts on beneficiaries. More commonly, however, under pressure of lobbying from associations representing large volumes of commercial activity, profit, investment and jobs, the regulatory design to at least some degree compromises the definition of, or the achievement of, the nominal regulatory objectives.

Under some regulatory regimes, individual organisations are able to buy off the regulator's attention by giving 'enforceable undertakings' to stop breaching the law, or by entering into 'consent orders', perhaps coupled with the belated inclusion of reviews of compliance within their audit programs. The public wonders why breaches of the law by the politically weak are prosecuted, whereas miscreants that are large or powerful are forgiven.

In the privacy arena, the USA leads the world in prioritising business interests over those of individuals. The credibility of a regulator is destroyed when it fails to enforce 'enforceable undertakings' that have clearly been breached. The US Federal Trade Commission (FTC) is the negative exemplar par excellence (EPIC 2011). The Australian Privacy Commissioner also uses this approach, and has a similar track record to the FTC of failing to enforce even such protections as the very weak data protection statute establishes.

Even where an independent privacy regulator is established, it may not survive. In Australia, for example, Privacy Commissioners have not merely moved within an Information Commissioner's organisation, but also subjugated to the Information Commissioner. This has been done in all of the large States of NSW, Victoria and Queensland, and at federal level. For some years now, the Australian Privacy Commissioner role has not even been filled. All three positions in the Office of the Australian Information Commissioner (OAIC) - Information Commissioner, Freedom of Information Commissioner, and Privacy Commissioner - have been performed since 2015 by the same person. The appointee is styled 'Information Commissioner' and appears to commit the large majority of their limited time to that and FoI, and a small minority to privacy.

A recent phenomenon is the incursion of other regulators into the privacy arena. This has been quite marked in Australia. The Australian Competition and Consumer Commission (ACCC) has responsibilities in relation to both corporate abuse of monopoly positions in markets, including in advertising, and consumer interests. Inherent in digital surveillance economy business models is the abuse of market power in relation to consumer data (Clarke 2019). Considerable international interest has been evident in the ACCC's recommendations to the Australian government in relation to Google's and Facebook's appropriation of Australian news reports (Kemp & Greenleaf 2020).

In many cases, a high-level agency is provided with substantial delegations from the parliament, e.g. in relation to the resourcing of the regulator, appointments to key positions within it, and the approval of codes, with the result that it can vary the parameters set by the parliament, possibly tightening them, but more likely, under lobbying pressure, easing the constraints on regulatees.

In Australia, for example, the Attorney-General's Department (AGD), which is now primarily concerned with national security issues and only secondarily with civil law reform, holds the power of the purse over the OAIC. The Office's priorities and pronouncements unsurprisingly pay more attention to the AGD's expectations than to the interests of the public.

Consultancies, in the strategic, legal, compliance, marketing, public relations and government relations areas, provide services to regulatees and their associations in relation to the most effective pressure-points among Ministers and agencies, and the techniques for achieving compromise of regulatory designs or processes. The practicalities of gaming regulatory systems are matched by literatures on the political economy of regulation (e.g. Libecap 2008), and on game-theoretic analyses of interactions between a regulator and the (frequently more powerful) regulatees (e.g. Madani 2010).

Beneficiaries are in most cases less well-resourced and less well-informed than the other players. Adjustments for power and information asymmetries are possible, such as class actions, legal aid, representative complaints, test cases and an adequately resourced champion, of the nature of a 'public defender's office'. In most contexts, however, these measures are absent, defective and very thinly resourced. Ombudsman arrangements may exist, although it is common for the bases of complaint to be limited, and uncommon for such complaints organisations to have significant powers to force change and achieve restitution, let alone power to impose effective sanctions for serious or repeated breaches.

In the privacy arena, experiences are highly variable across juridictions. Whereas conflicts in European countries can be brought before the courts for arbitration, in Australia, complaint-handling is slow and unsympathetic to complainants, representative complaints are stunted, escalation to judicial processes is blocked, and no scope exists to initiate cases in the courts.

Dysfunctional complaints-handling schemes such as those just described have been referred to as an 'expectations management' mechanism, to condition complainants into accepting that the process can achieve very little and that they would be well-advised not to bother pursuing matters (Gilad 2008). Such realities of 'regulatory play' need to be understood by anyone seeking to analyse existing regimes, propose adaptations, or devise new ones.

2.5 The Framework as a Whole

The four segments presented in this section together provide a framework for identifying structural and process features of regulatory regimes, and articulating themes and issues within the relevant field. The first model provides a gross model of the space, including the function that regulation performs, the central players, the relationships among them, the processes whereby regulation is achieved, and the criteria whereby the appropriateness or otherwise of a regulatory regime can be evaluated. The second section presents a more detailed model of the processes, the third provides a detailed articulation of the players, and the fourth delves more deeply into the interactions among the players.

The combined understanding of regulatory space, layers, players and plays enables the performance of the 'sense-making' activities that necessarily precede the conception, design, development and deployment of new regimes, and the adaptation of existing regimes.

The following section suggests ways of drawing on and applying the concepts and insights provided by the framework, in order to develop combinations of regulatory mechanisms that are effective in protecting privacy, while also satisfying the many other criteria identified in Table 1.

3. Application to Privacy Protection

The exposition of the framework in the previous section included various examples of specific regulatory mechanisms of relevance to the privacy regulation arena. Individual mechanisms are incapable of fulfilling the need, however. This section suggests two ways in which the framework can be applied to privacy protection, providing a holistic view, and enabling an architecture to be conceived that combines multiple mechanisms in a mutually reinforcing manner.

Consideration is given first to approaches that reflect adversarial relationships, and then to an alternative, more collaborative way of working.

3.1 Constructive Tension

Organisations have long been aggressive in their acquisition, use and interchange of personal data. The majority of most populations are unaware of the extent of data use and trafficking, and are complacent and trusting. Even among those who are aware and concerned, most feel powerless. Small minorities of people, on the other hand, take umbrage, and take action. The hierarchy of regulatory mechanisms in Figure 2 can be applied to assist activists to achieve insights, and develop and articulate stategies.

A first approach is to take full advantage of such protections as exist in layers (7) Formal Regulation and (6) Meta- and Co-Regulation. All such processes are bureaucratic, and many seem to value procedural correctness more highly than addressing social ills. Effective utilisation of them accordingly requires a disciplined approach, access to considerable expertise, and a moderate degree of social organisation. Some scope also exists for inviting privacy-abusive organisations to tie themselves up in knots dealing with multiple regulators, and explaining how their practices are contrived to successfully utilise legal loopholes.

The intermediate 'self-governance' layers (3)-(5), by design, afford very little privacy protection. Some scope exists, however, for complaints that invite the complainee to disclose information, to respond inappropriately, or to act or argue inconsistently in different contexts, and to thereby embarrass themselves when the matter is escalated to a tribunal or court.

Some scope exists to build on aspects of layer (1) Natural Regulation. For example, the public can affect the economics of data capture and use, by demanding payment for the use of personal data. The caveat is that the price must be much higher than the minuscule amounts accepted by university students who are hoodwinked by academics' experimental designs. Another approach is the consistent provision of inconsistent data. This denies organisations the chance to perform straightforward matching between newly-acquired data and already-stored data, and between data acquired from different sources. This makes organisational processes much more difficult, error-prone and expensive, and hence the inferences drawn from them more easily contestable.

The last 40 years of data abuse by organisations has resulted in many people feeling it necessary to routinely mislead and lie. For example, a simple 'white lie' approach much used to track the flow of address data around the direct mail economy was to use slightly different mis-spellings of name and address with different suppliers. Such petty plays have been progressively formalised. In Clarke (2015, 2016), the relevant risk management strategies are identified as avoidance, obfuscation and falsification. These need to be applied to data, to messages, to identities, to locations, and to social networks. Similarly, Schneier (2015a, 2015b) advocates the following strategies to undermine surveillance: avoid, block, distort, break. Bösch (2016), acknowledging Hoepman, proposes eight privacy design strategies: minimise, hide, separate, aggregate. inform, control, enforce, demonstrate.

Another approach again is to apply the principles of social engineering (Mitnick & Simon 2002), in order to serve the interests of consumers and citizens. For example, skilful handling of call-centre staff can extract information that undermines the claims made by the organisation to a mediator or arbitrator.

However, the layer in which the majority of the opportunities are to be found is (2) Infrastructural Regulation. Firstly, advantage can be taken of features that are designed into operational and regulatory mechanisms, variously to protect privacy, to protect security, and to protect the interests of other players. Examples of such features include non-trivial but memorisable passwords, and ensuring that data in transit is cryptographically protected, e.g. by means of SSL/TLS (on the web through https) and PGP (for email-traffic). These are useful protections against third parties, although they have limited value in addressing the second-party risks arising from the organisations that people deal with.

Activists are strongly motivated to achieve many more privacy-supportive adaptations to the infrastructure as a whole. Ideally they can designed-in, or they can be retro-fitted, and in some circumstances they can be smuggled in. Since the mid-1990s, the widespread deployment of Privacy Invasive Technologies (the PITs) stimulated the emergence of Privacy Enhancing Technologies (PETs) (ICPR 1995, Goldberg et al. 1997, Burkert 1997, Clarke 2001a). Various categorisations of PETs are identified in Clarke (2016).

PETs have been largely driven by the supply-side, which has primarily comprised computer scientists, particularly those associated with the PET Workshops and Symposia, since 2000, and the Symposia on Usable Privacy and Security (SOUPS), since 2005. Despite the large numbers of workable implementations, adoption-rates for almost all of them have been low. The antidote for the poor take-up is emphasis on real-world use rather than laboratory demonstrations, the differentiation of user-segments, concentration on segments with both the need and the capacity to apply the tool, analysis of the needs of each user-segment, design specifically to address those needs, integration of PETs as elements within products and services rather than as standalone tools, modular architecture to enable alternative user interfaces, and the application of the accumulated body of knowledge on usability (Clarke 2014a).

So-called 'Identity Management' (IdM) is a particularly important area in which those design principles need to much more effectively applied. IdM is assumed by government and business, and by much of the academic literature, to exist only on the supply-side. i.e. within governments and corporations, which think that they 'provision' individuals with their identities. Demand-side management of identities and of identity-associated data is crucial to improving not only privacy protections but also the very low trustworthiness levels of organisations (Koch & Woerndl 2001, Clarke 2004, pp.18-25).

Another approach is to take advantage of systems and features developed by organisations for their own benefit. Arrangements have long existed whereby government agencies provide evidence of new identities to, for example, undercover operatives and protected witnesses. The technologies and techniques have naturally leaked into the grey and black economies. Yet, despite the prevalence of, and recent widespread discussion about, domestic violence, victims who are at risk from vengeful ex-partners in most cases do not have access to adequate protections for their data, message-traffic, contact-points, locations and social networks.

Identity-obfuscation technologies and techniques naturally find use for criminal purposes; but they are also vital to the perhaps 5% of the population of relatively free and relatively law-abiding countries who are 'persons-at-risk'. The primary focus of that concept is individuals whose health and safety are subject to direct threat by other people (Clarke 2014a). Beyond physical and psychological needs, however, it is in society's interests for many categories of people to have such support. The grounds include cultural reasons (for deviants from inevitably conservative social norms), political reasons (for dissidents against the preferred viewpoints of the dominant political class), and economic reasons (for the different-thinkers who are the source of inventions and innovations).

The examples in this section demonstrate that the hierarchy of regulatory layers provides a useful framework for the adversarial approach to privacy protection. The following section considers its usefulness in collaborative contexts.

3.2 Co-Regulation

In the privacy arena, the power imbalance between data-using organisations and the individuals to whom the data relates is so overbearing that formal legal protections are essential. That is reflected in the expansion of formal data protection laws over the last 50 years to cover a large proportion of the world's population (Greenleaf 2019). However, dependence on statutory instruments alone is not appropriate. Many of the requirements of a regulatory scheme noted in Table 1 cannot be satisfied by formal law, including Efficiency, Flexibility and Adaptability, and particularly Articulation, which requires that "The requirements and scope are expressed in terms that are clear, operationalised, specific and unambiguous".

Co-Regulation, as indicated earlier, refers to a regulatory model in which the stakeholders have significant input to a set of requirements, and even draft them, but do so within a statutory context that exercises control over the process, and makes the requirements enforceable (Hepburn 2006). A useful term to distinguish such instruments from mere industry codes is 'Statutory Codes'. Elements from layer (1) Formal Regulation are essential, to establish generic legal protections. On the other hand, layer (6b) Co-Regulation is much more appropriate for detailed provisions in specific contexts, such as industry sectors that involve interests and trade-offs that are materially different from those in every other sector, such as consumer credit and healthcare.

Co-regulation can also be the most effective approach in dealing with the ravages of specific technologies, particularly during a technology's early years of dynamism and opacity. The criterion of 'technology neutrality' has been advanced in recent years. The motivation is economic, to avoid imposing penalties on innovative technologies. The approach involves expressing requirements in abstract functional terms rather than in detailed, highly prescriptive or procedural terms. Unfortunately, 'technology neutrality' is frequently used as a mantra, or as an excuse to escape regulation, and is applied to circumstances in which it is entirely inappropriate. Some common generic requirements are applicable to all of coal-fired, hydro-electric and nuclear power stations, and wind, tide and solar-power generation facilities; but each also needs to have further obligations imposed that are technology-specific in nature.

The regulatory hierarchy in general, and co-regulation in particular, have been previously applied to privacy protection in the specific contexts of the impact of drones on behavioural privacy (Clarke 2014b) and the impact of inferencing by means of AI/ML techniques (Clarke 2019b). In each case, regulation that is both effective and efficient needs to incorporate multiple elements from various layers. The specifications for an effective scheme to achieve co-regulation, as specified in Table 3 of (Clarke 2019b), appear to be readily applicable to many privacy-invasive technologies, and in many sectors.

4. Conclusions

This paper has presented a four-part framework for underaking regulatory analysis. A series of examples has provided evidence of the relevance to the privacy arena of the models of the regulatory hierarchy, of regulatory players and of regulatory plays. Each needs consideration in its own right. On the other hand, all need to be also seen in the context of the overall scheme within which they play a role.

Two approaches have been presented to demonstrate the value of the four-part framework as a means of gaining a comprehensive view of the field. That in turn provides a basis for appreciating the options, for evaluating existing regimes, for devising adaptations, and for conceiving new regulatory regimes comprising appropriate combinations of measures to deliver privacy protection.

Reference List

Acquisti A., John L.K. & Loewenstein G. (2013) 'What Is Privacy Worth?' Journal of Legal Studies 42, 2 (June 2013) 249-274

ALRC (2008) 'For Your Information: Australian Privacy Law and Practice' Report 108, August 2008, Ch.42 - Journalism Exemption, at

ANAO (2007) 'Administering Regulation: Better Practice Guide' Australian National Audit Office, March 2007, at

APEC (2005) 'APEC Privacy Framework' Asia-Pacific Economic Cooperation, December 2005, at

APCC (1994) 'Australian Privacy Charter' Australian Privacy Charter Council, December 1994, at

Awad N.F. & Krishnan M.S. (2006) 'The Personalization Privacy Paradox: An Empirical Evaluation of Information Transparency and the Willingness to be Profiled Online for Personalization' MIS Quarterly 30, 1 (March 2006) 13-28, at

Ayres I. & Braithwaite J. (1992) 'Responsive Regulation: Transcending the Deregulation Debate' Oxford Univ. Press

Balleisen E.J. & Eisner M. (2009) 'The Promise and Pitfalls of Co-Regulation: How Governments Can Draw on Private Governance for Public Purpose' Ch. 6 in Moss D. & Cisternino J. (eds.) 'New Perspectives on Regulation' The Tobin Project, 2009, pp.127-149, at

Barnes S.B. 'A privacy paradox: Social networking in the United States' First Monday 11, 9 (September 2006), at

Bennett Moses L. & Chan J. (2018) 'Algorithmic prediction in policing: assumptions, evaluation, and accountability' Policing and Society 28, 7 (2018) 806-822, at

von Bertalanffy L. (1940) 'Der Organismus als physikalisches System betrachtet' Die Naturwissenschaften 28 (1940) 521-53

von Bertalanffy L. (1968) 'General System Theory: Foundations, Development, Applications' New York: George Braziller, 1968

Bo_sch C. et al. (2016) 'Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns' Proc. PETS 4 (2016) 237-254, at

Braithwaite J. (1982) `Enforced self-regulation: A new strategy for corporate crime control' Michigan Law Review 80, 7 (1982) 1466-507

Braithwaite J. (2017) 'Types of responsiveness' Chapter 7 in Drahos (2017), pp. 117-132, at

Braithwaite B. & Drahos P. (2000) `Global Business Regulation' Cambridge University Press, 2000

Burkert H. (1997) 'Privacy-Enhancing Technologies: Typology, Critique, Vision' in Agre P.E. & Rotenberg M. (Eds.) (1997) 'Technology and Privacy: The New Landscape' MIT Press, 1997

Clarke R. (1995) 'A Normative Regulatory Framework for Computer Matching' Journal of Computer & Information Law XIII,4 (Summer 1995) 585-633, PrePrint at

Clarke R. (1997) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms' Xamax Consultancy Pty Ltd, August 1997, at ://

Clarke R. (1999) 'Internet Privacy Concerns Confirm the Case for Intervention' Commun. ACM 42, 2 (February 1999) 60-67, PrePrint at

Clarke R. (2001a) 'Introducing PITs and PETs: Technologies Affecting Privacy' Privacy Law & Policy Reporter 7, 9 (March 2001) 181-183, 188, PrePrint at

Clarke R. (2001b) 'Meta-Brands' Privacy Law & Policy Reporter 7, 11 (May 2001), PrePrint at

Clarke R. (2004) 'Identity Management: The Technologies, Their Business Value, Their Problems, Their Prospects' Xamax Consultancy Pty Ltd, March 2004, at

Clarke R. (2006) 'A Pilot Study of the Effectiveness of Privacy Policy Statements' Proc. 19th Bled eCommerce Conf., Slovenia, 5-7 June 2006, PrePrint at

Clarke R. (2009) 'Privacy Impact Assessment: Its Origins and Development' Computer Law & Security Review 25, 2 (April 2009) 123-135, PrePrint is at

Clarke R. (2012) 'Privacy and the Media - A Platform for Change?' Uni of WA Law Review 36, 1 (June 2012) 158-198, PrePrint at

Clarke R. (2014a) 'Key Factors in the Limited Adoption of End-User PETs' Politics of Surveillance Workshop, University of Ottawa, May 8-10, 2014, at

Clarke R. (2014b) 'The Regulation of of the Impact of Civilian Drones on Behavioural Privacy' Computer Law & Security Review 30, 3 (June 2014) 286-305, PrePrint at

Clarke R. (2015) 'Freedom and Privacy: Positive and Negative Effects of Mobile and Internet Applications' Interdisciplinary Conference on 'Privacy and Freedom', Bielefeld University, 4-5 May 2015, at

Clarke R. (2016) 'Can We Productise Secure eWorking Environments?' Workshop for 11th IFIP Summer School on Privacy and Identity Management, 21-26 August 2016, Karlstad, Sweden, at

Clarke R. (2017a) An Instrumentalist's View of Koops et al.'s Typology of Privacy' Panel Notes, CPDP, Brussels, January 2017, Xamax Consultancy Pty Ltd, at

Clarke R. (2017b) ''The Distinction between a PIA and a Data Protection Impact Assessment (DPIA) under the EU GDPR' Panel Notes, CPDP, Brussels, January 2017, Xamax Consultancy Pty Ltd, at

Clarke R. (2019a) 'Risks Inherent in the Digital Surveillance Economy: A Research Agenda' Journal of Information Technology 34,1 (Mar 2019) 59-80, PrePrint at

Clarke R. (2019b) 'Regulatory Alternatives for AI' Computer Law & Security Review 35, 4 (Jul-Aug 2019) 398-409, PrePrint at

Clarke R. & Bennett Moses L. (2014) 'The Regulation of Civilian Drones' Impacts on Public Safety' Computer Law & Security Review 30, 3 (June 2014) 263-285, PrePrint at

CoE (1981) 'Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data' Council of Europe Treaty No. 108, January 1981, at

Connolly C., Greenleaf G. & Waters N. (2014) 'Privacy self-regulation in crisis? TRUSTe's `deceptive' practices' Privacy Laws & Business International Report 132 (December 2014) 13-17, at

Connolly C., Greenleaf G. & Waters N. (2015) 'Privacy groups win changes to APEC CBPR system' Privacy Laws & Business International Report 133 (February 2015) 32-33, at

Culnan M. & Armstrong P.K. (1999) 'Information Privacy Concerns, Procedural Fairness and Impersonal Trust: An Empirical Investigation' Organization Science, Vol. 10, No. 1, p. 104-115, January-February 1999, at

Dinev T. & Hart P. (2006) 'An Extended Privacy Calculus Model for E-Commerce Transactions' Information Systems Research 17, 1 (March 2006) 61-80

Drahos P. & Krygier M. (2017) 'Regulation, institutions and networks' Ch. 1 in Drahos (2017), at

EPIC (2011) 'EPIC v. FTC (Enforcement of the Google Consent Order)' Electronic Privacy Information Center, 2011, at

FTC (1997) 'Individual Reference Services- A Report to Congress' Federal Trade Commission, December 1997, at

FTC (2014) 'In the Matter of Apperian, Inc.' Federal Trade Commission, June 2014, at

Finn R. L., Wright D. & Friedewald M. (2013) 'Seven types of privacy' in Gutwith S. et al. (eds) 'European Data Protection: Coming of Age' , Springer, 2013, pp. 3-32, at

Fisher J. & Harindranath G. (2004) 'Regulation as a barrier to electronic commerce in Europe: the case of the European fund management industry' Euro. J. Info. Syst. 13, 4 (2004) 260-272

GDPR (2016) 'EU General Data Protection Regulation', April 2016, at

Gilad S. (2008) 'Accountability or Expectations Management? The Role of the Ombudsman in Financial Regulation' Law & Policy 30, 2 (Aril 2008) 227-253, at

Goldberg I., Wagner D. & Brewer E. (1997) 'Privacy-enhancing Technologies for the Internet' Proc. 42nd IEEE Spring COMPCON, February 1997, at

Grabowsky P. (2017) 'Meta-Regulation' Chapter 9 in Drahos (2017), pp. 149-161, at

Greenaway K.E., Chan Y.E. & Crossier R.E. (2015) 'Company information privacy orientation: a conceptual framework' Info Systems J 25, 6 (2015) 579-606

Greenleaf G. (2001) ''Tabula Rasa': Ten Reasons Why Australian Privacy Law Does Not Exist' UNSW Law Journal 24, 1 (2001) 262-269, at

Greenleaf G. (2012) 'A Critique of Australia's Proposed Privacy Amendment (Enhancing Privacy Protection) Bill 2012' UNSW Law Research Paper No. 2012-35, at

Greenleaf G. (2014) 'Asian Data Privacy Laws: Trade and Human Rights Perspectives' Oxford University Press, 2014, Updates 2014-17 at, and 2017-18 at

Greenleaf G. (2019) 'Global Tables of Data Privacy Laws and Bills' 6th Ed, January 2019, Supplement to Privacy Laws & Business International Report 157, at

Greenleaf G. & Cottier B. (2018) 'Data privacy laws and bills: Growth in Africa, GDPR influence' Privacy Laws & Business International Report 152 (2018) 11-13

Gunningham N. & Sinclair D. (2017) 'Smart Regulation', Chapter 8 in Drahos (2017), pp. 133-148, at

Gunningham N., Grabosky P, & Sinclair D. (1998) 'Smart Regulation: Designing Environmental Policy' Oxford University Press, 1998

Gupta A. & Lad L. (1983) `Industry self-regulation: An economic, organizational, and political analysis' The Academy of Management Review 8, 3 (1983) 416-25

Hardin G. (1968) 'The Tragedy of the Commons' Science 162 (1968) 1243-1248, at

Hardin (1994)  'Postscript:  The tragedy of the unmanaged commons' Trends in Ecology & Evolution 9, 5 (May 1994) 199

Hedman, J. and Henningsson, S. 2016. "Developing Ecological Sustainability: A Green IS Response Model," Information Systems Journal (26:3), pp. 259-287

Hepburn G. (2006) 'Alternatives To Traditional Regulation' OECD Regulatory Policy Division, undated, apparently of 2006, at

Hosein G., Tsavios P. & Whitley E. (2003) 'Regulating Architecture and Architectures of Regulation: Contributions from Information Systems' International Review of Law, Computers and Technology 17, 1 (2003) 85-98

IPCR (1995) 'Privacy-Enhancing Technologies: The Path to Anonymity' Information and Privacy Commissioner (Ontario, Canada) and Registratiekamer (The Netherlands), 2 vols., August 1995, Vol. II at

Jorstad E. (2001) 'The Privacy Paradox' William Mitchell Law Review 27, 3, Article 16, at

Kemp K. & Greenleaf G. (2020) 'Competition and consumer watchdog spurs Australian privacy changes' Privacy Laws & Business International Report, October 2020

King A.A. & Lennox M.J. (2000) 'Industry self-regulation without sanctions: The chemical industry's responsible care program' Academy of Management J. 43, 4 (August 2000) 698-716, at

Koch M. & Woerndl W. (2001) 'Community Support and Identity Management' Proc. Europ. Conf. on Computer-Supported Cooperative Work (ECSCW2001), Bonn, Germany, Septe,ber 2001, at

Koops B.J., Newell B.C., Timan T., Korv[[daggerdbl]]nek I., Chokrevski T. & Gali M. (2016) 'A Typology of Privacy' University of Pennsylvania Journal of International Law 38, 2 (2016), at

Laudon K.C. (1993) 'Markets and Privacy' Stern School of Business, Workinq Paper Series IS-93-21, July 1993, at Revised version published in Communications of the ACM 39, 9 (September 1996) 92-104

Lessig L. (1999) 'Code and Other Laws of Cyberspace' Basic Books, 1999

Libecap G.D. (2008) 'State Regulation of Open-Access, Common-Pool Resources' Ch.21 in Ménard C. & Shirley M.M. (Eds,) 'Handbook of New Institutional Economics' , Springer, 2008

Madani K. (2010) 'Game theory and water resources' Journal of Hydrology 381 (2010) 225-238, at

Mann M. (2018) 'Privacy in Australia: Brief to UN Special Rapporteur on Right to Privacy' Australian Privacy Foundation, at

Mitnick K.D. & Simon W.L. (2002) 'The Art of Deception: Controlling the Human Element of Security' Wiley, 2002

Moores T.T. & Dhillon G. (2003) 'Do privacy seals in e-commerce really work?' Communications of the ACM 46, 12 (December 2003) 265-271

OECD (2013) 'The OECD Privacy Framework' Organisation for Economic Co-operation and Development, 2013, at

Ostrom E. (1999)  'Coping with Tragedies of the Commons'  Annual Review of Political Science 2 (June 1999) 493-535, at

Parker C. (2002) 'The Open Corporation: Effective Self-regulation and Democracy' Cambridge University Press, 2002

Parker C. (2007) 'Meta-Regulation: Legal Accountability for Corporate Social Responsibility?' in McBarnet D, Voiculescu A & Campbell T (eds), The New Corporate Accountability: Corporate Social Responsibility and the Law, 2007

PC (2006) 'Rethinking Regulation' Report of the Taskforce on Reducing Regulatory Burdens on Business, Productivity Commission, January 2006, t

Posner R.A. (1977) 'The Right of Privacy' 12 Georgia Law Review 393, at

Schneier B. (2015a) 'Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World' Norton, March 2015

Schneier B. (2015b) 'How to mess with surveillance' Slate, 2 March 2015, at

Sethi S.P. (1975). `Dimensions of Corporate Social Performance: An Analytical Framework', California Management Review (17:3), pp. 58-64

Sethi S.P. & Emelianova O. (2006) 'A failed strategy of using voluntary codes of conduct by the global mining industry' Corporate Governance 6, 3 (2006) 226-238, at

Shapiro S.A. (2012) 'Blowout: Legal Legacy of the Deepwater Horizon Catastrophe:The Complexity of Regulatory Capture: Diagnosis, Causality, and Remediation' Roger Williams Uni. L. Rev. 17, 1 (Winter 2012) 221-257, at

Smith A. (1776) 'The Wealth of Nations' W. Strahan and T. Cadell, London, 1776

Stiglitz J. (2008) 'Government Failure vs. Market Failure' Principles of Regulation - Working Paper #144, Initiative for Policy Dialogue, February 2008, at

Varian H.R. (1996) 'Economic Aspects of Personal Privacy' University of California, Berkeley, December 1996, at

Westin A.F. (1967) 'Privacy and Freedom' Atheneum 1967

Westin, A.F., Ed. (1971) 'Information Technology in a Democracy', Harvard University Press, Cambridge, Mass., 1971

Westin A.F. & Baker M.A. (1974) 'Databanks in a Free Society: Computers, Record-Keeping and Privacy' Quadrangle 1974

Wiener N. (1948) 'Cybernetics, or Control and Communication in the Animal and the Machine' MIT Press, Cambridge, Massachusetts, 1948, 1961

Wood D.I. (1991). `Corporate Social Performance Revisited', Academy of Management Review (16:4), pp. 691-718

Williamson O.E. (1979) 'Transaction-cost economics: the governance of contractual relations' Journal of Law and Economics 22, 2 (October 1979) 233-261

Wright D. & De Hert P. (Eds.) (2016) 'Enforcing Privacy: Regulatory, Legal and Technological Approaches' Springer, 2016


The models presented in this paper were first exposed in a Working Paper in the author's own repository in late 2018. The models have since been applied in several contexts, including the regulation of drones and of AI, and research and practice in the RegTech area. The author expresses his appreciation for constructive comments provided by two CPDP reviewers, some of which have been reflected in this version and others of which will be in subsequent versions.

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor associated with the Allens Hub for Technology, Law and Innovation in UNSW Law., and a Visiting Professor in the Research School of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 1 October 2020 - Last Amended: 22 January 2021 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy