Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2025
Photo of Roger Clarke

Roger Clarke's 'Regulatory Regime Framework'

Regulatory Regimes for Disruptive IT:
A Framework for Their Design and Evaluation

Review Draft of 1 May 2025

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2025

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://rogerclarke.com/EC/FRR.html


Abstract

The pervasiveness and the impactfulness of information technology (IT) have been growing steeply for decades. Recent forms of IT are highly obscure in their operation. At the same time, IT-based systems are being permitted greater freedom to draw inferences, make decisions, and even act in the real world, without meaningful supervision. There are prospects of serious harm arising from misconceived, mis-designed or misimplemented projects.

Organisations developing and applying IT need to be subject to obligations to take degrees of care, prior to deploying impactful initiatives, that are commensurate with the risks involved. They also need to be subject to accountability mechanisms that act as strong disincentives against reckless behaviour by executives and professionals alike. This article presents a framework for evaluating the efficacy of regulatory regimes for impactful IT-based systems, designing new regimes, and adapting existing ones. The framework has been matured over several decades and applied in multiple contexts.

The article commences by defining regulation and the kinds of entities and behaviour to which it is applied, and identifying the criteria for an effective regulatory mechanism. This is followed by presentation of models of the layers of regulatory measures from which regimes are constructed, and the players in the processes of regime formation and operation. Observations are also provided concerning the nature of the principles and rules that need to be established in order to provide substance within the regulatory frame. An evaluation form is provided as an Appendix. Supplementary Materials contain pilot applications of the evaluation form in several diverse contexts. A companion article applies the framework to a technology of current concern.


Contents


1. Introduction

Computing and related information technologies (IT) have been applied to support organisational activities for 75 years (Land 2012). During that time, there has been an enormous increase in both the scale and scope of functions that are supported and the transformational and disruptive impacts of IT applications. The focus has shifted progressively beyond 'data processing' to support records management, 'information systems' to support management planning and control, and 'decision support systems' to serve the needs of executives. IT applications have escaped the confines of a single organisation and serve pairs, chains and networks of organisations. Many systems are now extra-organisational (Clarke 1992), in that individuals are direct participants, not only as employees, but also as consumers and citizens. Most data is now 'born digital', the scale of data-holdings is massive, and the scramble for competitive advantage from it borders on the frantic. Digitisation has spawned digitalisation, whereby interpretation and management of the world is now far less through human perception and cognition, and instead heavily dependent on computer-performed manipulation of digital data (Brennen & Kreiss 2016).

In this new context, IT-based systems are generating new data by drawing inferences from available data and models, and are being delegated the power to make decisions. They are also increasingly acting directly in the real world, with increasing degrees of autonomy. This is accompanied by decreasing degrees of transparency, exacerbated during the early 2020s by the uncontrolled application of machine-learning forms of artificial intelligence (AI/ML) and Generative AI, which deliver inferences, decisions and even actions based on data alone, devoid of rationale. Significant public concern exists about uncontrolled IT applications, giving rise to demands for protections and accountability mechanisms, and for them to be imposed far earlier in the technology adoption cycle than has been the case in the past.

For protection and accountability objectives to be achieved, contemporary IT-based systems, particularly those driven by organisations of large scale and scope, need to be subjected to regulatory regimes. Those regimes must take into account the desires of organisations to have the opportunity to experiment, of governments to encourage productivity gains, and of affected parties to not bear the costs of ill-judged IT applications, and of risks arising from impactful and opaque technologies.

The research reported in this article is motivated by the author's dual experiences over recent decades. On the one hand, in a consultancy role, this has involved advice to organisations on strategy and policy in relation to their adoption and exploitation of disruptive forms of IT. On the other hand, the author's activities in research, and in public interest advocacy, have had a stronger emphasis on public policy aspects, resulting in contributions in the area of responsible application of technologies as diverse as drones, big-data analytics, AI generally, and AI/ML and Generative AI in particular. The perspective brought to this work is that of an information systems professional, consultant, and researcher. Although this path has drawn the author into the field of technology law, his interest in the law is as an instrument of policy rather than as the object of study.

The research reported here presents a framework within which regulatory regimes can be observed and critiqued, their efficacy can be evaluated, existing schemes can be adapted, and new regimes can be designed. In section 2, the nature and objectives of regulation are considered, and the range of different objects that may be subjected to regulation is identified. In this, and most of the following sections, the text draws on previous refereed works by the author, citing both those and key underlying sources. A range of diverse examples is provided throughout. Section 3 presents the required attributes of a regulatory regime that satisfies policy objectives. In section 4, the many forms that regulation can take are presented by means of a layered model. The entities that are involved in a regulatory regime are described in Section 5, together with examples of patterns of behaviour among the players. Section 6 discusses the nature of the principles on which the specifics of each particular regime need to be built. In Section 7, indications are provided of how a regime can be articulated and implemented so as to achieve policy objectives. An approach to using the framework to evaluate an existing regulatory regime is outlined in section 8, and supported by an evaluation template. Pilot uses of it are available as Supplementary Materials. A comprehensive application of the framework is not feasible within the space constraints of a single article. However, a companion article considers the emergent regime for what the European Commission refers to as AI.


2. Regulation, and the Object of Regulation

In biology, natural processes are subject to other natural processes whose effect is to limit, control or regulate them. A definition arising from the biological sciences in about 1900 is:

Regulation is the process whereby a living organism ... adapts its structure in order to accommodate disturbances or damage that it undergoes, so that it develops as an integrated whole (OED 5a)

This gave rise to general systems theory, which posits the tendency of natural systems to maintain the status quo (von Bertalanffy 1940, 1968). During the industrial revolution, Watts applied a fly-ball governor to a steam engine. This subjected the behaviour of a mechanical artefact to automated control by an element within the same artefact. From such innovations, the insights of cybernetics emerged, whereby sensors deliver data that enables a controlling device to monitor a process, and to use effectors to influence the process. Successive levels of nested controllers enable the management of complex digital and physical systems (Wiener 1948).

The present analysis is concerned with neither biological phenomena nor manufacturing processes, but with socio-technical systems. The OED definition can be usefully co-opted to provide a suitable definition for the present purpose:

Regulation is the process whereby a socio-technical system adapts its structure and processes in order to accommodate disturbances or damage that it undergoes, so that it operates and adapts as an integrated whole

Key entities involved in the regulatory arena, and key relationships among them are represented in Figure 1. In an unregulated state, one entity has a negative effect on the interests of a second entity. In a regulated state, the second entity has been converted into a 'Beneficiary' of the existence, power and actions of a third entity, commonly called a 'Regulator', which influences the behaviour of the first entity, referred to as the 'Regulatee'. Regulators may be tightly-controlled government agencies or relatively independent bodies. Regulatees include all forms of incorporated and unincorporated bodies, and individuals. Beneficiaries include all of those categories but also social constructs such as trust in social and economic institutions, and environmental values.

Figure 1: Abstract Model of the Entities Involved in Regulatory Schemes

From Clarke (2021, Fig. 1)

A definition of regulatory measures in economic and social contexts is "instruments used ... to influence or control the way people and businesses behave in order to achieve economic, social or environmental policy objectives" (ANAO 2007). This nicely conveys the idea that there are multiple ways of influencing behaviour. The definition used in this article, however, broadens the ANAO definition in two further ways. The first is that 'instruments' is generalised to 'mechanisms', in order to encompass the natural as well as the human-made. To reflect that, the assumption of intentionality is removed, replacing a teleological with a utilitarian approach:

A Regulatory Regime is a set of mechanisms that influence or control the way entities behave within a socio-technical system, and that thereby contribute to the achievement of economic, social and/or environmental policy objectives

That definition is expressed in an abstract manner. This enables it to be applied in rather different ways, depending on the context, on the particular policy objectives, and on the object that is to be subject to the regulatory regime. The object that is the focal point of a regulatory regime may be conceived in a variety of ways, as indicated by the categories in Table 1. For example, the focus may be Closed-Circuit TV (CCTV) generally, the application of it specifically to Automated Number Plate Recognition (ANPR), or to a category of organisations applying it, such as law enforcement agencies or parking stations.

Table 1: Categories of Objects of a Regulatory Regime

Drawing on these definitions of terms and scope, the focus now switches to the question of what characteristics a regulatory regime needs to satisfy.


3. Criteria for the Evaluation of a Regulatory Regime

The policy objectives of any particular regulatory regime require articulation, and must reflect the context, including socio-technical, economic and socio-political factors. Broadly speaking, however, the foreground purpose of a regulatory regime is to exercise control over harmful behaviours. The primary Beneficiaries will commonly be perceived to be those who suffer harm and bear risks. However, in many circumstances considerable weight may also be placed on harm to the reputation of an industry sector, particularly where serious misbehaviour by a minority of industry participants undermines the trustworthiness of the industry as a whole.

The control objective is, however, generally subject to constraints and/or balanced against other, conflicting purposes. In addition to effectiveness in exercising control, the set of measures needs be efficient, i.e. to impose no higher costs on organisations than are justified by the harm being avoided. It must also be sufficiently flexible to accommodate variations in practice, and adaptable as circumstances change. Particularly during the early stages of new or substantially-changed patterns, phased regulation may be appropriate, with 'light-handed' measures imposed at first, to enable early adoption, learning, and the demonstration of benefits, but subject to controls being exercised increasingly vigorously as the patterns mature. This represents a cautious approach to implementation of 'the precautionary principle' in the ICT arena, which is particularly important in the case of changes in patterns driven by, or enabled by, technological innovations of a disruptive nature (Wingspread 1998, Som et al. 2004, Som et al. 2009).

Criteria discussed in Gunningham et al. (1998), Hepburn (2006) and ANAO (2007) were drawn on in expressing the set of attributes of regulatory arrangements in Table 2, an early version of which was originally presented in Clarke & Bennett Moses (2014) and further developed in Clarke (2021).

Table 2: Criteria for the Evaluation of a Regulatory Regime

After Clarke (2021, Table 1)

These criteria facilitate the evaluation of existing regulatory regimes, the adaptation of existing schemes, and the development of new schemes. Given that framing, consideration needs to be given to what mechanisms are available that can be used to build a regulatory regime that has the requisite characteristics.


4. Layers of Regulatory Mechanisms

As previously indicated, a regulatory regime comprises a set of mechanisms that influence or exercise control over behaviours that have the potential to do harm. Many mechanisms exist, some natural and others human-designed. To provide some structure to an examination of the many options available, this section presents a series of layers, and provides examples of mechanisms within each layer. This part of the framework was originally laid out in an unpublished working paper in 2018, and has been subsequently applied in the contexts of privacy protection (Clarke 2021) and electronic markets (Clarke 2022).

The approach adopted here reflects, but varies, the Braithwaite-Drahos model (Ayres & Braithwaite 1992, Drahos 2017, Drahos & Krygier 2017). The highest levels of Figure 2 depict the formal alternatives, and beneath that are shown the self-governance alternatives, and below that two forms of systemic governance.

Figure 2: A Hierarchy of Regulatory Mechanisms

From (Clarke 2020, Figure 1)

For reviews of the upper five layers, see Baldwin et al. (2011) and Drahos (2017). Layer (7), Formal Regulation, is the realm of legislatures and regulatory agencies, identified by the keywords 'government' (of behaviour) and 'compliance' by Regulatees. An instrument may be expressed in statute, or in a Schedule to a statute, or in a subsidiary instrument such as Regulations whose promulgation may be delegated by the parliament to a government agency. Other categories of instrument include treaties that bind the State, and decisions by courts and tribunals, which influence subsequent decisions (depending on the jurisdiction and on court hierarchies).

Two softer forms are allocated to Layer (6). The idea of 'Meta-Regulation' is that the Regulatee is subject to a formal requirement to satisfy some broad regulatory principles, with the onus placed on the Regulatee to demonstrate that its practices satisfy those principles (Gupta & Lad 1983, Parker, 2007). Another description of it is 'enforced self-regulation', an expression traced by Grabosky (2017) to Braithwaite (1982). In 'Co-Regulation', which is discussed later in this article, both Regulator and Regulatees, and highly desirably also Beneficiaries, are involved in the setting of the standards against which practices will be assessed. In its original conception, it was intrinsic to the approach that the resulting Rules were enforced by the Regulator (Ayres & Braithwaite 1992). In some jurisdictions, further 'soft law' categories may exist, such as binding self-regulation and formal undertakings.

Formal, legal instruments may mandate the performance of a particular practice, or at the other extreme, may absolutely prohibit it. The functions of a great many laws, however, lie somewhere between the extremes. Table 3 presents a set of 6 modalities. It is important to distinguish 'pre-conditions', which are threshold tests that result in permission or otherwise for a particular activity to be performed, from 'post-conditions', which apply to those activities that do proceed.

Table 3: Modalities of Law

Adapted from Clarke & Greenleaf (2018, Table 2)

The middle three layers relate to the various forms of 'self-governance' of behaviour by individual organisations and by sectors as a whole. Examples at Layer (3), Organisational Self-Regulation, include internal codes of conduct and 'customer charters', and self-restraint associated with expressions such as 'business ethics' and 'corporate social responsibility' (Parker 2002). In Layer (4), Industry Sector Self-Regulation, schemes exist that express technical or process standards, codes of conduct or practice or ethics, Memoranda of Understanding (MoUs), and accreditation ('tick-of-approval' or 'good housekeeping') schemes. By their nature, and under the influence of trade practices / anti-monopoly / anti-cartel laws, such instruments seldom apply to all players, and are rarely binding. They seldom affect the 'cowboys' in the industry, which tend to be responsible for a disproportionate amount of the harm that the industry causes (Sethi & Emelianova 2006). Similarly, 'good housekeeping' meta-brands are rarely materially protective of the interests of the nominal beneficiaries (Clarke 2001, Moores & Dhillon 2003).

Layer (5), Pseudo Meta- and Co-Regulation is included in this layer, because the nature of these approaches has been debased. For example, the Australian Privacy Act cl.7B(4) exempts "a media organisation" for acts while "engaged in" "journalism", provided that the organisation is "publicly committed to observe standards that deal with privacy ..." and that "have been published in writing" (cl.7B(4). There are no minimum standards, so there is no notion of non-compliance with any substantive requirement. Similarly, many standards in, for example, the field of telecommunications services, are negotiated in the absence of advocates for the beneficiaries, and some are by design unenforceable.

The three self-regulatory layers, perceived from the viewpoint of the entities that are meant to be Beneficiaries of regulatory regimes, are in many cases close to valueless, merely window-dressing to camouflage industry practices, or a fig-leaf to ward off formal regulation. In the words of Braithwaite (2017), "self-regulation has a formidable history of industry abuse of privilege" (p.124). Gunningham & Sinclair (2017) conclude that 'voluntarism' is generally an effective regulatory element only when it exists in combination with 'command-and-control' components.

In the most fundamental Layer (1), Natural Regulation, are features and processes that are intrinsic to the relevant socio-economic system and that have a regulatory effect. Examples include competition for limited resources forcing up market value, the exercise of countervailing power by those affected by an initiative, independent activities by competitors, cost/benefit trade-offs, and reputational factors.

The postulates of an "invisible hand" that promotes the public interest (Smith 1776) and that economic systems are therefore inherently self-regulating, were later bolstered by transaction cost economics (Williamson 1979). Limits to those ideas include 'the tragedy of the (unmanaged) commons' (Hardin 1968, 1994, Ostrom 1999). "Similarly, whereas neo-conservative economists commonly recognise 'market failure' as the sole justification for interventions, Stiglitz (2008) adds 'market irrationality' (which justifies the use of circuit-breakers to stop bandwagon effects in stock markets) and 'distributive justice' (in such forms as safety nets and anti-discrimination measures)" (Clarke 2021).

The first of six principles proposed by the Australian Productivity Commission expressed the importance of an adequate appreciation of pre-existing natural controls as a precursor to any analysis of regulation: "Governments should not act to address 'problems' through regulation unless a case for action has been clearly established. This should include evaluating and explaining why existing measures are not sufficient to deal with the issue" (PC 2006, p.v). A further consideration is that regulatory measures can be designed to reinforce natural controls. For example, the cost/benefit/risk balance perceived by the players can be adapted as part of the design of a regulatory regime, by subsidising costs, levying fees and/or assigning risk.

Layer (2), Infrastructural Regulation, is all-too-often overlooked. The mechanical steam governor is a physical artefact that represents a regulatory instrument. IT can be harnessed to the same purpose. An early description of 'intrinsic controls' over computer matching is in Clarke (1995), and Clarke (2014b) identified a range of 'natural controls' in relation to drones. 'West Coast Code' (Lessig 1999, Hosein et al. 2003) has been a popular way to refer to infrastructural features that reinforce positive aspects of the relevant socio-economic system, and/or preclude or inhibit negative aspects. Beyond just 'code' are standards, protocols, hardware and service-layers, and default settings, authentication of messages and data, message encryption, pseudonymous identities and obfuscatory routing. So it is more appropriate to refer to architectural and infrastructural features than 'code' (Greenleaf 1998).

Such features may be incidental, architected into infrastructure, or retro-fitted onto it. There has been increasing recognition of the value of designing-in such features. Internal and external audit processes are now facilitated by embedded data-sampling routines. Stock and commodities exchanges conduct real-time anomaly detection and reporting. Most recently, Fintechs and RegTechs have been instigating new forms of innovation in these areas.

In Table 4, categories of regulation by architecture and infrastructure are identified. These encompass technological features that enable particular activities ('Enablement') and design features than prevent or intentionally fail to enable particular activities ('Preclusion'). Between those two extremes, technologies may only enable activities if they are actively consented to ('Willing Participation', or in US parlance, 'opt-in'), and technologies may enable activities unless the individual performs an action of the nature of denial or circumvention ('Default Participation' / 'opt-out').

Table 4: Modalities of Regulation by Architecture and Infrastructure

Adapted from Clarke & Greenleaf (2018, Table 1)

Each regulatory regime comprises multiple regulatory mechanisms, which desirably work together towards the achievement of policy objectives. The next element within the framework proposed in this article provides a much more granular view of the entities that exist within a regulatory space, and the kinds of behaviours those entities display.


5. Players and Plays in Regulatory Regimes

In Section 2, the notion of regulation was introduced by distinguishing three broad categories of entity. A considerably more detailed model is of course necessary to enable the design and evaluation of regulatory regimes for complex real-world environments. Figure 3 identifies a sufficiently rich set of categories of players, and key inter-relationships among them. The model comprises three clusters, associated with the central roles of Regulator, Regulatee and Beneficiary. The model first appeared in a Working Paper of 2017, was first presented in Clarke (2020), and has subsequently been applied in two further contexts in Clarke (2021) and Clarke (2022).

Figure 3: Players in Regulatory Schemes

Adapted from Clarke (2020, Figure 2)

The topmost segment of the model reflects the fact that any particular Regulatee is likely to be subject to multiple Regulators. Mainstream examples serve public policy objectives in the fields of corporate behaviour, taxation, occupational health and safety, and product-specific aspects such as food, chemicals or financial advice. Generally, a Regulator is established, empowered, resourced and subject to disestablishment by a parliament. A Regulator may be accountable directly to the relevant parliament, or to a designated Minister through a high-level, portfolio agency. Regulators may use other specialist agencies or commercial third-parties, variously to negotiate, draft, promulgate, investigate, sue and prosecute. Regulators need to routinely defend themselves against overt and covert attacks from aggrieved Regulatees and their supporters, perhaps from aggrieved Beneficiaries or their advocates, and from policy agencies.

In some industry sectors, an intermediary regulatory role exists, in what is referred to by Ayres & Braithwaite (1992) as a 'tripartite' arrangement. For example, stock exchanges perform control functions in relation to listed corporations, and perhaps registrars, brokers and traders. Financial services industry clearing associations perform similar functions in relation to participants in payments systems. Some forms of occupational registration or certification are run not by regulatory agencies but by professional associations, particularly in medicine, law and engineering.

In the second segment of Figure 3, further players are shown as being associated with Regulatees. Industry associations represent the interests of Regulatees, and may run accreditation schemes or meta-brands (Clarke 2001) and industry complaints schemes. In many sectors, associations have established formalised Industry Standards, with technical Standards having various degrees of standing in law, and process Standards having some influence on practices. Auditors and consultants provide compliance-related services. Given the scale of organisations and their activities, all players are heavily dependent on IT, and on information systems that manage the relevant data, support decision processes, and in some cases automate decision-making and reporting. The term 'RegTech' has emerged recently to refer to specialist IT services of this nature (Arner et al. 2017, Clarke 2020).

Beneficiaries are most commonly individuals, associations, communities, unincorporated business enterprises, or incorporated small businesses. Generally, Benefiaries lack market power. Moreover, most regulatory regimes provide Beneficiaries with only very little additional power. Beneficiaries may, however, have some capacity to sue and to achieve recompense through courts, tribunals, ombudsman or other schemes, and some access to assistance to do so.

Each of the many entities identified in Figure 3 naturally has its own objectives, and its own interests to protect and advance. An organisation subject to regulatory requirements may adopt a 'responsible citizen' or 'corporate social {and environmental} responsibility' (CSR/CSER) attitude, with an objective of efficiently achieving compliance with regulatory requirements (Sethi 1975, Wood 1991, Hedman & Henningsson 2016). Alternatively, a 'cowboy' in the same sector may have the objective of avoiding, circumventing or ignoring regulatory requirements in order to minimise their negative impacts on the organisation's interests.

A Regulator, meanwhile, may act as a 'watchdog', interpreting its legal authority as widely as possible, and seeking to withstand the depradations wrought by lobbying against its activities. Alternatively, it may stolidly administer the scheme's enabling legislation; or it may perceive itself to have a minimalist, window-dressing role on behalf of government, and may even facilitate industry behaviour irrespective of the harm that it may cause (Drahos & Krygier 2017). The conventional term to describe the extreme form of dysfunction is 'regulatory capture' (Shapiro 2012).

The model of players and plays in Figure 3 provides a practical basis for describing, interpreting and analysing the overall efficacy of a regulatory regime, including its comprehensiveness, effectiveness and efficiency, and assessment of its likely impact and implications. The framework also enables policy-makers, executives, practitioners and academics to perform the 'sense-making' activities that necessarily precede the conception, design, development and deployment of new schemes, and the adaptation of existing schemes.

The preceding sections have described the architecture of a regulatory regime, comprising policy objectives, regulatory mechanisms both natural and human-made, and entities playing many different roles. The following section examines how the architectural frame is provided with substantive content, in the form of principles and rules.


6. Principles and Rules

When a regulatory regime is devised, a great deal of attention is given to the structural elements of the entities involved, their powers and their duties, and the process aspects whereby various practices are influenced, and corrective actions are undertaken. At the heart of each regime, however, lies a set of substantive propositions, which, even if not clearly enunciated, can be inferred from the instruments that together define the scheme. A conventional term used for such propositions is 'principles'.

During the course of the decade-long project of which the present article forms a part, 'guidelines' were published for 'big data analytics' (Clarke 2018), and mapped to a business process (Clarke & Taylor 2018). These were tools for use within Layers (4) and (5), Organisational and Industry Self-Regulation. In a more substantive regulatory context, a set of 50 Principles for Responsible AI was derived from a diverse set of 30 such publications (Clarke 2019a), and subsequently applied to the particular technology of Generative AI (Clarke 2025a).

The term 'principles' has become engrained in discussions since at least the UK Financial Service Authority's travails around the time of the global financial crisis of 2007-09, and the serious weaknesses in banking regulation that the crisis exposed (RBA 2023). According to Black (2007), "principles-based regulation [PBR] means moving away from reliance on detailed, prescriptive rules and relying more on high-level, broadly stated rules or principles to set the standards by which regulated firms must conduct business. The term 'principles' can be used simply to refer to general rules, or also to suggest that these rules are implicitly higher in the implicit or explicit hierarchy of norms than more detailed rules: they express the fundamental obligations that all should observe".

In practical terms, a Principle is intended to be understood at a sufficient level of generality that it can be applied in a variety of contexts, and operationalised into more precise Rules that, in specific contexts, are meaningful, volitional and capable of being used as criteria to distinguish compliant from non-compliant behaviour.

The FSA's implementation of PBR was pronounced a failure (Black 2012). The reason for this, however, was not the enunciation of Principles, but the failure to carry through to operationalisation of the abstract expression in specific contexts, and to the enforcement of compliance with those Rules. The following section outlines an approach to architecting regulatory regimes that addresses these weaknesses.


7. Articulation Through Co-Regulation

In Black (2008), a number of different forms of principles-based regulation are canvassed and critiqued. One variant has been referred to for many years as 'co-regulation'. This section proposes how a co-regulatory approach can harness the Principles-based approach, but deliver operationally-defined statements that can be applied as Rules, and enforced. An early description of the nature of a co-regulatory scheme is in (Clarke 1999). As Internet ecology began to demonstrate a degree of maturation, the article proposed that privacy concerns established the case for a carefully-negotiated form of regulation. The proposition attracted attention, and continues to do so. However, the 11 September 2001 terrorist attacks had their intended effect of causing democracy to eat itself, and a quarter-century later little progress has been made in bringing order to the wild-west behaviour of digital corporations. The applicability of co-regulation to privacy more generally is discussed in Clarke (2021).

"Co-regulation is a cross-over point between self-governance and external governance" (Hepburn 2006). A key requirement has been declared as being to "integrate structures of private governance effectively within a larger institutional setting -- to embed those structures within a broader framework of public oversight" (Balleisen & Eisner 2009, p.129). Those authors' actual proposals, on the other hand, merely add a gloss to conventional self-regulation. Most other discussions in the literature also reduce co-regulation to a slightly refined form of non-regulation, usually self-regulation with some modest additional factor in play, such as moral suasion or potential reputational harm to regulatees. For example, a longstanding European Community definition of co-regulation refers to it as a mechanism whereby largely powerless parties outside government are "entrusted" with the attainment of policy objectives. A more recent article in the 'tech platform' space also indicates the degradation of the concept: "Co-regulation ... may look more like facilitating industry self-regulation, such as promoting certification to demonstrate compliance with industry best practices or obtaining informal commitments from companies to voluntarily disclose or adhere to certain reporting standards" (Cannon & Chung 2015, p.94, emphases added)

In Figure 2, Co-Regulation is assigned to Layer (6), whereas pale imitations of it are referred to as Pseudo Co-Regulation, classified as a form of Self-Regulation, and allocated to Layer (5). Substantive forms of Co-Regulation involve the establishment of a Code or Standard within a legislative context that makes the requirements enforceable. Regulatees, appropriately, have significant input to the requirements. An essential feature, however, is that advocates for the interests of the Beneficiaries of the measures have a high degree degree of influence over the expression in the Code or Standard (Clarke & Greenleaf 2018). In 2019, a proposal was formulated for a co-regulatory framework for the regulation of AI (Clarke 2019b). This declared five key features, as further refined in Table 5.

Table 5: A Comprehensive Co-Regulatory Framework

Adapted from Clarke (2019b, Table 3)


8. The Evaluation Process

The purpose of this article was declared as being support the design of new regulatory regimes, and the evaluation and adaptation of existing schemes. The preceding sections directly support a design activity, by identifying relevant considerations and alternative approaches to addressing them. This section considers the use of the framework as a tool for evaluating existing regimes.

One approach to conducting an evaluation is to consider the intellectual content of the preceding sections, and adapt and extend the framework to reflect the particular context and particular object of the scheme. A more structured approach is also possible. In particular, the criteria expressed in Table 2 can be used as a checklist, and the features of the regulatory regime assessed, resulting in a score against each of the 16 items. A regulatory regime is a complex multidimensional construct, and any attempt to devise an authoritative scoring scheme would be doomed to failure. On the other hand, some broad indication of any particular regime's efficacy appears feasible. At the very least, the process of scoring brings with it a degree of discipline and documentation, and hence a firm foundation for discussion and debate.

In Appendix 1, several possible scoring schemes are identified. A simple binary or ternary tick could be applied. Alternatively, scores on a, say, 6-point scale could be assigned for each criterion. If preferred, differential weightings could be applied, perhaps favouring the Product and Outcome criteria over those that relate to the Process of developing the scheme. A template is provided in Appendix 2, designed to support either or both of a ternary tick-list and a differentially-weighted scheme normalised to a total of 100, and containing keywords to guide the scoring of each criterion. The template was piloted using three regimes of convenience. These were selected based on diversity among the objects of regulation, and on being sufficiently well-known to the author that they could be scored with relatively modest further research effort.

The regulatory regime for a large-scale welfare benefits fraud and waste scheme was assessed on the basis of a recently-completed deep case study of the Australian 'Robodebt' project (Clarke et al. 2024). The extraordinarily low scores help explain the public opprobrium that the project earned. The Robodebt scoresheet is provided as Supplementary Materials. The second pilot considered the severe test ofregulatory regimes for licensed taxi that arose from the anti-authoritarian emergence of 'ride-sharing' services. Moderate scores up to about 2010 were replaced by seriously low scores during and after the explosion. This drew on a previously-published case study (Clarke 2020). See the Uber scoresheet. As a more positive third pilot, the Australian Spam Act, operational since 2003, was evaluated. See the Spam Act scoresheet.

Experience arising from the three pilots has been reflected in modifications to the guidance and the evaluation template. This has provided a qualified indication that the framework and the guidance are at least usable, and to a lesser extent that they appear likely to be useful as a means of evaluating regulatory regimes. A comprehensive application of the framework is not feasible within the space constraints of a single article. However, a companion article (Clarke 2025b) considers the emergent regime established by the European Union's Artificial Intelligence Act (AI).


9. Conclusions

This article's purpose has been to present a framework whereby existing regulatory regimes, in particular for contexts in which disruptive information technologies are involved, can be studied, understood and improved, and new regulatory regimes can be designed. Definitions have been provided for 'regulation' and 'regulatory regime', and for the object to be subjected to the regime. The first element of the framework is a set of criteria for judging the effectiveness of a regulatory regime (presented in Section 3 and Table 2). A layered model of regulatory mechanisms is presented in Section 4, with attention also paid to the differing modalities of law (Table 3) and of regulation by architecture and infrastructure (Table 4). This is complemented by a model of the players and plays in the field (Section 5 and Figure 3). A discussion of substantive general statements ('Principles') and of operationalised forms of them applicable to particular circumstances ('Rules') is provided in Section 6. A co-regulatory approach is proposed in Section 7 as the appropriate way to achieve the articulation of Principles into Rules.

During the framework's gestation period, elements and versions of it have been applied in a variety of contexts. This commenced with Clarke (1995a), which had as its focus the dataveillance technique of computer matching, and Clarke (1995b), which was concerned with the regulation of obscene material in on-line information services. Multiple aspects of the model are also evident in an early foray into the new approaches that would be needed to the regulation of financial services in digital marketspaces (Clarke 1997). Some years later, an express set of principles for the regulation of surveillance was proposed in Clarke (2012), and applied to surveillance by the media in Clarke (2014a).

The first meta-description of aspects of the model presented in this article appeared in two contributions on the regulation of civilian drones, in Clarke & Bennett Moses (2014) and Clarke (2014b), and extended in Clarke (2016). The model was further matured, and applied to artificial intelligence (AI), in Clarke (2019b). Digital platforms were then considered in light of the model, in Greenleaf et al. (2019) and Clarke (2020). In Clarke (2022a), the broad field of electronic markets was addressed, with a lengthy section expanding the digital platform analysis to the specific case of the Uber 'ride-sharing' platform. Clarke (2021) applied the model to privacy protection, and most recently Clarke (2022b) had as its focus the application of AI to surveillance.

The model presented in this article has been applied in multiple contexts, at various stages in its maturation. It has proven to be effective in supporting those analyses, and has been refined to reflect what has been learnt from those projects. Of the nine articles published in the last decade (cited in the previous paragraph), which used moderately mature versions of the framework, six have appeared in leading journals and the other two in refereed conference proceedings. Between them, they have gained over 600 Google citations to date, suggesting some degree of exposure and potential value to other researchers. An additional application of the framework is in a companion article (Clarke 2025b). Further applications are needed, in diverse contexts, by additional researchers and consultants, in order to test, demonstrate, refine, and adapt the framework.


Reference List

ANAO (2007) 'Administering Regulation: Better Practice Guide' Australian National Audit Office, March 2007, at http://www.anao.gov.au/~/media/Uploads/Documents/administering_regulation_.pdf

Arner D.W., Barberis J. & Buckley R.P. (2017) 'FinTech, RegTech, and the Reconceptualization of Financial Regulation' 37 Nw. J. Int'l L. & Bus. 371 (2017), at http://scholarlycommons.law.northwestern.edu/njilb/vol37/iss3/2

Ayres I. & Braithwaite J. (1992) 'Responsive Regulation: Transcending the Deregulation Debate' Oxford Univ. Press, 1992

Baldwin R., Cave M. & Lodge M. (2011) 'Understanding Regulation: Theory, Strategy, and Practice' Oxford University Press, 2nd ed., 2011

Balleisen E.J. & Eisner M. (2009) 'The Promise and Pitfalls of Co-Regulation: How Governments Can Draw on Private Governance for Public Purpose' Ch. 6 in Moss D. & Cisternino J. (eds.) 'New Perspectives on Regulation' The Tobin Project, 2009, pp.127-149, at https://libros.metabiblioteca.org/server/api/core/bitstreams/b5c6b2cb-b6ae-4919-b5d6-b332b26e7ff4/content#page=127

Black J. (2007) 'Principles based regulation: risks, challenges and opportunities' Presented at the Supreme Court of New South Wales, 28 Mar 2007, at http://eprints.lse.ac.uk/62814

Black J. (2008) 'Forms and Paradoxes of Principles Based Regulation' Capital Markets Law Journal 3,4 (2008) 425-457, PrePrint at https://www.lse.ac.uk/law/research/working-paper-series/2007-08/WPS2008-13-Black.pdf

Black J. (2012) 'The Rise, Fall and Fate of Principles Based Regulation' Ch.1, pp. 3-34, in K. Alexander and N. Moloney (eds) 'Law Reform and Financial Markets' Edward Elgar, 2012, PrePrint at https://eprints.lse.ac.uk/32892/1/WPS2010-17_Black.pdf

Braithwaite J. (1982) 'Enforced self-regulation: A new strategy for corporate crime control' Michigan Law Review 80,7 (1982) 1466-507, at https://johnbraithwaite.com/wp-content/uploads/2016/03/Enforced_Self_1982.pdf

Braithwaite J. (2017) 'Types of responsiveness' Chapter 7 in Drahos (2017), pp. 117-132, at http://press-files.anu.edu.au/downloads/press/n2304/pdf/ch07.pdf

Brennen S. & Kreiss D. (2016) 'Digitalization and Digitization' International Encyclopedia of Communication Theory and Philosophy, October 2016, PrePrint at http://culturedigitally.org/2014/09/digitalization-and-digitization/

Cannon B. & Chung H. (2015) 'A Framework for Designing Co-Regulation Models Well-Adapted to Technology-Facilitated Sharing Economies' Santa Clara High Tech. L.J. 31,23 (2015), at: http://digitalcommons.law.scu.edu/chtlj/vol31/iss1/2

Clarke R. (1992) 'Extra-Organisational Systems: A Challenge to the Software Engineering Paradigm' Proc. IFIP World Congress, Madrid (September 1992), at http://www.rogerclarke.com/SOS/PaperExtraOrgSys.html

Clarke R. (1995a) 'A Normative Regulatory Framework for Computer Matching' Journal of Computer & Information Law XIII,4 (Summer 1995) 585-633, PrePrint at http://www.rogerclarke.com/DV/MatchFrame.html#IntrCtls

Clarke R. (1995b) 'The Regulation of Obscene Material on On-Line Information Services' Submission to a Senate Committee by the Australian Computer Society's Economic, Legal and Social Implications Committee, August 1995, at https://www.rogerclarke.com/II/OLIS.html

Clarke R. (1997) 'Regulating Financial Services in the Marketspace: The Public's Interests' Invited Address to the Conference of the Australian Securities Commission Conference on 'Electronic Commerce: Regulating Financial Services in the Marketspace', The Wentworth Hotel, Sydney, 4-5 February 1997, at http://www.rogerclarke.com/EC/ASC97.html

Clarke R. (1999) 'Internet Privacy Concerns Confirm the Case for Intervention' Communications of the ACM 42,2 (February 1999) 60-67, Special Issue on Internet Privacy, PrePrint at http://www.rogerclarke.com/DV/CACM99.html

Clarke R. (2001) 'Meta-Brands' Privacy Law & Policy Reporter 7, 11 (May 2001), PrePrint at http://www.rogerclarke.com/DV/MetaBrands.html

Clarke R. (2012) 'The Regulation of Surveillance' Xamax Consultancy Pty Ltd, February 2012, at http://www.rogerclarke.com/DV/SReg.html

Clarke R. (2014a) 'Surveillance by the Australian Media, and Its Regulation' Surveillance & Society 12,1 (Mar 2014) 89-107, at http://library.queensu.ca/ojs/index.php/surveillance-and-society/article/view/aus_media , PrePrint at http://www.rogerclarke.com/DV/MSR.html

Clarke R. (2014b) 'The Regulation of the Impact of Civilian Drones on Behavioural Privacy' Computer Law & Security Review 30, 3 (June 2014) 286-305, PrePrint at http://www.rogerclarke.com/SOS/Drones-BP.html#RN

Clarke R. (2016) 'Appropriate Regulatory Responses to the Drone Epidemic' Computer Law & Security Review 32,1 (Jan-Feb 2016) 152-155, PrePrint at http://www.rogerclarke.com/SOS/Drones-PAR.html

Clarke R. (2018) 'Guidelines for the Responsible Application of Data Analytics' Computer Law & Security Review 34, 3 (May-Jun 2018) 467- 476, PrePrint at http://www.rogerclarke.com/EC/GDA.html

Clarke R. (2019a) 'Principles and Business Processes for Responsible AI' Computer Law & Security Review 35, 4 (2019) 410-422, PrePrint at http://www.rogerclarke.com/EC/AIP.html

Clarke R. (2019b) 'Regulatory Alternatives for AI' Computer Law & Security Review 35, 4 (Jul-Aug 2019) 398-409, PrePrint at http://www.rogerclarke.com/EC/AIR.html

Clarke R. (2020) 'RegTech Opportunities in the Platform-Based Business Sector' Proc. Bled eConference, June 2020, pp. 79-106, PrePrint at http://rogerclarke.com/EC/RTFB.html

Clarke R. (2021) 'A Comprehensive Framework for Regulatory Regimes as a Basis for Effective Privacy Protection' Proc. 14th Computers, Privacy and Data Protection Conference (CPDP'21), Brussels, 27-29 January 2021, PrePrint at http://rogerclarke.com/DV/RMPP.html

Clarke R. (2022a) 'Research Opportunities in the Regulatory Aspects of Electronic Markets' Electronic Markets 32,1 (Jan-Mar 2022) 179-200, PrePrint at http://rogerclarke.com/EC/RAEM.html

Clarke R. (2022b) 'Responsible Application of Artificial Intelligence to Surveillance: What Prospects?' Information Polity 27,2 (Jun 2022) 175-191, Special Issue on 'Questioning Modern Surveillance Technologies', PrePrint at http://rogerclarke.com/DV/AIP-S.html

Clarke R. (2025a) 'Principles for the Responsible Application of Generative AI' Xamax Consultancy Pty Ltd, January 2025, at http://rogerclarke.com/EC/RGAI-C.html

Clarke R. (2025b) 'Feted AI Legislation (FAIL): An Evaluation of the EU AI Act against a Normative Framework for Regulatory Regimes' Working Paper, Xamax Consultancy Pty Ltd, April 2025, at http://rogerclarke.com/EC/RRE-AIA.html

Clarke R. & Bennett Moses L. (2014) 'The Regulation of Civilian Drones' Impacts on Public Safety' Computer Law & Security Review 30, 3 (June 2014) 263-285, PrePrint at http://www.rogerclarke.com/SOS/Drones-PS.html

Clarke R. & Greenleaf G.W. (2018) 'Dataveillance Regulation: A Research Framework' Journal of Law and Information Science 25, 1 (2018), PrePrint at http://www.rogerclarke.com/DV/DVR.html

Clarke R., Michael K. & Abbas R. (2024) 'Robodebt: A Socio-Technical Case Study of Public Sector Information Systems Failure' Australasian J. of Infor. Syst. 28 (September 2024) 1-42, at https://ajis.aaisnet.org/index.php/ajis/article/view/4681/1481

Clarke R. & Taylor K. (2018) 'Towards Responsible Data Analytics: A Process Approach' Proc. Bled eConference, 17-20 June 2018, PrePrint at http://www.rogerclarke.com/EC/BDBP.html

Drahos P. (ed.) (2017) 'Regulatory theory: Foundations and applications' ANU Press, 2017, at https://press.anu.edu.au/publications/regulatory-theory

Drahos P. & Krygier M. (2017) 'Regulation, institutions and networks' Ch. 1 in Drahos (2017), at http://press-files.anu.edu.au/downloads/press/n2304/pdf/ch01.pdf

Grabowsky P. (2017) 'Meta-Regulation' Chapter 9 in Drahos (2017), pp. 149-161, at http://press-files.anu.edu.au/downloads/press/n2304/pdf/ch09.pdf

Greenleaf G. (1998) 'An Endnote on Regulating Cyberspace: Architecture vs Law?' University of New South Wales Law Journal 21,2 (1998) 593, at https://ssrn.com/abstract=2188160

Greenleaf G.W. et al. (2019) 'Regulation of digital platforms as part of economy-wide reforms to Australias failed privacy laws' Australian Privacy Foundation, Submission to the Australian Government on implementation of the ACCC's Digital Platforms Inquiry - Final Report, at https://www.rogerclarke.com/DV/SSRN-id3443337.pdf

Gunningham N., Grabosky P, & Sinclair D. (1998) 'Smart Regulation: Designing Environmental Policy' Oxford University Press, 1998

Gunningham N. & Sinclair D. (2017) 'Smart Regulation', Chapter 8 in Drahos (2017), pp. 133-148, at http://press-files.anu.edu.au/downloads/press/n2304/pdf/ch08.pdf

Gupta,A. & Lad L. (1983) 'Industry self-regulation: An economic, organizational, and political analysis' The Academy of Management Review 8, 3 (1983) 416-25

Hardin G. (1968) 'The Tragedy of the Commons' Science 162 (1968) 1243-1248, at http://cescos.fau.edu/gawliklab/papers/HardinG1968.pdf

Hardin (1994)  'Postscript:  The tragedy of the unmanaged commons' Trends in Ecology & Evolution 9, 5 (May 1994) 199

Hedman J. & Henningsson S. (2016) 'Developing Ecological Sustainability: A Green IS Response Model' Information Systems Journal 26,3 (2016) 259-287

Hepburn G. (2006) 'Alternatives To Traditional Regulation' OECD Regulatory Policy Division, undated, apparently of 2006, at http://www.oecd.org/gov/regulatory-policy/42245468.pdf

Hosein G., Tsavios P. & Whitley E. (2003) 'Regulating Architecture and Architectures of Regulation: Contributions from Information Systems' International Review of Law, Computers and Technology 17, 1 (2003) 85-98

Land F. (2012) 'Remembering LEO' in A. Tatnall (ed.) 'Reflections on the History of Computing: Preserving Memories and Sharing Stories', AICT-387, Springer, 2012, pp.22-42, at https://inria.hal.science/hal-01526811/document

Lessig L. (1999) 'Code and Other Laws of Cyberspace' Basic Books, 1999

Moores T.T. & Dhillon G. (2003) 'Do privacy seals in e-commerce really work?' Communications of the ACM 46, 12 (December 2003) 265-271

Ostrom E. (1999) 'Coping with Tragedies of the Commons'  Annual Review of Political Science 2 (June 1999) 493-535, at https://www.annualreviews.org/doi/full/10.1146/annurev.polisci.2.1.493

Parker C. (2007) 'Meta-Regulation: Legal Accountability for Corporate Social Responsibility?' in McBarnet D, Voiculescu A & Campbell T (eds), The New Corporate Accountability: Corporate Social Responsibility and the Law, 2007

PC (2006) 'Rethinking Regulation' Report of the Taskforce on Reducing Regulatory Burdens on Business, Productivity Commission, January 2006, t http://www.pc.gov.au/research/supporting/regulation-taskforce/report/regulation-taskforce2.pdf

RBA (2023) 'The Global Financial Crisis' Reserve Bank of Australia, May 2023, at https://www.rba.gov.au/education/resources/explainers/the-global-financial-crisis.html

Sethi S.P. (1975) 'Dimensions of Corporate Social Performance: An Analytical Framework' California Management Review 17,3 (1975) 58-64

Sethi S.P. & Emelianova O. (2006) 'A failed strategy of using voluntary codes of conduct by the global mining industry' Corporate Governance 6, 3 (2006) 226-238

Shapiro S.A. (2012) 'Blowout: Legal Legacy of the Deepwater Horizon Catastrophe:The Complexity of Regulatory Capture: Diagnosis, Causality, and Remediation' Roger Williams Uni. L. Rev. 17, 1 (Winter 2012) 221-257, at http://docs.rwu.edu/rwu_LR/vol17/iss1/11

Smith A. (1776) 'The Wealth of Nations' W. Strahan and T. Cadell, London, 1776

Som C., Hilty L.M. & Ruddy T.F. (2004) 'The Precautionary Principle in the Information Society' Human and Ecological Risk Assessment 10 (2004) 787?799, at https://shorturl.at/QuWhc

Som C., Hilty L.M. & Koehler A.R. (2009) 'The Precautionary Principle as a Framework for a Sustainable Information Society' Journal of Business Ethics 85 ((2009) 493?505, at https://philpapers.org/archive/SOMTPP

Stiglitz J. (2008) 'Government Failure vs. Market Failure' Principles of Regulation ? Working Paper #144, Initiative for Policy Dialogue, February 2008, at http://policydialogue.org/publications/working_papers/government_failure_vs_market_failure/

Williamson O.E. (1979) 'Transaction-cost economics: the governance of contractual relations' Journal of Law and Economics 22, 2 (October 1979) 233-261

Wingspread (1998) '' Wingspread Statement on the Precautionary Principle, 1998, at http://sehn.org/wingspread-conference-on-the-precautionary-principle/

Wood D.I. (1991) Corporate Social Performance Revisited' Academy of Management Review 16,4 (1991) 691-718


Appendix 1: Scoring Schemes for a Regulatory Regime

No scoring scheme can ever be satisfactory, in the sense of being authoritative, and fair to and acceptable by all parties. This Appendix nominates two possible approaches to scoring any particular regulatory regime:

A. Use of the Criteria as a Tick-List

A1: Score each subjectively as a Yes or a No, delivering a score in the range 0/16 to 16/16. The value of the approach is undermined by the likely tendency of most assessments to fall in the grey area between the two extremes.

A2: Use a Strong / Adequate / Weak measure, scoring 1, 0.5 and 0 respectively.

B. Equally-Weighted Criteria

Score each subjectively on a scale of 0 to 6 on each of the criteria in Table 2, with up to 4 bonus-points assigned, say above 80%, 3 for 65-80%, 2 for 55-65%, 1 for 40-55%, and 0 below 55%. This would implicitly weight all criteria equally heavily.

C. Product and Outcomes Criteria more heavily weighted than Process

Score subjectively on each of the criteria in Table 2, but weighted differentially, for example under-weighting the Process in comparison with Product and Outcomes:

General Guidance on Scoring

Because of the diversity of areas of application, and to make some allowance for the inevitable vagaries of inherently politicised design processes, the top score in each area should be regarded as achievable rather than a counsel of perfection, and scores of zero should be reserved for the complete absence of any contribution.

Regulatory Layers and Mechanisms

Evaluation of the regulatory regime against each criterion should take into account relevant aspects of section 4, including the dependence of all self-regulatory mechanisms on incentives and disincentives to deliver on nominal undertakings. The scoring is for the regime as a whole, not for elements within it, particularly if otherwise positive features are standalone and lack support or reinforcement.

Players and Plays in Regulatory Regimes

Evaluation of the regulatory regime against each criterion should take into account relevant aspects of section 5, particularly the aspects that are likely to determine the effectiveness of the Regulator and associated entities in achieving policy objectives.

Principles and Rules

Evaluation of the regulatory regime against each criterion should take into account relevant aspects of section 6, specifically the extent to which Principles are clearly enunciated, and Rules are articulated in a manner than makes them readily applicable to compliance assessment.

Articulation Through Co-Regulation

Evaluation of the regulatory regime against each criterion should take into account relevant aspects of section 7, particularly the extent to which Regulatees and Beneficiaries achieve reflection of their justifiable interests in the Regime's architecture and the Principles and Rules.


Appendix 2: An Evaluation Template


Appendix 3A: RoboDebt


Appendix 3B: Uber


Appendix 3C: Spam Act


Acknowledgements

TEXT


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professorial Fellow associated with UNSW Law & Justice, and a Visiting Professor in Computing in the College of Systems & Society at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 6 March 2025 - Last Amended: 1 May 2025 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/EC/FRR.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2024   -    Privacy Policy