Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Responsible AI'

Responsible AI Technologies, Artefacts, Systems and Applications
The 50 Principles
Cross-Referenced to the Source Documents

Version of 15 April 2019

This is a supporting document for 'Responsible AI: A Business Process and a Set of Principles'

See here for a PDF version of this Appendix, without cross-references

See here for an HTML version of this Appendix, without cross-references

This supersedes the version of 20 February 2019

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2019

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at

The following Principles apply to each entity responsible for each phase of AI research, invention, innovation, dissemination and application. The Principles were derived by consolidating elements from 30 international sources.

The cross-references following each Principle are to the 'Ethical Principles and IT' sources (Clarke 2018 - E) and 'Principles for AI' sources (Clarke 2019 - P).

1. Assess Positive and Negative Impacts and Implications

1.1 Conceive and design only after ensuring adequate understanding of purposes and contexts
(E4.3, P5.21, P6.1, P15.7, P17.5)

1.2 Justify objectives
(E3.25, P18.E1)

1.3 Demonstrate the achievability of postulated benefits
(Not found in any of the documents, but a logical pre-requisite)

1.4 Conduct impact assessment, including risk assessment from all stakeholders' perspectives
(E7.1, P3.12, P4.1, P4.2, P5.21, P11.8, P17.5, P18.P0, P19.2, P20.1, P22.R6, P22.R7)

1.5 Publish sufficient information to stakeholders to enable them to conduct impact assessment
(E7.3, P3.7, P4.1, P7.3, P7.4, P7.7, P22.R5)

1.6 Conduct consultation with stakeholders and enable their participation in design
(E5.2, E7.2, E8.3, P3.7, P7.6, P7.7, P11.8, P19.2, P20.7, P22.R5)

1.7 Reflect stakeholders' justified concerns in the design
(E5.2, E8.3, P3.7, P11.8, P19.2)

1.8 Justify negative impacts on individuals ('proportionality')
(E3.21, E7.4, E7.5, P20.1, P22.E3)

1.9 Consider alternative, less harmful ways of achieving the same objectives
(E3.22, P22.R7)

2. Complement Humans

2.1 Design as an aid, for augmentation, collaboration and inter-operability
(P4.5, P9.1, P9.8 .P14.2, P14.4, P22.E1)

2.2 Avoid design for replacement of people by independent artefacts or systems, except in circumstances in which those artefacts or systems are demonstrably more capable than people, and even then ensuring that the result is complementary to human capabilities

3. Ensure Human Control

3.1 Ensure human control over AI-based technology, artefacts and systems
(E4.2, E6.1, E6.8, E6.19, P1.4, P2.1, P4.2, P8.5, P5.16, P7.4, P9.3, P12.1, P13.5, P15.4, P22.E1, P22.R1)

3.2 In particular, ensure human control over autonomous behaviour of AI-based technology, artefacts and systems
(E8.1, P7.4, P10.2, P11.4, P17.12, P18.P4, P22.R1)

3.3 Respect people's expectations in relation to personal data protections (E5.6, P8.3, P16.5, P18.P7, P19.4, P20.4, P21.5, P22.R3), including:
* their awareness of data-usage (E3.6)
* their consent (E3.7, E3.28, E5.3, P3.11, P4.6)
* data minimisation (E3.9)
* public visibility and design consultation and participation (E3.10, E7.2), and
* the relationship between data-usage and the data's original purpose (E3.27)

3.4 Respect each person's autonomy, freedom of choice and right to self-determination
(E2.1, E5.3, P3.3, P9.7, P11.3, P18.E3, P18.P6, P22.E1, P22.E2)

3.5 Ensure human review of inferences and decisions prior to action being taken
(E3.11, P17.2, P22.R1)

3.6 Avoid deception of humans
(E4.4, E6.20, P2.5, P18.E4, P20.2, P22.E1, P22.E3)

3.7 Avoid services being conditional on the acceptance of AI-based artefacts and systems
(P4.5, P18.E2, P22.E1, P22.E3, P22.R1, P22.R4)

4. Ensure Human Safety and Wellbeing

4.1 Ensure people's physical health and safety ('nonmaleficence')
(E2.2, E3.1, E4.1, E4.3, E5.4, E6.2, E6.9, E6.13, E6.14, E6.18, P1.2, P1.3, P2.1, P3.2, P3.6, P3.9, P3.12, P4.3, P4.9, P5.6, P7.4, P8.2, P9.4, P10.2, P11.4, P13.5, P14.1, P15.3, P17.8, P18.E2, P19.3, P20.2, P21.3, P22.E2)

4.2 Ensure people's psychological safety (E3.1, E6.9, E6.13, P18.E2, P22.E2), by avoiding negative effects on their mental health, emotional state, inclusion in society, worth, and standing in comparison with other people (E5.4, E6.3)

4.3 Contribute to people's wellbeing ('beneficence')
(E2.3, E3.20, E5.5, P3.1, P3.4, P5.1, P5.14, P5.15, P7.6, P8.1, P11.6, P12.2, P13.1, P15.1, P18.E1, P19.1, P19.7, P22.R1)

4.4 Implement safeguards to avoid, prevent and mitigate negative impacts and implications
(E3.24, E7.6, P5.21, P10.4, P22.R2)

4.5 Avoid violation of trust
(E3.3, P18.P0, P22.E4, P22.R2)

4.6 Avoid the manipulation of vulnerable people (E4.4, P4.5, P4.9, P22.E2), e.g. by taking advantage of individuals' tendencies to addictions such as gambling (E6.3, P18.P3), and to letting pleasure overrule rationality

5. Ensure Consistency with Human Values and Human Rights

5.1 Be just / fair / impartial, treat individuals equally (E2.4, E2.5, E3.2, E3.16, E3.29, P3.4, P8.4, P16.4, P18.E4, P18.P3, P19.5, P20.5, P21.1, P21.2, P22.E3, P22.R5), and avoid unfair discrimination and bias, not only where they are illegal, but also where they are materially inconsistent with public expectations (ICCPR Arts. 2.1, 3, 26 and 27, E3.16, P3.4, P4.5, P11.5, P15.2, P17.4, P18.P5)

5.2 Ensure compliance with human rights laws
(E4.2, P3.5, P3.9, P4.3, P8.1, P16.5, P18.E0, P19.5, P20.3, P22.R1)

5.3 Avoid restrictions on, and promote, people's freedom of movement
(ICCPR 12, P5.13)

5.4 Avoid interference with, and promote privacy, family, home or reputation
(ICCPR 17, E5.6, P3.11, P5.12, P7.4, P9.6, P13.3, P15.5)

5.5 Avoid interference with, and promote, the rights of freedom of information, opinion and expression (ICCPR 19, P4.6), of freedom of assembly (ICCPR 21, P5.13), of freedom of association (ICCPR 22, P5.13), of freedom to participate in public affairs, and of freedom to access public services (ICCPR 25, P5.13)

5.6 Where interference with human values or human rights is outweighed by other factors, ensure that the interference is no greater than is justified ('harm minimisation')
(E3.9, E7.6, P1.3, P3.12, P22.R7)

6. Deliver Transparency and Auditability

6.1 Ensure that the fact that a process is AI-based is transparent to all stakeholders
(E4.4, P4.8, P16.3, P18.P6, P20.6, P22.E4, P22.R4)

6.2 Ensure that data provenance, and the means whereby inferences are drawn from it, decisions are made, and actions are taken, are logged and can be reconstructed
(E6.6, P2.4, P4.8, P5.7, P6.4, P6.6, P7.2, P9.2, P11.1, P11.2, P13.2, P16.3, P17.7, P18.P10, P19.6, P21.4, P22.E4, P22.R4, P22.R7)

6.3 Ensure that people are aware of inferences, decisions and actions that affect them, and have access to humanly-understandable explanations of how they came about
(E3.12, P2.4, P16.3, P17.1, P18.P6, P19.6, P22.E4, P22.R4)

7. Embed Quality Assurance

7.1 Ensure effective, efficient and adaptive performance of intended functions
(E6.2, E6.11, P1.6, P4.2, P8.7, P15.6, P21.3, P22.R2)

7.2 Ensure data quality and data relevance
(P10.3, P11.2, P17.7, P18.P2, P22.R3)

7.3 Justify the use of data, commensurate with each data-item's sensitivity
(E3.26, E7.4, P18.P7)

7.4 Ensure security safeguards against inappropriate data access, modification and deletion, commensurate with its sensitivity
(E3.15, P16.5, P17.9, P18.P8, P19.3, P22.R2)

7.5 Deal fairly with people (faithfulness, fidelity)
(E2.5, E3.2, P22.E3)

7.6 Ensure that inferences are not drawn from data using invalid or unvalidated techniques
(E3.5, P6.7)

7.7 Test result validity, and address the problems that are detected
(E3.5, P6.7. P9.2, P17.6, P18.P8, P18.P10)

7.8 Impose controls in order to ensure that the safeguards are in place and effective
(E7.7, P10.2, P18.P8)

7.9 Conduct audits of safeguards and controls
(E7.8, P9.3, P18.P1)

8. Exhibit Robustness and Resilience

8.1 Deliver and sustain appropriate security safeguards against the risk of compromise of intended functions arising from both passive threats and active attacks, commensurate with the significance of the benefits and the potential to cause harm
(E4.3, E6.11, P1.4, P1.5, P4.9, P5.6, P7.4, P9.5, P18.P8, P22.R2)

8.2 Deliver and sustain appropriate security safeguards against the risk of inappropriate data access, modification and deletion, arising from both passive threats and active attacks, commensurate with the data's sensitivity
(E3.15, E6.5, E6.10, P3.11, P9.6, P17.9, P18.P8, P22.R2)

8.3 Conduct audits of the justification, the proportionality, the transparency, and the harm avoidance, prevention and mitigation measures and controls
(E7.8, E8.4, P20.7, P22.R7)

8.4 Ensure resilience, in the sense of prompt and effective recovery from incidents
(P18.P8, P22.R2)

9. Ensure Accountability for Legal and Moral Obligations

9.1 Ensure that the responsible entity is apparent or can be readily discovered by any party
(E4.5, E6.4, P2.3, P3.8, P4.7, P7.5, P12.3, P17.3, P18.P1, P20.8, P22.E3)

9.2 Ensure that effective remedies exist, in the form of complaints processes, appeals processes, and redress where harmful errors have occurred
(ICCPR 2.3, E3.13, E3.14, E7.7, P3.11, P4.7, P6.2, P7.7, P9.9, P10.5, P11.9, P17.5, P18.P1, P20.7, P22.E3, P22.R7)

10. Enforce, and Accept Enforcement of, Liabilities and Sanctions

10.1 Ensure that complaints, appeals and redress processes operate effectively
(ICCPR 2.3, E7.7, P20.7, P22.E3)

10.2 Comply with external complaints, appeals and redress processes and outcomes (ICCPR 14), including, in particular, provision of timely, accurate and complete information relevant to cases


Clarke R. (2018) 'Ethical Principles and Information Technology' Xamax Consultancy Pty Ltd, rev. September 2018, at

Clarke R. (2019) 'Principles for AI: A 2017-19 SourceBook' Xamax Consultancy Pty Ltd, rev. April 2019, at

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 11 July 2018 - Last Amended: 15 April 2019 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy