Roger Clarke's Web-Site


© Xamax Consultancy Pty Ltd,  1995-2023

Roger Clarke's 'Theory of Authentication'

A Generic Theory of Authentication
to Support IS Practice and Research

Early Draft of 9 January 2023

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2022-23

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at


This paper addresses a yawning gap in IS theory and practice. In the information systems (IS) discipline and profession, the concept of authentication is commonly limited in scope to the checking of assertions relating to identity. The effective conduct of organised activities depends on the authentication of many other categories of assertion. The paper presents a generic theory of authentication, which includes assertions about identities and entities, but extends far beyond that narrow focus to encompass assertions relating to facts, content, value and attributes. The theory is based on a declared set of metatheoretic assumptions, but is expressed in terms designed to speak to IS practitioners and to IS researchers whose work is intended to do the same.


1. Introduction

Use of the term 'authentication' in contexts relevant to information systems (IS) practice and research is almost always limited to what is referred to here as '(id)entity authentication'. For example, in the large family of standards documents published by the Internet Engineering Task Force (IETF 2022), a device, or a process running in a device, 'authenticates itself' to another device or process. This is commonly done by declaring an identifier and demonstrating that the device or process has access to a secret (such as a password) that only that device or process is expected to know. The US government standards document extends beyond artefacts to encompass human entities and their identities, defining authentication as "Verifying the identity of a user, process, or device, often as a prerequisite to allowing access to resources in an information system" (NIST 2006, p.6). Magnusson (2022) provides a straightforward, commercial explanation of the NIST notion, using the definition "the process of verifying a user or device before allowing access to a system or resources".

There are contexts in which it is important to authenticate entities of various kinds (including humans, other living specimens such as livestock, artefacts and natural objects) and identities (including processes running within artefacts, and particular presentations or roles of humans and of artefacts). On the other hand, (id)entity is far from the only thing that needs to be authenticated (Clarke 2001, 2003). Hence, limiting the scope of the term has serious drawbacks. Examples of other things that need to be authenticated include statements of fact, claims of value, and declarations of authority.

The proposition that authentication needs to be interpreted broadly is unusual in the IS literature, to the extent that I have been able to identify very few sources. An inspection of the first 200 hits using a Google Scholar search on <authentication "information systems"> detected almost no uses other than in relation to (id)entity authentication. One exception was a small number of articles on watermarking for image provenance authentication. Another was a single throwaway sentence in a mainstream IS journal: "Authentication can be used to verify either the content of the message, the origin of the message, or the identity of the user" (Altinkemer & Wang 2011, p.394). Other examples found in the AIS electronic Library (AISeL) were Mattke et al. (2019) and Thomas & Negash (2023) who refer to transaction and asset authentication in the context of blockchain implementations, and Lausen et al. (2020) who refer to the authentication of claims made by financial professionals in relation to their previous employment and licences held.

In the IS literature, even the common, narrow interpretation as '(id)entity authentication' is little-discussed. For example, the string <authentication> is found in Title or Abstract in only 37 of over 17,000 refereed papers in AISeL, with the positivist term <verification> used in a further 13. (All hits provided by the AISeL search-engine require inspection, because of generic uses, and specific usages distinct from the present one, and an over-generous synonym-table, which treats 'authentic' and 'authenticity' as equivalents to 'authentication'). I contend that the narrowness of the conventional conception is a major deficiency in IS practice, which IS researchers have failed to address and for which they must therefore bear partial responsibility.

This paper's purpose is to express a general theory of authentication that encompasses not only (id)entity authentication but also the many other circumstances in which the reliability of claims is assessed. The theory is intended for use to support both IS practice and research. To achieve that end, the analysis applies a previously-published, pragmatic metatheoretic model, comprising a working set of assumptions in each of the areas of ontology, epistemology and axiology (Clarke 2021). Partly because of its metatheoretic nature, and partly because of the paucity of IS literature on the topic, the resources that the analysis depends on are largely outside the IS mainstream.

The paper commences by briefly presenting the underlying pragmatic metatheoretic model. It then discusses the abstract notion of 'authentication', placing it within that model, and defining it as a process that establishes a level of confidence in an assertion. A number of categories of assertion are then outlined, and descriptions provided of the processes and criteria appropriate to each category. Implications of this theory of authentication are drawn for IS practice and for IS research.

2. The Underlying Pragmatic Metatheoretic Model

In this work, an IS is treated as "a set of interacting artefacts and human activities that performs one or more functions involving the handling of data and information" (Clarke 1990), or "a system which assembles, stores, processes and delivers information ... a human activity (social) system which may or may not involve the use of computer systems" (Avison & Fitzgerald 2006, p. 23). During the intervening decades, information technology (IT) has become so pervasive that the legitimacy of a technical view of IS is limited to specific kinds of highly-automated systems, and neither a wholly technical nor a wholly social view can provide a sufficient basis for understanding of the large majority of IS.

Key features of the necessary socio-technical system view are that organisations comprise people using technology, that each affects the other, and that effective design depends on integration of the two (Mumford 2006, Abbas and Michael 2022). Adopting the socio-technical view, "[t]he social and the technical should be apportioned comparable emphases" and treated as "as two mutually interacting components" (Sarker et al. 2019 p.697, 698). Yet those authors' study of MISQ and ISR works concludes that "about 87% of the studies reviewed focused solely on instrumental outcomes" (p.704), by which the authors mean "higher productivity" for the benefit of the system sponsor (p.698), and that "by losing sight of humanistic goals, the IS discipline risks facilitating the creation of a dehumanized and dystopian society" (p.705).

A further commitment that I bring to this work is that IS research is concerned with advancing IS practice. However, this is even more unfashionable than the socio-technical view of IS: "while everyone pays lip service to the importance of applied research, when it comes time to publish it, the mainstream journals are generally unwilling participants ... Papers are routinely rejected from our journals because they fail to make a sufficient theoretical contribution" (Hirschheim 2019, pp.1349, 1351). The approach I have adopted, the model I have proposed, and the analyses based on that model, may therefore be relevant only to those readers who subscribe to this interpretation of the IS discipline.

The analysis of authentication presented in this paper builds on previous work that proposed a pragmatic metatheoretic position and model to support IS practice and research Clarke (2021). This section provides a recapitulation of key aspects of that work. The model is referred to as 'metatheoretic' (Myers 2018, Cuellar 2020), on the basis that it draws on relevant areas of philosophy in which IS practitioners and theorists alike make 'metatheoretic assumptions', often implicitly, and sometimes consciously. Where the assumptions are both conscious and intentional, a more appropriate term for them is 'metatheoretic commitments'.

The model is also 'pragmatic', as that term is used in philosophy, that is to say it is concerned with understanding and action, rather than merely with describing and representing. The author's intention is instrumentalist, in this case not economically motivated but to achieve change in the worldviews of IS practitioners and researchers, and hence changes in behaviour and in the management of data. So the model needs to speak to IS practitioners, and to those IS academics who intend the results of their research to do the same. Figure 1 supports the textual explanations in the following sub-sections with a visual depiction of the key elements of the model.

Figure 1: A Pragmatic Metatheoretical Model

2.1 Ontology

Ontology is the branch of philosophy concerned with the study of existence. The approach adopted here is that a reality exists, outside of and independently of the human mind, where 'Phenomena' exist - a position commonly referred to as 'realism'. Humans cannot directly know or capture those Phenomena. They can, however, sense and measure those Phenomena, can create data reflecting them, and can construct models of them - an assumption related to the ontological assumption referred to as 'idealism'.

The pragmatic meta-model adopted in this paper, and depicted in Figure 1 accordingly distinguishes a 'Real World' from an 'Abstract World'. The Real World comprises 'Things' and 'Events', which have 'Properties'. These can be sensed by humans and artefacts with varying reliability. Authentication is a process whereby that reliability can be assessed.

2.2 Epistemology

Two contrasting conceptions of knowledge exist. The proposition of the first, 'empiricism', is that knowledge is derived from sensory experience, and is a body of facts and principles accumulated by humankind over the course of time, that are capable of being stored in the equivalent of a warehouse (Becker & Niehaves 2007, p.202). This works well in circumstances where the Things are inanimate, and their handling is essentially mechanical. Examples include aircraft guidance systems and robotic production-lines.

The other, 'apriorist' view is that knowledge is internal and personal, and the concept is not applicable outside the mind of an individual human (Becker & Niehaves 2007, p.202). Within this school of thought, knowledge is the matrix of impressions within which an individual situates newly acquired information. In order to cater for the two extremes of empiricism and apriorism, the term 'Knowledge' is usefully avoided, except when qualified by one of two adjectives:

A pragmatic metatheoretic approach must support modelling not only in contexts that are simple, stable and uncontroversial, but also where there is no expressible, singular, uncontested 'truth'. The assumption adopted is that both the empiricist and apriorist epistemological views are applicable, but in different circumstances.

Humans create Abstract Worlds. Some are imaginary, variously tenable and simply fantasy. Those of relevance here are empirical, in the sense of being intended to model relevant aspects of the Real World. In Figure 1, Abstract Worlds are depicted as being modelled at two levels. The Conceptual Model level endeavours to reflect the modeller's perception of the Things, the Events and their Properties. The notion of an 'Entity' at the Conceptual Model level corresponds to a category of Things, and 'Transaction' to a category of Events. In the dialect used by ontologists, the term 'universal' corresponds to a category, and 'particular' refers to an instance. For example, in biology, the notion 'species' (e.g. African Elephant) is a universal, and the notion 'specimen' is a particular. An example that is more commonly relevant to IS is the category cargo-containers, which is a universal or Entity, whereas a specific cargo-container is a particular or 'Entity-Instance'. The ideas and terms used in this paper, and articulated further below, are similar to, but not necessarily identical with, related ideas and terms in the well-developed and diverse sub-discipline of conceptual modelling.

The abstract concept of 'Identity' corresponds to a particular presentation of a Thing, as arises when it performs a particular role. A 'Role' is a pattern of behaviour adopted by an Entity. An Entity may adopt one Identity in respect of each Role, or may use the same Identity when performing multiple Roles. An 'Identity-Instance' is a particular occurrence of an Identity. The Role of CEO is an Identity performed by different human Entities over time. The Role of Company Director is an Identity performed by multiple human Entities at the same time.

A 'Relationship' is a linkage between two elements within the metatheoretic model, represented in Figure 1 using an arrow. Each element at the Conceptual Level has 'Attributes' which contain 'Attribute-Values'. For example, the Entity shipping-containers has Attributes such as colour, owner, type (with Attribute-Values such as refrigerated and half-height), and various kinds of status (e.g. dirty or clean; and empty or loaded). A Relationship has the Attribute of cardinality, reflecting how many of each of the elements it links can be involved, with the Attribute-Values that can be adopted typically being zero, one or many. Transactions give rise to changes in the Attribute-Values of (Id)Entities.

The other level, referred to here as the Data Model, enables the operationalisation of the relatively abstract ideas in the Conceptual Model level. This moves beyond a design framework to fit with data-modelling and data management techniques and tools, and to enable specific operations to be performed to support organised activity. Central to this level is the notion of 'Data'. The term, used variously as a plural and as a generic noun, refers to a quantity, sign, character or symbol, or a collection of them, that is in a form accessible to a person and/or an artefact. A 'Data-Item' is a storage-location in which a discrete 'Data-Item-Value' can be represented. The term 'value', in this context, is a somewhat generalised form of "a numerical measure of a physical quantity" (OED I 4). For example, Entity-Attributes of cargo-containers may be expressed at the Data Model level as Data-Items and Data-Item-Values of Colour = Orange, Owner = MSK (indicating Danish shipping-line Maersk), Type = Half-Height, Freight-Status = Empty. A collection of Data-Items all of which relate to a single (Id)Entity- or Transaction-Instance is referred to as a 'Record'. A Data-Item or group of Data-Items that enables Record-Instances to be distinguished is referred to here as a 'Record-Key'.

'Real-World Data' or 'Empirical Data' is data that purports to represent some Property of a Real-World Phenomenon. That is contrasted with 'Synthetic Data', which is Data that bears no direct relationship to any real-world phenomenon, such as the output from a random-number generator, or data created as a means of testing the performance of software under varying conditions. A special case of Synthetic Data is data generated from Empirical Data by some perturbation or substitution process.

Authentication is a process applied to assertions. 'Assertions' are putative expressions of knowledge about one of more elements of the metatheoretic model ("a positive statement; a declaration, averment" OED 5). The signifiers that make up assertions may involve:

  1. particular elements of the Real World, e.g.
    'A physical item was delivered to a location at a date and time';
  2. particular elements of an Abstract World.
    In that case, the elements involved may:
    1. be solely at the Conceptual Model level, e.g.
      'A delivery Transaction caused changes in the states of Entities representing stock-holdings and a customer order;
    2. be solely at the Data Model level, e.g.
      'A delivery Transaction-Record caused changes in Data-Items in the Entity- Records reflecting a particular stock-item and a particular customer order';
    3. link elements at both the Conceptual and Data Model levels, e.g.
      'A delivery Transaction caused a change in the state of an Entity-Instance representing stock-holdings and its corresponding Transaction-Record caused changes in Data-Items in the Entity-Record reflecting a particular customer order';
  3. a mapping between particular elements in both the Abstract and Real Worlds, e.g.
    'The Data-Record that contains a particular Record-Key [all of which is in the Abstract World] relates to a particular Thing [which exists in the Real World]'.

Beyond Data, the epistemological aspects of the pragmatic model comprise assumptions made about information, knowledge and wisdom. The term 'information' is used in many ways. Frequently, even in refereed sources, it is used without clarity as to its meaning, and often in a manner interchangeable with Data. The pragmatic model adopted in this paper uses the term 'Information' for a sub-set of Data: that Data that has value (Clarke 1992b). Data has value in only very specific circumstances. Until it is in an appropriate context, Data is not Information, and once it ceases to be in such a context, Data ceases to be Information.

2.3 Axiology

Axiology is the branch of philosophy that deals with 'values', in the sense of "the relative worth, usefulness, or importance of a thing" (OED II 6a). The values dominant in many organisations are operational and financial. However, many contexts arise in which there is a pressing need to recognise broader economic interests, and values on other dimensions as well, particularly social and environmental concerns.

At the outset, the position was declared that IS practice and research of necessity adopt a socio-technical system view. Organised activities depend on people, artefacts, and effective interactions among them. IS also affect people, including those participating in the system (conventionally called 'users') and some who are not participants in the system, but are affected by it (usefully referred to as 'usees' - (Berleur & Drumm 1991 p.388, Clarke 1992a, Fischer-Huebner & Lindskog 2001, Baumer 2015). Examples of usees include people with records in shared industry databases, such as those for police suspects, tenants and insurees; and the conversation-partners of people whose voice and/or electronic communications are subjected to surveillance. Because of the presence and significance of users and usees in IS, human values are prominent.

Several approaches exist to the question of how to determine what values to apply:

Generally, the various stakeholders are likely to have at least some differences among their value-sets. In simple contexts, the value-conflicts that arise may all be of an economic nature, particularly between the system-sponsor's operational and financial objectives and the financial interests of users. However, a range of factors can give rise to much greater complexity in the assessment of utility. Important examples are where:

This section has drawn on the pragmatic metatheory described in Clarke (2021) to identify aspects relevant to the generic notion of authentication.

3. Authentication in Theory

This section proposes a working definition of the generic notion of 'authentication'. It first outlines the elements evident in dictionary definitions. It then considers the concept within the context of the pragmatic metatheoretic model outlined in the previous section, and other sources of insight.

3.1 Definition

The richest dictionary source records a wide range of interpretations of the verb 'authenticate' and of the object of the action or process of authentication (OED 1, 3a, 3b, 4a, 4b, 5, 6). The interpretations can be summarised as follows:

to { validate, approve, prove, confirm, establish as genuine/authentic, verify }

{ something/anything, a statement, an account, truth, existence, a reputed fact, a document, an artefact, an artwork, a user identity, a process identity }

IS practice and research has narrower scope than a dictionary needs to encompass. There are, however, considerable variations in the contexts that need to be addressed.

This paper establishes a metatheoretic foundation for authentication in IS practice and research. The fundamental assumption of this work, relating to the necessity of adopting a socio-technical view of IS, has the corollary that it is untenable to assume that each, or any, assertion can be resolved by reference to a singular, accessible truth. It could be feasible to do so if the more extreme forms of positivism are adopted; but design science research, interpretivism and critical theory research are highly unlikely to be able to accommodate an uncompromising assumption of the existence of accessible truth. That leads to a preference to avoid language that implies truth, such as 'verification'. Instead, there is a need for fuzziness, or at least degrees of likelihood or reliability, quite possibly contingent on various factors.

The following is proposed as an operational definition:

Authentication is a process that establishes a degree of confidence in the reliability of an assertion.

3.2 Theoretical Contexts in which Assertions Can Be Made

In section 2.2 above, a taxonomy of assertion-types was proposed, depending on the level of the pragmatic metatheoretic model of the elements involved in the assertion. Applying the pragmatic metatheoretic model, there are three contexts on which an assertion may be made.

(1) At the Real-World Level

Assertions are expressed in terms of Things that are postulated to exist, and Events that are postulated to occur, and that have impacts on the Properties of Things. Hence 'A delivery of Things called stock-items was received by a category of Things called customers' or 'There are no relevant Things (stock-items) in the Thing (a storage-bin) that is allocated to that particular category of stock-items'. The nature of IS is such that this level is an external point-of-reference, not part of IS.

(2) At the Abstract, Informational Level

Some assertions are expressed in terms of (Id)Entities and Transactions and their Attributes. For example, 'A Transaction has occurred evidencing the delivery of a particular stock-item thereby affecting the stock-count Attribute of that Entity-Instance'. Other assertions are expressed in terms of the Data Model level, such as 'A Transaction-Record gives rise to changes in (Id)Entity-Records'.

Reasoning can be applied to assertions in order to infer further assertions. Classical logic, most relevantly in the form of the propositional calculus, only supports conclusions of right or wrong / true or false. Many-valued and fuzzy logics, on the other hand, recognise that propositions can have degrees of truth (Gottwald 2001). For example, Three-valued logics permit an intermediate value, such as 'undefined' or 'not proven', and another form of logic may recognise both 'no information' and 'conflicting information'.

Some logics support qualitative data, perhaps on a nominal but more likely an ordinal scale (e.g. unborn, young, old, dead), amd others require discrete quantitative values, on an ordinal and possibly an interval or cardinal scale (with equal distances between consecutive values). A ratio scale requires a natural zero (cf. Kelvin for temperature, Stevens 1946), although assumptions are sometimes made (e.g. that a zero can be contrived at one end of a spectrum, such as 'False'; or that a mid-point such as 'Ambivalent' or even 'Don't Know' can suffice).

Assertions and inferences from them are of value to real-world decision-making. On the other hand, this Level can provide only limited assistance, because it simply assumes that the assertions and inferences are representative of the Real World. Logics of all kinds can do nothing to test the relationship between Data and the Real World. Further, where partial information exists, logics do little to enable an assessment of the likelihood or the degree of confidence in the reliability of assertions. This limitation also applies not only to assertions on either the Conceptual or the Data Model level, but also to the third sub-category, which involves interaction, mapping or relationships between the Conceptual and Data Models within the Abstract World.

(3) Across the Real-World and Abstract Levels

In Figure 1, arrows represent Relationships, interactions or mappings between the Real-World and Abstract levels. Assertions can be made that straddle the levels. For example, 'A physical stock-count of Things in a particular storage-bin identified a mis-match between that count and the relevant Data-Item-Value'. This presents supporting evidence for the reliability or otherwise of an assertion of stock-holding counts (and, by inference, stock-holding values).

The reliability of these kinds of assertions is critical to the effectiveness of an IS, because without that reliability the IS lacks connection with and relevance to the Real World.

4. Authentication in Practice

The previous section described a theory about the authentication of assertions. This section discusses its practical application.

4.1 Authentication Processes

Assertions at the Abstract, Informational Level discussed in section 3.2(2) can be subjected to evaluation. One form this can take is a check of compliance with the rules of logic, to detect whether a flaw exists in the chain of argument. Another is checks of the language used, to ensure that no misunderstandings have arisen from ambiguous language, or conflation or terms.

More valuably, assertions that straddle the Real World and the Abstract World can be authenticated. Data-Item-Values can be compared with available observations of the Real World, purpose-designed observation and measurement can be conducted, and quality checks can be devised to ensure that the observations are of sufficient quality.

4.2 Evidence to Support Authentication Strength

In section 3.1, Authentication was defined as a process that establishes a degree of confidence in the reliability of an assertion. What degree of confidence is desirable, and what degree is achieved, vary widely, depending on the circumstances. This sub-section considers the nature of evidence by means of dictionary definitions and the perspectives of courts of law and law enforcement investigators.

The term 'Evidence' is used here to mean data that assists in determining the level of confidence in an assertion's reliability. This is closely related to OED III, 6: "Grounds for belief; facts or observations adduced in support of a conclusion or statement; the available body of information indicating whether an opinion or proposition is true or valid". An individual item of Evidence is usefully referred to as an 'Authenticator'. This broader use has yet to find its way into the OED, which indicates a strong form (a human guarantor) and recognises IT usage only in respect of user autnentication. A common form of Authenticator is a 'Document', by which is meant content of any form and expressed in any medium, often text but possibly tables, diagrams, images, video or sound. Content on paper, or its electronic equivalent, continues to be a primary form.

Some Authenticators carry the imprimatur of some authority, such as a registrar or notary. These are usefully referred to as a 'Credential' ("any document used as a proof of identity or qualifications", OED B2). Common examples of Credentials are Documents issued by some kind of authority that has (or, in many cases, is merely assumed to have) undertaken some form of Authentication prior to issuing it, such as a birth certificate, certificate of naturalisation, marriage certificate, passport, driver's licence (and, in some jurisdictions, non-driver's 'licence'), employer-issued building security card, credit card, club membership card, statutory declaration, affidavit, or letter of introduction.

The term 'Token' refers to a recording medium on which useful data is stored. Tokens are usefully applied to the storage of machine-readable copies of (Id)Entifiers. Examples include 'identity cards' (especially 'photo-id'), turnaround documents, tickets issued to Natural Persons required to wait in a queue, machine-readable visual images (such as bar-codes or QR-codes) and machine-readable data-storage (such as a magnetic-stripe, solid-state memory, or transmission from an RFID-tag).

Tokens may also contain Authenticators generally, and Credentials in particular, typically using smartcards and 'dongles' designed for use in conjunction with computing devices. In that case, security features are necessary, in order to provide confidence in the validity of the Token and its contents, such as hidden graphic features to guard against forged Tokens, and cryptographic features to guard against manipulation of the content. Measures may also be needed in an endeavour to tie the Token in some manner with the particular Entity that is intended to be its exclusive user.

Where the subject of the Assertion is a passive natural object, artefact or even animal, the Authentication process is limited to checking the elements of the Assertion against Evidence that is already held, or is acquired from, or accessed at, some other source that is considered to be both reliable and independent of any party that stands to gain from masquerade or misinformation. On the other hand, humans, organisations (through their human agents) and active artefacts can participate in the Authentication process, in particular in a 'challenge-response' sequence. This involves a request or 'challenge' to the relevant party for an Authenticator, and an answer or action in response. Examples of Authenticators relevant to each of the important categories of Assertion are provided in the following section.

In legal proceedings, distinctions are drawn among testimony, documentary evidence, and physical evidence. The term 'probative' means "having the quality or function of proving or demonstrating; affording proof or evidence; demonstrative, evidential" or "to make an assertion likelier ... to be correct" (OED 2a). In the law of evidence, "'probative value' is defined to mean the extent to which the evidence could rationally affect the assessment of the probability of the existence of a fact in issue" (ALRC 2010, 12.21).

At law, a court is required to treat some kinds of assertion as 'rebuttable presumptions', to be treated as being reliable unless and until case-specific evidence is presented that demonstrates otherwise. This makes clear on which party the 'onus of proof' lies. In civil jurisdictions, the standard of proof is 'preponderance of the evidence' or 'preponderance of the probabilities', whereas in criminal jurisdictions the threshold of proof is 'beyond a reasonable doubt'.

The law also recognises the economic constraints on evidence-collection and authentication generally: "a decision is better if it is less likely to be erroneous, in light of the actual (but unknown) outcome of the decision that would be known if there were perfect information. The quality of the decision takes into account the magnitude of consumer harm from making the erroneous decision in addition to the probability of doing so. Decision theory similarly can be used to rationally decide how much information to gather. It does so by balancing the costs and benefits of additional imperfect information in terms of making better decisions" (Salop 2017, pp.12-13).

The legal perspective on evidentiary value in evaluating assertions is conditioned by the context: The court's decision on the reliability of an assertion is a binding determination. This is significantly different from the context of investigations by a law enforcement agency. An investigator seeks patterns or relationships in data, which at best will point firmly towards the resolution of a case, but which will desirably at least close off an unproductive line of enquiry and even lead the investigator towards more promising lines. A degree of protection against spurious results is desirable, but the disincentives have to do with resource efficiency rather than a wrong result in a civil and especially in a criminal trial. Linked with this looser form of evidentiary standard is the concept of 'confirmation bias', which describes the tendency to notice evidence that supports a hypothesis rather than that which conflicts with it, and the even more problematic tendency to actively look for evidence that will support rather than refute a currently favoured proposition (Nickerson 1998).

Townsley, Michael, Mann, Monique, & Garrett, Kristian (2011) 'The missing link of crime analysis: A systematic approach to testing competing hypotheses' Policing (Oxford), 5(2), pp. 158-171, at

4.3 Authentication Quality Management

A range of risk factors impinge on the quality of Authentication processes. Of especial importance is the need to achieve an appropriate balance between the harm arising from false positives / inclusions, which are Assertions that are wrongly accepted; and false negatives / exclusions, which are Assertions that are wrongly rejected.

Sources of poor quality include accidental mistakes, and intentional mistakes which comprise intentional false positives, e.g. masquerade or 'spoofing' and intentional false negatives, e.g. avoidance, undermining or subversion of (Id)Entification.

Where quality shortfalls occur, additional considerations come into play, including the following:

Quality is a substantially greater challenge where other parties are motivated to contrive false positives or false negatives. Safeguards are needed to limit the extent to which such parties may succeed in having Assertions wrongly accepted or wrongly rejected, in order to gain advantages for themselves or others.

The level of assurance of an Authentication mechanism depends on the extent of protections against abuse, and hence on whether it can be effectively repudiated by the entity concerned. It is conventional to distinguish multiple quality-levels of Authentication, such as unauthenticated, weakly authenticated, moderately authenticated and strongly authenticated. Organisations generally adopt 'risk management' approaches, which accept lower levels of assurance in return for processes that are less expensive, more practical, easier to implement and use, and less intrusive (Altinkemer & Wang 2011).

5. Categories of Assertion

The analysis to date has considered the generic concept of authentication of a generic notion of assertions. In practice, there are many kinds of assertions, and a theory to support IS practice and research needs to encompass them and their differences. This section provides a preview, identifies key categories, and describes them in sufficient detail to enable application of the generic theory of authentication advanced in this paper, and demonstrate its effectiveness and usefulness in a variety of contexts.

5.1 A Preview

A recitation of categories is necessarily lengthy and laborious. This preview uses the archetypal cartoon in Figure 2, which captured the mood at the dawn of the cyberspace era.

Figure 2: A Dog's (Id)Entities

Steiner P., The New Yorker, 5 July 1993

Several elements relevant to this paper can be discerned in the cartoon:

The cartoonist denies that the cartoon was intended to convey any deep insights. Despite that, it encapsulates the ideas that the Internet user has only a limited amount of information available, and needs to be cautious about assumptions they make. In particular situations, there are various things worth authenticating before reliance is placed on those assumptions. Examples include where confidences are uttered to the conversation-partner, credit card details are provided to them, a file is opened or a hyperlink clicked on, and information is replayed onwards, such as investment tips and rumours about celebrities.

Examples of Assertions that would benefit some kind of checking or confirmation include:

The following sub-sections provide less informal treatment of these ideas, commencing with the more generic and simpler categories, and leaving those with greater complexity and difficulty until last.

5.2 Fact Assertions

The fundamental form is an assertion of fact. A common examples is of the following form:

A Real World Thing is reliably represented by an Abstract World Entity-Instance-Attribute-Value or an Entity-Record-Data-Item-Value, e.g.

'The number of Widgets Class A that we have in stock is 37'
[as recorded in the Current-Stock-Count Data-Item in the stock file].

The function of authentication is to establish a degree of confidence in the reliability of an assertion. In the case of an assertion about stock-holdings, the assertion may be accepted at face-value (because the inventory IS is regarded as reliable), or (particularly if the stock-item in question is subject to pilfering or breakages) the bin-contents may be checked for at least rough equivalence with the number recorded in the stock-file.

Multiple other forms of fact assertions exist, such as the assignment of an instance to a category. In the case of 'This customer is a frequent-buyer who qualifies for the loyalty discount', the claim might be regarded as legitimate simply because the customer's face is recognised by the employee at the checkout, or the customer has logged in online and their profile has the loyalty-program indicator set. Alternatively, some information might be sought, from the customer or elsewhere, to provide adequate assurance that the claim is justifiable.

In relation to well-structured data, a framework for assessing data quality is presented in Clarke (2016, pp.77-80). It draws on a range of sources, importantly Huh et al. (1990), Wang & Strong (1996), Mueller & Freytag (2003) and Piprani & Ernst (2008). Each of the first group of 7 'data quality factors' can be assessed at the time of data acquisition and subsequently, whereas those in the second group of 6 'information quality factors' can only be judged at the time of use.

In the case of less structured, textual data, complex patterns arise, as evidenced by the prevalence of propaganda, misinformation, rumour-mongering, 'false news', 'alternative facts', 'fact checkers' and 'explainers'. Meanwhile, the scope for both varied interpretations and abuse of image, video and audio has exploded in the digital era.

5.3 Content Integrity Assertions

In some circumstances, a material risk may exist that a message containing an assertion has been garbled or subjected to interference. For example, word-of-mouth messages are subject to the 'Chinese whispers' phenomenon, whereby small changes in the wording arise at each step in the communication chain, resulting in a materially different interpretation of the assertion from what the originator intended. Messages over wired and wireless telecommunications infrastructures also involve risk of accidental corruption and falsification. Risks arise with the transmission of email and other message-forms, and with downloads of documents, images, video-files, transactions and software.

A common technique used for authentication is the computation of a hash of the message on despatch and on arrival, and a check that the hashes are the same (NIST 2015). A simplistic form of hash is a count of the number of characters in the message, and a less simplistic one is conversion of every character to a unique number, and summation of all of the numbers. A more complex, mechanism for content integrity authentication for software utilises encryption (Sander 2021), and another for transactions uses encryption coupled with multiple, independent storage locations (Dai & Vasarhelyi 2017).

5.4 Liquid-Asset Value Assertions

Commerce, whether electronic, mobile or entirely physical, depends on the reliable transfer of value to the seller, most commonly of money ("Any generally accepted medium of exchange which enables a society to trade goods without the need for barter", OED 1; "Means of payment considered as representing value or purchasing power", OED 2a), whether in the form of currency or an irrevocable transfer into an account with a trusted third party.

Examples of Value Authentication for liquid assets include:

5.5 Non-Liquid-Asset Value Assertions

Commerce depends on confidence by each transacting party in the value being transferred by the other. The previous section addressed Value Authentication where the value is being transferred in the form of money. That encompassed transactions in which money is traded for a non-monetary tradable item, and money traded for money, in particular currency-conversion transactions. This section addresses the remaining categories, viz. a non-monetary tradable item traded for money, and barter transactions. Examples of tradable items include consumer durables, livestock, artworks, motor vehicles and 'collectibles', but also invisibles such as shares, loan agreements, and insurance.

The Value Assertion is that the tradable item has economic value, or has Attributes that underpin economic value. Examples of Authenticators applicable to Assertions relating to such tradable items include warranty cards for consumer durables, valuation reports in relation to real estate, inspectors' reports on livestock, and pedigree certificates for pets and breeding stock. Trading in shares and other invisibles may be heavily dependent on the reputation of the other party, and the existence of effective regulatory protections and recourse.

In the case of valuable artworks, Authenticators may be sought that purport to document the work's origins, nature, provenance (in particular, chain of ownership), and current ownership. The extent to which such metadata generates confidence in reliability of an Assertion of the artwork's authenticity depends on the credibility of the Authenticator. Evidence that makes the Authenticator appear to be a Credential from a highly-reputed intermediary (such as letterhead, a physical signature, a company seal, or a digital signature) is likely to generate far more confidence than a verbal assurance or a poor-quality, uncertified photocopy. Another mainstream example of strong evidence is an unbroken series of contracts relating to transfers between successive owners.

In a great deal of conventional commerce, Value Authentication is a primary means whereby trust is achieved, with little dependence on the (Id)Entity of the other party. In eCommerce, on the other hand, an aberration has arisen: During the early decades of Internet-facilitated commerce, the sole practical payment mechanism has been through the transmission of credit card details, which carry an Identifier of the cardholder.

Payment mechanisms that do not have an Identifier associated with them have been conceived, designed, prototyped, implemented, and trialled, but have not yet been widely adopted. One reason for slow adoption of nymous value-transfer is the opposition of tax collection agencies, which rely on the ability to associate the arrival of funds that represent taxable income with the (Id)Entity that received them. There is, of course, the scope for techniques that deliver pseudonymous value-transfer, with the protections against (Id)Entification readily surmountable by the relevant tax agency, but not by other organisations and individuals. Chaumian eCash operated in this manner during the period 1995-98 (Chaum 1985), whereas, since 2014, many digital currencies have been implemented that are somewhat resistant to visibility to taxation agencies.

5.6 Attribute Assertions

An Attribute Assertion is of the form:

A Real World Thing has a Property that is appropriately represented by an Entity-Instance-Attribute-Value and/or an Entity-Record-Data-Item-Value.

Job-applicants make Attribute Assertions in relation to their qualifications, experience and previous employment. Their claims of having a particular qualification can be tested against a testamur or by look-up of an educational institution's database. To guard against masquerade, the association between the person and the qualification-evidence may also need to be tested. An assertion that a person qualifies for a trade discount at a retail outlet may be authenticated by evidence of a trade qualification or a company letterhead.

If the Real World Thing is a gem, its claimed weight in carats can be tested by independent measurement. Alternatively, some kind of Credential may be inspected, such as a jeweller's valuation. A party that accepts a Credential as being sufficient evidence is referred to as a Relying Party. A Relying Party may have recourse against the issuer of the Credential, under consumer protection or contract law

5.7 Assertions Involving (Id)Entity

It was noted at the beginning of this paper that assertions that involve an Entity or Identity are the sole focus of almost all discussions of authentication in both IS practice and research. Discussion of these categories has been deferred until very late in the analysis. They are complex to describe and to understand, and challenging to implement effectively. In the case of human Entities, they are also inevitably invasive of personal space. Because of those characteristics, it is highly advantageous to all concerned for authentication activities to focus on other kinds of assertions if they are capable of satisfying the need.

However, various circumstances exist in which assertions involving (Id)Entity do require authentication. The primary category is:

An Assertion that a particular Thing in the Real World is appropriately associated with a particular (Id)Entity-Instance at the Conceptual Level of an Abstract World, and/or with one or more particular (Id)Entity-Records at the Data Level

This is performed by demonstrating a reliable match between a Property of the Thing and an Attribute-Value of the (Id)Entity-Instance and/or a Data-Item-Value of an (Id)Entity-Record-Key. The complexities are such that this is the topic of two separate full-length papers on (Id)Entities and (Id)Entification (Clarke 2022) and their Authentication (Clarke 2023).

Other categories involving Id(Entity) include:

Many circumstances exist in which the Credential identifies the person. This is not actually necessary, however. All that is needed is some means whereby the Credential is reliably associated with the (Id)Entity presenting the Credential. For example, a challenge for information can be sufficient to establish that a person qualifies for entry to secure premises, without even knowing their (Id)Entity, let alone authenticating it. Rather than asking 'who goes there?', night-guards are instructed to issue a challenge that requires a specific counter-sign or password (DA 1971).

Even where the process of Attribute Authentication involves the provision of an (Id)Entifier, there may be no need to record anything more than the fact that Authentication was performed. In this way, the Transaction is not identified. One example of this is the inspection of so-called 'photo-id' and the birthdate displayed on it, without recording the (Id)Entifier displayed on the card. In addition, IS can be designed to authenticate each party's eligibility to conduct a Transaction (e.g. is a member of an appropriate category, or has a particular Attribute) without disclosing the party's (Id)Entity. An important example of a technique expressly designed for this purpose is U-Prove digital certificates, which authenticate an (Id)Entity's Attributes without disclosing any (Id)Entifiers (Brands 2000).

6. Implications and Conclusions

This paper has presented a generic theory of authentication to support IS practice and practice-oriented research. It reflects a pragmatic metatheoretic model comprising assumptions about the Real World, and a two-layer view of the Abstract, Informational World. It defines authentication as a process whereby a level of confidence is achieved in an assertion. It provides a framework for the design of processes, with exemplars of its application. It identifies categories of assertions that are relevant to IS design.

The generic theory of authentication is relevant to IS practice, uses terms familiar to practitioners, is understandable by practitioners, and can provide guidance in design and refinement of business processes that perform authentication. Among its important implications for practice is the focus on identifying assertions that are relevant to the need, whose reliability can provide the necessary assurance, and that can be the subject of practical, effective, efficient and inexpensive authentication processes. It encompasses the authentication of assertions of (id)entity, but recognises the challenges, expense and intrusiveness inherent in that undertaking, and encourages IS designers to pause and consider whether other categories of assertion are a more appropriate focus.

The theory has multiple implications for research. It outlines a large domain in which theory has been lacking. It identifies many sub-domains in which articulation of the theory is needed, which can contribute to more efficient and effective designs by IS practitioners. It invites the development of contingency theories that define the circumstances under which each of the various categories of assertion needs to be prioritised for consideration by designers. It draws the attention of academics away from the humdrum 'normal science' of filling gaps in abstruse theories dealing in constructs that are meaningless to IS practitioners, managers and executives, and that contribute little or nothing to humankind.

Reference List

Abbas R. & Michael K. (2022) 'Socio-Technical Theory: A review' In S. Papagiannidis (Ed), 'TheoryHub Book', TheoryHub, 2022, at

ALRC (2010) 'Uniform Evidence Law' Australian Law Reform Commission, Report 102, August 2010, at

Altinkemer K. & Wang T. (2011) 'Cost and benefit analysis of authentication systems' Decision Support Systems 51 (2011) 394-404

Avison D. & Fitzgerald G. (2006) 'Information Systems Development - Methodologies, Techniques & Tools' McGraw Hill, 4th ed., 2006

Baumer E.P.S. (2015) 'Usees' Proc. 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI'15), April 2015, at

Becker J. & Niehaves B. (2007) 'Epistemological perspectives on IS research: a framework for analysing and systematizing epistemological assumptions' Info Systems J 17 (2007) 197-214

Berleur J. & Drumm J. (Eds.) (1991) 'Information Technology Assessment' Proc. 4th IFIP-TC9 International Conference on Human Choice and Computers, Dublin, July 8-12, 1990, Elsevier Science Publishers (North-Holland), 1991

Brands S.A. (2000) 'Rethinking Public Key Infrastructures and Digital Certificates: Building in Privacy' MIT Press, 2000

Chaum D. (1985) 'Security Without Identification: Transaction Systems To Make Big Brother Obsolete' Communications of the ACM 28, 10 (October 1985) 1030-1044, at

Clarke R. (1990) 'Information Systems: The Scope of the Domain' Xamax Consultancy Pty Ltd, 1990, at

Clarke R. (1992a) 'Extra-Organisational Systems: A Challenge to the Software Engineering Paradigm' Proc. IFIP World Congress, Madrid, September 1992, PrePrint at

Clarke R. (1992b) 'Fundamentals of 'Information Systems'' Xamax Consultancy Pty Ltd, September 1992, at

Clarke R. (1994) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Information Technology & People 7,4 (December 1994) 6-37, PrePrint at

Clarke R. (2001) 'The Fundamental Inadequacies of Conventional Public Key Infrastructure' Proc. Conf. ECIS'2001, Bled, Slovenia, 27-29 June 2001, PrePrint at

Clarke R. (2003) 'Authentication Re-visited: How Public Key Infrastructure Could Yet Prosper' Proc. 16th Bled eCommerce Conf., June 2003, PrePrint at

Clarke R. (2009) 'A Sufficiently Rich Model of (Id)entity, Authentication and Authorisation' Proc. IDIS 2009 - The 2nd Multidisciplinary Workshop on Identity in the Information Society, LSE, London, 5 June 2009, at

Clarke R. (2016) 'Big Data, Big Risks' Information Systems Journal 26, 1 (January 2016) 77-90, PrePrint at

Clarke R. (2021) 'A Platform for a Pragmatic Metatheoretic Model for Information Systems Practice and Research' Proc. Austral. Conf. Infor. Syst. (ACIS), December 2021, PrePrint at

Clarke R. (2022) 'A Reconsideration of the Foundations of Identity Management' Proc. 35th Bled eConference, June 2022, pp.1-30, PrePrint at

Clarke R. (2023) 'The Authentication of Assertions of (Id)Entity' Working Paper, Xamax Consultancy Pty Ltd, January 2023, at XXXXXXXXXXX

Cuellar M.J. (2020) 'The Philosopher's Corner: Beyond Epistemology and Methodology - A Plea for a Disciplined Metatheoretical Pluralism' The DATABASE for Advances in Information Systems 51, 2 (May 2020) 101-112

DA (1971) 'Guard Duty' FM 22-6, Department of the [US] Army, September 1971, at

Dai J. & Vasarhelyi M.A. (2017) 'Toward Blockchain-Based Accounting and Assurance' Journal of Information Systems 31,3 (Fall 2017) 5-21, at

Fischer-Huebner S. & Lindskog H. (2001) 'Teaching Privacy-Enhancing Technologies' Proc. IFIP WG 11.8 2nd World Conf. on Information Security Education, Perth, Australia

Gottwald S. (2001) 'A Treatise on Many-Valued Logics', January 2001, at

Hirschheim R. (2019) 'Against Theory: With Apologies to Feyerabend' Journal of the Association for Information Systems 20, 9 (2019) 1340-1357, at

Huh Y.U., Keller F.R., Redman T.C. & Watkins A.R. (1990) 'Data Quality' Information and Software Technology 32, 8 (1990) 559-565

IETF (2022) 'RFCs' Internet Engineering Task Force (IETF), December 2022, at

Lausen J., Clapham B., Siering M. & Gomber P. (2020) 'Who Is the Next 'Wolf of Wall Street'? Detection of Financial Intermediary Misconduct' Journal of the Association for Information Systems 21, 5 (2020)

Magnusson A. (2022) 'The Definitive Guide to Authentication' Strong DM, September 2022, at

Mattke J., Maier C., Hund A. & Weitzel T. (2019) 'How an Enterprise Blockchain Application in the U.S. Pharmaceuticals Supply Chain is Saving Lives' MIS Quarterly Executive 18, 4 at 6

Miscione G., Ziolkowski R., Zavolokina L. & Schwabe G. (2018) 'Tribal Governance: The Business of Blockchain Authentication' Proc. Hawaii International Conference on System Sciences (HICSS-51), 3-6 January 2018, at

Mueller H. & Freytag J.-C. (2003) 'Problems, Methods and Challenges in Comprehensive Data Cleansing' Technical Report HUB-IB-164, Humboldt-Universität zu Berlin, Institut fuer Informatik, 2003, at

Mumford E. (2006) 'The story of socio-technical design: reflections on its successes, failures and potential' Info Systems J 16 (2006) 317-342, at

Myers M.D. (2018) 'The philosopher's corner: The value of philosophical debate: Paul Feyerabend and his relevance for IS research' The DATA BASE for Advances in Information Systems 49, 4 (November 2018) 11-14

Nickerson R.S. (1998) 'Confirmation Bias: A Ubiquitous Phenomenon in Many Guises' Review of General Psychology 2, 2 (1998), at

NIST (2006) 'Minimum Security Requirements for Federal Information and Information Systems' Federal Information Processing Standard (FIPS) PUB 200, National Institute of Standards and Technology Gaithersburg, March 2006, at

NIST (2015) 'hashing' Computer Security Resource Center, [US] National Institute of Standards and Technology, 2015, at

Piprani B. & Ernst D. (2008) 'A Model for Data Quality Assessment' Proc. OTM Workshops (5333) 2008, pp 750-759

Salop S.C. (2017) 'An Enquiry Meet for the Case: Decision Theory, Presumptions, and Evidentiary Burdens in Formulating Antitrust Legal Standards' Georgetown University Law Center, 2017, at

Sander R. (2021) 'What is a Code Signing Certificate? How does it work?' IEEE Computer Society, August 2021, at

Sarker S., Chatterjee S., Xiao X. & Elbanna A. (2019) 'The Sociotechnical Axis of Cohesion for the IS Discipline: Its Historical Legacy and its Continued Relevance' MIS Qtly 43, 3 (September 2019) 695-719

Stevens S.S. (1946) 'On the Theory of Scales of Measurement' Science 103, 2684 (7 June 1946), at

Thomas D. & Negash S. (2023) 'Emerging Technology IS Course Design: Blockchain for Business Example' Communications of the Association for Information Systems, 52 (2023)

Wang R.Y. & Strong D.M. (1996) 'Beyond Accuracy: What Data Quality Means to Data Consumers' Journal of Management Information Systems 12, 4 (Spring, 1996) 5-33


This paper builds on a long series of publications on the topic of authentication, including Clarke (1994, 2003, 2009).

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor associated with the Allens Hub for Technology, Law and Innovation in UNSW Law, and a Visiting Professor in the Research School of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 15 December 2022 - Last Amended: 9 January 2023 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy