Roger Clarke's Web-Site© Xamax Consultancy Pty Ltd, 1995-2024 |
||||||
HOME | eBusiness |
Information Infrastructure |
Dataveillance & Privacy |
Identity Matters | Other Topics | |
What's New |
Waltzing Matilda | Advanced Site-Search |
Version of 30 May 2024
© Xamax Consultancy Pty Ltd, 2023-24
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://rogerclarke.com/ID/GT-IdEM-Annex.html
It is an Annex to http://rogerclarke.com/ID/GT-IdEM.html
This is an Annex to an article whose overall purpose is to present a theory of Id/Entity Management (Id/EM). The theory is derived from a Pragmatic Metatheoretic Model of the field addressed by IS practice, extended to embody theoretical treatments of Id/Entities, the Authentication of Assertions, and the critical elements of Authorization and Access Control. The theory has many features that evidence some divergence from conventional Identity Management (IdM) practices.
This Annex considers how the theory can be applied to achieve improvements in the quality of Authorization and Access Control.
No single Industry Standard dominates the IdM field, and no single IdM product or service dominates the market. The approach that has been adopted is to consider several key sources that have had, and continue to have, substantial impact on industry practices. A few references are made to the IETF Security Glossary (IETF 2000, 2007), but the primary focus is on standards documents that have significant influence and whose guidance has been available for sufficiently long to have been reflected in contemporary products, services and practices. The following are the primary sources considered in this section:
The various standards differ considerably in their purposes, approaches, language and structures. This section accordingly adopts a thematic approach, commencing with foundational matters, and then moves on to specific aspects.
The Id/EM theory presented in this article is based on express metatheoretic assumptions that adopt a truce between the two alternative ontological assumptions, the 'materialist' assumption (that phenomena exist in a real world) and the 'idealist' approach (that everything exists in the human mind). It adopts the notion that a Real World of Phenomena exists, on one plane, which humans cannot directly know or capture, but which humans can sense and measure, such that they can construct manipulable Abstract-World models of those Phenomena, on a plane distinct from that of the Real World. Terms are then adopted and defined that apply in, and only in, respectively the Real World and the Abstract World.
A further ontological commitment is to a clear distinction between two categories of Real-World Phenomena: Physical Things and Virtual Things. This is depicted in Figure 4. That distinction is then carried across into an epistemological commitment to the Abstract World, in the elements of the conceptual-model elements of Entities and Identities and the data-model elements of Entity Records and Identity Records. Entities and Identities need to be teased apart, each with one or more (Id)Entifiers to distinguish (Id)Entity Instances. A human Entity must be able to map to multiple human Identities, such that a human who acts using one Identity (such as prison warder, undercover agent or protected witness) is able to keep that Identity distinct from other Identities (such as householder, parent and football coach).
Conventional IdM practice, on the other hand, is not founded on a carefully-constructed meta-model. It conflates Physical Things with Virtual Things, and Entities with Identities. For example, the IETF's Glossary defines identification as "an act or process that presents an identifier to a system so that the system can recognize a system entity and distinguish it from other entities" (IETF 2000, p. 83, IETF 2007, p. 144, emphases added).
ISO 24760-1 carries the conflation of ideas over into international standards. Even after the 2019 revision, it defines "identification" as a "process of recognizing an entity" (all quotations from p. 1, emphases added), and verification as a "process of establishing that identity information ... associated with a particular entity ... is correct" (p. 3, emphases added). This is despite the document having earlier distinguished 'entity' (albeit somewhat confusingly, as "item relevant for the purpose of operation of a domain [or context] that has recognizably distinct existence") from 'identity' ("set of attributes ... related to an entity ...").
The NIST (2017) treatment of Real-World and Abstract-World elements, and of 'entity' and 'identity', is incomplete, inconsistent and confusing, in several ways:
In FIPS-201-3 (2022), the US government's Standard for Personal Identity Verification, defines identity as "The set of physical and behavioral characteristics by which an individual is uniquely recognizable" (FIPS-201-3, p. 98, emphases added). Firstly, this is a Real-World definition, as distinct from the widely-used Abstract-World conception of an Identity, and its definition is in terms not of physical characteristics (in Id/EM terms, Properties) but of Attributes and/or Data-Items. Secondly, it is about a Physical Thing ("an individual"), but it refers to it as an "identity", which more usefully represents a Virtual Thing that arises from a Role performed by a Physical Thing.
These multiple layers of fog in the NIST documents could be overcome, the clarity of communication to practitioners greatly enhanced, and the effectiveness of implementations within government agencies and business enterprises much-improved, by revising the NIST terms and definitions either to reflect the suite used in Id/EM theory, or those of some other scheme that features completeness, consistency and coherence.
Standards and associated practices also feature mis-handling of relationship cardinality. Entities generally have multiple Identities, and Identities may be performed by multiple Entities, both serially and simultaneously. A common example is the practice of parents sending their young children to an automated teller machine or EFTPOS terminal with the parent's payment-card and PIN. Another is a common practice among aged parents, who depend on their adult children to perform Internet Banking tasks on their behalf. Similarly, in many organisations, employees share loginids to save time switching between users. An enterprise model is inadequate if it does not encompass these common activities. It is also essential that there be a basis for distinguishing authorised delegations that may breach terms of contract with the card-issuer or of organisational policies and procedures, on the one hand, and criminal behaviour on the other.
The ISO-24760-1 standard accepts that an Entity may have multiple Identities ? e.g. "An entity can have more than one identity" (p. 1), and "This document considers any set of attributes that describe a particular entity as an identity for the entity ... Different sets of ... attributes form different identities for the same human entity" (p. 8). It also expressly recognises that the relationship is not 1:n but m-n, stating that "Several entities can have the same identity" (p. 1, see also pp. 8-9). Yet it fails to reflect those statements in the remainder of the document. Any Authorization scheme built on a model that fails to actively and consistently support those realities is doomed to embody confusions, and evidence errors and insecurities.
In NIST (2017), it is challenging to interpret the intentions regarding the cardinality of the relationship between 'subject' (presumably equivalent to Entity) and 'identity' (cf. Identity). In a single passage, the NIST document acknowledges the existence of 1-to-many relationships: "a subject can represent themselves online in many ways. An individual may have a digital identity for email, and another for personal finances. A personal laptop can be someone's streaming music server yet also be a worker-bot in a distributed network of computers performing complex genome calculations" (p. iv). Generally, however, the document appears to implicitly assume a one-to-one relationship ? i.e. that a Real-World human can only have a single Abstract-World Identity, and an Identity is only adopted by one human.
For example, there is a reference to "the subject's [singular] real-life identity [equivalent to Entity] ... " (p. iv), the term 'identity' is defined as "An attribute or set of attributes that uniquely describe a subject [Entity] within a given context" (p. 47), and the term 'digital identity' is described as "the [i.e. singular] online persona [Identity] of a subject [Entity]", is defined as "the unique representation [Identity] of a subject [Entity] ..." (p. 9), and there is a reference to verifying "a subject's [Entity] association with their [singular] real-world identity [Entity]" (p. 9, all emphases added). To the extent that NIST's guidance is interpreted by its users as limiting each human or artefactual Entity to a singular Identity, it is unworldly, and hence that aspect of the guidance needs to be ignored or worked around.
NIST (2022) is concerned specifically with US federal employees and contractors, and does not appear to contemplate the possibility of anything other than a 1:1 relationship between an identity and an entity. Its application within that context will map poorly to the needs of many government agencies. It is even more out of touch with the realities of private sector activities.
FIDO (2018), with its focus on the authentication of humanly-used-devices, can support multiple processes running in a single device, because registration is not based on a physical device-identifier, but rather on an asynchronous cryptographic key-pair. FIDO's scope, however, does not extend to the modelling of the human user and their relationship with the device, leaving that to the service-provider.
Id/EM theory enables the detection of conceptual conflations, ambiguities and imprecisions in standards and guidance documents. Further, it provides a suite of terms and definitions, situated within a coherent and comprehensive model and associated terminology, that can together enable adaptation of the standards and guidance documents to overcome the deficiencies discussed in this section.
The model in this article has comprehensive scope, including:
As regards the first aspect, the scope of ISO 24760-1 is similar, in that it defines identity as "set of attributes ... related to an entity", and entity as "item ... that has recognizably distinct existence", notes that "An entity can have a physical or a logical embodiment", and the examples provided include "a person, an organization, a device, a group of such items, a human subscriber to a telecom service, a SIM card, a passport, a network interface card, a software application, a service or a website" (p. 1). Because that standard does not extend to authorization (which it refers to as "establishing entitlements for the entity to access resources and interact with services", p. 16), the Resources and Actions aspects are out-of-scope.
NIST declares its scope as being humans only: "The guidelines cover identity proofing and authentication of users (such as employees, contractors, or private individuals) interacting with government IT systems over open networks" (NIST 2017, p. iii). It later qualifies the statement with "That said, these guidelines are written to refer to generic subjects wherever possible to leave open the possibility for applicability to devices" (p. 5). It wanders from this commitment, e.g. in its definition of authentication as "Verifying the identity of a user, process, or device ..." (p. 41, emphasis added), yet the specific term 'digital authentication' reverts to "the process of establishing confidence in user identities presented digitally to a system", p. 45), where 'user identities' by implication encompass only human users. Later, it defines subject as "A person, organization, device, hardware, network, software, or service" (p. 55, emphases added), encompassing both humans and artefacts.
All references in NIST (2017) are to 'access to a digital service', and are non-specific about the nature of the service, defining access as "To make contact with one or more discrete functions of an online, digital service" (p. 55). Such examples as are provided are limited to Data and Processes, without mention of Physical Resources or Actions in the Real World. Recent literature, on the other hand, is concerned with features of cyber-physical systems such as avionics, electricity grids and medical devices, aiming to combat opportunistic attacks on physical detector and actuator designs (e.g. Akhuseyinoglu & Joshi 2020). Several focus on the limited capacity of early-generation IoT devices to support effective security measures (e.g. Liu et al. 2020), with a particular focus on Wireless Body Area Network (WBAN) contexts supporting communications from and to implanted medical devices (Bu et al. 2019, Arfaoui et al. 2020).
The terms 'masquerade' and 'attack' are used, but generally the assumption appears to be made that safeguards against malbehaviour are by definition successful, and hence the model does not encompass imposters within the system. There is also no notion of limitation of the purpose, use, task or function being performed by an authorized user, and hence Permission Breach is also out-of-scope of the NIST guidance.
FIDO is motivated by the desire to overcome Man in the Middle attacks, and is almost entirely concerned with the Authentication of client-devices to server-devices without the need for transmission of an Authenticator such as a password. On the client side, a user must demonstrate presence by gesture, making clear that the scope does not extend to autonomous artefacts. A service may specify stronger Authentication of the human user than a mere demonstration of presence, e.g. by locally-validated password/PIN, by challenge-response procedures, or using a locally-validated biometric. Those specifications, however, lie outside the scope of FIDO itself.
The comprehensiveness of Id/EM theory provides a means whereby the many weaknesses in existing Standards and services can be recognised and addressed.
Id/EM theory features careful attention to details concerning each element of a comprehensive model. This sub-section considers several specific elements and clusters of elements, in each case reviewing standards and guidance documents in light of the theory.
The foundational concepts of Id/Entity were discussed in an earlier sub-section. Two further, closely-related elements are:
IETF (2007) defines identification as "An act or process that presents an identifier to a system so that the system can recognize a system entity and distinguish it from other entities" (p. 144). This is in some ways consistent with the Id/EM definition, but it fails to separate references to the Real World and the Abstract World, and conflates Physical Things with Virtual Things, and Entity with Identity.
The ISO 24760-1 standard was noted above as confusing the notions of Entity and Identity. An even stranger aspect of the standard is that, having defined 'identifier' as "attribute or set of attributes ... that uniquely characterizes an identity ... in a domain [or context]" (p. 1), it defines 'identification' without reference to 'identifier'. Moreover, the definition of 'identification' as "process of recognizing an entity ... in a particular domain" (p. 3) is immediately followed by "Note 1 to entry: The process of identification applies verification ..." (p. 3). Given that 'verification' is defined as "process of establishing that identity information ... associated with a particular entity is correct" (p. 3), the definitions not only conflate the notions of Identity and Entity but also the processes of Identification and Authentication.
NIST (2017) does not discuss 'identification' or 'identifier', despite the document's scope extending to "the online persona of a subject" (p. iv). The document does define 'pseudonymous identifier', as "A meaningless but unique number that does not allow the [Relying Party (RP)] to infer anything regarding the subscriber but which does permit the RP to associate multiple interactions with the subscriberÅfs claimed identity" (p. 57), but the term is used only twice, there is no indication of its purpose, and no recognition of the related but stronger notion of anonymity.
In FIPS-201-3 (2022), identification is defined as "The process of discovering the identity (i.e., origin or initial history) of a person or item from the entire collection of similar persons or items" (p. 98). This has some broad consistency with the approach in Id/EM theory. NIST defines an identifier as "Unique data used to represent a person's identity and associated attributes. A name or a card number are examples of identifiers" (p. 98). This appears to exclude biometrics, and no term is suggested for the means of distinguishing a person from other people, as distinct from an identity from other identities.
In Id/EM theory, Authentication is a process that uses Evidence to establish a degree of confidence in the reliability of an Assertion. The theory rejects the epistemological assumption of 'accessible truth' in favour of a relativistic interpretation, with degrees of reliability (or of 'strength', as security theory expresses it). Truth / verification / proof / validation notions are applicable within tightly-defined mathematical models, but not in the Real World in which Identity Management is applied. The complexities inherent in schemes that inter-relate humans and artefacts, are such that socio-technical perspectives are essential to understanding and to effective analysis and design of IS.
The IETF Security Glossary, on the other hand, defines 'authenticate' to mean "verify (i.e., establish the truth of) an attribute value claimed by or for a system entity or system resource" (2000, p. 15, unchanged in 2007, p. 26, emphasis added). Hence authentication is the process of verifying a claim. On the other hand, an adjacent passage states that "An authentication process consists of [the] Identification step: Presenting the claimed attribute value (e.g., a user identifier) to the authentication subsystem [and the] Verification step: Presenting or generating authentication information ... that acts as evidence to prove the binding between the attribute and that for which it is claimed" (IETF 2007 p. 26, emphases added). The inconsistency between the two aspects has become embedded in the language adopted by standards and guidance documents.
The problem is also inherent in ISO24760-1, where "verification" is defined as "process of establishing that identity information ... associated with a particular entity ... is correct" and "authentication is defined as "formalized process of verification ... that, if successful, results in an authenticated identity ... for an entity"(p. 3, emphases added), and 'identity proofing' (synonym: 'initial entity authentication') is defined as "verification ... based on identity evidence ..." (p. 5, emphases added).
The ISO standard at one point appears to acknowledge that the 'accessible truth' postulate is inappropriate, by observing that authentication involves tests "to determine, with the required level of assurance, their correctness" (p. 3, emphases added). On the other hand, a qualification to an absolute term like 'correctness' is incongruous. The inconsistency survived the standard's authoring, review and approval processes, but it is unclear what practitioners who consult the standard make of it. At the very least, the ambiguities appear likely to sow seeds of doubt and cause confusion.
In NIST (2017), truth-related notions such as 'proofing', 'verification' and 'determination of validity' occur throughout the document. For example, the string 'verif' occurs more than 100 times in the document's 60 pp. The implication is that something approaching infallibility is achievable, an assumption of perfection that is also evidenced by common but naive terms such as 'single source of truth'. However, several more circumspect expressions are also used in relation to the authentication process, such as 'reasonable risk based assurance' (p. 2), and 'levels of assurance' and 'establishing confidence in' (in the definition of Digital Authentication on p. 45).
Misinterpretations are invited by the conflicting definitions of the general term 'authentication' ("verifying the identity of a user, process, or device ...", p. 41, emphasis added) and of the specific term 'digital authentication' ("the process of establishing confidence in user identities presented digitally to a system", p. 45). The former uses a truth-related word, and the latter a practical and relativistic term. One encompasses both humans and artefacts, whereas the other refers only to users, which by implication means only human users. This is not just a minor editorial flaw, because the word 'authentication' appears 219 times in the document. The specific term accounts for 21 of them, but from the various contexts it appears likely that many others are intended to invoke the more specific concept.
NIST refers to the "classic paradigm" for authentication factors (what you know/have/are), without consideration of the Real-World nature of "what you are" and "what you have" compared with the Abstract-World nature of "what you know", and without distinguishing humans from active artefacts (NIST 2017, p. 12). Further, NIST's definition of biometric characteristics ("unique personal attributes that can be used to verify the identity of a person who is physically present at the point of verification", on p. 13) again conflates the notions of Physical Thing (modelled as an Entity) and Virtual Thing (modelled as an Identity). The quality of uniqueness is merely asserted without discussion, despite the assertion's shaky foundations and the challenges involved in gathering biometric data, and in comparing biometric samples gathered at different times under different conditions. Further confusion is created by a declaration that "In this volume, authenticators always contain a secret", followed three lines later by "Authentication factors classified as something you know are not necessarily secrets" (p. 13).
There are many circumstances in which an Authenticator pre-exists its use to evaluate any particular Assertion, having been designed for purposes additional to, and even different from, the Authentication process. Examples include registries of people in controlled professions such as health care, driving licensing registries, and tertiary education testamurs. NIST appears to be concerned only with Authenticators generated specifically for the Id/Entity Authentication process.
In the glossary, meanwhile, NIST defines biometrics as "automated recognition of individuals based on their biological and behavioral characteristics" (p. 43, emphasis added). The term 'recognition' depicts biometrics as a tool for Identification (selecting 1-among-many) rather than for Authentication (involving the less unreliable process of 1-to-1 comparison) or, in NIST's terms, verification. Furthermore, the text does not link the notion of biometrics to the term 'subject', and does not consider any equivalent to biometrics in the case of artefacts. The many and varied ambiguities arising from the imprecisions of the NIST text are bound to result in differing interpretations of various passages by different readers.
NIST (2017) defines the general term 'authentication' as "verifying the identity of a user, process, or device ...", p. 41, emphasis added. It is clear that the undefined term 'user' is intended to refer only to humans, in that the examples provided are "employees, contractors, or private individuals" (p. iii) and "employees and contractors" (p. 4). The contexts of all of the 48 occurrences of the term are consistent with the interpretation that computing devices and processes running in them are not within-scope ? e.g. "this revision of these guidelines does not explicitly address device identity", and "specific requirements for issuing authenticators to devices when they are used in authentication protocols with people" are also excluded (p. 5).
In Id/EM theory, Credential means an Authenticator that carries the imprimatur of an Authentication Authority. This is consistent with dictionary definitions of the term.
The IETF sent the industry down an inappropriate path, defining the term as "A data object that is a portable representation of the association between an identifier and a unit of authentication information" (RFC4949 2007, p. 84). This is consistent with IT industry usage of 'token', rather than with the notion of Credential.
The ISO 24760-1 standard invites confusion in this area. Despite defining the term 'evidence of identity', the document fails to refer to it when it defines credential, which is said to be a "representation of an identity ... for use in authentication ... A credential can be a username, username with a password, a PIN, a smartcard, a token, a fingerprint, a passport, etc." (p. 4). This muddles all of Evidence, Entity, Identity, Attribute, Identifier and Entifier, and omits any sense of a Credential being evidence of high reliability, having been issued or warranted by a Credential Authority.
The confusions generated by the standard are increased by the definition of 'identity proofing' as involving "a verification of provided identity information and can include uniqueness checks, possibly based on biometric techniques" (p. 5, emphases added). A biometric provides evidence concerning not the particular Identity that is being presented, but the particular underlying Entity. From a data modelling viewpoint, it associates data with the wrong record, and from a privacy viewpoint, it cuts through all of a person's many Identities, and enables access to and use of personal data irrelevant to the particular context.
Early versions of NIST's guidance adopted the same approach to 'credential' as IETF. The confusion between credential and token was acknowledged in 2017, but the new version limits its use of credential solely to electronic means of associating an authenticator with a user (p. 18). It also omits the notion of an authority. The NIST document makes the claim that "a credential binds an authenticator to the subscriber, via an identifier" (pp. 15, 44). It defines a credential as "an object or data structure" that has the mandatory characteristic that it "authoritatively binds an identity ... to [an] authenticator" (NIST 2017 p. 44, emphasis added). Unbreakable association may be achievable with artefacts; but the only way to implement it with humans is to reduce the human to an artefact, e.g. through chip-implantation. A more appropriate statement would be that 'a credential provides a considerable degree of confidence in an assertion, because it entails a combination of an assurance from a third party in which reliance is placed, together with a technical design that reliably associates the assertion with the party that is being authenticated'. Further uncertainty arises from ambiguity in the use of the expression "bound to". It appears to refer to binding between four different pairs of concepts, in the definition on p. 43 and in passages on pp. 10 and 14.
Another difficulty with the NIST approach is that a Credential may contain multiple items of information (e.g. a testamur evidencing not only the award of a degree, but also the units studied and the results achieved in each ? any or all of which may be relevant in any particular context). The Credential is the testamur. The various Data-Items it contains may be Authenticators for particular Assertions as to Fact, or as to Identity.
In Id/EM theory, a Token is "a recording medium on which useful Data is stored ...".
IETF (2007) deprecated the (early) use of the term 'token' in the 2004 version of NIST 800-63, recommending the use of the terms 'NIST hard token' for "a hardware device that contains a protected cryptographic key" and 'NIST one-time password device token' for "a personal hardware device that generates one-time passwords: (p. 308).
ISO 24760-1, on the other hand, adopts the same approach to a Token as Id/EM theory: "The identity information represented by a credential can ... be printed on human-readable media, or stored within a physical token" (p. 6). Immediately afterwards, however, it conflates the data with the storage-medium: "A credential can be ... a token" (p. 6).
In NIST (2017), the previous editions' conflation of authenticators and tokens is acknowledged and removed (pp. 6, 42). However, no definition is provided for the term 'token', and hence ambiguity and the scope for confusion remain.
Authorization is defined in Id/EM theory as a process whereby an Authorization Authority decides whether or not to declare that an Actor has one or more Permissions in relation to a particular IS Resource or Physical Resource.
IETF (2007) defines authorization firstly as an Id/EM Permission, but secondly as "a process for granting approval to a system entity to access a system resource" (p. 28). The Id/EM definition is similar, but more precise (because it distinguishes the prior act of determining what Permissions an Actor has, from the later, operational activity of Access Control, which provides a Session in which those Permissions can be exercised).
ISO 24760-1 makes almost no mention of authorization, appearing to treat it as being out-of-scope of 'A framework for identity management'.
NIST (2017) does not define the term authorization. An ambiguous sentence appears to say that the determination of the claimant's authorizations or access privileges is outside the guidelines' scope (p. 10). However, no cross-reference to authorization guidance is provided. The document makes use of the word 'authorization' to refer to the output of a process: "authorizations or access privileges" (p. 10). Its main use, however, is as part of the compound noun 'authorization decision', which refers to the final action in a process that determines what permissions a user is to be granted (pp. 9, 10, 16). An ungrammatical definition is provided of (the verb) 'authorize' to mean (the noun) "a decision to grant access, typically automated by evaluating a subject's attributes" (p. 42).
As noted earlier, another NIST document defines "Access control or authorization [as] the decision to permit or deny a subject access to system objects (network, data, application, service, etc.)" (NIST800-162 2014, p. 2). Id/EM theory separates the two notions, with Authorization being restricted to the making of the decision, and Access Control applying to its application, which may occur many times in respect of the outcome of any one Authorization decision-process.
The vagueness and ambiguity of the NIST guidance invites variable intepretations and hence variability in the implementations of IdM schemes. More precise definitions within a coherent model and associated terminology could greatly reduce the variability in interpretations and implementation, improve access control quality, and facilitate federation among schemes.
The primary factors influencing the making of decisions by Authorization Authorities are:
Users of IdM or Id/EM schemes are confronted by both technical and economic challenges. IBAC approaches do not scale well. RBAC schemes sacrifice precision for economy. ABAC schemes are complex, require customisation, and are expensive. TBAC approaches appear to have achieved little market penetration.
In some contexts, RBAC offers a satisfactory trade-off among the objectives, while in others it manifestly falls short of the need. One issue is that RBAC was conceived at a time when most IS operated inside organisational boundaries. On the other hand, the notions of inter- and multi-organisational systems were already strongly in evidence, and extra-organisational systems extending out to individuals have been operational since c.1980 (Clarke 1992a). Yet the IETF Glossary of 2000, even after revision in 2007, defines role to mean "a job function or employment position" (IETF, p. 254).
Authorization is treated by ISO 24760-1 as being out-of-scope.
The NIST exposition on ABAC adopts the narrow view that "a role has no meaning unless it is defined within the context of an organization" (NIST800-162 2014, p. 26). Further, although the document suggests that ABAC supports "arbitrary attributes of the user and arbitrary attributes of the [IS resource]" (p. vii), the only examples provided for actors' attributes in the entire 50-page document are position descriptions internal to an organisation: "a Nurse Practitioner in the Cardiology Department" (pp. viii, 10), "Non-Medical Support Staff" (p.10).
An assumption implicit in many interpretations of the NIST document is that a one-to-one relationship exists between organisational identity and organisational role. It is common for an Identity to have a primary organisational Identity, in the form of a job-title and associated job description. However, employees and on-site contractors commonly have additional Roles ? e.g. as a fire warden, a mentor to a junior assistant, and member of an interview panel. Particular Permissions are needed for each Role (e.g. for access to messages intended only for Fire Wardens, and for access to the personal details of mentees and of job-applicants).
A scheme that limits Role to organisational positions or even part-time functions is inherently incapable of coping with boundary-spanning applications of IT. From the earliest period of online inter-organisational systems (IOS), onwards to multi-organisational systems (MOS), and outwards to extra-organisational systems, large numbers of individuals with no more formalised role than customer, client, supplier, registrant or enquirer have been granted Permissions. Such systems, particularly when dealing with sensitive data or enabling direct action on the Real World, need far more sophistication in Access Control Lists than most RBAC implementations offer. Furthermore, to the extent that Attributes rather than Roles are used as the organising concept, Attributes must also reflect the whole context. Id/EM theory accommodates these ideas.
Id/EM theory defines a Permission as (a) an entitlement granted to an Id/Entity-Instance (b) to be provided with the capability to perform a specified Action (c) in relation to a specified IS Resource or Physical Resource, and (d) for a particular Purpose, Use, Function or Task. This is a relatively expansive notion. It goes beyond the scope of the Identity-based IBAC approach, and Role-based RBAC. It extends aspects of the ABAC notion, by defining the 'environmental variables' recognised in that approach to extend to the placing of limitations on the reasons why an Actor may be granted access. It can also be seen as broadening the notion of Task in TBAC.
A common weakness in authorization schemes is inadequate attention to the granularity of Data and Processes and/or Actions that are the subject of the Permission. Many Permissions are provided at a gross level, with entire records accessible, well in excess of the Data justified on the basis of the 'need-to-know principle' and the 'principle of least privilege'. Excessive scope of Permissions invites Permission Breach, in the forms of misappropriation of data and performance of functions for purposes other than those for which the Permissions were intended. This may be done out of self-interest (commonly: curiosity, electronic stalking of celebrities, or the identification and location of individuals), as a favour for a friend, or for-fee. The threats of insider attack and data breach are inadequately controlled by RBAC. Nor do the standards acknowledge the need for proactive and reactive safeguards and mitigation measures. This is hardly a new insight. See, for example, Clarke (1992b).
A further concern arises with implied powers. "The RBAC object oriented model (Sandhu 1996) organises roles in a set/subset hierarchy where a senior role inherits access permissions from more junior roles" (p. 135). Permission inheritance is a serious weakness. A superior does not normally have the need, and hence should not normally have the Permission. Access to such data is necessary for the performance of review and audit, but not for the performance of supervision.
The inheritance problem is one instance of a more general issue. RBAC approaches are unable to accommodate the reason for access as a determinant of Permission. A related perception from an organisation's own perspective is the need to limit access according to the general Function being performed (e.g. at a managerial or professional level) and/or the specific Task being undertaken (particularly at the level of operational or administrative staff-members). This may be looming as a larger issue in enterprise management, as the dream of 'data as the new oil' becomes complemented and perhaps even supplanted by notions of 'data risk' and 'toxic data'. There are already contexts in which a legal justification for access needs to be demonstrated. For example, in data protection law, access to personal data must be limited based on the general Purpose and/or specific Use.
Currently, it appears to be uncommon for IdM schemes to embody even a proxy approach to safeguard against inappropriate access. For example, each User could be required to provide a brief declaration of the reason for each exercise of a Permission. In many IS, this need be no more than a Case-Id or other reference-number to a formal organisational register. Once that declaration is logged, along with the Username, Date-Time-Stamp, Record(s) accessed and Process(es) performed, a sufficient audit trail exists. That, plus the understanding that log-analysis is undertaken, anomalies are investigated, and sanctions for misuse exist and are applied, could act as a substantial deterrent safeguard against Users abusing their Permissions, and as an enabler of ex post facto detection, investigation and correction safeguards.
Inadequately specific Permissions invite Permission Breach by authorized Users. In addition, they enable Imposters to more easily gain access to relatively broad Permissions. Id/EM theory encompasses this, whereas none of IETF, NIST, ISO or FIDO embody such limitations.
In Id/EM theory, Access Control is defined as "a process that utilises previously recorded Permissions to establish a Session that enables a User to exercise the appropriate Permissions".
ISO 24760-1 states that "identity-based decisions can concern access to applications or other resources" (p. v), and mentions that the function of what Id/EM theory refers to as Enrolment is "to enable the entity to access resources and interact with services provided by a domain" (p. 15). However it defers access control matters to ISO/IEC 29146 'A framework for access management'.
The sole occurrence of the expression 'Access Control' in NIST (2017) is in an Abbreviations list for ABAC (Attribute Based Access Control) (p. 58). It is implicit, however, that permissions granted to a subscriber are exercised in a 'session', defined as "a persistent interaction between a subscriber and an endpoint ..." (p. 53). Hence, access control can be inferred as the step that enables a session to come into existence, or the function of managing a subscriber's access to a session.
The lack of guidance in NIST (2017) on the use of Permissions granted in an Authorization process creates uncertainties. These would have been overcome if a cross-reference had been provided to an appropriate document on the subject. A variety of ambiguities and uncertainties would remain, however. These could be addressed by adapting the model and terminology in the NIST document by reference to those in the theory presented in this article.
The terminology used in expressing Id/EM theory was devised to achieve completeness, consistency and coherence. If a similarly careful approach is adopted to a revision of the ISO and/or NIST model, terminology and guidance, many interpretation difficulties will be surfaced and addressed. Their next revisions would deliver greater clarity and invite fewer misunderstandings.
Id/EM theory includes a generic process model, depicted in Figure 6, to distinguish two Phases, of respectively four and three steps, with Data stored in an Account intermediating between the core Authorization and Access Control activities, and with all terms defined clearly and consistently, thereby minimising syntactic and semantic ambiguities.
Depictions of conventional authorization theory exhibit many variants and inconsistencies in architecture, process flow, terminology and definitions. Even at the most abstract level, considerable differences exist in interpretations of the notions of identity management and access control. The descriptions of identification, authentication and authorization functions evidence many overlaps. An early example of the conflation of the identification and authentication processes is "An authentication process consists of two basic steps: Identification step: Presenting the claimed attribute value (e.g., a user identifier) [and] Verification step ..." (IETF 2007 p. 27, emphases added). Similar problems are evident in the international standard, as depicted in Figure 1. It states that "The process of identification applies verification to claimed or observed attributes" (ISO24760-1, p. 3, emphases added).
The NIST document provides limited guidance about the process, with the diagrammatic version of the Digital Identity Model in Figure 1 showing three players (a 'relying party' seeking to authenticate a subject claiming to have permissions, a service that authenticates that claim, and a credential service provider) and three presentations of the subject (as the claimant of permissions, but previous to that as an applicant and, once pre-authenticated, as a subscriber). The textual description goes somewhat further towards a process definition, saying "the left side of the diagram shows the enrollment, credential issuance, lifecycle management activities, and various states of an identity proofing and authentication process" and "the right side of [the diagram] shows the entities and interactions involved in using an authenticator to perform digital authentication" (pp. 10-11). The preliminary phase is referred to as Enrollment and Identity Proofing, and the second phase is called Digital Authentication, with little clarification of the phases' sub-structure (pp. 10-12). Moreover, authorization is sometimes described in ways that suggest it occurs at the time a user is provided with a session in which they can exercise their permissions; whereas its primary usage has to do with a preparatory act: the making of a decision about what permissions a user is to be granted, resulting in data being created to enable effective and efficient performance during the operational phase.
Id/EM theory brings order to the use of terms and distinctions between the many functions that need to be performed, and avoids the ambiguities, uncertainties and divergent interpretations that arise from the NIST presentation.
FIDO (2022) 'User Authentication Specifications Overview' FIDO Alliance, 8 December 2022, at https://fidoalliance.org/specifications/
FIPS-201-3 (2022) 'Personal Identity Verification (PIV) of Federal Employees and Contractors' [US] Federal Information Processing Standards, January 2022, at https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.201-3.pdf
IETF (2000) 'Internet Security Glossary', Internet Engineering Task Force, RFC2828, May 2000, at http://www.ietf.org/rfc/rfc2828.txt
IETF (2007) 'Internet Security Glossary, Version 2', Internet Engineering Task Force, RFC4949, August 2007, at http://www.ietf.org/rfc/rfc4949.txt
ISO 22600-1:2014 'Health informatics Å\ Privilege management and access control Å\ Part 1: Overview and policy management' International Standards Organisation TC 215 Health informatics, 2014
ISO 22600-2:2014 'Health informatics Å\ Privilege management and access control Å\ Part 2: Formal models' International Standards Organisation TC 215 Health informatics, 2014
ISO/IEC 24760-1 (2019) 'A Framework for Identity Management ? Part 1: Terminology and concepts' International Standards Organisation SC27 IT Security techniques, 2019, at https://standards.iso.org/ittf/PubliclyAvailableStandards/c077582_ISO_IEC_24760-1_2019(E).zip
ISO/IEC 24760-2 (2017) 'A Framework for Identity Management ? Part 2: Reference architecture and requirements' International Standards Organisation SC27 IT Security techniques, 2017
ISO/IEC 27001 (2018) 'Information technology Å\ Security techniques Å\ Information security management systems Å\ Overview and vocabulary' International Standards Organisation, 2018, at https://standards.iso.org/ittf/PubliclyAvailableStandards/c073906_ISO_IEC_27000_2018_E.zip
ISO/IEC 24760-3 (2019) 'A Framework for Identity Management ? Part 3: Practice' International Standards Organisation SC27 IT Security techniques, 2019
ISO/IEC 29146 (2024) 'Information technology Å\ Security techniques Å\ A framework for access management' International Standards Organisation, 2024, at https://www.iso.org/obp/ui/en/#iso:std:iso-iec:29146:ed-2:v1:en
NIST800-63-3 (2017) 'Digital Identity Guidelines' National Institute of Standards and Technology, 2017, at https://doi.org/10.6028/NIST.SP.800-63-3
NIST800-63-3A (2017) 'Digital Identity Guidelines: Enrollment and Identity Proofing' National Institute of Standards and Technology, 2017, at https://doi.org/10.6028/NIST.SP.800-63a
NIST800-63-3B (2017) 'Digital Identity Guidelines: Authentication and Lifecycle Management' National Institute of Standards and Technology, 2017, at https://doi.org/10.6028/NIST.SP.800-63b
NIST800-63-3C (2017) 'Digital Identity Guidelines: Federation and Assertions' National Institute of Standards and Technology, 2017, at https://doi.org/10.6028/NIST.SP.800-63bc
NIST800-63-4 (2022) 'Digital Identity Guidelines: Initial Public Draft' Special Publication SP 800-63-4 ipd, US National Institute of Standards and Technology, December 2022, at https://doi.org/10.6028/NIST.SP.800-63-4.ipd
NIST800-162 (2014) 'Guide to Attribute Based Access Control (ABAC) Definition and Considerations' NIST Special Publication 800-162, National Institute of Standards and Technology, updated to February 2019, at https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-162.pdf
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professorial Fellow associated with UNSW Law & Justice, and a Visiting Professor in the School of Computing at the Australian National University.
Personalia |
Photographs Presentations Videos |
Access Statistics |
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax. From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024. Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer |
Xamax Consultancy Pty Ltd ACN: 002 360 456 78 Sidaway St, Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 6916 |
Created: 23 March 2023 - Last Amended: 30 May 2024 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/ID/GT-IdEM-Annex.html
Mail to Webmaster - © Xamax Consultancy Pty Ltd, 1995-2024 - Privacy Policy