Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2018
|Identity Matters||Other Topics||Waltzing Matilda||What's New|
Roger Clarke **
Version of 30 September 2007, slide-set added 27 October 2007
Prepared for an Invited Keynote at the 2nd RNSA Workshop on the Social Implications of National Security - From Dataveillance to Überveillance ..., 29 October 2007, University of Wollongong
Revised version published as 'What is Überveillance? (And What Should Be Done About It?)' IEEE Technology and Society 29, 2 (Summer 2010) 17-25
[Because some search-engines fail to recognise 'ü' as a variant of 'u', it's been necessary to include the word Uberveillance several times in this document, to ensure appropriate indexing and discoverability - Uberveillance again]
© Xamax Consultancy Pty Ltd, 2007
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://www.rogerclarke.com/DV/RNSA07.html
The slide-set to accompany the presentation is at http://www.rogerclarke.com/DV/RNSA07-071029.ppt
Mere surveillance is passé. The idea was worth discussing as recently as a quarter-century ago, but no longer. Technologists have delivered, and marketers have promoted (and exaggerated), a host of additional capabilities.
A new term that might better describe the current circumstances is 'überveillance'. This paper provides both a theoretical and an empirical context within which to assess alternative interpretations of that notion. It culminates in a set of Principles whereby the balance that has been lost in recent years can be restored.
Corporate marketers have promoted a vast array of technologies as means to monitor the behaviour of all manner of things. Parliaments have suspended their disbelief and permitted government agencies to buy technologies and install systems. Some corporations have imposed similar schemes on their employees, and on their customers.
There is enormous diversity among the schemes that have been installed or proposed. Indeed, there are many objectives, and considerable specialisation is occurring, with the result that surveillance is going through divergence and even splintering.
But there are also signs of convergence and coordination, and this creates both some degree of promise and a vastly increased level of threat to society. The workshop committee has selected 'überveillance' as the theme around which the new direction can be examined.
This keynote commences by underlining key aspects of the surveillance notion. It then briefly scans the range of surveillance schemes. The intention is to lay a foundation for a typology of schemes, for comparison and contrast, and ultimately for a critical appreciation of the benefits, disbenefits and risks that are inherent in the process of inter-relating surveillance schemes.
Three alternative interpretations of the notion of 'überveillance' are then discussed, translating 'über' variously as 'all', as 'exaggerated' and as 'supra'. Finally, themes arising from these discussions are developed into a small set of Principles that must be applied in order to avoid the over-reaction to the threat of 'terrorism' causing our societies to eat themselves.
The number of different surveillance schemes is so great that a comprehensive survey requires substantial resources. This section commences by re-visiting a couple of key concepts, as a prelude to vignettes of a number of rather different kinds of surveillance.
In my work in this area over the last 20 years, I've referred to surveillance as "the systematic investigation or monitoring of the actions or communications of one or more persons". This requires some adjustment, in particular to take account of the monitoring of spaces, and of objects other than humans. The primary concern of this paper is the surveillance of people and their behaviour, whether directly or indirectly.
The original forms of physical surveillance were typified by visual observation, and symbolised by Bentham's panopticon.
Watching and listening have come to be aided by equipment of various kinds which offers enhancement of optical and aural signals, e.g. through telescopes and directional microphones. This has enabled physical surveillance at distance.
A development in recent years has been the emergent phenomenon of what might be called auto-physical surveillance. This is enabled by means of devices that are attached to the person (whether loosely but reliably, as with a mobile phone, or tightly as with an anklet, or even embedded). Rather than the modern connotation of 'automated', the prefix 'auto-' is intended here to convey its original meaning of 'self-'.
Progressively, surveillance ceased to be constrained to the observation of ephemera. The recording of signals meant that data trails could be built up, and that retrospective analysis could be undertaken of those trails. As the number of such trails increased, information originating from different times and places could be interwoven, enabling additional inferences to be drawn.
The monitoring of data-flows, and the analysis of data-holdings, are economically efficient because they can be automated. Furthermore, they are inherently surreptitious, so the watched are far less aware of the watchers than is the case with physical surveillance, even at distance. As a result, dataveillance (a convenient contraction of 'data surveillance') has been used to augment, and increasingly to substitute for, physical surveillance (Clarke 1988). The volume of monitoring undertaken has also grown, because its inexpensiveness enables more of it to be done within the same budget. The natural limitations on the number of men who can be hired to wear trench-coats and watch doorways have been overcome.
As telecommunications improved, a further capability was added. The data became available very shortly after it was collected, which meant that the trail was warm and real-time tracking could be conducted. This increased the chances of being able to intercept a target. It also created the possibility of predictive tracking, by inferring a target's intended destination.
As telecommunications developed, first telegraphic, then telephonic and later facsimile transmissions became vehicles for electronic surveillance. In recent decades, this has been extended to all forms of Internet communications, particularly those that depend on wired connections, but also the various unwired channels.
Until recently, electronic communications supported the equivalent of speech. Generally, the law permitted connections monitoring or traffic analysis (who is talking with whom) although it subjected such activities to controls. Because of the enormous intrusiveness and the risks involved in granting powers to law enforcement agencies, much greater obstacles were placed in the way of communications surveillance (who is saying what to whom).
Since the advent of the Web in the early-to-mid-1990s, however, electronic communications also support the equivalents of buying books and going to the library. The monitoring being conducted by employers and governments is now far more intrusive, because what might be described as experience surveillance provides access not merely to what a person is saying, but also to what they are thinking about and researching.
Within each of the categories discussed above, it is important to distinguish two sub-categories:
Physical surveillance was applied to a location or place. Enhancements enabled the watcher and their equipment to be separated by some distance from that place, but the locus of the surveillance remained the same. Three different categories of place have been discernible, which might be described as private, controlled and public.
The notion of private places corresponds to locations in which an individual, or two, or perhaps a few, could reasonably expect not to be subject to surveillance by other parties. This has seemed to have a central core of the marital bedroom, a more qualified zone comprising the rest of the home and even more so its visible exterior (gardens and patios), and some further outposts such as the insides of toilet cubicles.
Organisations that exercise substantial control over particular places have asserted the right to conduct surveillance where, when and how they wish. The contestability of claims in relation to controlled places increases from, for example, the rooms from which nuclear power stations and air traffic are controlled, via the footpaths outside government agencies and the faces presenting to ATMs, to railway stations and cinema precincts.
One interpretation of public place is 'everywhere that is neither of the other two'. The numerous subscribers to the 'original sin' philosophy of life tend to assert that all forms of surveillance of public places are legitimate, on the grounds that privacy inherently doesn't exist in public places, or no longer exists in public places, or should not exist in public places.
Yet people have always had reasonable expectations of privacy in public places. That applies all the more to people who are not well-known. More generally, people, whether well-known or not, have a reasonable expectation of privacy when they are behaving in a manner that is intended to be private, e.g. when in the company of family, rather than projecting themselves (or their 'public persona') to some kind of 'public'. Because parliaments have been slow to protect such behaviours, the courts are being forced to develop a tort through case law.
Electronic surveillance broke the nexus with a single location. Initially, it was feasible to re-define it to a multi-location phenomenon, as in the monitoring of both ends of a phone conversation. But first dataveillance and then new forms of electronic surveillance forced further re-thinking. It is now necessary to define the actions or communications that are subject to surveillance as occurring in 'space' rather than 'place', and to conceive of the space as being either physical or figurative (as in abstractions such as 'cyberspace'). With that change, the old concepts of private, controlled and public places have given way to private, controlled and public spaces.
The purposes and potential benefits of surveillance are discussed in section 2 of Wigan & Clarke (2006). This paper focusses primarily on its negative impacts.
As an intrinsic part of this presentation, a collection of vignettes was prepared. These describe a wide array of instances of surveillance, with considerable differences in purpose, style and intensity. Partly because of the length of the text and partly in order to make them accessible independently of this paper, they are to be found in 'Surveillance Vignettes' (Clarke 2007a).
The diversity that is evident in that collection of vignettes suggests the need to be clear about the dimensions across which applications vary. Drawing on the outline of the surveillance concept in section 2 above, the following can be distinguished:
That which is subjected to surveillance may be a specified individual, specified groups of individuals, specified objects, specified groups of objects, or a specified space.
The beneficiaries of surveillance may be the individual who is the subject of the surveillance, an individual who has a direct interest in the subject of the surveillance, or another party with an interest in the behaviour of the subject.
The surveillance may be conducted by the individual who is the subject of the surveillance, an individual who has a direct interest in the subject of the surveillance, another party with an interest in the behaviour of the subject, or a third party that is in some sense acting on behalf of one of the above.
The primary purpose of the surveillance may be to assist with the health or safety of the subject of surveillance, to detect or collect evidence of behaviour that conforms or does not conform with some norm, or to encourage conformant behaviour and/or deter non-conformant behaviour.
The means whereby the surveillance is conducted may be physical surveillance (visual and aural), physical surveillance at distance, auto-surveillance, retrospective analysis, dataveillance, real-time tracking, predictive tracking, traffic analysis, communications surveillance or experience surveillance; and each of them may be targeted personal surveillance or much broader mass surveillance.
The locus of the surveillance may be defined in physical space, or in some virtual space. A common form of virtual space is that enabled by electronic communication networks, but another is the web of ideas inherent in published text, uttered words and recorded behaviour.
The timeframe in which surveillance is conducted may be defined across a single span of time, or recurrent spans (such as a particular span within each 24-hour cycle), or scattered across time (e.g. triggered by particular conditions detected in published text, uttered words and recorded behaviour), or continuous and unremitting.
The public and political acceptability, the legality, and the effectiveness of a particular instance of surveillance differ greatly depending on the design choices that it evidences. An approach to developing an ethical framework for surveillance is in Michael, McNamee & Michael (2006).
The theme of the workshop originates in the work of the Editors, Michael and Katina Michael, with the first published use in lecture notes (Michael 2006). The notion is emergent rather than established, and it continues to evolve. A useful working definition that they offer is "an above and beyond omnipresent 24/7 surveillance where the explicit concerns for misinformation, misinterpretation, and information manipulation, are ever more multiplied and where potentially the technology is embedded into our body" (Michael & Michael 2006, p. 361)
In this section, this author approaches the idea afresh, and considers several possible interpretations of the term, including, but not restricted to, the Michael & Michael quotation above.
The word appears not to have existed until Michael & Michael coined it. Its stem and suffix, '-veillance', are clearly co-opted from 'surveillance'. Originally, this derived from the French 'surveiller', whose contemporary senses include 'to keep an eye on' (e.g. luggage), to supervise (e.g. people), to monitor (e.g. people, an object or a space), and to invigilate (to watch candidates in an examination).
Judging by the entry in the Oxford English Dictionary, the word was co-opted into English in 1799, originally in a report on the French Revolution. The relationship was readily recognised with Bentham's panopticon proposal, which originated in 1787 but was current for 25 years. During the 200 years since then, the English word 'surveillance' has come to be used primarily with sinister associations. It has been subject to a number of adaptations and extensions, including this author's own neologism 'dataveillance', of 1988, which the Michaels explicitly identify as one of the inspirations for their own work.
The prefix 'über' is drawn directly from German. Its several senses are investigated in the following sub-sections.
An apocalyptic vision would see 'überveillance' as referring to surveillance that applies across all space and all time (omni-present), and support some organisation that is all-seeing and even all-knowing (omniscient), at least relative to some person or object. (The apocalyptic theme is a key thread in Michael's work. See Michael 1999 and 2003).
An effective way to do this would be to embed the surveillance mechanism within the person or thing to be monitored, and endow it with the capacity to monitor itself continuously, and report to a monitoring authority, whether periodically, by exception, or continuously. Applying the dictum that 'information is power', this leads easily to a feeling of inevitability of the surveillance organisation becoming an all-powerful (omnipotent) being.
On the one hand, this is the stuff of science fiction, and the dystopian genre within sci-fi at that. On the other, most of the elements needed to realise the nightmare already exist, including:
Remarkable as it may seem, some categories of people are being enveigled, coerced and even mandated to submit to such a 'pan-electricon', particularly as a condition of employment, or in return for reduced constraints on the space within which the individual is permitted to move. Aspects of the 'digital persona' in contexts such as these are investigated in Clarke (2005b).
If the word 'überveillance' achieves broad currency, this may well be the primary interpretation that our children and grandchildren have of it. It remains somewhat speculative at this stage, however, and is sufficiently forbidding that many people are likely to remain 'in denial'. The following two alternative interpretations may therefore be of greater immediate value in investigating the idea and what we need to do about it right now.
One interpretation of 'überveillance' questions the extent to which surveillance is undertaken. This can be along various dimensions, as discussed in section 4 above. For example, surveillance may be excessive because it has too broad a scope, or is instigated for reasons that are minor in comparison with its negative impacts. In either case, its justification is exaggerated.
Surveillance has costs and disbenefits, and its benefits need to be balanced against them. The costs and disbenefits may be incurred by the organisation conducting the surveillance, or by others, particularly the individuals subjected to it.
The term 'costs' is used here in the financial sense, and includes all forms of expenditure, in particular on the conduct of the surveillance, on the infrastructure to support it, and on the analysis of the resulting data stream(s). It encompasses at least some of the costs of actions taken as a result of surveillance, in particular those actions that transpire to have been unjustified because they arose from 'false positives'.
The notion 'disbenefits' is used to encompass non-financial impacts that are negative, whether for the society, economy or polity as a whole, or only for some individuals or groups. The enormous scope of disbenefits arising from surveillance is exemplified by the list in Exhibit 1.
From Clarke (1988)
A crucial question in any organic system is the extent to which natural controls exist. If natural controls are in place and not seriously impeded, then the system may be best left to find its own equilibrium. If, on the other hand, the controls are retarded in a significant way, then some intervention may be needed, in order to overcome the impediments, or to stimulate the control aspects. In some settings, however, the system may be doomed to spiral out of control. In that case, the architecture is in need of overhaul if the system is to survive.
To what extent is surveillance an organic system, and which of those archetypes best describes it?
In Clarke (1995a), 'intrinsic controls' over the particular dataveillance technique of data matching were examined. They were found to include:
That paper concluded that "the intrinsic factor which might be expected to exercise the most significant degree of control over computer matching is economics: surely government agencies will not apply the technique in circumstances in which it is not worthwhile. The primary means whereby the economic factor will influence decision-making about computer matching programs is cost/benefit analysis". The various forms of cost/benefit analysis are described in Clarke & Stevens (1997) and (Clarke 2007b).
A mere decade later, that sentiment seems quaint. In the present decade, government agencies have barely adopted so much as a pretence of conducting cost/benefit analyses. They have become thoroughly politicised, and 'business cases' dominate. A 'business case' differs from a cost/benefit analysis in two important ways. It is one-dimensional, because it adopts the view of the sponsor, rather than reflecting the varying perspectives of multiple stakeholders. Secondly, it is essentially designed as a justification of a policy position that has already been adopted, rather than as an analytical tool.
In the surveillance arena, there has not only been little evidence of cost/benefit analysis being applied, there has seldom even been a compelling business case. The proponents of surveillance successfully avoid scrutiny of their proposals, especially since the windfall of the terrorist strikes in New York and Washington DC in 2001, and what marketers refer to as 'mid-life kickers' in Bali in 2002, in Madrid in 2004, and in London in 2005. Since 2001, surveillance has been implemented as an imperative, as those worst forms of policy-formation - knee-jerk reaction, bandwagon effect, and sacred cow.
The biometrics industry provides a valuable case study. Most biometrics technologies cannot and do not deliver on their promises, partly because the environments in which they are applied are complex and messy, and partly because most biometrics technologies are technically flawed. It is therefore not in the interests of the providers of technologies and services to provide truthful information or to submit to evaluation.
Surely 'the truth will out', user organisations will discover that 'the emperor has no clothes', and the mythologies of surveillance will become common knowledge?
Instead, an extraordinary phenomenon has emerged, that has not been evident in other contexts - alliances of vendors and user organisations. The US national security community has contrived the publication of tests and reports that have been quite grossly twisted and biassed, in order to provide biometrics vendors with breathing-space, and with credibility that their products do not warrant. The most extreme instance is in the laughably inadequate technology falsely projected as 'facial recognition'. The Face Recognition Vendor Test (FRVT) projects have been breathtaking in their misrepresentation of reality. They were jointly sponsored by a group including the Federal Bureau of Investigation (FBI), the National Institute of Standards and Technology (NIST) and the Department of Homeland Security.
In Australia, the corruption has been mirrored in the Biometrics 'Institute'. This organisation has a grand-sounding name, but its function is to provide a forum for the alignment of organisations whose interests, in an organic system, would be at considerable variance from one another. Government agencies and suppliers have conspired, and continue to conspire, to project biometrics technologies as things that they are not: effective, reliable, and safe for human consumption.
Corporations, unlike governments and government agencies, are subject to the constraints of return on investment (ROI). This somewhat tempers their enthusiasm for monitoring. For these reasons, the financial sector has long resisted strong authentication on its customers. It also appears that the full power of consumer profiling and 'customer relationship management' technology may not yet have been unleashed on Australian consumers.
But ROI has proven inadequate to ensure rational designs. The private sector too makes decisions that are far from balanced, because knee-jerk and bandwagon outweigh rationality. In addition, there has been increasing pressure from Governments, using such 'motherhood and apple pie' sentiments as 'money-laundering', 'counter-terrorism', 'homeland' and 'critical infrastructure protection'. The 2006-07 rounds of 'Anti-Money-Laundering and Counter-Terrorism Financing' (AML-CTF) legislation represent one of the most extreme forms of exaggeration to date, with business enterprises now obligatorily enlisted as spies against their customers.
Another possible interpretation of 'überveillance' derives from the use of 'über' to imply 'meta', 'supra' or 'master'-surveillance.
This could involve the consolidation of multiple surveillance threads in order to develop what would be envisaged by its proponents to be superior information. This might be performed ad hoc, as occurs in 'intelligence assessment' agencies active in foreign affairs and national security.
The challenges are enormous, however. In particular, the data-flows are typically highly variable and highly unreliable. The bases on which they are conceived and implemented vary greatly between the streams. There may be very considerable differences between the aims of each individual operator and the would-be 'master'. The challenges of diversity in data sources, data meaning and data quality were investigated in the context of data matching programs in Clarke (1995b).
In order to overcome the difficulties inherent in consolidating very different streams of information, there could be endeavours to achieve coordination among the various surveillance sources. An example of such an approach is the creation of an organisation whose express purpose is to draw surveillance organisations closer together. A prime example was the creation of the U.S. Department of Homeland Security (which in the process changed the sense of 'DHS' from Human Services to something differently protective and much more sinister).
An approach that might seem superior to both consolidation and coordination is centralisation. This involves the conception of an architecture intended from the outset to develop a set of feeds into a single 'master', with all of the subsidiary surveillance processes serving the centrally-determined objectives. Stafford Beer naively thought that a centrally-planned cyber-economy could be consistent with an open society and a democratic polity. The experience of Beer's Cybersyn project (1970-73) could have delivered the coup de grace to such Promethean idealism if Chile had not been seen to be acting against the interests of the American way of profit. Its elected President was eliminated, and with him Beer's experiment.
During the 1970s and 1980s, such 'central planning' approaches were derided. To some extent this was due to their totalitarian nature, demanding as they do a controlled and inherently static society. But the primary reason was that they had been demonstrated not only behind the Iron Curtain, in Cuba and under East Asian Communist regimes, but also in France, to lead to economic systems that were ineffective, inefficient and in most cases downright stagnant.
Up to a point, systems generally exhibit efficiencies of scale, and efficiencies of scope. Beyond that point, they become unwieldy, excessively complex, and inherently unmanageable. Systems of the complexity of societies are well beyond the flex-point. They accordingly exhibit substantial inefficiencies of scale and of scope. General systems theory recognises that, for large-scale systems to have the flexibility and adaptability that they need for survival, they need to comprise loosely coupled elements, and to be subject to control through the interplay of those elements rather than through any form of centrally-determined control.
The picture painted in the preceding sections may seem bleak. Surveillance is rampant. Human values have been trampled. Osama bin Laden and Al Qaeda, or rather the effigies that have been made of them, have triumphed. The limited and sporadic attacks in their names have struck at the moral weaknesses and contradictions inherent in the 'Western', 'democratic' world. That world has turned inwards on itself. It is spiralling towards self-destruction through the denial of the very freedoms on which it was supposed to be built. Our world needs an antidote to 'national security extremism', and it needs it fast.
This section distills a few key messages about what we need to do in order to ensure survival of society, the economy and the polity, in the face of rampant 'control freaks'. It enunciates a small set of Principles that will contribute to the restoration of Australian society by bringing the surveillance mania back under control. The intention is to generate countervailing power against the extremism of the national security agencies. In this context, a variant of the label 'countervaillance to 'counterveillance' is appropriate. Exhibit 2 lists the Principles, and the remainder of the section provides brief descriptions of them.
The position adopted in developing these Principles is not itself extremist. It is common ground across society that terrorists are killing people from time to time, that there are (small numbers of) disaffected individuals who will be attracted to violent 'solutions', that religious fundamentalism is a threat to open societies, that countermeasures are needed, and that both general alertness and capable public security institutions are needed.
Where this set of Principles might be seen by some to be radical is in the following:
Surveillance of the intensive kinds that are drastically altering our society are heavily dependent on technologies. The assertions of technologists and marketers must be viewed with scepticism, and subjected to testing. That testing must not be warped, and must not be conducted by participants in the field of play (such as the FBI, NSA, NIST, and, in Australia, the Defence Science & Technology Organisation - DSTO). Normal science and technology must be resumed. Rather than 'Government policy' driving and twisting outcomes, rational consideration of technologies and their applications is essential.
Some years ago, I called for a moratorium on biometric implementations in Australia (Clarke 2003). I did not do so idly. I argued that "[a] ban must be imposed on the application of biometrics technologies until and unless a comprehensive and legally enforced regulatory regime has been established". My rationale was not only that applications of biometrics had quite gross, negative impacts, but also that a moratorium might well be the only means of saving an industry that has promised much for years and delivered very little.
There are enormous impediments to the adoption of 'advanced technologies'. In the majority of cases, their dysfunctions are considerable, and the extent to which they achieve their primary objectives is in serious doubt. The identification and authentication schemes for the APEC meeting in Sydney were as much of a farce as the traffic control system that let The Chasers' convoy through beyond the point of embarrassment.
The antidote to inappropriate deployments of inadequate technologies is openness. The public needs facts about the context in which surveillance schemes are to be deployed. They need a statement of the scheme's objectives. They need to know sufficient about the design features that they can apply reasonable tests to the scheme's feasibility, and assess its effectiveness under varying circumstances. They need the opportunity to apply systemic reasoning, in order to evaluate whether the design features can give rise to the claimed benefits.
No measure should not be implemented unless its negative impacts are demonstrated to be outweighed by its benefits. It seems extraordinary that a case has to be mounted in support of such a straightforward contention. Yet national security and law enforcement agencies (NS&LEAs) have been permitted to make untested assertions about both threats to public safety and the benefits of surveillance measures in addressing those threats. The sacred cow of blind trust in NS&LEAs has to be put to death. Those organisations must be required to present their arguments, and defend them in public.
A further critical aspect of an open society is the ability of the public to participate in the debate. This enables testing of the information and arguments. But it also brings the many perspectives of a complex society to bear on the information and the declared objectives.
Another form of normal service that needs to be resumed is the application of established techniques to the available information, in order to provide a basis for comparison among financial costs and benefits, on the one hand, qualitative factors on the second, and risks (and especially remote ones) on the third.
The technique of Privacy Impact Assessment (PIA) has been making headway during the last few years, and has attracted support now from such inherently conservative institutions as the Senate, the Privacy Commissioner, and in September 2007 the Australian Law Reform Commission (ALRC). An even broader notion of social impact assessment is crucial to the survival of an open society.
One of the key features of the vignettes was the existence of positive instances of surveillance, both for individuals and society. Surveillance is not itself evil. The problem has been the presumptiveness of its proponents, the lack of rational evaluation, and the exaggerations and excesses that have been permitted.
Proponents of surveillance have Design Principles that guide the creation of their systems. An alternative or complementary set of Design Principles is required, which guides the conception of schemes that do not threaten free society from within. Key examples includes the following:
Nymity encompasses both anonymity and pseudonymity, and is addressed in depth in Clarke (1999a). Geniune anonymity precludes the link being discovered between an identity and the entity or entities using it. It carries with it the risk of non-accountability. With pseudonymity, the link can be made, but its effectiveness depends on legal, organisational and technical protections, to ensure that the link is not made unless pre-conditions are fulfilled.
Restoring sanity to the processes whereby schemes are evaluated and designed is crucial, but far from sufficient. The depredations of the last 5 years are so great that rollback of the great majority of anti-freedom provisions enacted by Parliaments is necessary. The valuable Parliamentary Library catalogue of the actions of the federal parliament is frightening for its sheer length, even without consideration of its depth.
This is not to suggest that every provision of every amendment act must be overturned. National security and law enforcement agencies were, as they claimed, confronted by a variety of barriers that were accidental and inappropriate and needed to be overcome. On the other hand, inadequately brisk processes for the issue of warrants are not properly solved by creating extra-judicial warrants, but rather by a faster, online judiciary. And although telephonic interception warrants based on old, fixed-line numbering are inappropriate in the modern era of mobile phones, the balanced solution is person-based interception warrants, not the removal of controls.
A neologism can be a mere linguistic device intended to bring some intellectual richness to a discussion. The English word 'surveillance' derives from the French 'surveiller', or 'watch over', which in turn derives from the French sur- and the Latin vigilare. So 'überveillance' takes a somewhat ambiguous Romance stem and imposes on it an abrupt and authoritarian Germanic prefix.
There are multiple flavours of 'überveillance', none of them comforting to someone who lives in the real Australian world of moderate daily dangers from cancer, heart conditions and road traffic, and of minuscule dangers from terrorism.
Unfortunately, as this paper has shown, all of the interpretations of 'überveillance' are descriptive of another reality, and one that has become rapidly more pervasive in the few years since the turn of the present century. We are confronted by the twin extremisms of religious fundamentalists in Muslim garb, on the one hand, and men in short haircuts chanting the mantra 'national security', on the other.
We need to ensure that the national security fundamentalists, who have ruled our lives for the last 5 years, are treated with the same seriousness as the terrorist threat within Australia, and are encouraged to return to the professionalism of the 1980s and 1990s, and respect for the free society that Australians believe they live in. This country wants neither 'unter-veillance' nor 'überveillance'. It wants balance.
Masters A. & Michael K. (2007) 'Lend me your arms: the use and implications of humancentric RFID' Electronic Commerce Research and Applications 6, 1 (March 2007) 29-39
Michael M.G. (1999) 'The Genre of the Apocalypse: What are they saying now?' Bulletin of Biblical Studies 18 (July-December) 115-126
Michael M.G. (2003) 'The Canonical Adventure of the Apocalypse of John in the Early Church (A.D. 96 - A.D. 377)' Unpublished PhD Thesis, Australian Catholic University, 2003
Michael M.G. (2006) 'Consequences of Innovation' Unpublished Lecture Notes No. 13 for IACT405/905 - Information Technology and Innovation, School of Information Technology and Computer Science, University of Wollongong, Australia, 2006
Michael K., Johnston K. & Michael M.G. (2007) 'Consumer Awareness in Australia on the Prospect of Humancentric RFID Implants for Personalized Applications' Invited Industry Presentation, at the IEEE International Conference on Mobile Business, at http://merc.mcmaster.ca/mBusiness2007/
Michael K., McNamee A. & Michael M.G. (2006) 'The emerging ethics of humancentric GPS tracking and monitoring' Proc. Int'l Conf. on Mobile Business, 25th-27th July 2006, Copenhagen, Denmark, pp. 34-44
Michael K., McNamee A., Michael M.G. & Tootell H. (2006) 'Location-based intelligence - modelling behaviour in humans using GPS' Proc. Int'l Symposium on Technology and Society, 8th-11th June 2006, New York City, pp. 1-8
Michael K. & Masters A. (2006) 'Realised applications of positioning technologies in defense intelligence' in H. Abbass & D. Essam (eds) 'Applications of Information Systems to Homeland Security and Defense' IDG Press, ch. 7, pp. 167-195
Michael K. & Michael M.G. (2005) 'Microchipping people: the rise of the electrophorus' Quadrant XLIX, 3 (March 2005) 22-33
Michael K. & Michael M.G. (2006a) 'Towards chipification: the multifunctional body art of the net generation' Proc. Conf. Cultural Attitudes Towards Technology and Communication, 28th June - 1st July 2006, Tartu, Estonia, pp. 622-641
Michael M.G. & Michael K. (2006b) 'National Security: The Social Implications of the Politics of Transparency' Prometheus 24, 4 (December 2006 ) 359-363
Michael M.G. & Michael K. (2007) 'Überveillance: 24/7 x 365 People Tracking and Monitoring' Proc. 29th International Conference of Data Protection and Privacy Commissioner, at http://www.privacyconference2007.gc.ca/Terra_Incognita_program_E.html
Michael K. & Michael M.G. (2008) 'Innovative Automatic Identification and Location-Based Services: From Bar Codes to Chip Implants' IGI Press, Forthcoming, 350pp.
Perusco L. & Michael K. (2006) 'Control, trust, privacy and security: evaluating location-based services' IEEE Technology & Society Magazine 26, 1 (Spring 2007) 4-16
Perusco L., Michael K. & Michael M.G. (2006) 'Location-based services and the privacy-security dichotomy' Proc. 3rd Int'l Conf. on Mobile Computing and Ubiquitous Networking, 11-13th October 2006, London, England, pp. 91-98
This segment provides access to this author's previous papers on surveillance, indexed on his web-site.
Clarke R. (1988) 'Information Technology and Dataveillance' Commun. ACM 31,5 (May 1988) 498-512, and re-published in C. Dunlop and R. Kling (Eds.), 'Controversies in Computing', Academic Press, 1991
Clarke R. (1994a) 'The Digital Persona and its Application to Data Surveillance' The Information Society 10,2 (June 1994)
Clarke R. (1994b) 'Human Identification in Information Systems: Management Challenges and Public Policy Issues' Information Technology & People 7,4 (December 1994) 6-37
Clarke R. (1995a) 'Computer Matching by Government Agencies: The Failure of Cost/Benefit Analysis as a Control Mechanism' Information Infrastructure & Policy 4,1 (March 1995) 29-65
Clarke R. (1995b) 'A Normative Regulatory Framework for Computer Matching' Journal of Computer & Information Law XIII,4 (Summer 1995) 585-633
Clarke R. (1997) 'Chip-Based ID: Promise and Peril' Invited Address to a Workshop on 'Identity cards, with or without microprocessors: Efficiency versus confidentiality', at the International Conference on Privacy, Montreal, 23-26 September 1997
Clarke R. (1999a) 'Identified, Anonymous and Pseudonymous Transactions: The Spectrum of Choice' Proc. User Identification & Privacy Protection Conf., Stockholm, 14-15 June 1999
Clarke R. (1999b) 'Person-Location and Person-Tracking: Technologies, Risks and Policy Implications' Proc. 21st International Conference on Privacy and Personal Data Protection, pp.131-150, Hong Kong, September 1999. Revised version in Information Technology & People 14, 2 (Summer 2001) 206-231
Clarke R. (2001) 'While You Were Sleeping ... Surveillance Technologies Arrived' Australian Quarterly 73, 1 (January-February 2001)
Clarke R. (2005a) 'Have We Learnt To Love Big Brother?' Issues 72 (June 2005)
Clarke R. (2003) 'Why Biometrics Must Be Banned' Presentation at the Cyberspace Law & Policy Centre Conference on 'State Surveillance after September 11', Sydney, 8 September, Xamax Consultancy Pty Ltd, 2003
Clarke R. (2005b) 'Human-Artefact Hybridisation and the Digital Persona' Background Information for an Invited Presentation to the Ars Electronica 2005 Symposium on Hybrid - Living in Paradox, Linz, Austria, 2-3 September 2005
Clarke R. (2007a) 'Surveillance Vignettes' Xamax Consultancy Pty Ltd, September 2007
Clarke R. (2007b) 'Business Cases for Privacy-Enhancing Technologies' Chapter in Subramanian R. (Ed.) 'Computer Security, Privacy and Politics: Current Issues, Challenges and Solutions' IDEA Group, 2007
Clarke R. & Stevens K. (1997) 'Evaluation Or Justification? The Application Of Cost/Benefit Analysis To Computer Matching Schemes' Proc. Euro. Conf. Infor. Syst. (ECIS'97), Cork, Ireland, 19-21 June 1997
Wigan M. & Clarke R. (2006) 'Social Impacts of Transport Surveillance' Proc. RNSA Workshop on Social Implications of Information Security Measures upon Citizens and Business, Uni. of Wollongong, 29 May 2006, in Michael K. & Michael M.G. (Eds.) 'The Social Implications of Information Security Measures on Citizens and Business' Research Network Secure Australia, 2006, Chapter 2, pp. 27-44. Revised version published as Wigan M. & Clarke R. 'Social Impacts of Transport Surveillance' Prometheus 24, 4 (December 2006) 389-403
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., a Visiting Professor in the E-Commerce Programme at the University of Hong Kong, and a Visiting Professor in the Department of Computer Science at the Australian National University.
His primary consultancy expertise is in eBusiness and information infrastructure. He is also a longstanding privacy researcher, consultant, and advocate. He has been a Board member of the Australian Privacy Foundation since its inception in 1987, and is currently its Chair.
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.
Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916
Created: 10 August 2007 - Last Amended: 27 October 2007 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/DV/RNSA07.html