Roger Clarke's Web-Site
© Xamax Consultancy Pty Ltd, 1995-2013
|Identity Matters||Other Topics||Waltzing Matilda||What's New|
Roger Clarke **
Version of 2 March 1997
Invited Address to CAUSE in Australasia '97, Melbourne, 13-16 April, 1997
© Xamax Consultancy Pty Ltd, 1997
Available under an AEShareNet licence or a Creative Commons licence.
This paper is at http://www.rogerclarke.com/II/EncoCyberCulture.html
Many of the challenges presented by the information infrastructure are not readily amenable to legislative and other hierarchical solutions. They require gentler, community-based measures as an adjunct to, and even an alternative for, formal regulatory action.
Communities in cyberspace need means of achieving cohesion and maintaining relationships, while avoiding unduly dysfunctional behaviour by community-members and outsiders. This paper's purpose is to investigate the means whereby such a 'cyberculture' can be brought about.
It commences by considering formal and semi-formal authority in cyberspace. It then discusses the processes and structures of electronic communities, including a series of mini-case studies of community behaviour in some recent contexts. Examples are provided of existing and emergent mechanisms whereby civilised behaviour can be encouraged. Some inadequacies in existing technologies are identified, and an approach suggested whereby future products, services, protocols and architectures can better support culture in cybserspace.
A culture exists when a group of people exhibits cohesion through the sharing of values, language, rituals and icons. 'CyberCulture' is used in this document to refer to the concept of a group or groups of people achieving cohesion by means of the information infrastructure.
For all practical purposes, 'information infrastructure' currently means the Internet. That may well change; but if the telcos persist with their broadcast-style 'cable-TV' philosophy, with high-bandwidth down and only low-bandwidth up the line, the Internet may remain as the only basis for CyberCulture to develop.
A series of questions present themselves. Are present Internet services adequate to support the development of culture? If not, are enhanced and new services in the offing that will support culture? Is the notion of a single culture relevant; or will we see the emergence of multiple cultures?
The Internet is at the crossroads between community and commerce. Can it be matured fast enough for the infrastructure to support both, with minimal disturbance by each of the other?
The Internet (aka 'the Electronic Frontier') is resisting formal authority. Is anarchy a tenable organisational form in the long-term?
Formal authority in the narrowly legalistic and jackbooted form that we regard as normal in physical spaces, may well prove unworkable in the virtual medium. Can we make do without it? Can CyberCulture deliver a sufficient set of equilibrating mechanisms?
This paper is a voyage of investigation. The community of netizens is struggling to come to terms with itself. This author is also struggling, and hence no apologies are offered for the many imprecisions of expression, for the many tentative, insufficiently analysed and argued statements, and for the many personal judgements, that are embedded in this paper. It is hoped that, by starting, he, and the community, will improve their understanding.
The audience to which the paper is addressed is participants in Internet communities, providers of Internet services, and executives and managers responsible for policies relating to information, information technology and information infrastructure, within university, public sector and private industry.
The paper commences with a brief review of formal and semi-formal authority in the Internet context, followed by comments on communities and community authority.
A series of mini-case studies is then assessed, in a search for commonalities in emergent CyberCulture. These include content-regulation, Spam and Cookies. Lessons are drawn from these cases, and suggestions made as to ways in which community-based control mechanisms can be encouraged.
The origins of the Internet lie in an experiment with a view to establishing an architecture that would be robust in the face of nuclear attack. Until the ARPANet, all networks were essentially centralised, and vulnerable to the disablement of a few key nodes and arcs (whether by military action or by accident). ARPANet, and its successors based on TCP/IP protocols (or, more generally, the Internet Protocol Suite), are far less vulnerable because the network identifies damage and congestion, and routes around it.
Internet architecture is therefore inherently non-centralised, and no single authority exists that 'runs' the Internet. Associated with this has been a state-of-mind that people commonly refer to as 'anarchic'. This is not used in the sense of 'chaos' or 'disorder', but simply in the political sense of the absence of direct or coercive authority.
Some people, notably John Perry Barlow extend the argument to claim that the Internet is inherently a lawless 'electronic frontier'. The loose community centred on the cypherpunks e-list believes fervently that existing authority based on nation-states is undermined by the Internet. This is described by Eric Hughes's Cypherpunk's Manifesto of March 1993, and summed up by Tim May's "national borders are not even speedbumps on the information superhighway".
A loose group who style themselves as
are dedicated to the proposition that this is a very good thing, and that
anyone and anything that seeks to sustain the nation-state's powers, and to
deny individual freedoms in relation to the use of the net, is to be
strenuously fought against. For a review, see
(1993, revised 1996)
Prior to the advent of the Internet, national governments have been facing
difficulties with 'trans-jurisdictionality', i.e. business
activities that cross national, or even just State, boundaries. Where elements
of a transaction are quarantined in jurisdictions that do not recognise
international conventions, the behaviour can be effectively
'extra-jurisdictional', in the sense that it is incapable of
prosecution in any court of law.
Electronic commerce is currently lifting the art of regulatory avoidance to new
planes. The term 'supra-jurisdictionality' usefully conveys
the way in which business conducted in virtual marketspaces may be subject to
no existing legal jurisdictions at all. The 'wild west' was eventually tamed,
but it may be that the 'electronic frontier' will be even less capable of
subjugation by formal legal architectures than is the kind of business
currently conducted at least partly in regulatory havens.
Part of the difficulty that law enforcement agencies face is the lack of
authentication mechanisms, and hence the inability to provide
evidence that reliably associates an act in electronic contexts with a
particular person or organisation. The technology exists to implement 'digital
signatures'. In addition to technical and political difficulties, there is a
considerable amount of resistance against the imposition of authentication, and
especially against its embedment within Internet architecture.
Such difficulties are seen by some people as signalling the ebb-tide of the
power of the nation-state. Formal authority as we have known it during the
last few generations in so-called 'advanced western nations' may not be the way
of the future. The next section considers the extent to which semi-formal
authority may be sustainable.
Authority in Cyberspace
Prior to the advent of the Internet, national governments have been facing difficulties with 'trans-jurisdictionality', i.e. business activities that cross national, or even just State, boundaries. Where elements of a transaction are quarantined in jurisdictions that do not recognise international conventions, the behaviour can be effectively 'extra-jurisdictional', in the sense that it is incapable of prosecution in any court of law.
Electronic commerce is currently lifting the art of regulatory avoidance to new planes. The term 'supra-jurisdictionality' usefully conveys the way in which business conducted in virtual marketspaces may be subject to no existing legal jurisdictions at all. The 'wild west' was eventually tamed, but it may be that the 'electronic frontier' will be even less capable of subjugation by formal legal architectures than is the kind of business currently conducted at least partly in regulatory havens.
Part of the difficulty that law enforcement agencies face is the lack of authentication mechanisms, and hence the inability to provide evidence that reliably associates an act in electronic contexts with a particular person or organisation. The technology exists to implement 'digital signatures'. In addition to technical and political difficulties, there is a considerable amount of resistance against the imposition of authentication, and especially against its embedment within Internet architecture.
Such difficulties are seen by some people as signalling the ebb-tide of the power of the nation-state. Formal authority as we have known it during the last few generations in so-called 'advanced western nations' may not be the way of the future. The next section considers the extent to which semi-formal authority may be sustainable.
The Internet may not be the subject of a centralised authority; but it would not exist if there were no authority at all. The mechanism whereby the Internet is coordinated is a set of standards, and procedures whereby new and amended standards are drafted, discussed and adopted. The mechanism is run by an organisation called the Internet Engineering Task Force (IETF). Background is provided by ISOC's brief history of the Internet.
The IETF provides a description of its modus operandi (they prefer a shorter term from a different cultural tradition - 'tao'). The most critical element is a couple of series of documents, called generically RFCs (originally, 'Request For Comments'). These are two main kinds: FYIs ("For Your Information' documents, which are descriptive and explanatory in nature), and STDs (the actual standards).
There are many other instances of semi-formal authority functioning successfully in the context of the Internet; for example:
In some cases, authority-schemes within the Internet are worn rather lightly, and honoured in the breach as much as the execution. But in other cases, they function, and need to function if the overall aim is to be achieved. Any claim that the Internet can or does entirely reject the precepts of 'law and order' is seriously suspect.
On the other hand, 'control' is not a primary sensation that the netizen experiences when strolling the netspaces, or surfing the web. It is necessary to examine the concept of 'community' in the Internet context.
To date, there appears to have been only limited discussion of 'culture' in the context of the Internet. Terms like 'electronic commerce', 'digital libraries' and 'electronic publishing' are much more common. A term which is close to the idea of CyberCulture is 'electronic or virtual community'. The most useful reference is still Howard Rheingold's 1994 book, 'The Virtual Community'. See also a recent Conference in Sydney on 'Creative Collaboration in Virtual Communities'.
There is a wide variety of bases whereby communities can come into being and sustain themselves. These include:
An example well-known to the author is a community of researchers in the Information Systems discipline, ISWorld Net. Some of the early history from its foundation in July 1994 until early 1996 has been chronicled. The community's population numbers 5-10,000 worldwide. The two primary media for participation are an e-list for announcements, which has 3-5,000 subscribers, and a set of some 200 community-service web-pages established and maintained by some 100 volunteers.
The community is driven by a leader/visionary, but the contributions are highly dispersed among the volunteers. The majority of the service-value has been provided by perhaps 2% of the overall community, but hundreds more have at least posted to the e-list, and many hundreds have accessed and in many cases bookmarked the web-pages. There are well over 1,000 hotlinks to ISWorld Net web-pages from other pages around the world.
The volunteer force is very heavily english-speaking, and virtually all of the content is in English. The heavy majority of volunteers are in North America, with modest numbers in Australia, the United Kingdom, New Zealand, and a scatter of Continental European and 'advanced' Asian countries.
Strenuous attempts have been made to ensure that the community services do not contain undue cultural biases. Given the strongly 'internationalist', but particularly Anglo-Saxon-American, style of the world's I.S. discipline, the attempt has achieved some success. There is, however, only limited and slow penetration in Continental Europe and advanced Asian nations (due to cultural concerns) and in less developed countries (due to slow emergence of the discipline there, mis-match between services and needs, cultural differences, and, importantly, infrastructure).
The initiative has been free-standing since its inception; but the possibility exists that it will forge an alliance, or formally join with, a more conventional professional association in the near future. If so, it will be negotiating from a position of strength, because of the enormous volume of electronic traffic it generates, and the extent to which it is perceived to be the life-blood of the disciplinary virtual community.
No-one appears to have yet proposed general principles regarding the behaviour of virtual communities, let alone explained how to contrive control mechanisms whereby they can be sustained.
In order to establish some basic guidelines, it is necessary to gather together some experiences of actual behaviour in particular contexts. This section provides brief reports on each of a number of such behaviours.
Some time ago, this author prepared a document which identified a range of what he called 'netethiquette cases'. This term was coined to draw the idea of etiquette in the context of the Internet together with the question of 'good, old-fashioned ethics'.
It is noteworthy that, in most of the 25-30 classes, and scores of specific examples, at least some form of countermeasure had quickly emerged, whereby the dysfunctional behaviour was being addressed, in many cases in a constructive manner.
Another area in which evidence exists of adaptation to need is in coping with insufficient capacity on the net as a whole, and on particular segments of the net.
A range of options is available whereby individual users, service providers, carriers and infrastructure designers can improve the efficiency with which bandwidth is used. Examples of the kinds of measures that are available include not sending attachments to e-lists, stopping unwanted browser downloads, offering limited-graphic alternative home-pages, use of thumbnail gifs, use of ASCII rather than space-wasting formats like PDF, progressive image formats (such as interlaced GIFs, progressive JPEG and the new PNG), caching at servers, and caching on workstations.
Guidance is provided by such sources as the Bandwidth Conservation Society, and Noel Jackling's list, distributed on the Australian link list in May 1996.
Such measures have, however, achieved little success. Legions of 'newbies' still tumble onto the net with almost no conception of the notion of conserving resources; the self-discipline of experienced users is very limited; bandwidth supply continues to run fast enough to keep abreast of demand; and incentives for less profligate usage are difficult to create in the current context.
The problem appears more likely to be effectively addressed by emergent technologies and standards that support differential priorities according to the needs of the particular services (especially asynchronous services like email compared with essentially synchronous ones like video-conferencing). Some of these are to be trailed as part of the Internet II initiative.
Some of the transactions that people undertake generate a record that contains the identity of the parties. In many other cases, either no record is kept, or no identifying data is kept, or the data contains an incomplete or indirect identifier for the person involved. This author has written on the importance of anonymity and pseudonymity in consumer transactions.
The Internet offers (semi-accidentally) a range of ways in which a person can operate anonymously, using a consistent 'nom de net', or using continually changing aliases. One reason is that email protocols permit pretty-much any string to be placed in the Reply-To field. Another is that many services were established largely as open services, with limited requirements to identify oneself; for example, ftp supports substantial anonymity, and gopher and the web have reasonanbly gentle identification norms.
In addition to the net's inherent openness, some specific services have been implemented that provide assistance to people seeking active obscuration of their identities. The most widely-discussed of such mechanisms is so-called 'anonymous remailers', which strip off the identifier of the sender of an email message, and forward it on to the intended recipient(s).
In practice, many of these would be better described as being 'pseduonymous', because the server maintains a linkage between the inbound and outbound messages, in the form of an audit trail. If this linkage can be acquired by an investigator (e.g. under legal compulsion such as a search warrant, or because the Finnish Police, like many other law enforcement agencies around the world, including Australia's, don't need a warrant), then the expected anonymity is not delivered.
In a recent paper on privacy-protective technology, Goldberg, Wagner & Brewer document several generations of remailer technology, which are progressively addressing the original scheme's various weaknesses.
During the mid-1990s, there have been moves throughout the world to apply censorship laws to the Internet. The developments are traced at this author's Regn.html. The history of attempts at Internet regulation leads to the following conclusions:
Spam is a term for unsolicited electronic communications. The history and consequences of spam are considered in a companion document by the author. This leads to a number of inferences:
A cookie is a record that is written onto the local drive of the web-browser, as a result of a command issued by a web-server. Each record has a long key, which is likely to be unique to a given application. When the user accesses a relevant page at a later date, the web-server causes the web-browser to read the record and transmit it to the web-server.
The history and implications of cookies are considered in a companion document by the author. This leads to the following observations:
During recent years, discussions of ethics have been fashionable. For example, MBA schools have been stressing it, as a replacement buzz-word for worn-out and discredited phrases like 'corporate social responsibility' and 'corporate citizenship'.
A first observation drawn from the above cases is the lack of evidence that the study of ethics is of any relevance to instrumentalists. That is to say: ethics is a useful basis for armchair discussions about behaviour; but it has little or no impact on behaviour itself, and provides little guidance to us as we endeavour to encourage and reinforce reasonable behaviour, and disencentivise dysfunctional behaviour.
A clear conclusionfrom the case studies is that Internet services and products contain loopholes that can be easily exploited by commercial interests, and by people interested in making nuisances of themselves.
Individuals, in their roles as community-members and as consumers, need the means to make themselves as available and as unavailable to others as they see fit. This implies the need for enhancements, and in some cases re-conceptualisation, of products and services, of protocols and of architecture, to serve the community need.
The case studies provided many instances in which adaptation was evident, as a result of behaviour being perceived as being dysfunctional. Evidence of adaptation was apparent from both net-communities on the one hand, and corporations and individuals seeking to exploit the Internet for commercial gain on the other.
Various kinds of adaptation are occurring at several levels:
To date there has been little evidence of the commercial sector establishing and policing codes of practice that embody balance among the various interests. This may simply be a matter of corporate inertia; for example, the Australian Direct Marketing Association (ADMA) has codes relating to unsolicited mail and tele-marketing, but, far from having a code on unsolicited e-mail, does not yet even have a web-address or an email-address.
The cases suggest that measures that depend on understanding and performance by individuals are feasible, provided that the people concerned are reasonably 'net-savvy', and perceive themselves as belonging to a community. The feasibility declines steeply as the numbers of people involved increases.
Another notable omission from the case studies is any serious usage of the established rational principles whereby risks can be identified, their nature assessed, and the harm mitigated through pre-emptive actions.
Members of the net-community need to undertake such studies. Where problems are being caused by commercial interests, members of net-communities also need to communicate the concerns to the perpetrators. Only in this way will corporations, industry associations and governments perceive the concerns to be risks to their own revenues, customer-image and freedom from excessive regulation.
At first, people regarded the Internet as the basis for a new form of community. During the last couple of years, the tendency has been to think in terms of many communities rather than just one.
There is a need to apply some of the principles of community. Perhaps the term itself is insufficiently rich to convey the meanings; and hence additional or alternative terms should be invoked, such as 'neighbourhoods'.
Meanwhile, what this paper has loosely referred to as 'commercial interests' have recognised the Internet's potential as a means of reaching out to prospects, and converting them into customers. The rhetoric of some businesspeople has implied that the 'community' notion was naive and transitory, that the Internet is in the process of conversion into a fully commercial form, and that communities have no place (space?) on the mature net.
That perspective is just as naive as the view that the Internet could deliver a utopian, advertisement-free, funded-by-magic, eternally-friendly medium. In the physical world, commerce co-exists with community; businesspeople (or, more commonly perhaps, their spouses) invest considerable efforts in non-revenue-earning community activities.
The cases suggest that community and commerce are locked in mortal combat. It is important that all players quickly mature beyond simple-minded adversarial stances. There is plenty of scope for Internet products and services to be enhanced in ways that support co-existence of community and commerce, and reconciliation between the two mind-sets, and the two kinds of roles that many people will play in cyberspace.
This paper has been long on generalities, and its thesis can be too easily dismissed as being as impractical as laudatory speeches about motherhood. This section draws together a few specific suggestions as to how Internet users can work towards a mature form of CyberCulture.
There is a burning need to build on our understanding of culture more generally, and map culture onto the new medium. People in cars don't see people; they see other cars. We need to find ways to help people see other people when they use the net, analogous to enlarging windscreens and providing driver-to-driver voice-communications.
We need to establish a theory or theories that will enable us to describe, explain and predict human behaviour in Internet contexts. The author has attempted a preliminary outline of what such a theory of CyberCulture might look like. We need anthropologists to invest more time observing people's behaviour and interpreting what they see.
Laws are likely to be inadequate to control human behaviour on the net. But then that inadequacy already exists in physical communities; otherwise there would be no murders, extortion or defamation.
We need to establish codes of behaviour and guidelines, whose primary purpose is education rather than retribution and sanction. These need to be implemented through Internet Services Providers (because they provide the means whereby people gain access to the net); but they need to be communicated, by behaviour, by example, and by reminder, by net-users to one another.
The inference was drawn from the case studies that all Internet products and services, and indeed the underlying protocols and architecture, contain loopholes that can be readily exploited by commercial interests, and by community-members interested in making nuisances of themselves.
Without demeaning the valuable work of the engineers who invented and sustain the net, we need to identify the inadequacies, and work constructively towards enhancements and replacements. We need to invest in 'instrumentalist anthopology'. By this I mean that we need to send in the anthropologists and tell them not to merely observe the natives, but to analyse their requirements and devise alternative conceptual designs.
The following sub-sections suggest a few aspects of existing products that are ripe for improvement. This is, of course, a pitifully inadequate catalogue. A project needs to be resourced and co-ordinated that extends far beyond the meagre capabilities of one author in one small segment of one paper.
Outgoing mail needs to be filtered, in order to detect and draw attention to potentially dysfunctional messages, such as those including multiple 'flame'-words, especially when addressed to multiple people or listservers; and lengthy messages or attachments being sent to multiple people or listservers.
We need default email templates that contain 'Dear XXX' openings and 'Regards ... <my-name>' closings, and which are designed to replace the barreness of computer interfaces with an atmosphere of human communication.
In most listserver software, the Reply-To parameter defaults to reply-to-list. This results in pollution of the list by respondents who are unaware of the impact of the 'reply' function. Many list-managers are unaware of the existence of the parameter, let alone its default setting. Listservers need to have conservative defaults, and list-manager interfaces that walk the 'newbie' list-manager through the decisions that they need to make in order to establish a neighbourly e-list.
People who wish to subscribe to or unsubscribe from e-lists frequently send their messages to the list rather than the administrative address. Such messages (whether syntactically correct or not) need to be detected by the listserv, and deflected from the list to the list-management software and/or the list-manager.
There appear to be limited options between unmoderated and fully moderated. Filters are needed, such that messages that fail some basic filtering tests are deflected to the list-manager for approval. Examples of such tests include multiple occurrences of 'flame'-words, occurrences of 'spam'-indicative strings (e.g. 'special offer'), attachments, length, and postings by non-subscribers.
Even within the World-Wide Web, there are elements that are mechanistic rather than neighbourly. We need to embody within browsers humanised metaphors such as 'enquiring' rather than mechanistic terms such as 'searching'; and 'visiting' rather than 'fetching'; and 'drawing attention to' rather than 'hotlinking' or 'pointing to'.
This section has suggested a few potential improvements in those services with which the author is most familiar. Similar, and deeper, analyses are required of all Internet services, and particularly those that are most open to dysfunctional behaviour, such as newsgroups, MUDs, IRC and web-chat.
The analysis of cyberculture that we all need, such as that outlined in this author's tentative paper, would give rise to a fresh approach to Internet services. Rather than assuming that the first rush of products and protocols that we are still using are in any sense 'right' or 'natural', we need to analyse our needs, conceptualise designs that meet them, and design, construct and deploy them.
One need is for a suitably rich set of integrated communications alternatives. The telephone is inherently 'synchronous', in the sense of requiring both parties to be involved at the same time. We need asynchronous communications tools (which are efficient for 'leaving messages'), synchronous ones (when the benefits of a conversation justify the interruption to the called party), and means of moving between the two.
Another need is for the idea of 'threads' (in the sense of the word used in email and newsgroup archives) to be much more directly and intuitively supported. The need is for the composition of a message, and the commencement of a conversation, to stimulate the availability of the stack of previous interactions between the parties.
Another fresh start that we need is to conceive the roles of text, image, sound and video in network-mediated inter-personal communications. The successive failures of video-phone and video-conferencing services to grab the public's imagination could be because, as they have been hitherto conceived, they fail to address a real human need.
The sense of community on the net may prove to have been a short, unsustainable burst of goodwill, fated to be superseded by apathy and crass commercialism; a modern counterpoint to utopias in places as promising as Gauguin's Pacific Islands, and as unlikely as Bolivia, and to the idealism that was the popular communism of the 1920s.
For the sense of community to be sustained, in parallel with commercial applications, it is essential that the pioneers turn their efforts to the specification of follow-on products, services, protocols and architecture that support human communication and human use of electronic tools, and discourage unneighbourly behaviour.
The purpose of this paper has been to argue the need for, and possibility of, much more culturally aware design features in Internet services. Further, the attempt has been made to provide some pointers towards the ways in which improved functionality, and reduced dysfunctionality, can be achieved.
There is a fundamental need for net-based services of all kinds to be impregnated with icons and rituals that reflect human relationships and culture. The explosion of first desk-top publishing and then the web has seen the skills of computer science complemented by that of visual design specialists. We now need to blend in the insights of anthropology, of culturally oriented social psychology, and of normal, thinking, feeling people.
This paper is an outgrowth from substantial amount of prior work in the area, together with ongoing projects. It is intended to be a living document, to be further developed for re-presentation at appropriate venues in more mature form. In addition to references provided in the body of the paper, sources have included:
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.
From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 40 million by the end of 2012.
Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 1472, 6288 6916
Created: 18 December 1996 - Last Amended: 2 March 1997 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/II/EncoCyberCulture.html