Managing Information Technology's Organisational Impact, II
Elsevier / North-Holland, Amsterdam, 1991

edited by
Roger Clarke
Australian National University
Julie Cameron
Organisational and Technological
Solutions Pty Ltd
Sydney Australia

Production Edited and Typeset by ANUCOM Research Centre,
Department of Commerce, Australian National University, Canberra
using MS Word, MacDraw, Apple File Exchange and Omnipage
on Macintosh SE/30, IIsi, Applescanner and Apple Laserwriter
from IBM-compatible and Macintosh diskettes and emailed originals

© Elsevier / North-Holland, 1991

There is a serious shortage of literature which deals not directly with IT artefacts, but with their impact on organisations, and the management of that impact. The papers in this collection are not concerned with 'technology' in the narrow sense of hardware, systems software and application software, but instead tackle the larger and more challenging issue of 'technology-in-use'.

The primary target audience for this volume is IT-aware user managers and user-aware IT managers, and students in courses which address the needs of such people. Most of the papers were presented at an international conference held in Adelaide in October 1991. It was the second in a series under the banner Shaping Organisations Shaping Technologies (SOST), a title chosen to reflect the need for a revised sequence, whereby technology ceases to be the driving factor, and is instead shaped by the needs of organisations. The first such Conference was held in May 1989, in Terrigal, near Sydney, and gave rise to the first volume in the series*.

All papers were subjected to vigorous refereeing, which is particularly important where a significant proportion of the collection is contributed by practising managers and other non-academics. The response of the authors to the abundant constructive criticism was very positive, and the quality of the final product is high. An analysis of the authors' background shows 26% to be practising IT managers, 8% IT consultants, 11% user managers, 11% observers and regulators, and 44% IT academics.

The Editors are pleased with the results, and express their thanks to the Conference Sponsors - the Australian Computer Society, IFIP TC9 and Microsoft, to Peter Campbell and other staff of the ANUCOM Research Centre, and most of all to the authors.

Roger Clarke					Julie Cameron
Department of Commerce,			Organisational and
Australian National University		Technological Solutions
Canberra					Pty Ltd, Sydney
13 November 1991


An Argument for 'Smart' Financial Transaction Cards in the Australian Payments System - Michael Walters

The Extended Parliamentary Network - Geoff Harber and Roger Webb

Case Study: The Australian Taxation Office Electronic Lodgement Service - John Ryan

Business Transformation Through Office Automation? - Ingo Rheinbay


Information Planning in a Large Government Agency - Graham Coote and Carol-Ann Gough

Integrating Information Technology with Corporate Strategy - Richard Warneminde

Generating Business Ideas Based on Information Technology - Hubert Österle


The Changing Face of Data Administration - Paul Loring and Catherine De Garis

An Archivist's View of the Long-Term Management of Electronic Data - Kathryn Dan

Data Ownership - Marcus Wigan


Is the Information Systems Community Wrong to Ignore Formal Specification Methods? Paul A. Swatman and Paula M.C.Swatman

The Role of Negotiation Strategies in Software Development Project Selection - Michael McCrae

User Satisfaction Surveys: A New Zealand Study - Pak Yoong


Selection Criteria for CASE Tools - Geoff Beckworth

Software Metrics for the Management of CASE-Based Development - June Verner and D. Ross Jeffery


The Elastic Limits of Vulnerability: The End of the Good Times for the IT Industry - Simon Davies

Service Continuity Planning - Klaus Brunnstein

The Computerized Supervisor - A New Meaning to 'Personal Computing' - Jim Rule and and Peter Brantley

General Systems Theory Can Bridge the Gaps of Knowledge Among IT-Security Specialists - Louise Yngstrøm

Australian Federal Privacy Laws and the Role of the Privacy Commissioner in Monitoring the Data Matching Program - Paul Kelly



This first section of the collection brings together papers which deal with specific information technology forms or capabilities. In all cases, however, the primary concern of the authors is not to discuss the technology itself, but rather to show how the technology is being, or is capable of being, applied and fashioned to the needs of the particular organisation or cultural setting.

The first paper, 'An Argument for 'Smart' Financial Transaction Cards in the Australian Payments System', by Michael Walters, begins with an international history of the various kinds of payments cards, including definitions and a classification scheme, and a description of the functions the cards perform. The manner in which magnetic-stripe cards have been applied to financial transactions is discussed, the various kinds of special-purpose banking terminal described, and weaknesses in the technology identified. The functionality of chip-cards is then explained, and the history traced of both technical developments and financial applications.

The remaining sections of the paper provide background to Australia's present payments system, propose a structure and process whereby smartcards could be applied, and present a financial evaluation of the proposal. The author concludes that, despite up-front costs of the order of a half-billion US dollars, the payback period would be less than 18 months.

The paper is a goldmine of information, and is readily accessible by the layman. Although some of the details of the proposal and evaluation are specific to the Australian context, the general argument is applicable in most countries. Walters warns, however, that considerably more detailed evaluation is needed, not only in relation to financial costs and benefits, but also consumer acceptance, distribution of the payback among the various parties involved, and social factors such as dispute resolution, employment impact, and regulatory and privacy issues.

Geoff Harber and Roger Webb's paper on 'The Extended Parliamentary Network' describes the provision of services to the electorate offices of 250 parliamentarians, dispersed throughout the Australian continent, through the application of the PTT's Integrated Services Digital Network (ISDN).

Within the precincts of Parliament House in Canberra, a substantial set of electronic services is available to elected representatives and their staff, including Videotex noticeboard services, on-line indexes to and full-text retrieval from both primary sources and up-to-the-minute domestic and international news from Australian Associated Press (AAP), library catalogue, email, and administrative data processing systems.

The use of information technology by staff in parliamentarians' local offices (and by some parliamentarians) is increasingly sophisticated, and the standard PC configuration in electorate offices is currently 80386-based. A trial project to provide remote access to Parliament House services, even though it used only 2400-baud dial-up communications and primitive interfacing software, was rated by the participants as 'successful', and the linkage as either 'essential' or 'highly desirable'.

The paper describes the manner in which ISDN is being used, the criteria whereby it was chosen over other alternatives, and the services which are being provided. The next step is to assess whether the extended parliamentary network will be merely a tool of administrative efficiency, or will be moulded by parliamentarians into an active component of the political process.

In the paper entitled 'Case Study: The Australian Taxation Office Electronic Lodgment Service', John Ryan describes an application of electronic data interchange (EDI). The agency responsible for collection of all income tax in Australia is part-way through a 'modernisation program' which is renewing its organisational structures and processes, and its information technology, in an integrated manner. Lodgment of taxation returns by thousands of tax agents on behalf of millions of taxpayers has recently become possible by magnetic media and electronic means. In the first full year of operation a 30% penetration was achieved, with 24 software producers and 4800 tax agents actively participating.

The paper describes the mechanics of electronic lodgment, reviews the development process, identifies problems encountered, discusses the impact on staff, and outlines the future directions of the scheme. The scheme has significantly reduced the efforts and costs of both the Office and tax agents, improved the turnaround time of tax returns, and enabled the agency to divert large numbers of staff from administrative to audit responsibilities. Such a successful application of information technology has potentially dramatic impacts, and proactive management has been vital, to ensure that the benefits are gained without undue negative implications for any of the parties involved.

Ingo Rheinbay, of the State Bank of N.S.W. in Sydney, asks the question 'Business Transformation Through Office Automation?'. He surveys the popular management literature on Office Automation and its benefits, separately classifying document management, group work support, decision-making support systems and transaction processing. Both financial and indirect benefits are discussed.

In identifying the hurdles which have to be overcome to reap these benefits, Rheinbay stresses the importance of organisational factors, change management and investment in human capital. He concludes that the label 'office automation' is not an appropriate descriptor, because it implies that existing practices are merely converted from manual to computerised form. In fact, successful application of OA involves changes in work practices and organisation, demanding a more dynamic term such as 'business transformation', and active, imaginative management measures.


A primary theme of this series of volumes is that effective use of IT does not just happen: changing technology changes organisations; and careful planning is needed to ensure that IT benefits the organisation and its stakeholders.

Successful organisations appreciate the added-value potential of IT-based systems. They exploit indirect and innovative business uses of IT which were not part of the purpose of the original system, but which emerge from it. For example, business functions of mutual interest to the organisation and its client, like an airline reservation system, can be expanded into a car and hotel booking service, provided that executives recognise serendipity when they see it. Some of the important factors in strategic planning and IT are the reasons and motivations for planning, the methodology and processes used, its implementation, the integration of IT with the organisation, and the ability to understand the strengths, weaknesses, opportunities and threats associated with different approaches to applying IT. The three papers in this section deal with various aspects of these issues.

The first paper, by Graham Coote and Carol-Ann Gough, provides a case study of 'Information Planning in a Large Government Agency'. As is occurring in the private sector, government organisations are increasingly recognising the commercial value of their data. The Government of Queensland, following an IT review, has implemented information technology planning throughout the service. This paper describes the rationale and methodology used. It also analyses the effectiveness of the process, by measuring the results against the organisation's critical success factors. This final phase of the cycle is extremely valuable, but so often forgotten.

Richard Warneminde's 'Integrating Information Technology With Corporate Strategy' draws attention to the common perception of information technologists as 'outsiders' who come in and 'install' systems in 'my' territory, and argues that this is a primary reason for unsuccessful systems. IT professionals require a far greater understanding of change management factors such as power, and organisational and professional cultures.

As an added complication, as a technology matures, the users begin to understand its strengths and weaknesses. They become less dependent on technicians. This may lead to healthy co-operation, or unhealthy rivalry. Experience is suggesting that co-location of IT and users in development teams and placement of IT 'support' in user areas is very effective. The dangers of mutual misunderstanding can be addressed by people with hybrid IT-and-business skills-sets, but these people are often not recognised or valued by organisations. The paper suggests some practical ways of enabling IT professionals to cope with these potential conflicts, to 'belong' and to gain acceptance. An addendum contains some additional points raised during the workshop which followed the paper's presentation.

'Generating Business Ideas Based on Information Technology' is an exciting combination of strategic planning and lateral thinking, using IT as the catalyst. Hubert Österle proposes a new method of exploring business concepts and discovering business opportunities, working in combination with clients. Applied this way, IT can move beyond its original role as a tool for automating existing systems, and even beyond its more recent function as a tool for implementing pre-decided corporate strategy, and become an 'enabler' of corporate planning.

New IT has the potential to produce new 'products' - new ways of doing business. But frequently that potential is unrealised because the opportunities inherent in IT remain undetected. The paper proposes an analysis of the client relationship by juxtaposing the business functions of each organisation, and considering the relevance of current and emerging technologies. The process described is practical, and could be applied in most organisations. It could be used as a part of the corporate strategic planning cycle to generate new ideas. It requires a facilitator who is business- and IT-literate, the commitment of key managers from both partners, especially from the marketing and client servicing functions, and a pair of business-oriented IT managers.

Together, these papers offer a positive report on the result of a previous phase of IT strategic planning; a positive approach to dealing with the recurrent problem of culture clash between IT professional and user; and a positive approach to using IT as a means of generating strategic opportunities.

Addendum to Richard Warneminde's Paper
Following the presentation of this paper, the delegates formed workgroups of five to discuss issues arising from it. This addendum presents the key points arising from the workgroup reports.

IT Strategic Planning

IT People and Change IT People as Outsiders IT People

The new capabilities being delivered by IT are significantly increasing the potential of existing data collections. The development of telephone directory services provides a readily appreciated example. A conventional telephone book contains the names, addresses and telephone numbers of the customers of a telephone company, but it only provides access on the basis of name, initial(s), and often in large directories, at least some information about where they live. 'Electronic White Pages' services offer, in addition:

Electronic data management does more than just increase the usefulness of individual databases. It also provides the opportunity to compare data from different sources to assess accuracy or to provide electronic dossiers about individuals. For example, the electoral roll can be compared with the telephone book to assess accuracy of name and address; and information about purchases can be linked to information about age, income, occupation and other data to assist marketing strategies.

Because the value of data captured and stored electronically becomes greater than that of data held on paper, it is important that both the data and the technology be understood and managed. In addition to technical issues, the interests of other stakeholders, particularly the data subject, must be recognised, such as information privacy, data ownership and the economic value of data. Organisations are placing a new emphasis on 'data management' in addition to 'database management'. The three papers in this section are written from completely different perspectives, but each deals with implications of the growth in importance in electronic data.

Paul Loring and Catherine De Garis' paper on 'The Changing Face of Data Administration' is written by public-sector data administration professionals. Particular attention is paid to organisational issues, including the inter-relationships within an IT Branch between data management, database management and project management; the relationship to any enterprise modelling activities which may be undertaken at corporate level; and the relationships with system owners.

The paper argues that the potential benefits cannot be realised by technical measures alone, partly because data administration is a change agent and therefore automatically unwelcome; and partly because data administration is by definition a coordinative role, and constrains the freedoms of project and user staff alike. The authors believe that the function will not be successful unless clear definition is provided of the roles and functions of, and the inter-relationships between, all IT and other professionals who are responsible for and who use shared organisational data schemas and databases.

'The Archivist's View of the Long Term Management of Electronic Data' is presented by Kathryn Dan, an archives professional with the Australian Commonwealth Government. The paper provides background information which is not part of the mainstream of the information systems profession, on such matters as data form, data value, erasability and manipulability, and long term accessibility.

The system-dependence of electronic data means that an organisation undergoing technological change may not be able to access even relatively recent data unless the preservation or conversion is properly planned. Yet the evaluation, assessment and classification of organisational information is rarely undertaken at the time that data capture is planned. The paper is very timely, because the increasing use of electronic mail and EDI, and the concomitant reduction in the reliance on paper, underline the importance to all organisations of establishing and using standards for the storage of important electronic data.

Marcus Wigan's paper on 'Data Ownership' argues that the legal framework within which data is used has not kept pace with the changes wrought by IT. It discusses the nature of data, and shows that the conventional concept of 'ownership' ill fits the needs of contemporary culture, let alone the emerging information society.

With all data, economic considerations are increasingly important, including the rights to deny access to data, to charge a fee for access to it, and to sell those rights. Difficulties are highlighted in deciding, in respect of any piece of data, in which entity those rights vest. The paper refers to pointer- or web-based schemes, which enable the ownership of these rights to be recorded and automatically recognised.

Further difficulties arise in establishing the basis for determining the price of 'public data' (such as the contents of land information systems and census returns). Reference is made to the increasing prevalence of government agencies exercising rights such as 'crown copyright', and effectively denying public access to public data, by applying the 'user-pays' principle to recover costs.

The paper also discusses the privacy issues arising in the case of data which relates to an individual, including justification for and methods of collection, integrity, and justification for and methods of access and usage. Such issues are of course compounded by communication technologies like EDI and EFTS, and storage technologies like CD-ROM and 'lasercards'.

Together, these papers make clear that the conversion of data from traditional into electronic, machine-readable forms is bringing about enormous change, and that the institutional environment is far from ready to cope with that change. Much more serious efforts are needed to improve understanding of the changes, and so enable meaningful public discussion of, and action concerning, them.


Successful IT systems satisfy user needs, and hence the relationship between users and developers is critical. The quality of systems depends heavily on the clarity of the specifications, the effectiveness of the selection process, and the measurement of user satisfaction. The three papers in this section address these issues.

In 'Is the Information Systems Community Wrong to Ignore Formal Specification Methods?', Paul and Paula Swatman argue that the quality of a system is dependent on the user and the developer having available a means of precisely expressing their mutual understanding of the requirements. They contend that many of the errors in system development could be avoided by using formal specification methods, because of the reduction in ambiguities, contradictions and omissions. Moreover, it is argued, the level of mathematical skill required to be able to effectively use formal specification methods is well within the range of most of the people who would be involved. The paper challenges the conventional notion that formal methods are incompatible with fuzzy user requirements implemented by non-mathematicians.

Michael McCrae's 'The Role of Negotiation Strategies in Software Development Project Selection' examines an important aspect of the decision making process involved in the selection of software. The choice among alternative IT products is frequently not based on a rational analysis of their ability to meet precisely expressed specifications. Even in formal tendering processes, the relative strengths of the negotiating positions of purchaser and supplier, and the negotiating skills of the various players, have major effects on outcomes. Although, as the author acknowledges, the model is limited, it reminds us that negotiation is a process of repeated offers and counter-offers by and to people and organisations.

In the final paper, 'User Satisfaction Surveys - A New Zealand Study', Pak Yoong considers the extent to which IT Managers are undertaking surveys of their users' satisfaction with the services provided to them. As the user community becomes more dispersed and diverse, obtaining feedback about existing systems, and thereby designing better quality goods and services, is increasingly difficult, even in organisations of only moderate size. Preliminary analysis of the two-stage sample survey suggests that a significant number of enterprises are already using the technique, and that, as in a similar survey in Canada, the respondent IT managers appeared to have strong customer-orientation.

The three papers in this section address important aspects of the relationship between users and IT professionals. Each of them accepts that perceptions and politics are integral to the concept of IT-in-use; but they each rise to the challenge of developing rational approaches to their chosen topic.


One particularly vital component of information technology is the means whereby new applications are developed. There has been a long series of improvements in the reliability and productivity of software development, and in the extent to which end-users can participate in the process. This section contains two papers dealing with the current fashion - Computer Aided Software Engineering (CASE) products.

CASE embodies two important advances. One is that it addresses not only the later phases (physical design and construction), but also the earlier, more fundamental and less structured phases (analysis and logical design). The other is that it directly addresses a major problem for development professionals - the loss of information during the translation of users' requirements, through the logical and physical design specifications, to their expression in compilable code. CASE products embody a significant improvement in the degree of integration among the techniques, tools and products of the successive phases, and some products currently enable the generation of code directly from logical design specifications.

The first paper, 'Selection Criteria for CASE Tools', by Geoff Beckworth, documents the author's experiences in evaluating products for use in a computer science department. The needs of a teaching-and-research institution are significantly different from those of a business, but one of the messages of the paper is that because the products vary so much in their philosophy and features, a clear appreciation of the organisation's needs is a critical prerequisite to an effective selection process. Other valuable contributions are the general background provided to CASE products, the discussion of the evaluation process, the features list drawn from the literature and the department's own requirements analysis, and a table providing a features comparison of the twenty products considered.

If managers are to make rational decisions about resource allocation, then the process of software development, and the products resulting from it, need to be measured. 'Software Metrics for the Management of CASE-Based Development', by June Verner and D. Ross Jeffery, applies established measurement concepts to environments in which CASE tools are used.

Background is provided to metrics for conventional (pre-CASE) software development. It is argued that CASE products make it (at least in principle) much simpler and cheaper to collect data about the size and complexity of task, the effort and elapsed time invested, and the product quality. On the other hand, suppliers of CASE products need to be convinced of the need to incorporate measurement features in their products, and managers need to be convinced of the benefits of applying them. The paper provides a valuable discussion of the particular metrics applicable to CASE-based development, and an assessment of progress in the field.


Increasing dependency on IT results in increasing vulnerability. What happens when a system goes down due to internal malfunction or external accident? What fallback arrangements are in place, and what inadequacies and additional risks do they entail? How long can the stakeholders survive an outage? What is the contingency plan if the system is accidentally or deliberately infected by a virus or worm, particularly one with side-effects? Do we understand enough about the possibility of failures due to amendments to computer programs, and interfacing to new systems, to ensure that the system still does what we expect it to do? Do we monitor our systems for faults, performance degradation and failures? Do we investigate and deal with the problems that we detect? In respect of most systems, the answers to these questions would give cause for concern, and yet we continue to increase our IT dependency. And we are increasingly entrusting very complex and critical industrial, transportation, military, economic and health procedures to risk-prone interlinked IT systems.

We need to recognise the limitations of IT, and the vulnerabilities that it creates, and we need to do so in advance of its introduction. In many countries, because of the potential risks to the environment, proposals for many classes of new industrial, commercial and even residential developments are subject to environmental impact studies prior to governments committing to them: it has been recognised that the level of vulnerability should be assessed before approval to proceed is given.

In a similar manner, the development and potential application of major new information technologies should be subject to prior impact studies. Conditions of use could then be considered in advance, and communities could be proactive in precluding harmful applications, and imposing appropriate controls on beneficial uses. Currently, far too few IT proposals are subjected even to rigorous cost/benefit analysis, let alone risk assessment. The time has come, at least in respect of proposals with major implications, for IT impact studies, similar to environmental impact studies, to be a pre-condition to public discussion.

A framework for the discussion of vulnerability is set by Simon Davies' paper on 'The Elastic Limits of Vulnerability: The End of the Good Times for the IT Industry'. The paper argues that the level of negative impact of IT will only be tolerated until public reaction becomes strong enough to influence the course of developments. In some countries, it is claimed, these limits are fast approaching.

The need is identified for the IT industry to impose effective self-regulation on itself now, or shortly hereafter submit to regulation imposed from outside. But to keep parliaments at bay, such codes must have teeth, and must actually bite organisations and individuals who fail to comply. From the tone of the paper, it is clear that the author has serious doubts about whether the maturity of the industry is sufficient for such self-regulation to be viable.

The remaining papers in this section deal with particular aspects of vulnerability arising from the use of IT. Klaus Brunnstein addresses the issue of 'Service Continuity Planning', from the perspective of the organisation responsible for IT-based systems, rather than from the viewpoint of the systems' clients or other external stakeholders. The paper describes the types, origins and extent of risk of computer outages, and provides factual evidence of the enormous costs to organisations of IT failures. At the end of his conference presentation, Klaus highlighted organisational vulnerabilities by an adaptation of Murphy's Law: "an IT outage will occur if it is possible, not predictable and in circumstances bound to produce maximum damage!".

Because of the level of risk inherent in complex and important systems, it is increasingly necessary for organisations to undertake service continuity planning, to ensure graceful degradation, interim fallback, medium-term fallback, and re-installation and recovery. The paper clearly lays out the alternative classes of fallback service, and the circumstances for which they are appropriate, and provides a survey of existing backup services throughout Europe and North America.

James Rule and Peter Brantley contribute 'The Computerized Supervisor - A New Meaning to 'Personal Computing''. Based on a survey of 184 enterprises in New York, they found that although the use of computers to monitor work and the workforce is more likely to occur in large organisations, it is widespread, and it applies to a wide variety of people, including professionals such as veterinary surgeons and psychotherapists. In many cases, the use of data for monitoring employees resulted from the availability of data originally collected for other purposes. For example, job tracking includes data about the time taken to complete jobs, and hence about the performance of organisational units and of individual workers.

Rule and Brantley see no natural limit to workplace surveillance. Their study has provided solid evidence that the alarmist literature on the subject is not without foundation. Urgent attention must be paid to the establishment of a balance between employer and employee rights.

In her paper, 'General Systems Theory Can Bridge the Gaps of Knowledge Among IT-Security Specialists', Louise Yngstrøm reviews a course offered since 1985 in her institution in Stockholm. The course addresses the need for managers with distinctly different educational backgrounds, and distinctly different motivations, to share a common philosophy and language when addressing IT security issues. It also confronts the inherent conflict between the functionality and security of information systems - security in the sense of 'closed doors' is inappropriate to systems which cannot fulfil their purpose unless they are relatively open.

The course uses General Systems Theory (GST), and particularly its current incarnation as General Living Systems Theory, as its unifying frame of reference. The paper provides a tightly written background to the primary literature in GST, demonstrating its relevance to the security function.

In Australia, the term 'data matching' is used to refer to what in the United States would be called 'computer matching' and 'front-end verification'. Following the long-delayed passage of the Privacy Act in 1988, data matching programs are now monitored by the Australian Privacy Commissioner, and the extent to which it is already used is finally becoming apparent to members of the public.

Paul Kelly's paper on 'Australian Federal Privacy Laws and the Role of the Privacy Commissioner in Monitoring the Data Matching Program' describes the regulatory regime being applied to one major scheme approved recently by the Commonwealth Parliament. The Department of Social Security matches data from the four benefit-paying agencies with data provided by the Australian Taxation Office, supplemented by reference to the electoral rolls and the Health Insurance Commission (Medicare) enrolment data. The scheme involves routinised surveillance of data concerning about 70% of the country's population, drawn from all of the Australian Government's primary client-oriented agencies.

The paper refers to some of the social and legal issues arising in relation to data matching, including the need for citizens to prove the data is wrong and hence 'prosecute their innocence'; and the exchange among government agencies of information collected for different purposes, at the risk of misinterpretation. Its primary focus is, however, the way in which the Privacy Commissioner is addressing the vulnerability of the large number of people affected by the scheme.

Together, these papers provide a further development in the important, ongoing analysis of the nature and dimensions of the vulnerability of contemporary society to information technology. There is evidence of a trend toward diminishing and managing vulnerability, rather than just describing it. And none too soon, because the point at which people react against the threats inherent in IT may be approaching very quickly. It is increasingly urgent for the purveyors of IT to recognise its real and perceived implications and treat those implications as part of the process of proposal assessment, system development and system use. If IT professionals and business managers do not seek out appropriate balances, they risk rejection by the populace, and regulation by unsympathetic parliaments.


The role that information technology is playing, or should play, in the redefinition of the concept of 'city' was of particular interest in the city in which the SOST'91 Conference was held. This is because a new technopolis (referred to as the 'Multi-Function Polis') is under development twenty kilometers to Adelaide's north. This short paper reports on a Panel Session held on the topic of 'New Cities in the Information Age', in which several people were invited to each make five provocative statements as a basis for open discussion.

From the Chair, Julie Cameron provided a framework for the session. She claimed that the concepts of 'new cities' and 'information age' are not well defined or well understood. The concept of a new city can range from a city created from nothing, to an existing city which has incorporated new uses for information technology into its existing infrastructure. The impact of new IT on an existing city is likely to be gradual. But a new city could be planned so as to fully incorporate all new technologies, including IT, into its infrastructure. Life in this kind of new city would be radically different from urban life today.

The concept of an information age can be defined as a time when all the capabilities of IT technology are commonly used in everyday life. On the basis of our current conceptions of IT, this would mean:

Like any other technology, IT can be used to assist or harm. A car is an invaluable means of transport which has revolutionised our lifestyle. It makes travel over long distances possible and gives independence unthinkable a mere century ago. When improperly used, however, it kills. So society has confined the use of the car to roads and controls its use by enforcing road laws. A large number of deaths still occur as a result of its use, but society has decided that the value and usefulness of the car outweighs the importance of the number of people killed by its use.

IT has not yet reached the phase of evolution in which cars were preceded by men carrying red flags. We are unable to predict all of the ways in which IT will be used, we have only a primitive roadway system, and the rules are very new and highly inadequate. In the case of IT, the 'deaths' are intellectual and slow, as in the loss of privacy. The challenge for the IT industry and the community is to foster the benefits while identifying the sources of harm and minimizing the 'deaths'.

Klaus Brunnstein argued that "the old concept of 'cities' will get new facets, but radical changes are not likely". He noted, however, that where information and communication technologies (ICT) are applied radically, "ICT's shortcomings would severely damage city organisation and services - with the proliferation and spreading of ICT, growing risks and growing social impact are inevitable".

"As cities are essentially distributed structures of individuals, groups and larger organisations, distributed ICT (including individual workstations, leisure stations and information stations) are better adapted than centralised concepts".

Individuals' roles in, and time distribution between, home, work and leisure will change, although probably in different ways for different people. "Apart from technofans, most people will only slowly adopt new ICT services, as their maturity is delayed by problems with hardware, software and networks".

Jim Rule argued that "information will be increasingly recognised as a commodity, and conflicts will arise over rights to this commodity". Who owns information about people? The person concerned? The organisation which collects it? The government agency which regulates its use? 'Ownership' will prove difficult to apply, and a new conception of rights with respect to information will slowly emerge.

During discussion, it was underlined that the design of many systems fails to reflect the interests of all of the various classes of stakeholder. In particular, systems continue to be designed for the 'users', generally those within companies or government agencies. Even such extra-organisational systems as ATM and EFT/POS networks fail to adequately reflect the wishes and needs of consumers. And 'usees', the people who are external to but affected by systems, are all but ignored during the development stage.

Roger Clarke felt that IT enables the notion of a 'distributed city', in which the limiting technologies are not those of information but of logistics. He doubted the wisdom of conceiving of a technopolis as a single-site city.

"New minorities of disenfranchised, IT-poor are emerging, including those who are not connected, those who have disconnected themselves, those who have been disconnected and those who are temporarily out of contact due to malfunction, natural disasters and man-made disturbances".

Like Brunnstein, he anticipated that vital IT-based services would be distrusted, because they are fragile. But they are also unpredictable, "because so many decision criteria are hidden in the machine, and because they are living systems, subject to the vagaries of human mediation and interpretation, and continual tinkering. As a result, 'high-tech' will lose its present gloss, and gain pejorative overtones, as in 'high-dependence' and 'high-risk'".

Like Rule, he expressed concern about the management of personal data, suggesting that "there are limits to public tolerance of privacy-invasive applications of IT".

Finally, he argued that, contrary to conventional views, IT begets not order but anarchy. It does this by shortening the cycles of institutional renewal and social revolution:

Simon Davies saw few positive benefits to counter the forthcoming negative impacts, and his views verged on the apocalyptic: "IT has changed from an adventurous electronic frontier to a lawless cowboy terrirory where self-interest and greed are the predominant guiding forces. Within 10 years, the general community will liken the IT industry to the armaments industry. Open systems will all but destroy the independence and sovereignty of developing nations. ... Thousands of kids are leaving the real world to live on the net at the expense of their health and wellbeing. ... Techno-terrorism, together with the counter-measures taken to address it, will become the biggest single threat to civil rights in the history of humankind".

It is difficult to draw concrete conclusions from such a wide-ranging discussion. What is apparent, however, is that enormous challenges exist for the management of information technology's impact, not only within organisations, but also at the level of cities and beyond.


Go to Roger's Home Page.

Go to the contents-page for this segment.

Send an email to Roger

Last Amended: 15 October 1995

These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content).
The Australian National University
Visiting Fellow, Faculty of
Engineering and Information Technology,
Information Sciences Building Room 211
Xamax Consultancy Pty Ltd, ACN: 002 360 456
78 Sidaway St
Chapman ACT 2611 AUSTRALIA
Tel: +61 6 288 6916 Fax: +61 6 288 1472