Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'A Career Retrospective'

A Career Retrospective

First-Cut Emergent Draft of 13 May 2024
with Theme 1 added, 6 February 2025

Notes quickly thrown together prior to an interview by ACS's Media and Communications Manager,
Paul Wallbank, as part of a series on ancient Fellows of the ACS

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2024-25

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://rogerclarke.com/SOS/ACR.html


Contents


Introduction

The intention in this document is to recapitulate one person's ICT-related experiences of over 50 years in information systems professional, consultancy and research work. The motivation is to provide some insight into the rapid rate of change in ICT, and its impacts and implications, and thereby warn about the need for all IS professionals to continually monitor technological developments, and review, refine and replace the assumptions they make in their applications of technology.

Some personal context is necessary to set the scene for the outlines of the technologies I was using in each period, 1967 to 2024. Inevitably, these personal aspects will intrude at least as much as they provide relevant background; so it will be necessary to prune some of that back once the first draft is complete. An Appendix is included, offering links to reviews I've published over the years on a wide range of information and communication technologies.

Before plunging into some detail, here's a very brief summary of my career:


Phase 1. Preparatory and Professional Life ? 1967-83 (First 17 Years)

I was born in Portsmouth, but brought out to Kingaroy at 18 months of age by my 10-quid-migrant parents. With my mothers' parents ailing, we returned to the UK 2 years later, and I did my first year of primary school back in Portsmouth. My father negotiated a second escape from a then-still-very-dismal UK, a second 10-quid passage, to provincial Bundaberg.

After matriculation in 1966 ('Wyndham Year' in NSW, when a year was added on to secondary school), the just-17-year-old me came down to Sydney from Bundaberg High School to take up a cadetship in ASX20 sugar company, CSR. I left after a year, to switch from part-time Chem Eng to part-time Commerce. I spent 2 years in 1968-69 in a company secretary's office in a financial institution in Sydney CBD. That was followed by 1-1/2 years in 1970-71 as an assistant accountant at Dow Chemical and then Angus & Robertson Book Publishers. That early experience exposed me to multiple systems, variously manual and accounting-machine supported, but very much data and information systems, in such areas as debenture (secured-loan) and treasury (cash flow) management, plus royalties, inventory and cost accounting sub-systems.

The first program I wrote was in a Maths I class at UNSW in mid-1967. At that stage, Fortran had no version-number, and we were required to use a (very poorly selected and structured) sub-set called Iitran. I'd been looking to get into systems analysis in a computing context, and joined the then industrial conglomerate Wormald in that role in mid-1971. I was given my chance by the Data Processing Manager, Neville Clissold, who'd done the PIT scheme some years earlier. The Programmer-in-Training (PIT) scheme was a federal government initiative that was necessary in the years prior to Colleges of Advanced Education and Universities developing computing and information systems courses. It ran mostly in Canberra and Melbourne, in Health, Defence, the Commonwealth Bureau of Census and Statistics (CBCS, later ABS) and the Postmaster General's Department (PMG).

Wormald had a GE405, with 32K of 6-bit word memory and 5 tape-drives, punched-card reader and line-printer. (My first Mac, part of the second shipment in April 1984, was a great deal more powerful). The GE405 already ran a batch debtors system for over a dozen subsidiaries, and creditors for half-a-dozen. During my 2-1/2 years there, I developed several flavours of inventory systems for the Steelbilt filing cabinet factory, for the electronic security division, and piping and fittings asset management for the sprinkler division and for the Gladstone power station construction. The GE ran somewhat faster after the company paid a lot of money for a 'golden screwdriver' job to 'install' an additional 32K 6-bit words. It's amazing what can be done with 64K. It's also appalling to 1970s computing professionals how undisciplined and profligate coders have since become.

I also wrote a payroll system that handled weekly and fortnightly pays, including deductions, but intentionally leaving some of the more esoteric and small-scale corners (such as 'dirt money' for some categories of plumbing work) to be handled manually, and similarly leaving all HR aspects well alone. All of those systems were in COBOL (the USASI 68 version).

We also used Honeywell Time-Sharing services on ASR33 teletypes connected to servers placed strategically in disparate time-zones, in New York, Amsterdam and Tokyo, communicating at 110bps (or 0.0001Mbps, on the scale applicable in 2024). One application I recall was a quality-assurance routine to ensure that sprinkler piping designs delivered at least the minimum water pressure requirement to any target zone in the network. Those projects were done in Fortran II.

That was a hectic period for me, because I was still running a transport platoon in the then CMF (now Army Reserve), and when I retired from that in early 1972, aged 22, I did an Honours year full-time while doing 50-hour weeks finishing the design of a couple of systems, supervising a programmer, writing a lot of code myself, and running systems testing and implementation.

In late 1973, I joined a rather calmer data processing outfit, a container shipping company called ACTA, back in the Sydney CBD. This was my first experience designing for direct access to records on disk, rather than serial access to records on tape. My primary responsibility was a large inventory system for freight containers (20x8x8s and all the many other variants). It managed a long-lived, mobile and world-wide inventory of individual assets, rather than ephemeral stock quantities of commodities. (Later, during my consultancy career, I had further experience of highly varied categories of inventory. One was a system to support abbatoirs and meat-packing for export. It took an hour or two for the penny to drop that the bill of materials logic works in the opposite way to the conventional explosion of a desired finished good into the component parts needed to assemble it).

During the 2-1/2 years with ACTA, I did two other quite different projects. One was an evaluation of alternatives to support an online system in the company's container packing and unpacking depot, including the conceptual and logical design of the system. A Data General (DG) minicomputer won the contract. (By 1975 the dominance of mainframe architecture was under serious challenge, particularly for small- and mid-sized systems with a need for real-time or online services, rather than just batch-processing).

During that time, by that point aged 25, I also developed my first program-generator. The Honeywell 2000 at ACTA had only a limited language-set available, so I used COBOL to write a program to generate COBOL programs. COBOL lacked string-handling primitives, so I had to write a set of compiled-in copy routines (because subprograms were too inefficient on the H2000 configuration) in order to dismember and assemble text character-by-character. It worked quite well, although the payback only came a few years later when I applied it in a software house with a small team writing many programs very quickly, greatly reducing both coding and testing time, because all of the main processing had been pre-coded and pre-tested.

I also did a Masters thesis in the area of structured design and development in a COBOL environment, demonstrating that only a small number of limitations had to be placed on coders' freedoms in order to achieve cleanly-structured source-code that satisfied Edsger Dijkstra's specifications for structured, GOTO-less programming. COBOL had been previously regarded as inadequately-structured and verbose. Applying my two innovations of structured coding rules and parameter-driven program generation for all mainstream functions, even USASI-68 COBOL, and particularly ANSI-74, became a lean and mean development environment ? always provided that the project team were trained, disciplined and managed.

As was quite normal at the time, I also moonlighted for several clients, initially in conjunction with my boss at Wormalds, and later freelance. About the most interesting project was a contract with Reckitt & Colman, via a mate in their financial planning area. Financial modelling tools had long existed, primarily accessible via time-sharing services, but they were expensive and cumbersome. In 1974, I developed a forerunner of the Visicalc spreadsheet that burst on the scene in 1979 and drove the Apple II and was a spur for IBM to establish its PC Division.

My program was a tool for project cash flow / net present value calculations to support the evaluation of project proposals. On the basis of a set of inputs, a pre-compiled COBOL program printed a spreadsheet showing the project's financial implications. Given the 20-20 vision provided by the runaway succes of Visicalc, I should have invested a little more time in generalising the tool, and re-developing it in a more suitable language and in an interactive environment on a minicomputer such as a DEC PDP or DG. Micros were still only available in cut-down and mostly kit form, and without an OS or a development environment. (It was quite some years before I was aware of the Xerox Alto, manufactured on a small scale from March 1973. But it took another 8 years until the Xerox Star, 2 more to the Apple Lisa, and 1 more after that to the Apple Macintosh). The breakthrough Altair 8800 was released in the US only in January 1975. In innovation, timing is everything. Dan Bricklin is nearly 2 years younger than me, but a great deal richer.

From 1971 to 1976, I was active with the then Australian Institute of Systems Analysts (AISA), which had Sydney and Melbourne groups. I ran monthly events across Sydney for some years, across the wide range of analysis and design and project management techniques, extending into the software development domain and the hardware field. At this stage, the older 'system monitors' that performed limited housekeeping functions to enable application support to run were giving way to much more substantial systems software that has matured into what operating systems (OS) are today. Mainframe companies ran their own proprietary and monolithic beasts of operating systems, but from 1969 onwards, Unix had laid the foundations for smaller, much more cleanly structured and non-proprietary operating systems. But I didn't get to use such OS until 1979 onwards, on Kienzle, Prime and DEC VAX minis.

In mid-1974, I joined the Australian Computer Society (ACS), by then already qualified as a full (if still somewhat junior, 24yo) professional member. ACS had been established almost a decade earlier, through federation of pre-existing associations in many of the State capitals plus Canberra. It was large and active, with as many professional members as it has today, and already had nearly 100 Fellows. It ran events across the full gamut of what we still called 'the computer industry'. Marriage with tele-communications, and even with local communications much more sophisticated than large numbers of device-to-device cables, was still emergent.

Apart from involvement in the running of events and the very busy and poular annual conference at Terrigal, my main contribution to ACS during this period was to establish and run an ACS NSW Branch SIG on Privacy. I'd become involved in privacy issues in 1973, simply because other students in a unit at UNSW had prattled on about computers as though they were Armageddon personified. I took time in the class to debunk the nonsense they were talking; but there were a few points that I couldn't debunk, and they formed the kernel of a vast amount of work I've done, much of it in spare moments, for the more than 50 years since then.

The SIG's initial work was to document the real threats to the interests of individuals that computing and emergent data communications gave rise to, and distinguish them from the enormous range of imagined threats that doom-sayers were coming up with. Armed with that evidence, the next task was to provide input to the NSW Attorney-General, John Maddison, urging that his planned data protection legislation be suitably targeted, and no more harmful to the then-fledgling computer industry than was actually warranted by real threats. The result of the groundbreaking Morison Report of 1973, and lobbying efforts, resulted in a statute in 1975 that created a Committee with powers to research, investigate complaints, and publish guidance to business and government. So successful was the lobbying that it was to be another 13 years before any substantive regulation emerged (and even then it was only grudgingly enacted in the aftermath of the Australia Card fiasco).

In early 1976, having finished a Masters degree, and by then lecturing, always part-time, I was contemplating two possibilities. One was to enrol in a part-time PhD and write a thesis on 'Generalised Activity Networking Applied to Application Software Development Projects'. The other was to travel overseas, and work for a while in the UK first, then probably Germany (because I'd done pretty well with the language in 4 years of it at school). What actually happened was Option 2, but delayed by 18 months. One reason for the delay was to see what part of the world I'd need to be in to catch up with my fast-moving future wife. The other was that Bill Orme, the Executive Officer of the NSW Privacy Committee, talked me into lead-authoring one of the world's earliest 'Guidelines for the Operation of Personal Data Systems'.

In mid-1977, my now-wife and I spent 3 months crossing North America, landing in London at the end of summer. I took a contract with the London Stock Exchange (LSX) as a 'design controller'. The sub-text beneath that euphemism was that the designers of the front end of a vast new batch system had written a highly-ambiguous functional description. Coding and test-design was already commencing, a chaos of incompatible interpretations was about to ensue, and specifications for the front end modules had to be rapidly written, reviewed and negotiated, in flight, generating a new, comprehensive and clean design, but causing no more re-work down the production-chain than was absolutely necessary. Without the fairly substantial diversity of projects for which I'd done analysis, design, coding, testing and project management, I'd have been floundering. As it was, I just had to work hard, efficiently and long, and stay calm and focussed. People in 'The City' were good at the calm and focussed bit, but depended on Australians for the rest.

The system, called TALISMAN, was to perform overnight batch processing for all transactions conducted on the LSX that day, and be completed by 5am, to enable the next day's operations to start with a clean database. The first run on normal volumes of live data took 24 hours; but being in PL/1 and designed using Jackson methods, it only took a week to analysis the execution-profile, write a few components in Assembler, and fit easily within the 10-hour window each night. I had the foresight to leave at the end of 1978, a few months before it went live. But it went in without fuss. Without that, the 'Big Bang' event in 1986, the world-leading switchover from solely floor trading to solely online trading, could not have happened.

On 2 January 1979, I commenced a job not in Germany, but with Brodmann Software Systeme AG on the outskirts of Z?rich ? an altogether much more attractive location in both geographic and financial terms. (The story of how I managed to get a rare-as-hen's-teeth work permit is worth relating, but not here). I joined a 30-person 'software house' / Softwarebetrieb, as we called them then. We produced bespoke software for clients, but the business was shifting the emphasis from custom-made software to packages, and from mainframes to minis, in the worlds of Prime/Primos, DEC VAX/VMS and Unix variants. Much of the work was on dumb or semi-intelligent terminals directly-connected to local servers. The applications were primarily online, but with some batch functions.

For 3-1/2 very enjoyable, if at times intense, years, I worked my German up to a level that I could emerge from the backroom and run projects again ? although many clients were uncomfortable in 'high German' / Schriftt??tsch, and I had to achieve some level of competence understanding the main half-dozen variants of the 'old German' Dialekts that are the mainstream across the more economically-developed 2/3rds of the country.

The primary language remained COBOL, but effective structured analysis, clear specifications, and efficient coding practices were essential elements of a successful software house. I refined my COBOL program-generator tool-set, and we used that on a number of projects. We specified, and had coded at a nearby educational institute, a clean editor, far more usable and convenient, and supporting better formatting of the main languages, than the mainstream products like vi, EMACS/Emacs, and their look-alikes under proprietary OS. A spin-off article from that work was published in the then Australian Computer Bulletin in February 1982. I was then asked to evalute a much larger-scale program-generation suite called DELTA, which was produced in the nearby town of D?bendorf, and then led the project to implement it for Peter Brodmann.

During this period, I also started publishing refereed journal articles on technical topics, including a summary of the work gleaned from my previous work, in 'A Background to Program-Generators for Commercial Applications'. More notable was one on 'Teleprocessing Monitors [TPMs] and Program Structure'. A TPM was a generic term for a fearfully obscure piece of systems software that each mainframe provider had to invent in order to enable its highly complex, monolithic and strongly batch-oriented OS to be used to support online transactions. My explanation of the functions a TPM performed, and, of necessity, also of the differences between batch and online program structures, won me the 1982 ANCCAC Prize for year's 'best' (most understandable and/or useful?) article in the then Australian Computer Journal. (It was to be more than three decades before I won a couple more 'best paper' awards).

In the 5 years since I'd left the NSW Privacy Committee in May 1977, I'd remained active in the privacy space, including meetings with the architect of the UK Act, Paul Sieghart. In particular, I remained in correspondence with Justice Michael Kirby throughout his time in 1978-1980 chairing the OECD Committee that produced the original OECD Guidelines. The drafts produced for the Committee by a Swedish consultant were greatly improved upon, resulting in a document agreeable-enough by a committee that had to appease strong US interest, and more than a little European interest, in protecting corporations' interests in free trade in personal data, but that delivered what could be described as 'adequate' data protection laws.

In mid-1982, we returned to Sydney from our 5-year 'working holiday' in Europe, I established a $2 company, and set about establishing a contracting and consultancy career. Initially, my focus was on software development and maintenance, with an emphasis on program-generators and '4th generation languages', including work for the Australian distributor of DELTA, Rob Charlton, a long-term colleague in ACS and on the soccer-field. I began running commercial seminars to project into the market some of the techniques I'd worked on over the years. Among other clients, I also worked on business development in the software-package space, with Col Hoschke at software house Wachers.

I'd maintained contact with the IS Department at UNSW, and Cyril Brookes invited me to take up a half-time Senior Lectureship for the 1983 calendar year. It was partly to develop case studies, one of which was of the world-leading enterprise data model of the UNSW Library. Other tasks included tutoring a COBOL development course, and designing and lecturing a 30-hour unit on the exploding world of large-scale networks. That was at a time when proprietary solutions such as IBM's SNA and DECNET were on the wane, and European ISO OSI's tightly-defined standards were vieing with the IETF's Internet Protocol Suite for intellectual and market dominance. (In 2002-2009, fully 20 years later, it was interesting to reprise some of that, teaching a Masters seminar at Uni of Hong Kong on Internet Infrastructure for eCommerce).


Phase 2. From Town to Gown ? 1984-1995 (Middle 11 Years)

1984 ? IS undergraduate service units within the Accounting major, an IS sub-major

Later Honours, and additional units at undergraduate and graduate level

1984 onwards ? ICT infrastructure ? Lisa, Mac, Mac Labs, networked Labs, accounting software, technology innovations in teaching (digital cameras, email, Internet email from 1990, Web from 1993, ...)

Managerial responsibilities at various levels, including 1-1/2 years as Head of Department

Ongoing consultancy activities, primarily on ICT strategy and policy matters

1985-87 ? Australia Card campaign, incl. ACS Policy Statements

strong record in research and publications across application software technology and its management; all facets of 'eCommerce'; information infrastructure, culture and pain-points; data privacy and dataveillance

1987 sabbatical at Uni Bern included running a Masters seminar in which students interfaced an IBM PC with a mid-range IBM System/6. This was at that stage a very new development and not yet feasible with mainframes ? 10 years after the Apple II, 7 years after the first IBM PC, and 3 years after Apple Macintosh

1990 paper on software architecture

a history of the Internet in Australia (published 2004)

the early Web in Australia (published 2013)

IS Academic Directory

ISWorld

AIS ? foundation Councillor

Program Committees for scores of international conferences, including over 30 years with the Bled eConference, Slovenia, 1991-2023

ACS:

IFIP TC8 Conference Editor ? 1984

LAN Conferences 1985-87

Canberra Branch Committee Member 1985-90

Chair, national Economic, Legal and Social Implications Committee, 1985-95

Director, Community Affairs Board, 1989-92

Fellow (FACS), since 1986

APF 1987-


Phase 3. Going (Back) to Town ? 1995-20.. (Last 30+ Years)

consultancy in eCommerce, eBusiness, ePublishing, electronic service delivery. etc.

expert evidence in a series of leading cases

Victorian Data Protection Advisory Council ? Member, 1995-97

Australian Government Public Key Authority ? Board-Member, 1999-2000

AESharenet Limited ? Chair, 2000-2006

Visiting Professorships at ANU, since 1995 (in Computer Science), at University of Hong Kong, 2002-09 (in Engineering), at UNSW since 2003- (in Technology and Law)

Visiting Scholar at the European Business School, Rheingau, 2002-03 and at University of Koblenz-Landau 2007, 2013

Cyberspace Law & Policy Centre, UNSW Law, 2007-2012

Allens Hub for Technology, Law and Innovation, UNSW Law, 2018-2024

Fellow of the international Association for Information Systems (FAIS), since 2012

Supervision of multiple PhD students

APF throughout

PI

EFA 2001-05

2010-2020 ? ISOC-AU

2019- ? ACS


Themes

The majority of the above 'Career Retrospective' naturally adopts a largely chronological sequence. Some aspects of my career (well, anyone's really) are better examined thematically. This section picks out a couple of abstract aspects to address. (The whole of the web-site is structured on the basis of detailed themes, so there's little point in repeating what's already there).


Theme 1: Quality: Doing IT Well

The early history of what we now call Information Technology (IT) or Information and Computing Technology (ICT) was about 'Computing', and quite literally so. At the beginning of the 'electronic component' era (in the 1930s), 'analogue' computing had a brief life, but 'digital computing', built over 'binary digital' componentry, quickly came to dominate (from 1940). The applications were also at first solely quantitative, for such applications as cryptology and ballistics calculations. The first use for administrative purposes came in the UK in 1951 (but you need to ignore single-minded US histories of computing and consult more open-minded sources to find it). My first use of a computer was 16 years later (using a crippled sub-set of very early Fortran to invert matrices), then not for a further 4 years, in 1971, when I became an analyst-programmer and project leader on inventory, payroll and similar systems, using COBOL.

Developing software for a bare computer is highly inefficient, and highly error-prone. Assemblers were 'orthogonal to' (had a very close structural correspondence with) the underlying machine language. Escape from those strictures was achieved from the mid-1950s onwards, by conceiving of languages in which solutions to categories of problems could be conveniently expressed, hence 'Formula Translator' and 'COmmon Business Oriented Language' (COBOL). 'Compilers' then generated machine-executable code from those problem-oriented expressions.

(Programming languages, and flavours of them, have proliferated. In addition, approaches have emerged whereby software can be developed in more abstract manner than by specifying solutions to problems. These include problem declaration (e.g. using program generators), problem-domain description (e.g. ruke-based expert systems), and automated analysis of data-heaps that purport to document a problem-domain (e.g. AI/ML using artificial neural networks and generative AI). I described these 'generations' of software development tools in a 1991 publication).

We've built many layers of software since then. Each layer has, intentionally, buffered the 'programmer' further from the raw device. The result is a that a vast array of assumptions exists about what the programmer is to do, and is not to do. It's quite rare for any software developer to be sufficiently aware of most of those assumptions. The scope for mis-matches in expectations is vast, and mysterious behaviour of programmed devices is the norm. The first programmer, Ada Lovelace, engraved the principle in 1832: 'Computers don't do what you want them to do. They do what you told them to do'. (One expression she used was "The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform" -- italics in the best available source, from 1949).

At the time I became active in the industry in 1971, quality was seen as crucial. We were all busy developing theory and practice to deliver it. We had our focus on clear thinking about problem analysis, solution design, and coding habits, code inspection and debugging techniques, plus multi-layered test design and implementation (at program, sub-system, user and client-acceptance levels). Professionals across the industry progressively retrofitted quality features backwards up what came to be called 'the waterfall model' of software development, adding in such early-phase features as design walkthroughs, coding for correctness first and run-time efficiency later, test-harnesses, and logging of intermediate results.

By the end of the 1970s, the scope and scale of systems had expanded beyond what we'd now come to regard as sub-systems of production support systems (later referred to by the odd name enterprise resource planning -- ERP), and financial management information systems (FMIS). These served the needs of multiple business functions at once. Quality control now required additional capabilities, including data dictionaries matured to encompass the programs and manual procedures that created, modified and deleted data, coordination between process structures and data structures, tight specification of change requests, and change-bundling into successive releases to enable change control and configuration management (CCCM). I provided an outline of CCCM in a 1990 conference paper. I extended that with a note on software escrow, in 1994.

The 1980s saw a shift in emphasis to database management systems and, generally in parallel with that, data modelling to establish a solid framework around which DBMA data schema specification and process design could proceed. These brought with them more discipline in relation to the semantics of data-items and records, and the values that were valid within each data-item. However, the ongoing growth in scale and scope risked development teams abandoning the well-established principles of modularisation and drifting into monolithism -- applications that were both vast and amorphous and with unplanned and unmanaged inter-dependencies. Isolating the reasons for malperformance is inherently infeasible in systems of that nature.

I published a short review of electronic interaction technologies in 1987, in a 2013 paper, and a longer review of IT Infrastructure in 1985, in a 2018 paper. (If you're interested in a longer-term view of IT in Australia, some background is in early sections of a 2008 paper).

By around 1990, a sound framework existed for reliable software development at scale. It comprised three dimensions -- data, process and state-transition triggers -- with mature modelling techniques for each dimension, including reliable and auditable cross-references among them. The approach was referred to as 'structured analysis and design'.

As with any well-engineered approach, the emphasis on quality came at an up-front cost. It had always been challenging to sell management on investing in early-phase quality features in order to avoid spiralling costs later in the project (just as investment in security safeguards to avoid explosive costs of breach-handling later is still an all-too-difficult sell). Longstanding warnings did not appeal to managers and executives demanding new features right now.

The project management dictum has always been:

'You can have it Fast, Good or Cheap. Choose two'.

But in the early 1990s, a new approach was being trumpeted by consultants. At its best, it sought to parallel parallelable tasks in a disciplined manner, to achieve earlier payback. But what so-called Rapid Application Development turned quickly into was RADical coding, with corners cut, quality short-changed, and errors ignored and where recognised, tolerated rather than addressed. The result was that Fast and Cheap were demanded, and Good not merely compromised, but abandoned.

Around this time, after 20 years in the software development space and as a Fellow of the ACS, my career was moving on to more strategic and policy aspects of IT and its applications. Since the mid-1990s, I've watched aghast as myriad excuses for unprofessional indiscipline have been popularised and glorified. Three examples that capture the then zeitgeist were 'Move fast and break things', 'permanent beta' and 'minimum viable product (MVP)' -- the last conveniently linked in the mind with the positive US expression 'most valuable player'. Along with those RADical ideas has come deep compromise, and even abandonment, of readability and maintainability of code, and of supplier responsibility to provide and to update documentation.

There is now no expectation that providers of software, and of services reliant on software, will deliver and maintain information that enables errors to be detected, understood and addressed. Users are expected to invest their own time and money, and swap notes with one another via a wide range of intermediary posting sites (so-called 'crowdsourced documentation'), in the hope that somewhere a crumb of hope might be found.

Moreover, there is widespread acceptance by users that software and services will frequently malfunction or break down, that support services will be difficult to access, that first-level responders will have a limited set of pre-written answers and will be discouraged from referring issues to second-level support, and that many problems simply fester and are never addressed. An earlier, 2008 rendition of some of this history is in this section of a historical revjew of the IS discipline in Australia.

The RAD mind-set, since re-badged 'Agile', continues to be seriously harmful to user organisations, employees and others who have to deal with systems, and consumers and citizens reliant on remote services that evidence continual malfunctions.

In short, quality has fled the majority of mainstream software, and is reserved for systems whose malfunctions are seriously impactful, will be recognised and investigated, and responsibility will be assigned for the consequences, with large-scale financial and reputational harm to the organisation deemed to have failed its quality obligations.

Occasionally, the RADical mind-set leaks into safety-critical systems development as well, and costs lives first, and hence untold billions in losses by the organisations that perpetrate them and insure the perpetrators. A nightmare scenario is leakage of the RADical mind-set into the layers of software that underpin safety-critical systems. Endeavours by software-library providers to reach into broader markets with lower-cost product-variants threatens compromise of the quality of those libraries, inevitably resulting in unauditable incidents, and unsolvable crashes, of software and equipment alike.

Up to this point, I've limited the discussion to IT technical-quality factors. By about 2000, substantial expansion in the scale and scope of administrative systems had occurred. In the 1950s and 1960s, almost all IT applications were operated entirely within a single organisation. During the 1970s to 1990s, communications capabilities were linked with computing, enabling inter-organisational systems to be develop, become more sophisticated, and support efficient interactions. As early as 1992, I documentated the extension of systems beyond the boundaries of organisations to reach individuals. The early examples of extra-organisational systems included ATMs, then EFT/POS terminals. Over time, personal computers, portables, tablets, then mobile phones, became means for people to interact with organisations' computers -- and for organisations to off-load some of the effort and the infrastructure costs to unpaid customers.

Human participants in systems are conventionally termed 'users'. However, many individuals who are not participants are affected by large-scale systems. It is indicative of a blind spot in IS theory and practice that no term for this category of people has reached the mainstream. A candidate term of long standing (since the early 1990s) is 'usees', implying people whose interests are impacted, and hence are used by the system.

As organisations, their employees, external users, and usees, all became increasingly intertwined and interdependent, purely technical perspectives on the quality notion became increasingly untenable. All systems have both technical and social aspects, and designs must achieve quality of technological use, not just of technological application. During the early decades of direct interactions between people and devices, attention to the human-computer interface (HCI) sufficed. The notion had to be quickly expanded, however, to the whole user experience (UX).

But even that falls far short of the need. The notion of socio-technical systems goes back many decades, and it is essential that the principles it espouses be applied. As expressed here, "Socio-technical systems theory and related design methods are built on open systems thinking and acknowledge that systems comprise technical components working in combination with social and/or human elements. Analyses must therefore consider the interactions between the social and technical elements". The socio-technical approach considers technology in combination with such human and organisational elements as adjustments to business processes, training, monitoring of behaviour, and change management techniques to cause users to adjust their behaviour.

In recent years, as expressed in a 2024 paper, "applications of IT have migrated far beyond the conversion of data into information. They now draw inferences from clusters of information, support decision-makers, and even conduct automated decision-making based on information and inferences, support individuals taking actions driven by those decisions, and in some cases [autonomous devices] directly implement decisions. ... IT-based systems are capable of substantial impacts and implications, ranging from the highly beneficial to the very harmful, sometimes foreseen and managed, but in many cases unintended, unanticipated, or foreseen but unaddressed. The growth in scale and scope of IT-based systems has been accompanied by greater potential value to at least some participants, but also by considerable increases in the likelihood and seriousness of harm".

With scale, scope and impact all far higher than in the past, it would be reasonable to expect that quality would again have become a high priority in the design, deployment and adaptation of IT-based systems. But, in contemporary large-scale systems, there's little evidence of quality assurance measures. IT Services executives and staff set themselves up as arbiters of what systems should like like. IS staff avoid the complexities of participative techniques, and build buffers between themselves and remote clients. An example of this in some contexts has been 'engagement' specialists, who actively block communications between user representatives and data analysis and system design staff. Operational systems are nominally accompanied by incident reporting and management systems; but many user reports languish, and barriers exist to 'escalation' of issues to relevant staff-members.

A large literature on IT project and system failures has accumulated, with no sign of improvement. Meanwhile, the frequency, scale and scope of failures appear to keep growing, at least linearly with the frequency, scale and scope of the systems themselves. Failure is commonly measured in terms of negative ROI or financial waste and opportunity cost. Too often overlooked are the harm done to staff-members, both internal users and in the IT arena. Impacts on external users and usees are commonly even less in focus. Far too few careful examinations are undertaken of the human costs of grossly inadequate quality assurance in IS development and maintenance. As a contribution to that literature, see this case study of the Robodebt scheme, 2014-23, published in 2024, accompanied by a more substantial working paper.

Reflecting the disenchantment I feel about the dismal record of the IS discipline and profession in relation to quality, the third Theme below is concerned with IT's impacts and implications.


Theme 2: Strategic Impacts and Implications: Doing IT Usefully for User Organisations

TEXT


Theme 3: Broader Impacts and Implications: Doing IT Appropriately for Everyone Else

TEXT


Appendix: ICT History

In a variety of papers over an extended period, I've provided information about many aspects of information and communications technologies over various timeframes at at various points-in-time.

Timelines

Snapshots

Future-Gazing


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professorial Fellow associated with UNSW Law & Justice, and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 12 May 2024 - Last Amended: 6 February 2025 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/SOS/ACR.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2024   -    Privacy Policy