Panellist's Statement for ICIS'2002, Barcelona, December 2002
Panel on Information Privacy in a Globally Networked Society: Implications for I.S. Research

Roger Clarke

Principal, Xamax Consultancy Pty Ltd, Canberra

Visiting Professor, Baker & McKenzie Cyberspace Law & Policy Centre, University of N.S.W.

Visiting Fellow, Department of Computer Science, Australian National University

Version of 26 April 2003

© Xamax Consultancy Pty Ltd, 2002-2003

This document is at http://www.anu.edu.au/people/Roger.Clarke/DV/ICIS2002.html

This accompanying PowerPoint slide-set is at http://www.anu.edu.au/people/Roger.Clarke/DV/ICIS2002.ppt


The Panel

The key question that the panel addressed is:

In what ways do information privacy matters challenge IS researchers as they go about their normal business?

Panellist's Statement

My thesis is that, in contexts in which privacy is a significant factor, research quality is extraordinarily difficult to attain. As a consequence, publication will only be achieved when fashion and topicality convince journal referees and editors to accept a paper that falls below their normal expectations.

My argument is based on the following considerations:


Quality Challenges in Attitudinal Research

Attitudinal surveys are capable of producing data whose quality is high when judged against criteria such as their amenability to powerful analytical techniques. But to the extent that quality depends on correspondence of the measures to particular real-world phenomena, the data that most surveys produce are merely fodder for exercises in statistical analysis. As training for new academics, such surveys may be justifiable, but they produce no information relevant to the real world, and should therefore fail a critical test of publishability.

Attitudinal survey design must confront many sources of uncontrollable measurement bias and response bias. The phrasing of questions has major impacts on respondents, and the impacts vary between respondents. The sequence of questions also leads respondents to particular understandings of the meanings of words used. Questions about sensitive topics cause respondents to choose their answers carefully, with a view towards self-protection at least as much as towards honesty. The context that each respondent perceives for the questions are likely to include factors that are extraneous to the designer's intention, that that may vary during the course of the data collection, and may even be unknown to the researcher.

The notion of non-response bias refers to refusuals being other than random, which is likely to result in the distribution of sample responses being different from that for the population. Yet many researchers make the implicit assumption that very similar distributions would have been achieved across the responding and the non-responding groups. The non-response bias problem also arises at the level of individual questions.

There is massive over-use of proxy sampling frames whose characteristics are very different from those of the target population. Most commonly, students are used as a convenience sample, under the pretext that the research is exploratory. Stduents are in most circumstances unrepresentative of the population that is ostensibly being researched. In many cases, they are also captive, and the proportion that answers other than honestly is likely to be high. Although some of these pseudo-responses may be easily filtered, they often are not; and some pseudo-responses are difficult to detect.

Likert scales are a commonly-used device. They usually involve very short statements, with very limited context provided that might encourage common understanding of the terms used. The lists of statements are frequently long, and boredom-inducing. Worse still, the responses are actually qualitative, and 'category ordinal' in nature; but they are assumed to be quantitative, 'ranked ordinal' data. Some researchers then apply more powerful statistical techniques to them which are only actually applicable to data that is on a cardinal scale. It is not uncommon for this to be done without so much as a discussion of the possibility that the respondents did not realise that the options that were described with written words and had numbers adjacent to them were supposed to be interpreted as having equal distances between them.


Quality Challenges in Privacy-Related Research

Research in which privacy factors arise is yet more problematical. This includes not only surveys whose express purpose is to sample attitudes to privacy, but also research designs in which privacy is an intervening, moderating or confounding variable. The involvement of privacy is frequently overlooked. For example, it is quite astonishing that a high proportion of the burgeoning literature on trust in the context of B2C fails to control for privacy, fails to meaningfully consider it, or even completely overlooks it.

The non-response bias problem is an especial challenge. It seems reasonable to assume that distributions of responses from people who are willing to answer questionnaires about privacy topics will be different from those that would arise if it were possible to get responses from those who decline to participate. Moreover, it would seem reasonable to assume that a significant proportion of those who decline do so because they place a high value on privacy. Hence there is likely to be a systematic bias in the data that is gathered, with the level of privacy concern in the population consistently under-stated by the respondent sample. The scale of the bias may be very substantial: in one of the rare instances in which the refusal rate is quoted, almost 4,250 people had to be approached for every 1,000 responses achieved (OFPC 2001). Yet discussion of this problem is almost entirely absent from conference papers and journal articles in the information systems discipline.

Among those who do provide responses, there is enormous scope for variation in the understanding of questions that involve privacy. The laws of most countries have no definition of the term 'privacy', because it is so highly open-textured. It has multiple dimensions, at least those of privacy of the person, of personal behaviour, of personal communications, and of personal data (Clarke 1997). Hence respondents may make very different interpretations of the most carefully-phrased question. Yet it is unusual for researchers to provide respondents with any kind of tutorial, or even a glossary, and it is unusual to see discussions of the steps taken to overcome measurement and response bias arising from such difficulties, or to assess their impact.

Beyond the definitional aspects, people's reactions are subject to situational relativity. A person who has a current health condition that is embarrassing to them might well be more likely to place a high value on health care data relative to other data, or to other interests. A person's attitudes to the disclosure of details on a doctor's certificate supporting an employee's absence from work are likely to vary depending on whether they are interviewed in the context of their role as an affected employee or as a supervisor.

Some of these variations may be controllable, or sufficiently uncommon that their effects might to be lost in the 'noise'. Other relativities, however, are likely to result in outright biases. Intrusiveness into the lives of pilots and train-drivers is likely to be more widely supported shortly after a plane or train crash. Media reports (which for the most part reflect propaganda, public relations campaigns and controlled information flows from governments, government agencies and corporations) are likely to condition responses during the days and weeks that follow their publication. An extreme case of this is evident in the enormous politicisation of privacy-related matters in the U.S.A., the U.K. and a few other countries following the assault on civil rights unleashed since 12 September 2001, and justified as responses to the terrorist assaults on New York and Washington DC the previous day.

Privacy attitudes are also subject to enormous cultural variation. For example, much of Western Europe places high value on the protection of personal data against corporations and government agencies, and regards statutory legal measures as essential. Scandinavian countries, however, especially Denmark, evidence something of a truce between data protections and openness. In the U.S.A., the public's attitudes are highly dependent on the media, and the American press is dominated by the interests of big business, and the kind of libertarian idealism that opposes government regulation and naïvely assumes that people are powerful enough to resist business and government agency intrusions. In East Asian countries, subservience to authority is highly-valued, to the extent that the Hong Kong Privacy Commisssioner had to create a Zhongwen character to enable 'privacy' to be rendered in written Chinese.

Of course, the nation-state is far from an adequate proxy for culture. There is a spectrum of opinion within each country. There is a significant lingual dimension to culture. And the religio-philosophical dimension varies in its intensity from minor to determinative. The conventional Hofstede analysis appears paltry as a means of controlling for such complex patterns.

A final area of difficulty for research in domains in which privacy is a significant factor is the unwillingness of the elders of the information systems discipline to recognise relevance to public policy as a criterion. The scientific tradition demands rigour of process, and 'hard', quantitative data. Interpretivism lacks firm ground in both process and data; but it has made headway during the last two decades, as the inherent ambiguity and multi-valuedness of information has been accepted as a characteristic of organisational contexts. But the preference remains strong for researchers to seek explanatory and predictive power, and to leave normative questions to other disciplines. Critical theory, with its explicit recognition of the inbuilt biases attributable to convention and to control of the public agenda by the politically powerful, is making only slow progress towards acceptability. Applied research, which applies known tools in new contexts, is acceptable. But instrumentalist research, which seeks solutions to problems, is still perceived to be 'unclean', especially where the context is public policy rather than management or strategy.

Privacy-related research evidences a combination of the least fashionable features: it deals in muddy concepts, soft data, uncertainty of process, politically-alive issues, and contentious public policy questions.


Conclusions

When privacy infects a research domain, or is expressly the topic of research, the quality that is capable of being attained is significantly lower than that which is achievable in other areas. The intrinsic quality of research can be improved by the use of techniques that provide reasonably-deep-but-reasonably-broad rather than broad-but-shallow data. Focus groups are a valuable tool for these purposes, but are shunned in academic circles. Deep research methods such as field studies and case studies are weak, however, because attitudes are so highly variable, and the applicability of outcomes is very limited without sufficient breadth to complement the depth.

Publication will be feasible in marginal conferences and journals, and in specialised conferences and journals. Publication in the mainstream of information systems depends on change in the notions of quality applied by senior editors, much greater emphasis on relevance even when at the cost of rigour, and acceptance of a focus on public policy as being as legitimate as information technology applications, management and strategy.

I argued some years ago that a researcher whose career depends on publications is well-advised not to adopt economic, legal and social implications of information systems as their sole specialisation (Clarke 1988). The outlook has improved marginally during the intervening 15 years, but the publication of privacy-related research will continue to depend on ingenuity and opportunism.


References

Clarke R. (1988b) 'Economic, Legal and Social Implications of Information Technology', MIS Qtly 12,4 (December 1988) 517-9, at http://www.anu.edu.au/people/Roger.Clarke/DV/ELSIC.html

Clarke R. (1997) 'Introduction to Dataveillance and Information Privacy, and Definitions of Terms', August 1997, at http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html#Priv

OFPC (2001) 'Privacy and the Community' Office of the Federal Privacy Commissioner, Sydney, July 2001, at http://privacy.gov.au/publications/rcommunity.html#3.5


Panellist Biography

Roger Clarke is a consultant in e-business, information infrastructure, and dataveillance and privacy. During his 30 years in the IS profession and discipline, he has migrated from technical matters and the management of commercial software development, to an emphasis on strategic and policy aspects of information and information technology. He regrets that the discipline is still resisting the policy perspective.

He holds degrees in IS from UNSW, and a doctorate from the ANU. Following 5 years in Europe, he spent 1984-95 as a senior academic. He continues to publish in refereed outlets and to supervise postgraduate candidates, and sustains his academic associations through several visiting positions.


Navigation

Go to Roger's Home Page.

Go to the contents-page for this segment.

Send an email to Roger

Created: 26 July 2002

Last Amended: 26 April 2003


These community service pages are a joint offering of the Australian National University (which provides the infrastructure), and Roger Clarke (who provides the content).
The Australian National University
Visiting Fellow, Faculty of
Engineering and Information Technology,
Information Sciences Building Room 211
Xamax Consultancy Pty Ltd, ACN: 002 360 456
78 Sidaway St
Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 1472, 6288 6916