Roger Clarke's Web-Site

© Xamax Consultancy Pty Ltd,  1995-2024
Photo of Roger Clarke

Roger Clarke's 'Critical Analysis of Works'

The Challenges Involved in Establishing a Research Technique

Version of 23 July 2019, revs. 16 February 2020

Published in Australasian J. of Infor. Syst. 24 (March 2020)

A Post Publication Review and Response appeared in Vol. 24 (November 2020)

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2018-20

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at http://www.rogerclarke.com/SOS/CAPW-C.html


Abstract

Many research techniques are well-accepted within the Information Systems (IS) discipline. From time to time, however, a researcher investigates a question that requires a novel approach. It is then incumbent on the researcher to justify that approach.

The IS discipline has accumulated a large corpus of published works. A project is being undertaken whose purpose is to establish the conceptual foundations for a research technique for the critical analysis of published works, and to develop methodological guidance for its application. This article discusses the challenges that have confronted that undertaking.


Contents


1. Introduction

Since the Information Systems (IS) discipline emerged c. 1965, a substantial body of publications has accumulated. For example, in mid-2019, the AIS eLibrary contains over 16,000 refereed works, John Lamp's directory of IS journals identifies almost 700 active venues that publish IS works, of the order of 7,500 articles have been published in the `Basket of 8' journals alone, and the major five IS conferences alone publish an additional 1,500 refereed papers per annum.

The accumulated IS literature evidences two particularly strong desires. A great deal of emphasis is placed on empirical research involving the observation and measurement of some aspect of IS practice. In addition, many papers are published that seek to establish or extend theories about IS practices, which can then provide a basis for further empirical research.

Important though these approaches are, progress in the discipline depends on some further elements as well. Meta-discussions are needed, in order to clarify such aspects as the discipline's scope, the meanings of key terms, and the suitability of particular research techniques for particular purposes. The project reported on in this article is a contribution to one such meta-discussion.

The starting-point for the project reported on here is a conviction that critical thought about the existing IS literature has a vital role to play. Criticism is, after all, central to the notion of science, in that all propositions must be in principle refutable, all propositions must be regarded as provisional, and all propositions must be subjected to testing (Popper 1963). For this reason, criticism is inherent in the review process whereby works are evaluated as a condition of publication in refereed venues.

The IS discipline has to date demonstrated considerable nervousness about works that contain criticism of prior publications, and especially about works whose specific purpose is to criticise prior publications. This project is a response to a particular instance of this nervousness. I conducted critical analysis of the articles in a Special Issue published by a leading, research-domain-specific journal that publishes many papers relevant to the IS discipline. I submitted the paper to the same journal as published the Special Issue. It was rejected, with a key factor being that the research technique adopted in the research was regarded by the reviewers as being illegitimate. It was subsequently also rejected by an IS 'Basket of 8' journal, but with an invitation to split the work into two articles, further develop the methodological component, and re-submit the substantive article as and when the methodological article achieves acceptance.

Although the conduct and publication of critiques of prior works are vital to progress in the discipline, the research methodology literature does not appear to currently provide a sufficient basis for work of this nature. A project has accordingly been undertaken to identify the necessary conceptual foundations, and develop a research technique for the critical analysis of published works.

The present article provides an overview of the project as a whole, with a focus on the challenges that have been encountered, and the difficulties that need to be overcome in order to establish the technique. The article commences by discussing methodological considerations in the establishment of a research technique. The nature and role of criticism in research and the relevance of critical theory research are considered. The article then identifies multiple existing research techniques that have relevance to this specific purpose, in order to enable conceptualisation of the new technique. The key contribution of this article is the identification of challenges that need to be overcome in order to establish a new technique within the IS discipline.


2. Methodological Considerations

A research technique needs to reflect the accumulated knowledge about research methods generally, and to be applicable to some particular category of research questions. This section addresses those foundational aspects of the project.

2.1 Definition of Terms and Scope

The IS literature evidences considerable looseness in its use of key terms relevant to this work. The term 'research methodology' is particularly badly abused. This article uses the following terms carefully, in a manner consistent with Kaplan (1964, pp. 18-19):

The first challenge in devising a new research technique is to achieve clarity about the kinds of research that it is intended to address. The objective of the present project is defined as:

the development of a research technique for the critical analysis of the content of published works relevant to the IS discipline.

The term 'published works' (short form 'works') is intended to encompass entries in a wide variety of 'publishing venues', including at least refereed articles in journals, refereed papers in conference proceedings, refereed chapters in academic books, refereed academic books as a whole, and completed postgraduate dissertations. The works and venues might reasonably be extended to papers in workshops and symposia, and for some purposes may include PrePrints or Working Papers published by institutions with appropriate standing. In some circumstances, a case could be made for encompassing research reports commissioned by government agencies, foundations, industry associations and perhaps individual corporations. Some other forms of publication may be relevant, depending on the specific research purpose. For example, technical media or corporate white papers might be included, if the research purpose were to, for example, assess the impact of academic work on thinking among consultancies or within industry sectors.

There are multiple forms of research question to which the proposed technique might be applied. Table 1 contains several examples, intended to provide a sense of the scope of application.

Table 1: Sample Research Questions

Methodological Critique:

1. Does each work describe the research method at a sufficient level of detail to enable the reader to evaluate the research quality?

2. Does each work consider the suitability of the data to the data analysis techniques that are applied to the data, or does it merely assume suitability?

3. Does each work provide access to sufficient supporting materials to enable audit?

Critique of the Theoretical Base:

4. Does each work reflect major prior publications in the area?

5. Is the scope of publications that are cited in each work's literature review sufficiently comprehensive?

6. To what extent does each work appropriately represent and apply the publications that it cites?

Substantive Critique:

7. In each work, what unstated assumptions can be detected?

8. Does each work address the issue of heterogeneity and instability in the relevant phenomena?

9. Does each work evidence an understanding of the power relationships within the research domain?

10. To what extent does each work reflect the interests of sub-dominant stakeholders?

_________

The term 'content' refers to the text of the published work, encompassing sentences, tables, diagrams and formulae, and taking into account the context within which the work is situated. Depending on the nature of the work, the context is likely to include aspects of the research domain, the academic discipline and cultural factors.

Which publishing venues are 'relevant to the IS discipline' depends on the nature of the project. Beyond mainstream IS journals and conferences, for example, venues that publish IS works may be relevant, as may venues used by particular reference and cognate disciplines, e.g. in management, social sciences, or computer sciences.

The scope-definition for particular studies will vary considerably. Some might consider the complete corpus of 'published works relevant to the IS discipline'. The resource-intensiveness of the technique is such, however, that it is far more likely that each project's focus will be on a particular population segment, with a sampling frame and sample selected from within that segment. A segment might be defined to be a particular venue (e.g. the complete set of ICIS Proceedings), or a time-span within one or more venues (e.g. the last 10 years of the `Basket of 8' IS journals), or a focussed collection (e.g. one or more journal special issues, narrowly-specialised conferences, or academic books). Other approaches would be to define the population segment on the basis of the research domain that the works address, the theoretical lens that they use, or the research technique that they apply. The smallest sample-size, a single work, would only appear likely to be appropriate under very particular circumstances, such as papers that are widely-regarded as being highly significant, in particular on the basis of citation-count or some other measure of influence, or are highly original.

2.2 Meta-Methodology

A second set of challenges arises in relation to the desirable characteristics of a method for the creation of a method - hence 'meta-methodology'.

A research technique is a socio-technical artefact as that term is used in design science (Niedermann & March 2012, Gregor & Hevner 2013, p.337). The design science approach is accordingly applicable to the project, and the process needs to reflect Peffers' Design Science Research Methodology (DSRM), with problem identification and definition of objectives followed by design and development (Peffers et al. 2007). However, as discussed in the later parts of this article, the later steps of DRSM - demonstration and evaluation - are challenges that the present project has not yet met.

On the other hand, a research technique is a very particular kind of socio-technical artefact, and it is desirable that the process adopted in this project be founded on generic guidance in relation to the establishment of a new research technique. Sources of the nature of meta-methodology have proven to be elusive, however. For example, scans of the large library of Sage Publications, and searches on terms such as <meta-methodology> and <research technique design> came up largely empty-handed. Within the IS literature, it is uncommon to encounter guidance. For example, a foundational article on positivist case study technique (Benbasat et al. 1987) commences with a set of assertions of 'Key Characteristics of Case Studies', without any explanation of their source. A corresponding authority on interpretive case studies (Walsham 1995), discusses differences in the epistemological and ontological stances of the two approaches, and the role of theory, but offers little in relation to the process adopted and appropriate criteria for assessing the quality of the methodological guidance that the article offers.

Articles that provide guidance in relation to various forms of action research (Avison 2002, Davison et al. 2004) limit the criteria for judging the quality of a research technique to rigour and relevance. Klein & Myers (1999) offers principles for conducting "interpretive research of a hermeneutic nature" and presents quality criteria for that particular category of research technique, but it does not declare the process and the quality criteria that guided the authors' work in constructing that guidance.

In the area of systematic literature reviews, Okoli & Schabram (2010) adopt the criteria used by Fink (2005): there must be a process that is systematic, is the subject of an explicit explanation, and is reproducible by others, and the scope of works examined must be comprehensive. Boell & Cecez-Kecmanovic (2014) adds to that list "engagement with the literature [by means of] an ongoing hermeneutic process of developing understanding" (p.259).

In IS-cognate literature, Bonoma (1985), a work cited in Benbasat et al. (1987), posited two criteria for "sound" research methods in social science. One is "'data integrity' ... those characteristics of research that affect error and bias in research results. It is an amalgam of what is variously referred to as 'internal validity' ..., 'statistical conclusion validity', and 'reliability'". The other is what the author calls "'currency' ... the characteristics of research that affect the contextual relevance of findings across measures, methods, persons, settings, and time. It is an amalgam of what is variously termed 'external validity' ... and 'pragmatic' or 'ecological validity'" (p.200). In practice, the author notes, there are conflicts between the two sets of objectives, and trade-off is unavoidable, i.e. all research techniques are, by their nature, inevitably flawed. "Ideally, a researcher can simultaneously pursue high levels of data validity and generalizability by adopting triangulation strategies which provide replication and/or corroboration of findings across methods within a single research project" (p.201). The author concedes, however, that 'intra-project triangulation' is very challenging, and that 'inter-project triangulation' is more common. The appropriateness of particular techniques is related to the current state of knowledge in the particular field. For example, the author positions case research very low on the data integrity scale but high on the currency scale, and indicates its appropriateness where "theoretical development is scant or uncertain" (p.201).

Alvesson & Sandberg (2011) proposed a research technique for 'problematisation', the first part of which is for "identifying and challenging assumptions underlying existing literature" (p.247). They shift the conversation beyond merely identifying "gaps in the literature" (which 'underproblematizes' the prior literature, by reinforcing rather than challenging existing theories). They note disadvantages of disruptive approaches that focus on what is wrong with existing knowledge (which 'overproblematizes' the prior literature). In seeking a middle road, they propose a five-set typology of assumptions that can be subjected to challenge. In their terms, 'paradigmatic assumptions' encompass ontological, epistemological and methodological aspects. Political, moral- and gender-related factors are 'ideology assumptions'; 'in-house assumptions' are those of a school of thought; 'field assumptions' are shared with other schools of thought in the particular field; and 'root metaphor assumptions' are associated with the imagery common in the relevant area of research. The quality criteria that they use for their own work appear to be change without bloodshed, and the generation of interesting new research questions.

On the basis of these varied exemplars, I posit the following as appropriate guidance in relation to the process of developing the new research technique:

As regards the product that this undertaking seeks to deliver, I posit that a body of knowledge about a research technique comprises the following elements: contextual information; conceptual foundations; guidance in relation to the process to be used; exemplars of processes that have been articulated in order to address particular kinds of research questions; and expositions of the technique in use. The body of this article addresses the first three of those elements and includes preliminary discussion of the last two.


3. Criticism

The challenge addressed in this section is to clarify key features of the category of research questions for which the new technique is being devised. The focus of this project is the critical analysis of published works. Two aspects of the notion of criticism are particularly relevant. The first is the role that criticism plays in scientific research generally and in IS research in particular. The second is the genre of critical theory research.

3.1 The Role of Criticism in Research

The purpose of undertaking content analysis may be simply exposition, that is to say the identification, extraction and summarisation of content, without any significant degree of evaluation. There are benefits in undertaking content analysis in a positive frame of mind, and in assuming that all that has to be done is to present existing information in brief and readily-accessible form (as, for example, much of the present article does).

Alternatively, the researcher may bring a questioning and even sceptical attitude to the work. A common purpose of literature reviews is to depict the current state of theory in an area. The purpose may be to draw inferences for the particular context relevant to the project. Alternatively, it may be to identify gaps in the present body of theory. Gap-identification is a gentle form of criticism in the sense in which the term is used in this article. Alvesson & Sandberg (2011) distinguishes three modes of gap-spotting: confusion-spotting (where competing explanations exist); neglect-spotting (an area that is overlooked or under-researched, or where empirical support is lacking); and application-spotting (extending and complementing existing literature by drawing further inferences that can be tested).

The notion of criticism goes further than merely identifying previously under-researched corners of theory. A critic asks hard questions that necessarily cut to the core of academic work. Is it reasonable to assume that all relevant published literature is of high quality? that the measurement instruments and research techniques have always been good, well-understood by researchers, and appropriately applied? that there have been no material changes in the relevant phenomena? and that there have been no material changes in the intellectual contexts within which research is undertaken?

The term 'criticism' is often used in a pejorative sense, implying that the critic is merely finding fault, is being destructive rather than constructive, and is failing to propose improvements to overcome the faults while sustaining the merits. The sense in which the term is used here, however, is related to `literary criticism', and embodies both positive and negative sentiments.

As the term is used in this article:

Criticism presents an analysis of relevant features of a body of work, both positive and negative, including its framing, the analysis undertaken, and the inferences drawn.

Such terms as `constructive criticism' and 'critique' might be preferred. They have the advantages of playing down the negative aspects, of indicating that the process is systematic, and of bringing focus to bear on the contribution being made by both the works that are being subjected to analysis and the critique they are being subjected to.

The justification for applying a sceptical eye to a body of work is that criticism plays a vital role in the scientific process. The conventional Popperian position is that the criterion for recognising a scientific theory is that it deals in statements that are empirically falsifiable, and that progress depends on scrutiny of theories and attempts to demonstrate falsity of theoretical statements: "The scientific tradition ... passes on a critical attitude towards [its theories]. The theories are passed on, not as dogmas, but rather with the challenge to discuss them and improve upon them" (Popper 1963, p.50).

However, senior members of a discipline commonly behave in ways that are not consistent with the Popperian position. This might be explained by the postulates of 'normal science', which view the vast majority of research work as being conducted within a 'paradigm' (Kuhn 1962) or 'disciplinary matrix' (Kuhn 1977) - "the common possession of the practitioners of a professional discipline [including its] symbolic generalizations, models, and exemplars" (p.296) - and subject to its conventions. In more practical terms, the problem may arise because senior members of any discipline have strong psychic investment in the status quo, and - nomatter how cogent and important the argument - react negatively against propositions perceived to be at least disruptive, and even revolutionary. Firmly-worded criticism is normal in reviewers' reports on as-yet unpublished submissions, and may be acceptable if uttered by a senior member of a discipline about a contrarian idea, whereas it commonly attracts opprobrium if made about the contemporary wisdom, by an outsider or a relative newcomer.

In an influential commentary, Webster & Watson (2002) recommended that "In contrast to specific and critical reviews of individual papers, tell the reader what patterns you are seeing in the literature ... Do not fall into the trap of being overly critical" (p.xviii). On a literal reading, the authors merely warn against unduly strong or negative expression - a recommendation that the research technique under development needs to adopt. Unfortunately, the quoted words are capable of being interpreted as valuing politeness among researchers more highly than scientific insight and progress.

Similarly, Straub (2009, p.viii) advised authors that "papers should be in apposition [the positioning of things side by side or close together] rather than in opposition". Clearly, where a new theory subsumes an old one (as was the case in the example provided, of Einsteinian relativity absorbing Newtonian physics as a special case), the dictum 'apposition rather than opposition' is appropriate. Where, on the other hand, the new theory actively contradicts or is fundamentally at odds with, existing theory, it would be intellectually dishonest to represent the contrary or disruptive theory as though it were a soul-mate to, or merely a refinement of, existing theory. Further, a reader might all-too-easily interpret the advice as expressing a moral judgement that 'criticism is a bad thing'. My contention is that scientific behaviour demands the opposite: it is an obligation of researchers to 'think critically' and to 'apply their critical faculties'. Politeness of expression and focus on the message rather than the messenger are elements of academic discourse, but they are second-order concerns rather than fundamental to science.

3.2 Critical Theory Research

Positivism and interpretivism are well-established approaches to research in IS. They have been joined by design science. And they have an odd bedfellow, in the form of what is variously termed 'critical research' and 'critical theory research'. The term 'critical' in this context is somewhat different from the sense of 'analysis of the merits and faults of a work' discussed in the previous section.

Both positivism and interpretivism are concerned with description and understanding of phenomena and behaviours. Sometimes the focus is on natural phenomena, but frequently the interest is in natural phenomena that have been subjected to an intervention. Importantly for the present project, however, both positivism and interpretivism involve strenuous avoidance of moral judgements and of 'having an agenda'. Design research, in contrast, is expressly purposeful and value-laden, in that the features designed into the artefact embody judgements about what is good in the particular context, and whose interests the goodness is to serve.

Critical theory research is concerned with description and understanding of phenomena and behaviours, but, like design science, it 'has an agenda'. Chua (1986) distinguished critical research from the positivist and interpretivist perspectives on the basis that the relations among real-world objects are transformed through subjective interpretation, conflict among actors' interests is endemic to society, interpretations reflect ideology, and theory must take ideologies into account. In addition, and most contentiously, critical research embodies an assumption that theory must strive to overcome ideological dominance. See also Alvesson & Deetz (2000). Beyond mere 'gap-spotting', Sandberg & Alvesson (2011) champions 'problematization', a term popularised by Foucault in the 1960s. It is concerned with 'the defamiliarisation of common sense', with "a central goal ... to try to disrupt the reproduction and continuation of an institutionalized line of reasoning" (p.32). Among the four forms that the authors discuss is 'critical confrontation'.

The IS literature reflects these sources, with critical research described as recognising the effects of power and the tendency of some stakeholders' interests to dominate those of other stakeholders. It brings to light "the restrictive and alienating conditions of the status quo" and expressly sets out to "eliminate the causes of alienation and domination" (Myers 1997). "Critical IS research specifically opposes technological determinism and instrumental rationality underlying IS development and seeks emancipation from unrecognised forms of domination and control enabled or supported by information systems" (Cecez-Kezmanovic 2005, p.19). In Myers & Klein (2011), three elements of critical research are identified:

Elements of critical theory research are relevant to the present purpose. However, the sense of 'criticism' or 'critique' adopted by critical theory researchers is strongly value-laden and partisan, and hence wilfully disruptive. That contrasts with the definition of 'criticism' adopted here, which expressly involves "analysis of relevant features of a body of work, both positive and negative".


4. Relevant Research Techniques

The previous section established the distinctive difference about the intended new research technique, and hence clarified the justification for devising a new research technique. The next challenge is to identify and evaluate prior techniques whose purposes are related to it, in order to co-opt aspects that are relevant and avoid aspects that are not.

This section outlines a series of categories of existing research techniques that have at least superficial relevance to the present purpose. Consideration is given firstly to qualitative research techniques and then to traditional and systematic literature reviews. The third group comprises techniques referred to generically as `content analysis'. A separate section outlines a particular and rather different form of content analysis. The final category is a form of literature review compatible with critical theory research.

4.1 Qualitative Research Techniques

A significant proportion of research involves the appraisal of content previously uttered by people. Qualitative research techniques such as ethnography, grounded theory and phenomenology involve the disciplined examination of content. That content may be uttered in natural settings by people within the research domain (e.g. in emails or social media postings) or it may be generated by researchers who inject themselves into otherwise natural settings by undertaking field research. Alternatively, the content may be captured in a contrived setting (laboratory experiments), or in a partially natural and partially contrived setting (e.g. interviews conducted in the subject's workplace). The content may originate as text, or in some other form, such as communications behaviour in verbal form - possibly transcribed into pseudo-text; as natural non-verbal behaviour ('body-signals'); or as non-verbal, non-textual communications behaviour (such as ticks in boxes in structured questionnaires).

The issues arising with analysis of these kinds of content are very different from those associated with the analysis of carefully-considered, formalised content, uttered by researchers, in published works that have been at least strongly edited and in most cases subjected to peer-review.

4.2 Literature Review

A context that is more closely related to the present purpose is literature reviews that examine substantial bodies of prior research. Traditional `narrative' reviews have been criticised as being insufficiently rigorous (Oakley 2003, p.23). There are now expectations of structure, transparency and replicability, and the term `systematic' review is commonly applied (Webster & Watson 2002, Okoli & Schabram 2010, Bandara et al. 2011). Examples particularly relevant to the present project include Galliers & Whitley (2002, 2007), which analysed themes in the ECIS conference series, and Clarke (2012) and Clarke & Pucihar (2015), which reviewed the corpus of Bled Conference papers. Grover & Lyytinen (2015) and Tarafdar & Davison (2018) reported on meta-analyses of articles in the `Basket of 8' IS journals.

Although such approaches have relevance to the present project, their focus is on exposition, application, or at most interpretation, and the research techniques used were not devised in order to support critical analysis. Relevantly, Boell & Cecez-Kecmanovic (2015) argue that the recipes provided for 'systematic literature reviews' are not necessarily appropriate, particularly where the research technique that is applied allows for emergent and changing understanding of the phenomena, and where even the research question itself may be emergent.

4.3 Content Analysis

The term `content analysis' refers to a cluster of techniques that seek to classify content into a manageable number of categories. Two definitions are:

"a research technique for making replicable and valid inferences from texts (or other meaningful matter) to the contexts of their use" (Krippendorff 2013, p. 24)

"the semantic analysis of a body of text, to uncover the presence of strong concepts" (Indulska et al. 2012, p.4, citing Weber 1990)

Many authors have attempted to categorise the various forms that content analysis takes. (See, for example, Hsieh & Shannon 2005, Drisko & Maschi 2016). The following four-way classification scheme is suitable for the present purpose:

(1) Pre-defined (a priori) categories

A methodologically strong approach is to derive the categories from existing theory. To the extent that the declared or inferred content of the text does not fit well to the predefined categories, there may be a need to consider possible revisions of the coding scheme, or even of the theory on which the research design was based. It may be feasible to draw inferences based on counts of the occurrences of categories and/or on the intensity of the statements in the text, such as the confidence inherent in the author's choice of language (e.g. "this shows that" cf. "a possible explanation is that"). However, as with any theory-driven research, the evidence extracted from the text may have a self-fulfilling-prophecy quality about it, i.e. there is an inevitable tendency to find more evidence in support of a theory than in conflict with it, and contextual factors may be overlooked. In order to enable auditability, it is important that not only the analysis be published, but also the raw material, the coding scheme, and the coding, at a sufficiently high level of granularity.

(2) Emergent categories

This approach is particularly appropriate when there is an absence of suitable theories to guide the establishment of a priori categories; but it also has relevance where the researcher is quite specifically challenging existing theories in the area. The process of necessity involves assumptions, and hence external validity of conclusions arising from this approach is likely to be limited. Depending on the degree of generality of the conclusions claimed by the author, full disclosure of the text selection, coding and inferencing procedures may be merely desirable or vital.

(3) Summative content analysis

This involves "counting and comparisons, usually of keywords or content, followed by the interpretation of the underlying context" (Hsieh & Shannon 2005, p.1277). The first step is to explore usage, by "identifying and quantifying certain words or content in text with the purpose of understanding the contextual use of the words or content" (p.1283).

Because of the complexity and variability of language use, and the ambiguity of a large proportion of words and phrases, a naive approach to counting words is problematic. At the very least, a starting-set of terms needs to be established and justified. A thesaurus of synonyms and perhaps antonyms and qualifiers is needed. Allowance must be made for both manifest or literal meanings, on the one hand, and latent, implied or interpreted meanings (in semiotic terms, `pragmatics'), on the other. Counts may be made not only of the occurrences of terms, but also of the mode of usage (e.g. active versus passive voice, dis/approval indicators, associations made).

(4) Programmatic content analysis

Automated processing enables much larger volumes of text to be analysed. The coding scheme may be defined manually, cf. directed content analysis / a priori coding. However, some techniques involve purely computational approaches to establishing the categories, cf. 'machine-intelligent' (rather than human-intelligent) emergent coding. The processing depends, however, on prior data selection, data scrubbing and data-formatting. In addition, interpretation of the results involves at least some degree of human activity. Debortoli et al. (2016), distinguishes three alternative approaches to programmatic coding:

Given that the 'big data analytics' movement is highly fashionable, vast volumes of data are available, and there is a comfort factor involved in office-based work much of which is automated, it would appear reasonable to anticipate that programmatic analysis techniques may be a growth-area in the coming few years - although that growth may be tempered as their limitations come to be better appreciated (Clarke 2016a, 2016c, 2018).

Content analysis techniques exhibit varying degrees of structure and rigour, from impressionistic to systematic, and they may involve qualitative and/or quantitative assessment elements. Qualitative data may be gathered on a nominal scale (whereby differences are distinguished, but no ordering is implied) or on an ordinal scale (such as 'unimportant', 'important', 'very important'). Quantitative data, on the other hand, may be on an ordinal scale, or on a cardinal or interval scale (i.e. an ordinal scale but with equal distances between values, e.g. degrees Celsius), or on a ratio scale (i.e. a cardinal scale, but with the additional feature of a natural zero, e.g. degrees Kelvin). Data collected on the higher-level scales, especially on a ratio scale, is able to be subjected to more powerful inferencing techniques.

Quantification generally involves measurement. This may be by counting, by comparing a real-world attribute against some standard, or by 'sensing' a real-world attribute and using the impulse to create one or more data-items whose contents are intended to correspond with the state of the attribute. Although there is a common perception that quantification produces data that reliably represents the real world, many conditions need to be fulfilled for this to be so. Units of measure have arbitrary boundaries, and the configuration and calibration of measuring instruments is challenging. Moreover, some quantification techniques are justified by conventions that are poorly supported by theory. A significant example is the frequently-encountered but unjustified assumption that 'Likert-scale' data is not merely ordinal, but is on an interval scale (i.e. the spaces between the successive terms are identical), and even ratio (i.e. the scale also features a natural zero), in order to justify the application of powerful statistical techniques to the data.

Many authors, particularly those who work in the positivist tradition, implicitly equate qualitative data with subjectivity and quantification with rigour. This has the effect of deprecating qualitative analysis, or at least relegating it to pre-theoretical research, which by implication should be less common than research driven by strong theories. However, the majority of authors who make those assumptions spend only limited time considering the extent to which their own assumptions, and the processes underlying the act of quantification, may be arbitrary or themselves 'subjective'. Positivism embodies an implicit assumption that computational analysis necessarily leads to deep truth. The assumption needs to be tested in each particular circumstance, yet such testing is seldom evident.

Although quantification has advantages in terms of the scope for applying powerful analytical techniques, the degree of analytical rigour that quantification can actually deliver is heavily dependent on a number of factors. Critical among them are the text selection; the express judgements and implicit assumptions underlying the choice of terms that are analysed; the sophistication and comprehensiveness of the thesaurus applied; and the significance imputed to each term. In practice, a considerable proportion of research that uses content analysis techniques involves primarily qualitative data.

As with any research technique, all aspects need to be subject to quality controls. The coding scheme and the performance of the coding process are particularly critical (Krippendorff 1980, Weber 1990, Stemler 2001). Approaches include coding by individuals with strong experience in both the review of papers and the subject-matter, parallel coding by multiple individuals, review of individuals' coding by other parties, and publication of both the source materials and the detailed coding sheets, in order to enable audit by other parties.

The various approaches to content analysis outlined above appear to be accepted within the IS discipline, but their use has been somewhat limited. For example, in a survey of the articles published in six leading IS journals during the 1990s, Mingers (2003) found that the use of content analysis as a research technique was evident in only four of the journals, and even in those four in only 1-3% of all articles published during that time. In July 2019, of the over 16,000 refereed papers indexed in the AIS electronic library, 13 had the term 'content analysis' in the title (none published more recently than 2013, and only 5 of them in journals), and 72 in the Abstract (only 6 of them since 2013, but the most recent 12 in journals). In recently-published papers, the most common forms of text that have been subjected to content analysis are social media, consumer reviews, and other message content, with other categories including newspaper articles and corporations' 'letters to shareholders'. There appears to have been very limited application of these techniques to published works.

Many forms of content analysis are primarily descriptive, and at best interpretive. Their focus is on concepts, themes, patterns and relationships. Moreover, the semantics and pragmatics of the source-materials may not be fully reflected, and only limited attention may be given to contextual factors. Critical analysis depends on interpretation, and it must delve deeply not only into the tenable meanings of the content, but also into the context and the assumptions inherent in the works being studied. It may also be necessary to follow trails such as cited works and terms used, in order to place the work within a genre, or within a `school of thought'.

4.4 Critical Discourse Analysis

A recent proposal endeavours to address these weaknesses. Wall et al. (2015) describe an approach that they say they have based on Habermasian strains of critical discourse analysis (CDA). Their starting-point is that "the information systems (IS) discipline is subject to ideological hegemony" (p.258). By `ideological hegemony' the authors mean "the conscious or unconscious domination of the thought patterns and worldviews of a discipline or subdiscipline that become ingrained in the epistemological beliefs and theoretical assumptions embedded in scientific discourse (Fleck, 1979; Foucault, 1970; Kuhn, 2012)". They argue that "ideologies can be harmful to individuals who are disadvantaged or marginalized by them, and they can be problematic to scientific research because they represent blind spots" (p.258), and hence that "review papers [should] ... challenge ideological assumptions by critically assessing taken-for-granted assumptions" (p.257). The authors propose a seven-step process for critical discourse analysis (pp. 265-9).

All of the forms of content analysis outlined in this and the previous sub-section have relevance to the present purpose. However, most do not extend beyond exposition to critique. The approach of Wall et al, (2015), on the other hand, adopts the 'critical theory research' tradition, and exhorts the righting of perceived wrongs to disadvantaged or marginalised individuals that arise from ideological assumptions adopted by designers. The scope of the critical analysis of published works research technique is intended to encompass not only critical theory research, but also research undertaken within the positivist and interpretivist traditions. It is therefore necessary to take care in applying the ideas embodied within critical discourse analysis, in order to avoid embedding commitment to social causes.

4.5 A Hermeneutic Approach to Literature Review

The emphasis on 'systematic' literature reviews (SLR) noted earlier has itself been subjected to criticism, in that it "suppresses aspects of quality in research and scholarship that are at least as important as clarity, countability and accountability - such as intertextual connectivity, critique, interest, expertise, independence, tacit knowledge, chance encounters with new ideas, and dialogic interactions between researcher, 'literature' and 'data'" (MacLure 2005, p.394). More substantially, Boell & Cecez-Kecmanovic (2015) argue that the literature on the SLR technique has effectively kidnapped the term 'systematic'. It has linked with the notion 'systematic' the requirements that literature review must necessarily begin with a clearly-defined research question, and that its results must evidence or must deliver rigour, objectivity, replicability, and absence of bias. Boell & Cecez-Kecmanovic contend firstly that these are not necessarily delivered by SLR, and secondly that, even if they are attainable, they are, for some forms of research, undesirable and harmful.

Further, in Boell & Cecez-Kecmanovic (2014) it is argued that a constructively loose and iterative process is needed, to avoid undue constraints and unlock insight and creativity: "Highly structured approaches downplay the importance of reading and dialogical interaction between the literature and the researcher; continuing interpretation and questioning; critical assessment and imagination; argument development and writing - all highly intellectual and creative activities, seeking originality rather than replicability [MacLure, 2005, Hart, 1998]" (p.258).

To address these issues, Boell & Cecez-Kecmanovic "propose hermeneutic philosophy as a theoretical foundation and a methodological approach for studying literature reviews as inherently interpretive processes in which a reader engages in ever exp[a]nding and deepening understanding of a relevant body of literature. Hermeneutics does not assume that correct or ultimate understanding can be achieved, but instead is interested in the process of developing understanding" (p.259, emphasis added). Their framework, reproduced in Figure 1, comprises two intertwined cycles: a search and acquisition circle, and a wider analysis and interpretation circle (p.263). Rather than a carefully-planned and closed-ended process, this approach embodies "questioning and critical assessment ... of previous research" (p.258). "Critical assessment ... not only reveals but also ... challenges the horizon of possible meanings and understanding of the problem and the established body of knowledge" (p.267).

The hermeneutic framework proposed is oriented towards the review of a body of literature. This will generally be selected because it has as its focus a particular body of theory, or perhaps a particular category of real-world phenomena or behaviour. Such a body of literature is within the scope of the notion of 'published works' that is the focus of the present project. On the other hand, the technique being devised here needs to encompass an even broader range of 'published work' populations and sampling frames, such as publishing venue(s) or research technique(s).


5. Conceptualisation of the Critical Analysis of Published Works

Having addressed some of the preliminary challenges, the aim now is to draw from the substantial methodological resources identified above, and establish a framework for the specific category of research questions for which the critical analysis of published works is the appropriate technique.

The preceding discussion has identified structured and pre-determinable techniques relevant to some forms of 'analysis of published works'. It has also highlighted a far more loose-limbed, adaptive and interpretive approach, which is imbued with strong, counter-cultural fervour, associated with the critical discourse analysis approach to content of Wall et al. (2015), and the hermeneutic approach to literature review of Boell & Cecez-Kecmanovic (2014). The diversity of flavour among contexts and purposes means that, in order to encompass all relevant variants, the new research technique needs to be described in a relatively abstract manner. Researchers then need to apply the abstract principles in order to customise a research method appropriate to each particular project.

Drawing on the preceding discussion, a set of strongly-desirable characteristics of the new research technique is presented in Table 2.

Table 2: Key Characteristics of the Critical Analysis of Published Works

The Focus of the Research

1. Formal Content

The analysis is concerned with works that present carefuly-considered textual content, in venues that generally apply formal review and editorial filters to the works that they publish

2. Content and Method

The analysis is concerned with the content of the works, including methodological aspects of the content

3. Critical Thought

The analysis applies critical thought to the content. This means that the analysis goes beyond exposition, description, explication, consolidation and interpretation of existing theory, and beyond the mere detection of gaps in existing theory, to the drawing of implications from theory, the identification of conflicts among stakeholder interests, and consideration of the impacts of ideologies on the research domain

4. Constructive Critique

The analysis is concerned with both the positive and negative features of the work, including its framing, the analysis undertaken, and the inferences drawn

5. Content rather than Intent

The analysis is of the content of works. This may include the author's express intentions, or the author's implied intentions where clear evidence exists, but not intentions of their authors that are inferred by the reviewer

The Framing of the Research

6. Appropriate Degree of Structuredness

Depending on the purpose of the research:

7. Designed-in Methodological Rigour

The analysis is subjected to such structuredness, consistency, comprehensiveness, transparency and auditability constraints as are feasible and desirable in the particular context

The Research Process

8. Phased Reading

Preliminary 'orientational reading' is followed by deeper 'analytical reading', in order to provide confidence in the appropriateness of the selection, coding and interpretation of passages. By 'passage' is meant a text-segment of any length, including word, word-group. phrase, clause, sentence, paragraph or paragraphs. (For example, the Macquarie Dictionary defines a passage as "an indefinite portion of a writing, speech, or the like ...")

9. Coding Rigour

Sources of categorisations are documented, and appropriate techniques are applied, which may fall anywhere within a wide range, from directed content analysis using a priori coding, via emergent coding, to critical discourse analysis

10. Appropriate Quantification

11. Analytical Rigour

12. Iteration

The analysis involves openmindedness and introspective questioning by the researcher about their analysis and assumptions, including loops back to earlier phases of the process, possibly resulting in adaptation of the project's framing, re-design of coding schemes, re-coding and/or reconsideration of inferences drawn, as appropriate in the context

The Presentation of the Research

13. Declaration of the Final Form of the Research Question

14. Declaration of the Research Method

15. Declaration of Researcher Perspective

The values underlying the researcher's approach are identified, and where appropriate are compared and contrasted with those that the researcher perceives to be embedded in the works

16. Qualitative Critique

A textual critique is presented, to the extent appropriate supported by quantitative evidence, analysing the positive and negative aspects of the selected works, and perhaps by implication of the sampling frame or the population

17. Inferential Rigour

18. Careful Expression

The text is expressed in a measured manner, avoiding unduly strong expression, unduly positive or negative expression, and overly colourful speech

19. Avoidance of argumentum ad hominem

20. Expression in Apposition Unless the Evidence Warrants Expression in Opposition

The text is no more confronting to or disruptive of existing theory than the evidence arising from the analysis warrants. In particular, where the conclusions and implications qualify, deepen or subsume existing theory, they are presented in apposition rather than in opposition


6. Applications of the Technique

The process of developing the new research technique has broadly followed he first three phases of Peffers' Design Science Research Methodology (DSRM - Peffers et al. 2007), viz. problem identification, objectives definition, and design and development. The next phase (demonstration) has been commenced but not brought to completion.

The first aspect of demonstration is the identification of specific contexts to which the technique is to be applied, and the second is the application of the framework in order to instantiate research methods appropriate to specific research questions. Four trials have been conducted and reported on, three of them published and one in second-round review. Each of them applies the theory of researcher perspective (Clarke & Davison 2020) to a particular set of published works. Of the sample research questions in Table 1, this most closely relates to item 10. The first trial, reported in Clarke (2015), considered a modest set of papers from the ACIS conference and the AJIS journal. The second, reported in Clarke (2016b), examined a larger set of papers presented at the Bled conference. The third, in Clarke (2020), examined a set of papers in the journal 'Electronic Markets'. The article in review reports on studies of Basket of 8 articles.

The pilot projects undertaken to date attest to the practicality of the technique when articulated for, and applied in, a specific critical theory research context. The small set is not sufficiently rich to illustrate or demonstrate the usefulness of the guidance across all of the various project-types. Far less do the projects conducted to date represent Peffers' fifth phase (evaluation), which involves observation and measurement of the new artefact's effectiveness in addressing stated objectives.


7. Conclusions

This article has outlined the elements of a project to establish adequate conceptual foundations and guidance in relation to the implementation of a technique for critical analysis of the content of published works relevant to the IS discipline. It has identified and drawn on a wide range of sources on techniques that bear some kind of relationship to the present purpose. It has proposed a set of key characteristics of a process of this nature, and indicated the scope of application. Articulation to apply to specific categories of critical content analysis, and application to particular sets of published works, are ongoing parts of the overall project.

The description of the process undertaken in the project to date has identified a series of challenges that are likely to confront each endeavour to create a new research technique. It is necessary to achieve clarity about the category of research questions to which the technique is intended to be applicable. It is vital to reflect and build on the accumulated knowledge about research techniques generally; but the methodology literature is less helpful in this regard than might be imagined, and the same is true of exemplars within the IS literature that are highly-cited in relation to particular research techniques.

A further challenge involves identifying the key features of the category of research questions that are in focus. In the present example, this involves the notion of (constructive) criticism and aspects of critical theory research. Armed with this level of understanding, it becomes possible to identify pre-existing techniques, and assess the extent to which elements of them do and do not provide insights relevant to the new technique.

The foundations having been laid, it becomes possible to undertake conceptualisation of the new technique, through the specification of key characteristics that it needs to have. The remaining challenges (only partly discussed in this article) are the articulation of the technique into forms that are directly relevant to particular forms of research question, application of the technique to a sufficient set of samples, and demonstration that the technique enables research questions to be satisfactorily answered.

The work reported here is limited to a single new reseach technique, and the trial applications relate to a single form of the many kinds of research questions that are within-scope of the technique. It is accordingly infeasible to generalise from this instance to all, or even to other individual instances of, new research techniques. This article does, however, provide researchers with a somewhat stronger starting-point for their own endeavours than has previously been available, because it identifies key considerations, points to a range of relevant sources, and provides an articulated example of their use to develop a particular kind of new research technique.


References

Alvesson M. & Deetz S. (2000) 'Doing Critical Management Research' Sage, 2000

Alvesson M. & Sandberg J. (2011) 'Generating Research Questions through Problematization' Academy of Management Review 36, 2 (2011) 247-271

Avison D.E. (2002) 'Action research: a research approach for cooperative work' Proc. Conf. Cooperative Systems for Cooperative Work, Rio de Janeiro, September 2002

Bandara,W., Miskon S. & Fielt E. (2011) 'A Systematic, Tool-Supported Method for Conducting Literature Reviews in Information Systems' Proc. ECIS 2011, Paper 221,at http://eprints.qut.edu.au/42184/1/42184c.pdf

Benbasat I., Goldstein D.K. & Mead M. (1987) 'The Case Research Strategy in Studies of Information Systems' MIS Quarterly 11, 3 (September 1987) 369-386

Boell S.K. & Cecez-Kecmanovic D. (2014) 'A Hermeneutic Approach for Conducting Literature Reviews and Literature Searches' Communications of the Association for Information Systems 34, 12, at http://tutor.nmmu.ac.za/mgerber/Documents/ResMeth_Boell_2014_Literature%20Reviews.pdf

Boell S.K. & Cecez-Kecmanovic D. (2015) 'On being 'systematic' in literature reviews in IS' J. Infor. Techno. 30 (2015) 161-173

Bonoma T.V. (1985) 'Case Research in Marketing: Opportunities, Problems,and a Process' Journal of Marketing Research 22, 2 (May 1985) 199-208

vom Brocke J. & Simons A. (2008) 'Towards a Process Model for Digital Content Analysis - The Case of Hilti' Proc. Bled Conference, June 2008, at http://aisel.aisnet.org/bled2008/2

Cecez-Kecmanovic D. (2001) 'Doing Critical IS Research: The Question of Methodology' Ch.VI in 'Qualitative Research in IS: Issues and Trends: Issues and Trends' (ed. Trauth E.M.), pp. 141-163, Idea Group Publishing, 2001, at https://pdfs.semanticscholar.org/37b1/e4c060b93fcfa04d81f03b750e746ba42f2d.pdf

Cecez-Kecmanovic D. (2005) 'Basic assumptions of the critical research perspectives in information systems' Ch. 2 in Howcroft D. & Trauth E.M. (eds) (2005) 'Handbook of Critical Information Systems Research: Theory and Application', pp.19-27, Edward Elgar, 2005

Chua W.F. 'Radical Developments in Accounting Thought' The Accounting Review 61, 4 (October 1986) 601-632

Clarke R. (2012) 'The First 25 Years of the Bled eConference: Themes and Impacts' Proc. 25th Bled eConference Special Issue, PrePrint at .. http://www.rogerclarke.com/EC/Bled25TA.html

Clarke R. (2015) 'Not Only Horses Wear Blinkers: The Missing Perspectives in IS Research' Keynote Presentation, Proc. Austral. Conf. in Infor. Syst. (ACIS 2015), at https://arxiv.org/pdf/1611.04059, Adelaide, December 2015, PrePrint at http://www.rogerclarke.com/SOS/ACIS15.html

Clarke R. (2016a) 'Big Data, Big Risks' Information Systems Journal 26, 1 (January 2016) 77-90, PrePrint at http://www.rogerclarke.com/EC/BDBR.html

Clarke R. (2016b) 'An Empirical Assessment of Researcher Perspectives' Proc. Bled eConf., Slovenia, June 2016, at http://www.rogerclarke.com/SOS/BledP.html

Clarke R. (2016c) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at http://www.rogerclarke.com/EC/BDQAS.html

Clarke R. (2018) 'Guidelines for the Responsible Application of Data Analytics' Computer Law & Security Review 34, 3 (May-Jun 2018) 467- 476, https://doi.org/10.1016/j.clsr.2017.11.002, PrePrint at http://www.rogerclarke.com/EC/GDA.html

Clarke R. (2020) 'Researcher Perspectives in Electronic Markets' Electronic Markets 30,1 (March 2020), PrePrint at http://www.rogerclarke.com/EC/RPEM.html

Clarke R. & Davison R.M. (2020) 'Through Whose Eyes? The Critical Concept of Researcher Perspective' Journal of the Association for Information Systems 21,2 (March 2020), PrePrint at http://www.rogerclarke.com/SOS/RP.html

Clarke R. & Pucihar A. (2013) 'Electronic Interaction Research 1988-2012 through the Lens of the Bled eConference' Electronic Markets 23, 4 (December 2013) 271-283, PrePrint at http://www.rogerclarke.com/EC/EIRes-Bled25.html

Davison R.M., Martinsons M.G. & Kock N. (2004) 'Principles of canonical action research' Info Systems J (2004) 14, 65-86

Debortoli S., Mueller O., Junglas I. & vom Brocke (2016) 'Text Mining For Information Systems Researchers: An Annotated Topic Modeling Tutorial' Communications of the Association for Information Systems 39, 7, 2016

Fink A. (2005) 'Conducting Research Literature Reviews: From the Internet to Paper' Sage Publications, 2nd ed., 2005

Galliers R.D. & Whitley E.A. (2002) 'An Anatomy of European Information Systems Research - ECIS 1993 - ECIS 2002: Some Initial Findings' Proc. ECIS 2002, June 2002, Gdask, Poland, PrePrint at http://personal.lse.ac.uk/whitley/allpubs/ECIS2002.pdf

Galliers R.D. & Whitley E.A. (2007) 'Vive les differences? Developing a profile of European information systems research as a basis for international comparisons' Euro. J. of Infor. Syst. 16, 1 (2007) 20-35

Gregor S. & Hevner A. (2013) 'Positioning Design Science Research for Maximum Impact' MIS Quarterly 37, 2 (June 2013 ) 337-355, at https://ai.arizona.edu/sites/ai/files/MIS611D/gregor-2013-positioning-presenting-design-science-research.pdf

Grover V. & Lyytinen K. (2015) `New State of Play in Information Systems Research: The Push to the Edges' MIS Quarterly 39:2 (2015) 271-296

Hsieh H.-S. & Shannon S.E. (2005) 'Three Approaches to Qualitative Content Analysis' Qualitative Health Research 15, 9 (November 2005) 1277-1288, at http://www33.homepage.villanova.edu/edward.fierros/pdf/Hsieh%20Shannon.pdf

Indulska M., Hovorka D.S. & Recker J.C. (2012) 'Quantitative approaches to content analysis: Identifying conceptual drift across publication outlets' European Journal of Information Systems 21, 1, 49-69, at http://eprints.qut.edu.au/47974/

Kaid L.L. (1989) 'Content analysis' In Emmert P. & Barker L.L. (Eds.) 'Measurement of communication behavior', pp. 197-217, Longman, 1989

Kaplan A. (1964) 'The conduct of inquiry: methodology for behavioral science' Chandler, 1964

Klein H.K. & Myers M.D. (1999) 'A set of principles for conducting and evaluating interpretive field studies in information systems' MIS Quarterly 23, 1 (March 1999) 67-94

Krippendorff K. (1980) 'Content Analysis: An Introduction to Its Methodology' Sage, 1980

Krippendorff K. (2013) 'Content analysis: An introduction to its methodology' Sage, 3rd ed., 2013

Kuhn T.S. (1962) 'The Structure of Scientific Revolutions' University of Chicago Press, 1962

Kuhn T.S. (1977) 'Second Thoughts on Paradigms' Ch. 12 of Kuhn T.S. 'The Essential Tension: Selected Studies in Scientific Tradition and Change' University of Chicago Press, 1977, pp. 293-319

MacLure M. (2005) '`Clarity bordering on stupidity': where's the quality in systematic review?' Journal of Education Policy 20, 4 (2005) 393-416, at http://www.esri.mmu.ac.uk/respapers/papers-pdf/Paper-Clarity%20bordering%20on%20stupidity.pdf

Mingers J. (2003) 'The paucity of multimethod research: a review of the information systems literature' Information Systems Journal 13, 3 (2003) 233-249

Myers M.D. (1997) 'Qualitative research in information systems' MISQ Discovery, June 1997, at http://www.academia.edu/download/11137785/qualitative%20research%20in%20information%20systems.pdf

Myers M.D. & Klein H.K. (2011) 'A Set of Principles for Conducting Critical Research in Information Systems' MIS Quarterly 35, 1 (March 2011) 17-36, at https://pdfs.semanticscholar.org/2ecd/cb21ad740753576215ec393e499b1af12b25.pdf

Niederman F. & March S. (2012) 'Design Science and the Accumulation of Knowledge in the Information Systems Discipline' ACM Transactions on MIS 3, 1 (2012), 1

Oakley, A. (2003) Research evidence, knowledge management and educational practice: early lessons from a systematic approach, London Review of Education, 1, 1: 21-33, at http://www.ingentaconnect.com/contentone/ioep/clre/2003/00000001/00000001/art00004?crawler=true&mimetype=application/pdf

Okoli C. & Schabram K. (2010) 'A Guide to Conducting a Systematic Literature Review of Information Systems Research' Sprouts: Working Papers on Information Systems, 10, 26 (2010), at http://sprouts.aisnet.org/10-26

Peffers K., Tuunanen T., Rothenberger M. & Chatterjee S. (2007) 'A design science research methodology for information systems research' Journal of Management Information Systems, 24, 3 (Winter 2007) 45-77

Popper K. (1963) 'Conjectures and Refutations: The Growth of Scientific Knowledge' Harper & Row, 1963

Sandberg J. & Alvesson M. (2011) 'Ways of constructing research questions: gap-spotting or problematization?' Organization 18(1) 23-44

Stemler S. (2001) 'An overview of content analysis' Practical Assessment, Research & Evaluation 7, 17 (2001), at http://PAREonline.net/getvn.asp?v=7&n=17

Straub D. W. (2009) 'Editor's comments: Why top journals accept your paper' MIS Quarterly, 33, 3 (September 2009) iii-x, at http://misq.org/misq/downloads/download/editorial/3/

Tarafdar M. & Davison R.M. (2018) `Research in Information Systems: Intra-disciplinary and Inter-disciplinary Approaches' Forthcoming, Journal of the Association for Information Systems

Wall J.D., Stahl B.C. & Salam A.F. (2015) 'Critical Discourse Analysis as a Review Methodology: An Empirical Example' Communications of the Association for Information Systems 37, 11 (2015), at https://www.dora.dmu.ac.uk/xmlui/bitstream/handle/2086/11180/2015_Critical_Discourse_Analysis_Review_Methodology_CAIS%20(1).pdf

Walsham G. (1995) 'Interpretive Case Studies in IS Research: Nature and Method' European Journal of Information Systems 4, 2 (May 1995) 74-81

Webster J. & Watson R.T. (2002) 'Analyzing The Past To Prepare For The Future: Writing A Literature Review' MIS Quarterly 26, 2 (June 2002) xiii-xxiii, at http://intranet.business-science-institute.com/pluginfile.php/247/course/summary/Webster%20%20Watson.pdf

Weber R.P. (1990) 'Basic Content Analysis' Sage, 1990


Acknowledgements

The ongoing project of which this work is a part has benefitted from feedback by many colleagues, including during presentations of the emergent argument and analysis in a Keynote at the Australasian Conference in Information Systems (ACIS'15) in Adelaide in 2015, and at the Bled Conference in 2016 and 2017, the contributions of my co-authors on the evaluation of 'Basket of 8' works, and valuable comments received from a reviewer and participants in the Information Systems Foundations 2018 Workshop: Theorising Society in Information Systems, ANU, Canberra, 13-14 September 2018.


Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.



xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 65 million in early 2021.

Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 23 June 2018 - Last Amended: 16 February 2020 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/SOS/CAPW-C.html
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2022   -    Privacy Policy