Roger Clarke's Web-Site© Xamax Consultancy Pty Ltd, 1995-2024 |
||||||
HOME | eBusiness |
Information Infrastructure |
Dataveillance & Privacy |
Identity Matters | Other Topics | |
What's New |
Waltzing Matilda | Advanced Site-Search |
Revision of 11 September 2018
For presentation at the
Information
Systems Foundations 2018 Workshop: Theorising Society in Information
Systems
ANU, Canberra, 13-14 September 2018
© Xamax Consultancy Pty Ltd, 2018
Available under an AEShareNet licence or a Creative Commons licence.
This document is at http://www.rogerclarke.com/SOS/CAPW.html
The slide-set is at http://www.rogerclarke.com/SOS/CAPW.pdf
The Information Systems (IS) discipline has accumulated a large corpus of published works. Constructive criticism, both of individual works and of collections of works, is an essential element of a living discipline. On the other hand, there is resistance among IS editors and reviewers to papers whose primary contribution is a critique of prior publications. A project has been undertaken that is intended to provide conceptual foundations and methodological guidance to enable papers of that nature to achieve publication in quality IS venues. This paper provides an overview of the challenges involved in establishing a sufficient basis for a new research technique, and the approach being adopted to address them.
Since the Information Systems (IS) discipline emerged c. 1965, a substantial body of publications has accumulated. For example, in mid-2018, the AIS eLibrary contains over 15,000 refereed works, John Lamp's directory of IS journals identifies almost 700 active venues that publish IS works, of the order of 7,000 articles have been published in the `Basket of 8' journals alone, and the major five IS conferences alone publish an additional 1500 refereed papers per annum.
The accumulated IS literature evidences two particularly strong desires. A great deal of emphasis is placed on empirical research involving the observation and measurement of some aspect of IS practice. In addition, many papers are published that seek to establish or extend theories about IS practices, which can then provide a basis for further empirical research.
Important though these approaches are, progress in the discipline depends on some further elements as well. Meta-discussions are needed, in order to clarify such aspects as the discipline's scope, the meanings of key terms, and the suitability of particular methods for particular purposes. This paper is a contribution to one such meta-discussion.
The starting-point for the project reported on here is a conviction that critical thought about prior IS literature has a vital role to play. Criticism is, after all, central to the notion of science, in that all propositions must be in principle refutable, all propositions must be regarded as provisional, and all propositions must be subjected to testing (Popper 1963). For this reason, criticism is inherent in the review process whereby works are evaluated as a condition of publication in refereed venues.
The IS discipline has to date demonstrated considerable nervousness about works that contain criticism of prior works, and especially about works whose specific purpose is to criticise prior works. This project is quite specifically a response to a particular instance of this nervousness. I conducted a critical analysis of the articles in a Special Issue published by a leading, research-domain-specific journal that publishes many papers relevant to the IS discipline. I submitted the paper to the same journal as published the Special Issue. It was rejected, with a key factor being that the research technique adopted in the research was regarded by the reviewers as being illegitimate. It was subsequently also rejected by an IS 'Basket of 8' journal, but with an invitation to split the work into two articles, further develop the methodological component, and re-submit the substantive article as and when the methodological article achieves acceptance.
The author contends that the conduct and publication of critiques of prior IS works is vital to progress in the discipline. The research methodology literature does not appears not currently provide a sufficient basis for work of this nature. A project has accordingly been undertaken to identify the conceptual foundations for a research technique for the critical analysis of published works, and to propose operationalisation of the principles in the form of a customised research method for the conduct of one category of such research.
The present paper provides an overview of the project as a whole. It commences by discussing methodological considerations in the establishment of a research technique. The nature of the proposed technique is then juxtaposed against two existing techniques. The nature and role of criticism in research and the relevance of critical theory research are considered. The paper then utilises existing research techniques in the areas of content analysis and interpretive literature review to enable conceptualisation of the critical analysis of published works. A sample application of the technique is used as a means of operationalising the principles in the context of a category of projects.
A project of this nature inevitably involves a considerable number of elements. In order to produce an paper of conventional length, each section of necessity provides an overview only. Further details on most aspects are available in a small number of papers that have achieved publication to date, together with the author's Working Papers.
This paper uses several terms carefully, as follows:
In these terms, the purpose of the present project is the establishment of a new research technique. More specifically, the objective is
the development of a research technique for the critical analysis of the content of published works relevant to the IS discipline.
This is a socio-technical artefact as that term is used in design science (Niedermann & March 2012, Gregor & Hevner 2013, p.337), and hence the design science approach is broadly relevant to the project. The process adopted accordingly broadly followed the sequence of Peffers' Design Science Research Methodology (DSRM), with problem identification and definition of objectives followed by design and development (Peffers et al. 2007).
The term 'published works' is intended to encompass entries in a wide variety of 'publishing venues', including at least refereed articles in journals, refereed papers in conference proceedings, refereed chapters in academic books, and refereed academic books as a whole. The works and venues might reasonably be extended to papers in workshops and symposia, and for some purposes may include PrePrints or Working Papers published by institutions with appropriate standing, and possibly also research reports commissioned by government agencies, foundations, industry associations and perhaps individual corporations. Some other forms of publication may be relevant, depending on the specific research purpose. For example, technical media or corporate white papers might be included, if the research purpose were to, for example, assess the impact of academic work on thinking among consultancies or within industry sectors.
The term 'content' refers to the text of the published work, taking into account the broader context within which the work is situated. Depending on the nature of the work, the broader context is likely to include aspects of the research domain, the academic discipline and relevant cultural factors. Methodological aspects of the work are generally treated as out-of-scope for the present project, except where, and to the extent that, content describing or discussing the research method and data (effectively 'meta-data' about the research method) may be relevant to the analysis.
Which publishing venues are 'relevant to the IS discipline' depends on the context of the specific project. Beyond mainstream IS journals and conferences, for example, venues used by particular reference and cognate disciplines may also be relevant, e.g. in management, social sciences, or computer sciences.
It might be feasible to conduct studies of the complete corpus of 'published works relevant to the IS discipline'. The resource-intensiveness of the technique is such, however, that it is far more likely that each project's focus will be on a particular population segment, with a sampling frame and sample selected from within that segment. A segment might be defined to be a particular venue (e.g. the complete set of ICIS Proceedings), or a time-span within one or more venues (e.g. the last 10 years of the `Basket of 8' IS journals), or a focussed collection (e.g. one or more journal special issues, narrowly-specialised conferences, or academic books). Another approach would be to define the population segment on the basis of the research domain that the works address, or the research technique that they apply.
The smallest unit of study would be a single work. This would only appear likely to be relevant under a small number of circumstances, such as papers that are judged to be highly significant, in particular on the basis of citation-count or some other measure of influence, or to be highly original.
In principle, the project needs to be founded on generic guidance in relation to the establishment of a new research technique. Sources of the nature of meta-methodology are proving elusive, however, even within Sage Publishing's vast list, and the search continues. I have made the presumption that a body of knowledge about a research technique comprises contextual information, conceptual foundations, guidance in relation to the process to be used, exemplars of processes that have been articulated in order to address particular kinds of research questions, and subsequent expositions and reviews of the technique in use. The body of this paper addresses in turn all but the last of those elements.
A significant proportion of research involves the appraisal of content previously uttered by other people. This section briefly reviews categories of research technique whose focus is adjacent to the topic addressed in this paper.
Qualitative research techniques such as ethnography, grounded theory and phenomenology involve the disciplined examination of content. However, the content is of a kind materially different to refereed papers. The text may be generated in natural settings (field research), in contrived settings (laboratory experiments), or in a mix of the two settings (e.g. interviews conducted in the subject's workplace). The materials may originate as text, or as communications behaviour in verbal form - possibly transcribed into pseudo-text, as natural non-verbal behaviour ('body-signals'), or as non-verbal, non-textual communications behaviour (such as ticks in boxes in structured questionnaires). In other cases, text that arises in some naturalistic setting is exploited by the researcher (e.g. emails, social media postings).
The issues arising with analysis of these kinds of content are very different from those associated with the analysis of carefully-considered, formalised content in works that have been at least strongly edited and in most cases subjected to peer-review.
A context that is more closely related to the present purpose is literature reviews that examine substantial bodies of prior research. Traditional `narrative' reviews have been criticised as being insufficiently rigorous (Oakley 2003, p.23). There are now expectations of structure, transparency and replicability, and the term `systematic' review is commonly applied (Webster & Watson 2002, Okoli & Schabram 2010, Bandara et al. 2011). Examples particularly relevant to the present project include Galliers & Whitley (2002, 2007), which analysed themes in the ECIS conference series, and Clarke (2012) and Clarke & Pucihar (2015), which reviewed the corpus of Bled Conference papers. Grover & Lyytinen (2015) and Tarafdar & Davison (2018) reported on meta-analyses of articles in the `Basket of 8' IS journals.
Although such approaches have relevance to the present project, their focus is on description and interpretation, and the research techniques used were not devised in order to support critical analysis. Relevantly, Boell & Cecez-Kecmanovic (2015) argue that the recipes provided for 'systematic literature reviews' are not necessarily of quality, particularly where the research technique that is applied allows for emergent and changing understanding of the phenomena, and where even the research question itself may be emergent.
The focus of this project is the critical analysis of publications. Two aspects of the notion of criticism are particularly relevant. The first is the role that criticism plays in scientific research generally and in IS research in particular. The second is the critical theory research genre.
The purpose of undertaking content analysis may be simply exposition, that is to say the identification, extraction and summarisation of content, without any significant degree of evaluation. There are benefits in undertaking content analysis in a positive frame of mind, and in assuming that all that has to be done is to present existing information in brief and readily-accessible form (as, for example, much of the present paper does).
Alternatively, the researcher may bring a questioning and even sceptical attitude to the work. A common purpose of literature reviews is to depict the current state of theory in an area. The purpose may be to draw inferences for the particular context relevant to the project. Alternatively, it may be to identify gaps in the present body of theory. Gap-identification is a gentle form of criticism in the sense in which the term is used in this paper.
The notion of criticism goes rather further than merely identifying previously unresearched corners of theory. A critic asks hard questions that necessarily cut to the core of academic work. Is it reasonable to assume that all relevant published literature is of high quality? that the measurement instruments and research techniques have always been good, well-understood by researchers, and appropriately applied? that there have been no material changes in the relevant phenomena? and that there have been no material changes in the intellectual contexts within which research is undertaken?
The term 'criticism' is often used in a pejorative sense, implying that the critic is merely finding fault, is being destructive rather than constructive, and is failing to propose improvements to overcome the faults while sustaining the merits. The sense in which the term is used here, however, is related to `literary criticism', and embodies both positive and negative sentiments.
As the term is used in this paper:
Criticism presents an analysis of both the merits and faults of a body of work, including its framing, and the expression of the analysis and the inferences drawn.
Such terms as `constructive criticism' and 'critique' might be preferred. They have the advantage of playing down the negative aspects, indicate that the process is systematic, and bring focus to bear on the contribution being made by both the works that are being subjected to analysis and the critique they are subjected to.
The justification for applying a sceptical eye to a body of work is that criticism plays a vital role in the scientific process. The conventional Popperian position is that the criterion for recognising a scientific theory is that it deals in statements that are empirically falsifiable, and that progress depends on scrutiny of theories and attempts to demonstrate falsity of theoretical statements: "The scientific tradition ... passes on a critical attitude towards [its theories]. The theories are passed on, not as dogmas, but rather with the challenge to discuss them and improve upon them" (Popper 1963, p.50).
However, senior members of a discipline commonly behave in ways that are not consistent with the Popperian position. This might be explained by the postulates of 'normal science', which view the vast majority of research work as being conducted within a 'paradigm' and subject to its conventions (Kuhn 1962). In more practical terms, the problem may arise because senior members of any discipline have strong psychic investment in the status quo, and - nomatter how cogent and important the argument - react negatively against propositions perceived to be at least disruptive, and even revolutionary. Firmly-worded criticism is normal in reviewers' reports on as-yet unpublished submissions, and may be acceptable if uttered by a senior about a contrarian idea, whereas it commonly attracts opprobrium if made by an outsider or a relative newcomer about the contemporary wisdom.
In an influential commentary, Webster & Watson (2002) recommended that "In contrast to specific and critical reviews of individual papers, tell the reader what patterns you are seeing in the literature ... Do not fall into the trap of being overly critical" (p.xviii). On a literal reading, the authors merely warn against unduly strong or negative expression - a recommendation that the research technique under development needs to adopt. Unfortunately, the quoted words are capable of being interpreted as valuing politeness among researchers more highly than scientific insight and progress.
Similarly, Straub (2009, p.viii) advised authors that "papers should be in apposition [the positioning of things side by side or close together] rather than in opposition". Clearly, where a new theory subsumes an old one (as was the case in the example provided, of Einsteinian relativity absorbing Newtonian physics as a special case), the dictum 'apposition rather than opposition' is appropriate. Where, on the other hand, the new theory actively contradicts or is fundamentally at odds with, existing theory, it would be intellectually dishonest to represent the contrary or disruptive theory as though it were a soul-mate to, or merely a refinement of, existing theory. Further, a reader might all-too-easily interpret the advice as expressing a moral judgement that 'criticism is a bad thing'. My contention is that scientific behaviour demands the opposite: it is an obligation of researchers to 'think critically' and to 'apply their critical faculties'. Politeness of expression and avoidance of argumentum ad hominem (criticism of the messenger rather than of the message) are elements of academic discourse, but they are second-order rather than fundamental to science.
Positivism and interpretivism are well-established schools of research in IS. They have been joined by design science. And they have an odd bedfellow, in the form of what is variously termed 'critical research' and 'critical theory research'. The term 'critical' in this context is somewhat different from the sense of 'analysis of the merits and faults of a work' discussed in the previous section.
Both positivism and interpretivism are concerned with description and understanding of phenomena and behaviours. Sometimes the focus is on natural phenomena, but frequently the interest is in natural phenomena that have been subjected to an intervention. Importantly for the present project, however, both positivism and interpretivism involve strenuous avoidance of moral judgements and of 'having an agenda'. Design research, in contrast, is expressly purposeful and value-laden, in that the features designed into the artefact embody judgements about what is good in the particular context, and whose perspective the goodness is to serve.
Critical theory research is concerned with description and understanding of phenomena and behaviours, but, like design science, it `has an agenda'. It recognises the effects of power and the tendency of some stakeholders' interests to dominate those of other stakeholders. It brings to light "the restrictive and alienating conditions of the status quo" and expressly sets out to "eliminate the causes of alienation and domination" (Myers 1997). "Critical research generally aims to disrupt ongoing social reality for the sake of providing impulses to the liberation from or resistance to what dominates and leads to constraints in human decision-making. Typically critical studies put a particular object of study in a wider cultural, economic and political context, relating a focused phenomenon to sources of broader asymmetrical relations in society ... (Alvesson & Deetz 2000, p.1). "Critical IS research specifically opposes technological determinism and instrumental rationality underlying IS development and seeks emancipation from unrecognised forms of domination and control enabled or supported by information systems" (Cecez-Kezmanovic 2005, p.19).
In Myers & Klein (2011), three elements of critical research are identified:
This section outlines two categories of existing research technique that are directly relevant to the present purpose. The first category comprises techniques referred to generically as `content analysis'. The second category is a form of literature review compatible with critical theory research.
The term `content analysis' refers to a cluster of techniques that seek to classify text, or specific aspects of text, into a manageable number of categories. Two definitions are:
Content Analysis is the semantic analysis of a body of text, to uncover the presence of strong concepts (Indulska et al. 2012, p.4, citing Weber 1990)
Content Analysis is the interpretation of the content of text data through the systematic classification process of coding and identifying themes or patterns (Hsieh & Shannon 2005, p.1278)
The Hsieh & Shannon (2005) paper indicates a 7-step process which the authors attribute to Kaid (1989). See also vom Brocke & Simons (2008):
As with any research technique, all aspects need to be subject to quality controls. Krippendorff (1980), Weber (1990) and Stemler (2001) emphasise steps 3-5 in relation to the coding scheme and its application, highlighting the importance of achieving reliability. Approaches include coding by individuals with strong experience in both the review of papers and the subject-matter, parallel coding by multiple individuals, review of individuals' coding by other parties, and publication of both the source materials and the detailed coding sheets, in order to enable audit by other parties.
Content analysis techniques exhibit varying degrees of structure and rigour, from impressionistic to systematic, and they may involve qualitative and/or quantitative assessment elements. Data may be on nominal, ordinal, cardinal or ratio scales. Data collected on higher-level scales, especially on a ratio scale, is able to be subjected to more powerful inferencing techniques. Qualitative data, on the other hand, may be gathered on a nominal scale (whereby differences are distinguished, but no ordering is implied) or on an ordinal scale (such as 'unimportant', 'important', 'very important').
Quantification generally involves measurement, most fundamentally by counting. This raises questions about the arbitrariness of boundaries, and about configuration and calibration of the measuring instrument(s). Some quantification techniques involve sleight of hand. A significant example is the frequently-encountered but unjustified assumption that 'Likert-scale' data is not merely ordinal, but is cardinal (i.e. the spaces between the successive terms are identical), and even ratio (i.e. the scale also features a natural zero), in order to justify the application of powerful statistical techniques to the data.
Many authors implicitly equate quantification with rigour, and qualitative data with subjectivity. They accordingly deprecate qualitative analysis, or at least relegate it to pre-theoretical research, which by implication should be less common than research driven by strong theories. However, the majority of authors spend only limited time considering the extent to which the assumptions and the processes underlying the act of quantification may be arbitrary or themselves 'subjective'. Positivism embodies an implicit assumption that computational analysis necessarily leads to deep truth. The assumption needs to be tested in each particular circumstance, yet such testing is seldom evident.
A positivist approach to categorising content analysis "along a continuum of quantification" distinguishes "narrative reviews, descriptive reviews, vote counting, and meta-analysis" (King & He 2005, p.666):
As regards the approach taken to establishing a coding scheme for content, a four-way classification scheme is relevant to the present purpose. The first three forms are usefully discussed in Hsieh & Shannon (2005).
Firstly, the categories may be defined a priori, in particular arising from existing theory. To the extent that the declared or inferred content of the text does not fit well to the predefined categories, there may be a need to consider possible revisions of the coding scheme, or even of the theory on which the research design was based. It may be feasible to draw inferences based on counts of the occurrences of categories and/or on the intensity of the statements in the text, such as the confidence inherent in the author's choice of language (e.g. "this shows that" cf. "a possible explanation is that"). However, as with any theory-driven research, the evidence extracted from the text may have a self-fulfilling-prophecy quality about it, i.e. there is an inevitable tendency to find more evidence in support of a theory than in conflict with it, and contextual factors may be overlooked. In order to enable auditability, it is important that not only the analysis be published, but also the raw material, the coding scheme, and the coding, at a sufficiently high level of granularity.
Alternatively, the categories may emerge from the data. This approach is of value, for example, when there is an absence of suitable theories to guide the establishment of a priori categories; but it also has relevance where the researcher is quite specifically challenging existing theiories in the area. The process of necessity involves assumptions, and hence external validity of conclusions arising from this approach is likely to be limited. Depending on the degree of generality of the conclusions claimed by the author, full disclosure of the text selection, coding and inferencing procedures may be merely desirable or vital.
A third approach has been dubbed summative context analysis. It involves "counting and comparisons, usually of keywords or content, followed by the interpretation of the underlying context" (Hsieh & Shannon 2005, p.1277). The first step is to explore usage, by "identifying and quantifying certain words or content in text with the purpose of understanding the contextual use of the words or content" (p.1283).
Because of the complexity and variability of language use, and the ambiguity of a large proportion of words and phrases, a naive approach to counting words is problematic. At the very least, a starting-set of terms needs to be established and justified. A thesaurus of synonyms and perhaps antonyms and qualifiers is needed. Allowance must be made for both manifest or literal meanings, on the one hand, and latent, implied or interpreted meanings (in semiotic terms, `pragmatics'), on the other. Counts may be made not only of the occurrences of terms, but also of the mode of usage (e.g. active versus passive voice, dis/approval indicators, associations made).
The degree of analytical rigour that quantification can actually deliver depends a great deal on a number of factors. Critical among them are:
Publication of details of text selection and the analytical process is in all cases important, and essential where a degree of rigour and external validity is claimed.
A fourth approach is usefully described as programmatic content analysis. This enables much larger volumes of text to be analysed. The coding scheme may be defined manually, cf. directed content analysis / a priori coding. However, some techniques involve purely computational approaches to establishing the categories, cf. 'machine-intelligent' (rather than human-intelligent) emergent coding. The processing depends, however, on prior data selection, data scrubbing and data-formatting. In addition, interpretation of the results involves at least some degree of human activity.
Debortoli et al. (2016), distinguishes three alternative approaches to programmatic coding:
Given that the 'big data analytics' movement is highly fashionable, vast volumes of data are available, and there is a comfort factor involved in office-based work much of which is automated, it would appear reasonable to anticipate that programmatic analysis techniques may be a growth-area in the coming few years - at least until their limitations are better appreciated (Clarke 2016a, 2016c, 2018).
The four approaches to content analysis outlined above appear to be accepted within the IS discipline, but their use has been somewhat limited. For example, in a survey of the articles published in six leading IS journals during the 1990s, Mingers (2003) found that the use of content analysis as a research technique was evident in only four of the journals, and even in those four in only 1-3% of all articles published during that time. In August 2018, of the over 15,000 refereed papers indexed in the AIS electronic library, 13 had the term 'content analysis' in the title (none new since February 2017), and 71 in the Abstract (previously 69). In recently-published papers, the most common forms of text that have been subjected to content analysis appear to be social media and other message content, with other categories including newspaper articles and corporations' 'letters to shareholders'. There appears to have been very limited application of these techniques to published works.
The four forms of content analysis outlined above are primarily descriptive, and at best interpretive. Their focus is on concepts, themes, patterns and relationships. Moreover, the extent to which the semantics of the source-materials are reflected is quite low. A technique for `critical analysis' must delve much more deeply not only into the tenable meanings of the content, but also into the context and the assumptions inherent in the work. It may also be necessary to follow trails such as cited works and terms used, in order to place the work within a genre, or within a `school of thought'.
A recent proposal endeavours to address these weaknesses. Wall et al. (2015) describe an approach that they refer to as critical discourse analysis, which they say they have based on Habermasian strains of critical discourse analysis (CDA). Their starting-point is that "the information systems (IS) discipline is subject to ideological hegemony" (p.258). By `ideological hegemony' the authors mean "the conscious or unconscious domination of the thought patterns and worldviews of a discipline or subdiscipline that become ingrained in the epistemological beliefs and theoretical assumptions embedded in scientific discourse (Fleck, 1979; Foucault, 1970; Kuhn, 2012). [They argue that] ideologies can be harmful to individuals who are disadvantaged or marginalized by them, and they can be problematic to scientific research because they represent blind spots" (p.258), and hence that "review papers [should] ... challenge ideological assumptions by critically assessing taken-for-granted assumptions" (p.257).
The authors propose a seven-step process (pp. 265-9):
The emphasis on 'systematic' literature reviews (SLR) noted earlier has itself been subjected to criticism, in that it "suppresses aspects of quality in research and scholarship that are at least as important as clarity, countability and accountability - such as intertextual connectivity, critique, interest, expertise, independence, tacit knowledge, chance encounters with new ideas, and dialogic interactions between researcher, 'literature' and 'data'" (MacLure 2005, p.394).
More substantially, Boell & Cecez-Kecmanovic (2015) argue that the literature on the SLR technique has effectively kidnapped the term 'systematic'. It has linked with the notion 'systematic' the requirements that literature review must necessarily begin with a clearly-defined research question, and that its results must evidence or must deliver rigour, objectivity, replicability, and absence of bias. Boell & Cecez-Kecmanovic contend firstly that these are not necessarily delivered by SLR, and secondly that, even if they are attainable, they are, for some forms of research, undesirable and harmful.
Further, in Boell & Cecez-Kecmanovic (2014) it is argued that a constructively loose and iterative process is needed, to avoid undue constraints and unlock insight and creativity: "Highly structured approaches downplay the importance of reading and dialogical interaction between the literature and the researcher; continuing interpretation and questioning; critical assessment and imagination; argument development and writing - all highly intellectual and creative activities, seeking originality rather than replicability [MacLure, 2005, Hart, 1998]" (p.258).
To address these issues, Boell & Cecez-Kecmanovic "propose hermeneutic philosophy as a theoretical foundation and a methodological approach for studying literature reviews as inherently interpretive processes in which a reader engages in ever exp[a]nding and deepening understanding of a relevant body of literature. Hermeneutics does not assume that correct or ultimate understanding can be achieved, but instead is interested in the process of developing understanding" (p.259). Their framework, reproduced in Figure 1, comprises two intertwined cycles: a search and acquisition circle, and a wider analysis and interpretation circle (p.263).
Rather than a carefully-planned and closed-ended process, this approach embodies "questioning and critical assessment ... of previous research" (p.258). "Critical assessment ... not only reveals but also ... challenges the horizon of possible meanings and understanding of the problem and the established body of knowledge" (p.267).
The hermeneutic framework proposed is oriented towards the review of a body of literature. This will generally be selected because it has as its focus a particular body of theory, or perhaps a particular category of real-world phenomena or behaviour. Such a body of literature is within the scope of the notion of 'published works' that is the focus of the present project. On the other hand, the set of 'published works' under consideration may be defined according to other criteria, such as publishing venue(s) or research technique(s).
The preceding discussion has identified highly structured and pre-determinable techniques relevant to some forms of 'content analysis of published works', most usefully exemplified by Hsieh & Shannon (2005). It has also highlighted a far more loose-limbed, adaptive and interpretive approach, associated with the critical discourse analysis approach to content of Wall et al. (2015), and the hermeneutic approach to literature review of Boell & Cecez-Kecmanovic (2014).
The new research technique must accordingly be described in a relatively abstract manner, so as to to encompass all relevant variants. Researchers then need to apply the abstract principles in order to customise a research method appropriate to each particular project.
Based on the preceding discussion, a set of strongly-desirable characteristics can be derived. These are presented in Table 1.
1. Formal Content
The analysis is concerned with works
that present carefuly-considered textual content, in venues that generally
apply formal review and editorial filters to the works that they publish
2. Content Rather Than Method
The analysis is primarily
concerned with the content of the works, including methodological aspects, but
not the details of the research techniques that were adopted in conducting the
research
3. Criticism
The analysis applies critical thought to the
content rather than limiting itself to the description, explication,
consolidation and interpretation of existing theory, the drawing of
implications from it, and the detection of gaps in it
4. Content not Intent
The analysis is of the content of
works, not of the inferred intentions of their authors
5. Appropriate Degree of
Structuredness
Depending on the purpose of the research:
6. Designed-in Methodological Rigour
The analysis is
subjected to such structuredness, consistency, comprehensiveness, transparency
and auditability constraints as are feasible and desirable in the particular
context
7. Phased Reading
Preliminary 'orientational reading' is
followed by deeper 'analytical reading', in order to provide confidence in the
appropriateness of the selection, coding and interpretation of passages. By
'passages' is meant a text-segment of any length, including word, word-group.
phrase, clause, sentence, paragraph or paragraphs. (For example, the Macquarie
Dictionary defines a passage as "an indefinite portion of a writing, speech, or
the like ...")
8. Self-Questioning, Reflection and Open-Mindedness
The
analysis involves ongoing self-questioning, and openness to the ideas
communicated in, and the ideas stimulated by, the works
9. Coding Rigour
Sources of categorisations are documented,
and appropriate techniques applied, ranging from directed content analysis
using a priori coding, via emergent coding, to critical discourse
analysis
10. Quantification
11. Analytical Rigour
12. Iteration
The analysis involves introspective
questioning by the researcher about their own analysis and assumptions,
resulting in loops back to earlier phases of the process, possibly resulting in
adaptation of the project's framing, re-design of coding schemes, re-coding,
reconsideration of inferences drawn, etc., as appropriate in the context
13. Declaration of the Final Form of the Research Question
14. Declaration of the Research Method
15. Declaration of Researcher Perspective
The values
underlying the researcher's approach are identified, and where appropriate are
compared and contrasted with those that the researcher perceives to be embedded
in the works
16. Qualitative Critique
A textual critique is presented,
by which is meant analysis of the merits and faults of the selected works, and
perhaps by implication of the segment and even of the corpus
17. Inferential Rigour
18. Careful Expression
The text is expressed in a
measured manner, avoiding unduly strong expression, unduly positive or negative
expression, and overly colourful speech
19. Avoidance of argumentum ad hominem
Criticisms
are made of the work and not of the author of the work. Where relevant, a
default assumption is made of unconscious hegemonic participation, and
attribution of intent on the part of the authors is avoided except where
reliable evidence exists
20. Expression in Apposition Unless the Evidence Warrants Expession in
Opposition
The text is no more confronting to existing theory than
the evidence arising from the analysis warrants. In particular, where the
conclusions and implications qualify, deepen or subsume existing theory, they
are presented in apposition rather than in opposition
The process adopted in this project has broadly followed the sequence of Peffers' Design Science Research Method(ology) (DSRM - Peffers et al. 2007). The final steps of DRSM are demonstration and evaluation. This section of the present paper makes a contribution to the demonstration step.
There are multiple forms of research question to which the Critical Analysis of Published Works can be applied. Table 2 contains several examples, intended to provide a sense of the scope of application.
Do the works describe the research method at a sufficient level of detail to enable the reader to evaluate the research quality, and do they provide access to sufficient supporting materials to enable audit?
Do the works consider the suitability of the data to the data analysis techniques that are applied to the data, or do they merely assume its suitability?
Is the scope of works cited in the works' literature review sufficiently comprehensive?
To what extent do the works reflect major prior works in the area?
To what extent do the works that cite particular leading works appropriately represent and apply them?
To what extent do the works reflect the interests of sub-dominant stakeholders?
Do the works evidence an understanding of the power relationships within the research domain?
What unstated assumptions can be detected in the works?
_________
The Key Characteristics in Table 1 are framed in a manner that is intended to cope with the diversity of potential applications indicated in Table 2. In each particular case, a research method needs to be crafted, and the 'key characteristics' can be used to articulate a research method applicable to a particular project, or category of projects, involving the critical analysis of published works.
To demonstrate this process, I draw on a recent program of work I have conducted in the area of 'researcher perspective'. By this term I mean the stakeholder viewpoint(s) from which the researcher observes the relevant phenomena. Several collections of works have been examined, trialling successive iterations of the analysis process. The first set of works was a cross-sectional sample of the author's national IS conference (ACIS) and journal (AJIS), reported in Clarke (2015). The second was a longitudinal sample of papers from the annual Bled conference (Clarke 2016b). The third was a journal Special Issue on personal data markets (Clarke 2017). The fourth (as-yet unpublished) studied a longitudinal sample of articles from the `Basket of 8' IS journals (Clarke et al. 2018). (In each of these analyses, over 90% of works have adopted a single stakeholder's perspective as an objective, relegating the interests of other stakeholders to the role of constraints. Further, the single perspective has been, in about 90% of cases, that of the system's sponsor).
The 'researcher perspective' project is a member of a category of critical analysis of published works, whose key features are:
The genesis of the developing research technique can be seen in the pilot study performed in support of a Conference Keynote, documented in s.5.2 of Clarke (2015). The second iteration is described in s.4 of Clarke (2016b), with the coding protocol in Appendix 2. That originally used for the third project is in the first Annex to Clarke (2017), and the subsequently revised version is in the second Annex. The specification for the most recent of the four iterations of the 'researcher perspective' project is provided in s.3.2 of Clarke et al. (2018), with the detailed coding protocol in Appendix 2, with s.3 being particularly relevant.
A description of the process that is intended to be more readily comprehensible to readers not familiar with that specific project is provided in Table 3.
1. Specify the Research Question
Determine and document the purpose of the research project
2. Specify the Segment, Sampling Frame and Sample
Determine and document the part of the corpus of works relevant to IS that is to be subjected to examination
3. Specify the Sources of Categorisations
Determine and document the means whereby code-sets are to be defined, and the protocol whereby coding is to be undertaken
4. Review each Work in its own Terms
Perform 'orientational reading' of each work, in order to "gain an overall impression" (Boell & Cecez- Kecmanovic, 2014, p.265)
5. Identify Key Passages
Perform deep 'analytical reading' of each work (Boell & Cecez-Kecmanovic 2014, p.265). Record or highlight relevant passages
6. Perform the Coding
Apply the coding protocol, including appropriate quality assurance measures
7. Record Coding Rationale and Key Passages
Record sufficient data about, and quotations from, each work to enable review and moderation of the coding, to provide contextual information and thereby to support the formation and review of inferences and to enable the research to be subjected to audit
8. Review all Aspects of the Analysis
Review the approach adopted to quality assurance, and to the reliability of selection, coding, interpretation and presentation. Reflect on whether any aspect of the resulting text is unreasonable or biassed: "By intentionally expressing, questioning, and reflecting upon their subjective experiences, beliefs, and values, critical researchers expose their [own] ideological and political agendas" (Cecez-Kezmanovich 2001, p.147)
9. Re-Work the Analysis as Necessary
Revise any selections, coding, interpretations, inferences or presentation about which doubt exists, and carry the effects through the entire analytical process
10. Communicate the Research Method and Researcher Perspective
Review the resulting text to ensure that the research method, and the researcher perspective are clearly communicated
11. Communicate the Results
Review the resulting text to ensure its ability to communicate effectively to the intended audience. Consider whether graphical support can assist in conveying the nature of the works (Boell & Cecez-Kecmanovic 2014, p.266) or the results of the analysis and coding, e.g. counts of relevant passages and indicators of their intensity, such as qualifications to statements
12. Avoid Undue Attribution of Intent
Review the resulting text to ensure that, in the absence of reliable evidence, attribution of intent on the part of authors is avoided, and the behaviour is assumed to be "unconscious hegemonic participation" (Wall et al. 2015, p.261)
This paper has outlined the elements of a project to establish adequate conceptual foundations and guidance in relation to the implementation of a technique for critical analysis of the content of published works relevant to the IS discipline. It has identified and drawn on a wide range of sources on techniques that bear some kind of relationship to the present purpose. It has proposed a set of key characteristics of a process of this nature. It has provided an example of a research method articulated in order to support a particular kind of project.
It is contended that research of this nature is important to progress within the discipline, because constructive criticism lies at the very heart of scientific method. Criticism is, however, challenging and unpopular, particularly to the gate-keepers of a discipline. Both the conceptual foundations and the guidance in relation to the technique must accordingly be sufficiently robust to withstand the negative reactions that the proposition engenders, and to support the performance, reporting and publication of critiques that will assist in improvements to the quality of IS research, and the advancement of the discipline's standing.
The projects completed to date attest to the practicality of the technique when articulated for, and applied in, a specific critical theory research context. The small set is not sufficiently rich to illustrate or demonstrate the usefulness of the guidance across all of the various project-types. Far less do the projects conducted to date represent the evaluation step called for in design science, which involves observation and measurement of the new artefact's effectiveness in addressing stated objectives.
All of the conceptual aspects outlined in this paper require further and better connection with existing literature, and deeper consideration. The guidance relating to the technique needs to reflect the ongoing engagement with prior literature. Multiple instantiations of the technique need to be articulated in respect of exemplar research projects. As indicated by the set suggested in the previous section, these projects need to encompass diverse senses of the term `criticism', some performed within the terms of the works that are being subjected to examination, and some adopting a broader view of the field of research, in line with the norms of critical theory research.
The scope of this project was defined to include methological critique only to the extent that the research method applied in the work under study is described in the text. The conformance of the research process underpinning each work with the principles and practices recommended by authorities on the relevant research techniques is not addressed here. It is highly desirable that a comparable technique exists for the critical analysis of research methods in IS.
Alvesson M. & Deetz S. (2000) 'Doing Critical Management Research' Sage, 2000
Bandara,W., Miskon S. & Fielt E. (2011) 'A Systematic, Tool-Supported Method for Conducting Literature Reviews in Information Systems' Proc. ECIS 2011, Paper 221,at http://eprints.qut.edu.au/42184/1/42184c.pdf
Boell S.K. & Cecez-Kecmanovic D. (2014) 'A Hermeneutic Approach for Conducting Literature Reviews and Literature Searches' Communications of the Association for Information Systems 34, 12, at http://tutor.nmmu.ac.za/mgerber/Documents/ResMeth_Boell_2014_Literature%20Reviews.pdf
Boell S.K. & Cecez-Kecmanovic D. (2015) 'On being 'systematic' in literature reviews in IS' J. Infor. Techno. 30 (2015) 161-173
vom Brocke J. & Simons A. (2008) 'Towards a Process Model for Digital Content Analysis - The Case of Hilti' Proc. Bled Conference, June 2008, at http://aisel.aisnet.org/bled2008/2
Cecez-Kecmanovic D. (2001) 'Doing Critical IS Research: The Question of Methodology' Ch.VI in 'Qualitative Research in IS: Issues and Trends: Issues and Trends' (ed. Trauth E.M.), pp. 141-163, Idea Group Publishing, 2001, at https://pdfs.semanticscholar.org/37b1/e4c060b93fcfa04d81f03b750e746ba42f2d.pdf
Cecez-Kecmanovic D. (2005) 'Basic assumptions of the critical research perspectives in information systems' Ch. 2 in Howcroft D. & Trauth E.M. (eds) (2005) 'Handbook of Critical Information Systems Research: Theory and Application', pp.19-27, Edward Elgar, 2005
Clarke R. (2012) 'The First 25 Years of the Bled eConference: Themes and Impacts' Proc. 25th Bled eConference Special Issue, PrePrint at .. http://www.rogerclarke.com/EC/Bled25TA.html
Clarke R. (2015) 'Not Only Horses Wear Blinkers: The Missing Perspectives in IS Research' Keynote Presentation, Proc. Austral. Conf. in Infor. Syst. (ACIS 2015), at https://arxiv.org/pdf/1611.04059, Adelaide, December 2015, PrePrint at http://www.rogerclarke.com/SOS/ACIS15.html
Clarke R. (2016a) 'Big Data, Big Risks' Information Systems Journal 26, 1 (January 2016) 77-90, PrePrint at http://www.rogerclarke.com/EC/BDBR.html
Clarke R. (2016b) 'An Empirical Assessment of Researcher Perspectives' Proc. Bled eConf., Slovenia, June 2016, at http://www.rogerclarke.com/SOS/BledP.html
Clarke R. (2016c) 'Quality Assurance for Security Applications of Big Data' Proc. EISIC'16, Uppsala, 17-19 August 2016, PrePrint at http://www.rogerclarke.com/EC/BDQAS.html
Clarke R. (2017) 'Content Analysis in Support of Critical Theory Research: How to Deliver an Unwelcome Message Without Being Shot' Proc. 30th Bled eConference, June 2017, PrePrint at http://www.rogerclarke.com/SOS/CACT.html
Clarke R. (2018) 'Guidelines for the Responsible Application of Data Analytics' Computer Law & Security Review 34, 3 (May-Jun 2018) 467- 476, https://doi.org/10.1016/j.clsr.2017.11.002, PrePrint at http://www.rogerclarke.com/EC/GDA.html
Clarke R., Davison R. & Jia W. (2018) 'Researcher Perspective in the 'Basket of 8' IS Journals: A Weakness in Disciplinary Scope?' Working Paper, Xamax Consultancy Pty Ltd, March 2018, at http://www.rogerclarke.com/SOS/RP8-180307.html
Clarke R. & Pucihar A. (2013) 'Electronic Interaction Research 1988-2012 through the Lens of the Bled eConference' Electronic Markets 23, 4 (December 2013) 271-283, PrePrint at http://www.rogerclarke.com/EC/EIRes-Bled25.html
Debortoli S., Müller O., Junglas I. & vom Brocke (2016) 'Text Mining For Information Systems Researchers: An Annotated Topic Modeling Tutorial' Communications of the Association for Information Systems 39, 7, 2016
Galliers R.D. & Whitley E.A. (2002) 'An Anatomy of European Information Systems Research - ECIS 1993 - ECIS 2002: Some Initial Findings' Proc. ECIS 2002, June 2002, Gdask, Poland, PrePrint at http://personal.lse.ac.uk/whitley/allpubs/ECIS2002.pdf
Galliers R.D. & Whitley E.A. (2007) 'Vive les differences? Developing a profile of European information systems research as a basis for international comparisons' Euro. J. of Infor. Syst. 16, 1 (2007) 20-35
Gregor S. & Hevner A. (2013) 'Positioning Design Science Research for Maximum Impact' MIS Quarterly 37, 2 (June 2013 ) 337-355, at https://ai.arizona.edu/sites/ai/files/MIS611D/gregor-2013-positioning-presenting-design-science-research.pdf
Grover V. & Lyytinen K. (2015) `New State of Play in Information Systems Research: The Push to the Edges' MIS Quarterly 39:2 (2015) 271-296
Hsieh H.-S. & Shannon S.E. (2005) 'Three Approaches to Qualitative Content Analysis' Qualitative Health Research 15, 9 (November 2005) 1277-1288, at http://www33.homepage.villanova.edu/edward.fierros/pdf/Hsieh%20Shannon.pdf
Indulska M., Hovorka D.S. & Recker J.C. (2012) 'Quantitative approaches to content analysis: Identifying conceptual drift across publication outlets' European Journal of Information Systems 21, 1, 49-69, at http://eprints.qut.edu.au/47974/
Kaid L.L. (1989) 'Content analysis' In Emmert P. & Barker L.L. (Eds.) 'Measurement of communication behavior', pp. 197-217, Longman, 1989
King W.R. & He J. (2005) 'Understanding the Role and Methods of Meta-Analysis in IS Research' Communications of the Association for Information Systems 16, 32
Krippendorff K. (1980) 'Content Analysis: An Introduction to Its Methodology' Sage, 1980
Kuhn T.S. (1962) 'The Structure of Scientific Revolutions' University of Chicago Press, 1962
MacLure M. (2005) '`Clarity bordering on stupidity': where's the quality in systematic review?' Journal of Education Policy 20, 4 (2005) 393-416, at http://www.esri.mmu.ac.uk/respapers/papers-pdf/Paper-Clarity%20bordering%20on%20stupidity.pdf
Mingers J. (2003) 'The paucity of multimethod research: a review of the information systems literature' Information Systems Journal 13, 3 (2003) 233-249
Myers M.D. (1997) 'Qualitative research in information systems' MISQ Discovery, June 1997, at http://www.academia.edu/download/11137785/qualitative%20research%20in%20information%20systems.pdf
Myers M.D. & Klein H.K. (2011) 'A Set of Principles for Conducting Critical Research in Information Systems' MIS Quarterly 35, 1 (March 2011) 17-36, at https://pdfs.semanticscholar.org/2ecd/cb21ad740753576215ec393e499b1af12b25.pdf
Niederman F. & March S. (2012) 'Design Science and the Accumulation of Knowledge in the Information Systems Discipline' ACM Transactions on MIS 3, 1 (2012), 1
Oakley, A. (2003) Research evidence, knowledge management and educational practice: early lessons from a systematic approach, London Review of Education, 1, 1: 21-33, at http://www.ingentaconnect.com/contentone/ioep/clre/2003/00000001/00000001/art00004?crawler=true&mimetype=application/pdf
Okoli C. & Schabram K. (2010) 'A Guide to Conducting a Systematic Literature Review of Information Systems Research' Sprouts: Working Papers on Information Systems, 10, 26 (2010), at http://sprouts.aisnet.org/10-26
Peffers K., Tuunanen T., Rothenberger M. & Chatterjee S. (2007) 'A design science research methodology for information systems research' Journal of Management Information Systems, 24, 3 (Winter 2007) 45-77
Popper K. (1963) 'Conjectures and Refutations: The Growth of Scientific Knowledge' Harper & Row, 1963
Stemler S. (2001) 'An overview of content analysis' Practical Assessment, Research & Evaluation 7, 17 (2001), at http://PAREonline.net/getvn.asp?v=7&n=17
Straub D. W. (2009) 'Editor's comments: Why top journals accept your paper' MIS Quarterly, 33, 3 (September 2009) iii-x, at http://misq.org/misq/downloads/download/editorial/3/
Tarafdar M. & Davison R.M. (2018) `Research in Information Systems: Intra-disciplinary and Inter-disciplinary Approaches' Forthcoming, Journal of the Association for Information Systems
Wall J.D., Stahl B.C. & Salam A.F. (2015) 'Critical Discourse Analysis as a Review Methodology: An Empirical Example' Communications of the Association for Information Systems 37, 11 (2015), at https://www.dora.dmu.ac.uk/xmlui/bitstream/handle/2086/11180/2015_Critical_Discourse_Analysis_Review_Methodology_CAIS%20(1).pdf
Webster J. & Watson R.T. (2002) 'Analyzing The Past To Prepare For The Future: Writing A Literature Review' MIS Quarterly 26, 2 (June 2002) xiii-xxiii, at http://intranet.business-science-institute.com/pluginfile.php/247/course/summary/Webster%20%20Watson.pdf
Weber R.P. (1990) 'Basic Content Analysis' Sage, 1990
This work has benefitted from feedback by colleagues, including during presentations of the emergent argument and analysis in a Keynote at the Australasian Conference in Information Systems (ACIS'15) in Adelaide in 2015, and at the Bled Conference in 2016 and 2017, and the contributions of my co-authors on the evaluation of 'Basket of 8' works. Further valuable comments were received from a reviewer.
Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in Cyberspace Law & Policy at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.
Personalia |
Photographs Presentations Videos |
Access Statistics |
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax. From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 75 million in late 2024. Sponsored by the Gallery, Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer |
Xamax Consultancy Pty Ltd ACN: 002 360 456 78 Sidaway St, Chapman ACT 2611 AUSTRALIA Tel: +61 2 6288 6916 |
Created: 23 June 2018 - Last Amended: 11 September 2018 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at www.rogerclarke.com/SOS/CAPW.html
Mail to Webmaster - © Xamax Consultancy Pty Ltd, 1995-2024 - Privacy Policy