Roger Clarke's Web-Site


© Xamax Consultancy Pty Ltd,  1995-2016

Roger Clarke's 'Data Risks in the Cloud'

Data Risks in the Cloud

Published in Journal of Theoretical and Applied Electronic Commerce Research (JTAER) 8, 3 (December 2013) 60-74

Roger Clarke **

© Xamax Consultancy Pty Ltd, 2013

Available under an AEShareNet Free
for Education licence or a Creative Commons 'Some
Rights Reserved' licence.

This document is at


Cloudsourcing involves considerably greater risks to data than do insourcing or conventional outsourcing. A generic data risk assessment identifies key concerns in relation to harm arising from threats impinging on vulnerabilities in the cloud. Guidance is provided as to appropriate safeguards to address those risks. Most services lack those safeguards, implying that individuals and user organisations need to be far more careful in their use of cloud services.


1. Introduction

In conventional outsourcing, a supplier hosts equipment that supports a relevant stack of software, and stores and maintains data. In cloud computing, the supplier changes the focus of the offer from the equipment to the processes. Those processes may be run in any of a wide range of devices, and the location of those devices is determined by the needs of the supplier, not those of the customer. The supplier scales the number of processes and the processing speed and storage capacity to meet the customer's varying needs over hourly, daily, monthly and annual cycles, to reflect growth and decay factors, and in the face of demand uncertainty. The supplier may offer a tariff based on usage, because instead of unused capacity being locked up in hosts that have been pre-allocated to a customer, the supplier can make more efficient use of the available computing resources.

Three categories of cloudsourcing are conventionally distinguished. Infrastructure as a Service (IaaS) refers to the provision of a bare (but virtualised) machine, without software or with little more than a specific operating system and version. Amazon's EC2 and Rackspace were early movers in a marketspace that is becoming densely populated. Platform as a Service (PaaS), on the other hand, offers a configured platform on which organisations can develop and/or install their applications. Examples include Microsoft Windows Azure, Google Apps and a range of services offering specific application development and/or execution environments. The third category, Software as a Service (SaaS), makes specific application software available. SaaS is targeted at organisations of all sizes as an alternative to applications running on the organisation's own hosts, or on their employees' workstations. Examples include Salesforce, Google Gmail, Zoho, Google Apps, MS Office 365, Dropbox and MYOB LiveAccounts. SaaS is also offered directly to consumers, e.g. Zoho, Gmail, Google Docs, Dropbox.

SaaS commonly involves complete dependence on an autonomous external service-provider. At the other extreme, cloudsourcing may be subject to reasonably tight control by the user-organisation, e.g. if it is used solely as a means of replicating data, in such forms as a backup service or a multi-server environment to achieve regional distribution. Between these two extremes are various arrangements in which the dependence on the service-provider may be mitigated by maintaining up-to-date local copies of data, and even by the retention of some degree of local processing capability, e.g. as a fallback arrangement when the service is inaccessible.

A great many risks arise in relation to all forms of insourcing, outsourcing and cloudsourcing. Some risks relate to IT infrastructure, some to the services that the technology enables, and some to the data that it maintains. This article focusses on data risks. From this perspective, cloudsourcing encompasses various configurations. In particular:

An organisation may use different configurations to support different business functions. It may even switch between different configurations, e.g. to cope with peaks in demand for access or processing. The purpose of this article is to investigate the data risks that arise in all of the variants of cloudsourcing identified above.

2. Research Method

Cloudsourcing has been the subject of active marketing during the period since 2006. Marketers have a natural tendency to overplay the benefits and underestimate the disbenefits and risks. Many exaggerations and misrepresentations have been swallowed by uncritical reporters in the trade press. Moreover, the enthusiasm has at times spilt over into universities and even the academic literature.

During the course of a 4-year research program into cloudsourcing, the author has developed templates to support organisations in identifying benefits, disbenefits and risks (Clarke 2010). Risks faced by consumers were examined in Svantesson & Clarke (2010), and the templates were adapted and applied to the consumer segment in Clarke (2011). The templates were further developed in Clarke (2012b), which examined cloud computing from the perspective of outsourcing theory and security theory. The purpose of this paper is to examine the specific issue of data risks that arise from cloudsourcing, and the extent to which adopters and service-providers appear to be managing them.

The research reported on in this paper commenced with inspection of the relevant refereed literature and summarisation of the known information. The emergence of formal literature lags well behind technological and market phenomena. Reasons for this include the slowness of the cycle of research, writing, review, revision and publication, together with the difficulties of studying small populations of diverse and unstable new phenomena. To complement the slim body of relevant refereed work, this project included searches for media reports on the experiences of adopters of cloudsourcing. It is of course highly desirable that stronger approaches be developed to the gathering of empirical data about the performance and malperformance of cloudsourcing service-providers. The industry is still immature, however, and the collection of reliable data is a challenging undertaking. It is untenable to delay investigations of this nature, because information is needed now, to enable informed decision-making. The author accordingly contends that the best available evidence needs to be used, and the conclusions from the research qualified in order to reflect the inadequacies in the available data.

The paper commences by reviewing the literature on data security, both that dealing with it in a generic sense, and in the particular context of cloudsourcing.

3. Data Security

As a framework for the analysis, the conventional security model was adopted. This is based on the proposition that Threats impinge on Vulnerabilities, resulting in Harm to Assets, and that a network of Safeguards (sometimes referred to as 'Controls') needs to be devised, implemented and maintained in order to manage risk appropriately. Exhibit 1 provides a schematic representation. This model is closely related to, but differs in several respects from, those in Firesmith (2004) and CC (2012, pp. 38-39).

Exhibit 1: The Conventional Security Model

The well-established process of Security Risk Assessment was applied. This considers in turn the Assets, Harm, Threats, Vulnerabilities and existing Safeguards, in order to guide the development of a strategy and plan to assure protection that is reasonable in the circumstances (NIST 2002, ISO 2008, NIST 2011, ISM 2012). An appropriate trade-off needs to be made among costs and benefits that can be estimated with some degree of confidence, and abstract threats whose impact is uncertain. The application of the Risk Assessment process in this paper differs from its common usage in that it is intentionally generic rather than focussed on the specific context of a particular organisation.

The primary focus of the paper is on additional risks that arise where cloudsourcing is adopted. When applying the generic analysis reported on in this paper to a particular context, it is also necessary to consider the extent to which risks that apply to insourcing, and to forms of outsourcing other than the cloud, may be avoided or mitigated by cloudsourcing.

Organisations' perceptions of cloud computing risks have been investigated in a number of articles, (Troshani et al. 2011). The literature offers various approaches to identifying and structuring data risks confronting organisations. An upbeat article that is very widely cited is Armbrust et al. (2010). It identified ten security 'Obstacles' and 'Opportunities', of which two were data related. They argued that 'Data Lock-in' could be addressed by standardising APIs, and by compatible software to enable 'Surge or hybrid Cloud Computing'; and that 'Data Confidentiality and Auditability' issues could be addressed by deploying encryption, vLAns and firewalls. The paper's analysis of data risk was inadequate, using the inappropriate concept of 'data theft', and failing to encompass the outright loss of data.

Paquette et al. (2010) considered the use of cloud computing by US government agencies specifically, and proposed a four-element framework: access, availability, infrastructure and integrity. Hardy & Williams (2010), on the other hand, used a six-element risk framework, comprising continuity, compliance, auditability, reputation, intellectual property and content risks. Subashini & Kavitha (2011) provided a comprehensive discussion of security issues in SaaS offerings, but the structure imposed on the ideas was very muddy. They identified a range of risks in relation to unauthorised access to data in storage, variously by hackers, by unintended members of the organisation's own staff, by other users of the service-provider's facilities, and by the service-provider's staff - but, remarkably, the authors overlooked access by the service-provider and by governments. Other relevant issues identified were interception during transmission, jurisdictional location (as a matter of legal compliance), data integrity, and data availability - but, like Armbrust et al., they overlooked outright data loss.

Both Clarke (2010, 2012b) and Ackermann et al. (2011, 2012) built on the analysis of Avizienis et al. (2004). The Ackermann Security Risk Items and Dimensions are reproduced in Exhibit 2. Although the Ackermann model has some advantages for researchers, it conflates threats, vulnerabilities and safeguards, and does not provide useful guidance to organisations that are considering the adoption of cloudsourcing. It also combines IT security, service security, and data security into one melange. Ackermann's Risk Items 10-12 and 15-19 are service security risks, and 20-21, 25-27 and 29-31 are IT security matters. This paper is concerned with the 15 Items in the Ackermann list that are Data Risks.

Exhibit 2: The Ackermann Security Risk Items and Dimensions

Extract from Ackermann et al. (2012)

A range of initiatives have been commenced within the cloudsourcing service-provider industry sector (CSA 2009) and beyond it (ENISA 2009). There are also more pragmatically conceived approaches, such as a 'Cloud Computing Bill of Rights' Urquhart (2010). An early but fairly comprehensive analysis is in Molnar & Schechter (2010). There are also some signs of new technologies that may deny access to data by cloudsourcing service-providers, such as Eben Moglen's FreedomBox (Dwyer 2011).

Current models and current services are widely recognised in the trade press as falling short of the need. The following section of this paper applies Security Risk Assessment in order to develop a framework for understanding cloudsourcing data risk. Unlike Ackermann et al., the focus of this work is on the delivery of value to practitioners rather than to researchers.

4. A Generic Data Risk Assessment of Cloudsourcing

To support user evaluation of cloudsourcing proposals, it is necessary to achieve clarity about the nature of relevant Assets and the Harm that they may suffer. Threat and Vulnerability Analysis can then be grounded in that understanding, and existing Safeguards can be evaluated and additional and enhanced Safeguards conceived.

4.1 Assets

The term data is used here to refer to any symbol, sign or measure that is in a form capable of being directly captured by a person or a machine. It may represent some phenomenon in the real world, either by resulting from a measurement of it, or from being postulated as indicating something about it. Alternatively, it may be synthetic data that has no such direct relationship, such as the data used in a Monte Carlo simulation. The term 'data' is used in this paper in preference to the term 'information', because it is more useful to limit the term information to data that has value, in particular value arising from relevance to a context such as a resource-allocation decision.

Data is subject to a range of quality factors, which bear on its value as an asset. One of these is its capacity to be relevant to some future decision. Other data qualty factors include accuracy, precision, completeness and timeliness. Over time, the quality of any particular item of data may diminish. One reason may be because the real-world phenomenon is subject to change over time, but the recorded data does not reflect that change. Data may also lose quality as a result of processing that takes place in the interim, particularly through the alteration of the data, or the alteration or deletion of other data associated with it. The term 'data integrity' is commonly used to refer to the condition in which data quality is sustained.

In order to understand data as an asset, it is important to take into account the distinction between data and the medium on which it is recorded. Another factor is the ready replicability of data, particularly in digital form. For these reasons, data is not an asset of the same kind as real estate or chattels (i.e. goods, made of atoms). Intellectual property laws, in particular copyright, create baskets of rights in relation to data, and those rights can be owned and sold; but the data itself is not an asset to which the notion of 'ownership' applies. Rather than data ownership, it is more appropriate to apply such concepts as data possession and data control. Data protection laws are commonly cast in terms of a 'data controller'. A data controller may be a corporation, a government agency or a not-for-profit organisation, and may be of any size (conventionally, micro, small, medium or large); or it may be an individual.

When determining the value of a data asset, organisations are almost entirely concerned with economic factors. For individuals, on the other hand, there may be an economic dimension, but more commonly their predominant concerns are social factors and psychological values, including hedonism or pleasure-value. Individuals may often be less concerned about data quality factors than organisations, although low quality data in the hands of an organisation may causes individuals difficulties. Organisations generally want data to be persistent, whereas a significant proportion of the data that individuals are interested in is ephemera.

Reflecting the uses to which organisations and individuals put data, the following sources of value can be distinguished:

Central though data is to this analysis, it is not the only Asset relevant to an evaluation of cloudsourcing. Data controllers, and parties to which the data relates, have a range of assets that mis-handling of data can affect. This aspect is further discussed in the following sub-section.

4.2 Harm

Reflecting the literature outlined earlier, Exhibit 3A identifies a set of five categories of Harm to Data that need to be taken into account when selecting among in-, out- and cloudsourcing options.

Exhibit 3A: Categories of Harm To Data

As will be shown in the following sub-sections, cloudsourcing creates additional risk exposures. The degree of harm varies greatly depending on a variety of factors. For example, a 5-minute period of inaccessibility to accounting data is of a completely different order of magnitude of harm in comparison with the unauthorised replication of a large database of sensitive personal data that is of sufficient richness to support identity fraud.

Because data serves important purposes, and has a number of different values associated with it, it is necessary to also consider categories of Harm arising to other Assets of value to the data controller and to other parties. For example, an airline may be negatively affected by loss of data by a company that maintains its aircraft; and an individual to whom data relates may be harmed by ill-informed decision-making by a corporation or government agency. The most direct impacts will, however, generally be on the data controller itself. Exhibit 3B identifies the kinds of harm that can be caused to an organisation when its data is subject to a security incident.

Exhibit 3B: Categories of Harm to Data Controllers

Compliance is an important aspect that has been inadequately treated in many discussions of cloudsourcing. The scope and significance of negative impacts on legal compliance is indicated in Exhibit 3C.

Exhibit 3C: Categories of Harm to Data Controllers' Compliance Obligations

4.3 Threats

A wide variety of threats exist, conventionally divided into three categories:

Rather than conducting analysis based on, say, the threat location or threat vector, the most promising approach for the current purpose is to consider Threats using the framework provided by the categories of Harm to Data in Exhibit 3A above. Within each major category, the threats are clustered according to the party responsible for them.

Exhibit 4: Categories of Threat

In all cases, the data-controller may suffer harm from accidents due to errors in the design or performance of business processes, and from attacks by insiders through abuse of the privileges granted to them as users. The incidence of business process error is likely to be higher with cloudsourcing than the alternatives, because the fit of the application to the organisation's needs is likely to be lower, and the appication is likely to be less adaptable as those needs change.

The second bracket of categories refers to actions within the realm of the service-provider(s). Storage error might, with low probabilty, result in data modification, or unauthorised access or replication (e.g. as a result of errors in the permissions lists). A much more likely eventuality is that data may be inaccessible by the organisation for periods of time due to outages attributable to the service-provider's storage facilities (e.g. Mellor 2010). More severe consequences are likely to arise from loss of the data due to unrecoverable hardware failure. This has occurred with nominally reputable providers like Amazon (Blodget 2011). Far from being unusual, data loss appears to be a frequently-occurring problem (IBD 2013), and in the rankings published by the Cloud Security Alliance has been raised to the Number 2 security threat (Kar 2013).

This threat is particularly significant in the case of SaaS. After an organisation has adopted SaaS for, say, its office applications, a single server, database, network or power outage renders unavailable the office applications, office documents, mail-archives, appointments and address-books of every staff-member, not merely those staff-members local to the point-of-failure - cloudsourcing's effect is 'one out, all out' (Needleman 2011).

Data inaccessibility may arise from a brief failure of the service as a whole (e.g. due to power outage or loss of connectivity), or through a temporary network malfunction or overload. Disturbances of this kind might alternatively result in modification (e.g. recovery to an earlier database state). There is also the possibility of longer-term inaccessibility. This might arise from a suspension of service while a liquidator undertakes sale or barter of the service, or merely the data, as a means of recovering monies owed to the service-provider's creditors. (Undertakings previously given in relation to data are commonly ignored during and after a change in ownership of the service-provider, its business, or its database). Outright data loss has arisen in a variety of cases, where the service-provider simply closes its doors, or withdraws the service (e.g. Google's Postini cloud backup service - Tung 2012).

Network malfunction might give rise to unauthorised modification of data, or to unauthorised access to or replication of data (e.g. through delivery to an inappropriate location). Interception of traffic between the data-controller and the service-provider could also result in unauthorised access or replication. These exposures are broader than in the case of conventional outsourcing, because of the likelihood of geographical dispersion of the hosts that are providing the virtualised servers.

Abuse of Privilege by the service-provider is an ever-present possibility, which could result in any of the various forms of harm to the data. A specific instance of such abuse is Verizon's scanning of user's data (Gallagher 2013).

A further threat arises from the possibility that the data may be formatted in a manner that is compatible with a particular service, but not with any alternative services. This could give rise to delays in accessing the data, unauthorised data modification due to faulty conversion to a new format, or even complete inability to access the data, equivalent to loss of the data.

The final group of threats relates to parties other than the data-controller and the service-provider(s). A break-in may be followed by unauthorised data access (effectively small-scale copying) or unauthorised replication (large-scale copying). Unauthorised modification could occur. A special case of modification that has recently been in evidence, and that gives rise to data inaccessibility, is 'data-napping', whereby the hacker encrypts the data, and extorts a fee in return for the decryption key (Hicks 2012). A malicious hacker may, on the other hand, simply delete the data and perhaps seek out backups and delete them as well. Cloud computing has also given rise to a specialised form of hacking, which is referred to as 'isolation failure' or a 'guest-hopping attack'. A party that has processes running in the same host may be able to gain access to the data associated with another party, enabling any of the actions described earlier in this paragraph to be undertaken.

Where services are insourced, the threats indicated in Exhibit 4 as 2nd party threats are the direct responsibility of the data controller. With any form of outsourcing, the data controller loses direct control over the data, becomes dependent on one or more service-providers, is subject to increased threats to the extent that the data is transmitted further and more often, and is subject to the additional threat of abuse of privilege by service-providers and their employees. With cloudsourcing, the threats expand further, in that the locations of storage and processing are no longer known to the data controller, and hence transparency, oversight and auditability are undermined.

4.4 Vulnerabilities

The technical vulnerabilities inherent in outsourcing are exacerbated by cloudsourcing. Exhibit 5A identifies the key factors involved.

Exhibit 5A: Key Technical Vulnerabilities in Cloudsourcing

These factors inevitably give rise to reliability issues. In Clarke (2012a), over 100 media reports about cloud service outages were assessed in order to gain an understanding of the frequency, length, consequences and redress aspects of cloudsourcing service reliability and data security. Beyond short-term inaccesibility, the proportion of times in which data loss occurred appears to have been as high as 20% of the 49 outages documented in media reports during the period 2005-11.

Beyond the technical vulnerabilities are operational and commercial factors. Exhibit 5B provides a summary of some of the most prominent such vulnerabilities that arise from cloudsourcing.

Exhibit 5B: Key Operational and Commercial Vulnerabilities in Cloudsourcing

In the case of insourcing, the data controller has the capacity to directly manage these risks. With any form of outsourcing, the control becomes indirect, and dependent on contractual terms and the service-provider's conformance with those terms. With cloudsourcing, the terms are generally looser, and dictated by the service-provider rather than being customised to meet the data-controller's needs. The physical and jursidictional location of the service-provider, the contract and the data also tend to become more remote from the data-controller.

A factor that is of major consequence in some circumstances is the scope for interference by governments. A standard is under development to facilitate law enforcement agency access to data in the cloud (ETSI 2012). However, government actions may or may not be authorised by law, and may or may not be mediated by the judiciary through court orders or warrants. In some cases, governments may also claim extra-territorial reach. This is particularly so with the USA, under its PATRIOT and Foreign Intelligence Surveillance Amendment (FISA) legislation (EP 2012). The US asserts that all data stored by any US corporation, nomatter where in the world it is stored, is subject to US government demand powers. This is far from a mere theoretical possibility, as demonstrated by the (probably unlawful) closure of Megaupload's services by New Zealand law enforcement agencies at the behest of the US (Galvin 2012).

4.5 Key Data Risks in the Cloud

On the basis of the analysis conducted in the preceding sub-sections, it is possible to identify some threat-vulnerability combinations that are of particular concern to organisations and individuals considering the adoption of cloudsourcing. Exhibit 6 provide a graphical overview of them using the structure introduced in Exhibit 4 above.

Exhibit 6: Key Data Risks in the Cloud

The reasons for highlighting these aspects is as follows:

4.6 Safeguards

Some safeguards are natural, such as the technical challenges that confront casual hackers, and the costs involved in mounting some kinds of attacks. Some safeguards are mainstream, and engrained in corporate and individual behaviour, such as locking doors and authenticating people who seek access to data.

There are also incentives that encourage the implementation of safeguards. In particular, cloud service-providers need to provide a sufficient appearance of reliability that they can attract and retain customers. Some of their clientele are technically and commercially capable, and others hire technically and commercially competent consultancies to assess suppliers' capabilities. It would therefore seem reasonable to expect that some basic level of security would be a feature of all cloud services. Unfortunately, that expectation is somewhat undermined by the limited studies undertaken to date of terms of service, standard practices and security incidents.

This section considers the extent to which technical and operational safeguards appear to be comprehensive and effective, and the degree to which legal safeguards are able to fill the gaps. The primary focus is on the key data risks identified in Exhibit 6.

(1) Technical Safeguards

Safeguards tend to be fairly specific in the threat/vulnerability combinations that they address. For example, the threat of Data Interception in Transit can be addressed by channel encryption - subject to the qualification of its susceptibility to man-in-the-middle attacks. But channel encryption does nothing to mitigate the risks of breaches of security by the service-provider, insiders, hackers and governments. Similarly, encryption of data in storage represents a safeguard against unauthorised access by a hacker to the complete data-set, but not against abuse of privilege by the data-controller or its agents, or by a service-provider, resulting in loss, inaccessibility or unauthorised modification.

Moreover, if the data is to be processed in the cloud rather than merely stored and recovered, then it needs to be decrypted on the service-provider's device, which exposes the data to unauthorised access during the period when it is being processed, and also exposes the encryption and decryption keys. This creates vulnerability to third parties, but also to actions by the service-provider, including insecure key management processes.

Another key safeguard against unauthorised modification, access and replication is access control. Passwords are an increasingly weak form of authentication, but the alternatives to date remain expensive or inconvenient. A significant challenge exists in making user authentication processes convenient for people who are authorised, yet very difficult for people who are not.

A rich set of tools exist to enable service-providers to resist the efforts of hackers. On the other hand, data centres that support cloudsourcing are honey-pots of data that attract hacker-bees. There is, and will continue to be, an arms race of safeguards, followed by countermeasures which demand further safeguards, etc. Inevitably, some attempted break-ins will succeed.

Various applications of the redundancy principle are relevant. However, replication of the data across multiple locations, and even across multiple service-providers, while mitigating the risk of loss and inaccessibility, increases the risk of unauthorised access. Multi-sourcing remains very challenging at this stage, although some progress has been made in inter-operability protocols and standards, e.g. SNIA (2012).

(2) Organisational Safeguards

Many risk exposures arise from human behaviour. Conventional organisational measures used to address them include staff-selection and training, and careful design of the manual aspects of business processes and of the human-computer interfaces. Cloudsourcing generally creates more challenges in implementing and sustaining these forms of organisational control, because the services tend to be less well-customised and hence a poorer fit to the organisation's and the individual users' needs.

Application of the redundancy principle in this area results in process controls such as split responsibilities, dual-entry, reviews, approvals, and reconciliations. These also tend to suffer in a cloudsourced environment because of the relatively unresponsive nature of applications to the organisation's changing needs.

A crucial aspect of organisational controls is oversight and audit of the processes and outcomes, and particularly of the controls. But oversight and audit are undermined in the cloudsourced environment, because of the lack of transparency, resulting in only a limited amount of information being available to enable the checking to be performed.

A key application of the redundancy principle is local replication of data, and fallback procedures to enable some continuity of business and customer service during outages. However, this is generally difficult to achieve with cloudsourcing.

(3) Legal Safeguards

Enforceable legal obligations are only a fallback safeguard, albeit an important one. For them to be effective, a number of conditions must be satisfied. These are summarised in Exhibit 7.

Exhibit 7: Legal Safeguards in the Cloud

It is highly unusual for clousourcing services to satisfy the requirements in Exhibit 7. A security guru put it this way: "Today's internet feudalism ... is ad hoc and one-sided. We give companies our data and trust them with our security, but we receive very few assurances of protection in return, and those companies have very few restrictions on what they can do" (Schneier 2012).

5. Implications

This survey of data risk in the cloud has suggested that many threat/vulnerability combinations exist for which the existing safeguards appear to be far from adequate. Some user-organisations conduct security risk assessments and institute risk management plans that protect their interests. On the other hand, many organisations that have adopted cloudsourcing, especially SaaS, have done so without careful consideration of the risks involved, and without a clear understanding of what Harm will arise to what Assets when what contingencies occur. It is no surprise that many governments and many corporations remain sufficiently concerned about the security of cloudsourcing that they have taken conservative approaches e.g. by applying it only to relatively small, non-core applications, or even avoiding adoption at this stage in the maturation of cloudsourcing. Under current conditions, that would appear to be the appropriate approach.

In the academic arena, a considerable amount of research is being conducted, but much of it is conducted within the frame of reference set by the service-provider sector. It is insufficiently sceptical, and insufficiently reflective of the interests of user organisations. An important implication of the research reported in this article is that academics need to become more attuned to the needs of data-controllers, and to focus much more on ways to assess and manage the data risks inherent in cloudsourcing.

6. Conclusions

Much more needs to be done by industry associations and service-providers, by researchers, and probably also by parliaments and regulators. One approach to identifying the minimum requirements of cloudsourcing's management of data risks is to consider the responsibilities of company directors and their equivalents, and of senior executives of corporations, government business enterprises, and government agencies. They have legal obligations relating to the fulfilment of the organisation's mission, the definition and pursuit of strategic advantage, risk assessment and risk management, compliance, and business continuity. There are many circumstances in which directors could be readily found to be in breach of their responsibilities by adopting cloud computing, at least without a substantial risk management plan in place that brings the levels of data risk back within reasonable bounds. While that statement remains true, cloudsourcing remains unready for 'prime time'.


Ackermann T., Miede A., Buxmann P. & Steinmetz R. (2011) 'Taxonomy of Technological IT Outsourcing Risks: Support for Risk Identification And Quantification' Proc. ECIS 2011, Paper 240, at

Ackermann T., Widjaja T., Benlian A. & Buxmann P. (2012) 'Perceived IT Security Risks of Cloud Computing: Conceptualization and Scale Development' Proc. 33rd Int'l Conf. on Infor. Syst., Orlando, December 2012, at

Armbrust M., Fox A., Griffith R., Joseph A.D., Katz R., Konwinski A. & Zaharia M. (2010) 'A view of cloud computing' Communications of the ACM, 53, 4 (April 2010) 50-58

Avizienis A., Laprie J.C., Randell B. & Landwehr C. (2004) 'Basic Concepts and Taxonomy of Dependable and Secure Computing' IEEE Trans. Dependable and Secure Computing 1,1 (2004) 11- 33

Blodget H. (2011) 'Amazon's Cloud Crash Disaster Permanently Destroyed Many Customers' Data' , Business Insider, 28|April 2011, at S.V., Parshani R., Paul G., Stanley H.E. & Havlin S. (2010) 'Catastrophic cascade of failures in interdependent networks' Nature 464 (15 April 2010) 1025-1028, at

CC (2012) 'Common Criteria for Information Technology Security Evaluation - Part 1: Introduction and general model' Common Criteria, CCMB-2012-09-001, Version 3.1, Revision 4, September 2012, at

Clarke R. (2010) 'Computing Clouds on the Horizon? Benefits and Risks from the User's Perspective' Proc. 23rd Bled eConference, Slovenia, June 2010, PrePrint at

Clarke R. (2011) 'The Cloudy Future of Consumer Computing' Proc. 24th Bled eConference, June 2011, PrePrint at

Clarke R. (2012a) 'How Reliable is Cloudsourcing? A Review of Articles in the Technical Media 2005-11' Computer Law & Security Review 28, 1 (February 2012) 90-95, PrePrint at

Clarke R. (2012b) 'A Framework for the Evaluation of CloudSourcing Proposals' Proc. 25th Bled EConference, June 2012, PrePrint at

CSA (2009) 'Security Guidance for Critical Areas of Focus in Cloud Computing' Cloud Security Alliance, April 2009, at

Dignan L. (2011) 'Amazon outage ends cloud innocence' ZDNet, 23 April 2011, at

DSD (2012) 'Cloud Computing Security Considerations' Defence Signals Directorate, Canberra, September 2012, at

Dwyer J. (2011) 'Decentralizing the Internet So Big Brother Can't Find You' New York Times, 15 February 2011, at

ENISA (2009) 'Cloud Computing: Benefits, risks and recommendations for information security' European Network and Information Security Agency, November 2009, at (2012) 'Fighting cyber crime and protecting privacy in the cloud' European Parliament, October 2012, at

ETSI (2012) 'Lawful Interception (LI); Cloud/Virtual Services (CLI)' , ETSI DTR 101 567 V0.0.5, European Telecommunications Standards Institute, April 2004, at

Firesmith D. (2004) 'Specifying Reusable Security Requirements' Journal of Object Technology 3, 1 (Jan-Feb 2004) 61-75, at

Gallagher S. (2013) 'How Verizon found child pornography in its cloud: Scanned files using hashes of known child pornography images' Ars Technica, 6 March 2013, at

Galvin N. (2012) 'Megaupload closure hits legitimate users' The Sydney Morning Herald, 23 January 2012, at

Gliddon J. (2013) 'CommBank rules out public cloud storage' 25 Febeuary 2013, at,commbank-rules-out-public-cloud-storage.aspx

Golden B. (2011) 'Cloud Computing and the Truth About SLAs' Networkworld, 8 November 2011, at

Hardy C.A. & Williams S.P. (2010) 'Managing Information Risks and Protecting Information Assets in a Web 2.0 Era' Proc. Bled eConf., June 2010

Heiser J. (2011) 'Yes, Virginia, there are single points of failure' Gartner Blog, 30 May 2011, at

Hicks S. (2012) 'Russian hackers hold Gold Coast doctors to ransom' ABC News, 11 December 2012, at

Hilvert J. (2012) 'Conroy warns of cloud uncertainties' it News, 15 February 2012, at,conroy-warns-of-cloud-uncertainties.aspx

IBD (2013) 'Cloud Computing Users Are Losing Data, Symantec Finds' Investor's Business Daily, 16 January 2013, at

ISM (2012) 'Information Security Manual' [Australian] Defence Signals Directorate, at

ISO (2008) 'Information Technology - Security Techniques - Information Security Risk Management' ISO/IEC 27005:2008

Kar S. (2013) 'CSA Report: Top Nine Cloud Security Threats in 2013' Cloud Times, 7 March 2013, at

Mellor C. (2010) 'NetApp and TMS involved in Virgin Blue outage' The Register, 28 September 2010, at

Molnar D. & Schechter S. (2010) 'Self Hosting vs. Cloud Hosting: Accounting for the security impact of hosting in the cloud ' Proc. 9th Workshop on the Economics of Information Security (WEIS 2010), Harvard University, June 2010, at

Needleman R. (2011) 'Was brief Google Docs outage a tremor or a tsunami?' cnet News, 7 September 2011, at

NIST (2002) ''Risk Management Guide for Information Technology Systems' NIST SP 800-30, [US] National Institute of Standards and Technology, July 2002, at

NIST (2011) 'Managing Information Security Risk: Organization, Mission, and Information System View' NIST SP 800-39, [US] National Institute of Standards and Technology, March 2011, at

Paquette S., Jaeger P.T. & Wilson S.C. (2010) 'Identifying the security risks associated with governmental use of cloud computing' Government Information Quarterly 27, 3 (July 2010) 245-253

Schneier B. (2012) 'When It Comes to Security, We're Back to Feudalism' Wired, 26 November 2012, at

SNIA (2012) 'Cloud Data Management Interface (CDMI)' Technical Position, Storage Networking Industry Association, 4 June 2012, at

Subashini S. & Kavitha V. (2011) 'A survey on security issues in service delivery models of cloud computing' Journal of Network and Computer Applications 34, 1 (January 2011) 1-11

Svantesson D. & Clarke R. (2010) 'Privacy and Consumer Risks in Cloud Computing' Computer Law & Security Review 26, 4 (July 2010) 391-397

Tay L. (2012) 'ANZ builds up 'cypher cloud' strategy' 27 November 2012, at,anz-builds-up-cypher-cloud-strategy.aspx

Troshani I., Rampersad G. & Wickramasinghe N. (2011) 'Cloud Nine? An Integrative Risk Management Framework for Cloud Computing' Proc. Bled e Conference, June 2011

Tung L. (2012) 'Google clips Exchange backup service' itNews, 23 January 2012, at,google-clips-exchange-backup-service.aspx

Urquhart J. (2010) 'The 'Cloud Computing Bill of Rights': 2010 edition' Cnet News, 7 June 2010, at

Winterford B. (2011b) 'Transparency: a core tenet of the cloud' itNews, 20 September 2011, at,transparency-a-core-tenet-of-the-cloud.aspx

Author Affiliations

Roger Clarke is Principal of Xamax Consultancy Pty Ltd, Canberra. He is also a Visiting Professor in the Cyberspace Law & Policy Centre at the University of N.S.W., and a Visiting Professor in the Research School of Computer Science at the Australian National University.

xamaxsmall.gif missing
The content and infrastructure for these community service pages are provided by Roger Clarke through his consultancy company, Xamax.

From the site's beginnings in August 1994 until February 2009, the infrastructure was provided by the Australian National University. During that time, the site accumulated close to 30 million hits. It passed 50 million in early 2015.

Sponsored by Bunhybee Grasslands, the extended Clarke Family, Knights of the Spatchcock and their drummer
Xamax Consultancy Pty Ltd
ACN: 002 360 456
78 Sidaway St, Chapman ACT 2611 AUSTRALIA
Tel: +61 2 6288 6916

Created: 1 April 2013 - Last Amended: 7 April 2013 by Roger Clarke - Site Last Verified: 15 February 2009
This document is at
Mail to Webmaster   -    © Xamax Consultancy Pty Ltd, 1995-2013   -    Privacy Policy