Posts Tagged ‘Cloud Computing

Key management procedures in Cloud computing

Cloud computing infrastructures require the management and storage of many different kinds of keys; examples include session keys to protect data in transit (e.g., SSL keys), file encryption keys, key pairs identifying cloud providers, key pairs identifying customers, authorization tokens and revocation certificates. Because virtual machines do not have a fixed hardware infrastructure and cloud based content tends to be geographically distributed, it is more difficult to apply standard controls, such as hardware security module (HSM) storage, to keys on cloud infrastructures. For example:

  1. HSMs are by necessity strongly physically protected (from theft, eavesdrop and tampering). This makes it very difficult for them to be distributed in the multiple locations used in cloud architectures (i.e., geographically distributed and highly replicated). Key management standards such as PKCS#10 and associated standards such as PKCS#11 do not provide standardized wrappers for interfacing with distributed systems.
  2. Key management interfaces which are accessible via the public Internet (even if indirectly) are more vulnerable, as security is reduced in the communication channel between the user and the cloud key storage and the mutual remote authentication mechanisms used.
  3. New virtual machines needing to authenticate themselves must be instantiated with some form of secret. The distribution of such secrets may present problems of scalability. The rapid scaling of certification authorities issuing key pairs is easily achieved if resources are determined in advance, but dynamic, unplanned scaling of hierarchical trust authorities is difficult to achieve because of the resource overhead in creating new authorities (registration or certification, in authenticating new components and distributing new credentials, etc).
  4. Revocation of keys within a distributed architecture is also expensive. Effective revocation essentially implies that applications check the status of the key (certificate usually) according to a known time constraint which determines the window of risk. Although distributed mechanisms exist for achieving the challenges to ensure that different parts of the cloud receive an equivalent level of service so that they are not associated with different levels of risk. Centralized solutions such as OCSP are expensive and do not necessarily reduce the risk unless the CA and the CRL are tightly bound.

Tags : , , , , , ,

Trust challenges of Cloud computing

The Security and Privacy challenges discussed above are also relevant to the general requirement upon Cloud suppliers to provide trustworthy services. If Cloud providers find adequate solutions to address the data privacy and security specificities of their business model,they will have met in a certain way the requirement of offering trusted services. Yet, there are a few other challenges which, if tackled properly, would enhance users confidence in the application of Cloud computing and would build market trust in the Cloud service offerings.

Continuity and Provider Dependency - The increasing complexity of Cloud architectures and the resulting lack of transparency also increase the security risk. In many Cloud implementations, the centralized management and control introduces several so-called single points of failure. These could threaten the availability of Cloud users’ data or computing capabilities indirectly, as a small incident in the Cloud could have an exponential impact.

Compliance with applicable regulations and good practices - If privacy is one regulatory area particularly relevant to Cloud computing, it is certainly not the only area. Once the applicable law to a Cloud service is determined, the provider will need to comply with other regulations than privacy, such as: General civil law and contract law, Consumer protection law, “e-commerce regulation”, Fair trade practices law.

Change in Cloud ownership and “Force Majeure”- The Cloud market is still immature and the situation of global economy may affect some of the Cloud industry players too in the coming months or year(s). Accordingly, users of the Cloud must be confident that the services externalized to the Cloud provider, including any important assets (personal data, confidential information)will not be disrupted as it was discussed above(“Continuity and Provider Dependency”).

Trust enhancement through assurance mechanisms – By definition, the Cloud-computing concept cannot guarantee full, continuous and complete control of the Cloud users over their assets. For these reasons, the establishment of appropriate “checks and controls” to ascertain that Cloud providers meet their obligations becomes very relevant for Cloud users (for example,through adherence to generally-accepted standards).

Despite security, privacy and trust concerns, the benefits offered by Cloud computing are too significant to ignore. Thus, rather than discarding cloud computing because of the risks involved, the Cloud participants should work to overcome them so that they can maximize the benefits (e.g. reduced cost, increased storage, flexibility, mobility, etc.). Cloud users should become Risk Intelligent by taking a proactive approach to managing risks and challenges in Privacy, Security and Trust. Risk will become an even more important part of doing business when adopting Cloud concepts.

Risk can then provide both opportunity and peril: poorly managed, it allows a security breach by a hacker or a disgruntled employee, exposing an organisation to potential loss and liability. Effectively addressed, it enables management to exploit e-channels, mobile offices and process efficiency gains and positive results. The Risk Intelligent C-suite should manage information security from the perspective of making money by taking intelligent risks, avoiding losing money by failing to manage risk intelligently.


Tags : , , , , , , , , , ,

Global Cloud Exchange and Markets

Enterprises currently employ Cloud services in order to improve the scalability of their services and to deal with bursts in resource demands. However, at present, service providers have inflexible pricing, generally limited to flat rates or tariffs based on usage thresholds, and consumers are restricted to offerings from a single provider at a time. Also, many providers have proprietary interfaces to their services thus restricting the ability of consumers to swap one provider for another.

For Cloud computing to mature, it is required that the services follow standard interfaces. This would enable services to be commoditised and thus, would pave the way for the creation of a market infrastructure for trading in services. An example of such a market system, modeled on real-world exchanges, is shown in Figure 1. The market directory allows participants to locate providers or consumers with the right offers. Auctioneers periodically clear bids and asks received from market participants. The banking system ensures that financial transactions pertaining to agreements between participants are carried out. Brokers perform the same function in such a market as they do in real-world markets: they mediate between consumers and providers by buying capacity from the provider and sub-leasing these to the consumers. A broker can accept requests from many users who have a choice of submitting their requirements to different brokers. Consumers, brokers and providers are bound to their requirements and related compensations through SLAs. An SLA specifies the details of the service to be provided in terms of metrics agreed upon by all parties, and penalties for meeting and violating the expectations, respectively.

Figure 1: Global Cloud exchange and market infrastructure for trading services.

Such markets can bridge disparate Clouds allowing consumers to choose a provider that suits their requirements by either executing SLAs in advance or by buying capacity on the spot. Providers can use the markets in order to perform effective capacity planning. A provider is equipped with a price-setting mechanism which sets the current price for there source based on market conditions, user demand, and current level of utilization of the resource. Pricing can be either fixed or variable depending on the market conditions. An admission-control mechanism at a provider’s end selects the auctions to participate in or the brokers to negotiate with, based on an initial estimate of the utility. The negotiation process proceeds until an SLA is formed or the participants decide to break off. These mechanisms interface with the resource management systems of the provider in order to guarantee the allocation being offered or negotiated can be reclaimed, so that SLA violations do not occur. The resource management system also provides functionalities such as advance reservations that enable guaranteed provisioning of resource capacity.

Brokers gain their utility through the difference between the price paid by the consumers for gaining resource shares and that paid to the providers for leasing their resources. Therefore, a broker has to choose those users whose applications can provide it maximum utility. A broker interacts with resource providers and other brokers to gain or to trade resource shares. A broker is equipped with a negotiation module that is informed by the current conditions of the resources and the current demand to make its decisions.

Consumers have their own utility functions that cover factors such as deadlines, fidelity of results, and turnaround time of applications. They are also constrained by the amount of resources that they can request at any time, usually by a limited budget. Consumers also have their own limited IT infrastructure that is generally not completely exposed to the Internet. Therefore, a consumer participates in the utility market through a resource management proxy that selects a set of  brokers based on their offerings. He then forms SLAs with the brokers that bind the latter to provide the guaranteed resources. The enterprise consumer then deploys his own environment on the leased resources or uses the provider’s interfaces in order to scale his applications.

However, significant challenges persist in the universal application of such markets. Enterprises currently employ conservative IT strategies and are unwilling to shift from the traditional controlled environments. Cloud computing uptake has only recently begun and many systems are in the proof-of concept stage. Regulatory pressures also mean that enterprises have to be careful about where their data gets processed, and therefore, are not able to employ Cloud services from an open market. This could be mitigated through SLAs that specify strict constraints on the location of the resources. However, another open issue is how the participants in such a market can obtain restitution in case an SLA is violated. This motivates the need for a legal framework for agreements in such markets.

Tags : , , , , , ,

Public Cloud Outsourcing

Although cloud computing is a new computing paradigm, outsourcing information technology services is not. The steps that organizations take remain basically the same for public clouds as with other, more traditional, information technology services, and existing guidelines for outsourcing generally apply as well. What does change with public cloud computing, however,is the potential for increased complexity and difficulty in providing adequate oversight to maintain accountability and control over deployed applications and systems throughout their life cycle. This can be especially daunting when non-negotiable SLAs are involved, since responsibilities normally held by the organization are given over to the cloud provider with little recourse for the organization to address problems and resolve issues, which may arise, to its satisfaction.

Reaching agreement on the terms of service of a negotiated SLA for public cloud services can be a complicated process fraught with technical and legal issues. Migrating organizational data and functions into the cloud is accompanied by a host of security and privacy issues to be addressed, many of which concern the adequacy of the cloud provider’s technical controls for an organization’s needs. Service arrangements defined in the terms of service must also meet existing privacy policies for information protection, dissemination and disclosure. Each cloud provider and service arrangement has distinct costs and risks associated with it. A decision based on any one issue can have major implications for the organization in other areas.

Considering the growing number of cloud providers and range of services offered, organizations must exercise due diligence when moving functions to the cloud. Decision making about new services and service arrangements entails striking a balance between benefits in cost and productivity versus drawbacks in risk and liability.

Tags : , , , , , , ,

Secular Cloud Challenges

Server spending the most negatively affected by the move to the cloud: In our survey, 54% of our respondents cited server hardware as a top-three area of cost savings from the move to cloud computing. The average 8.6% expected reduction in server spending over the next three years due to the move to the cloud, dwarfs the 1.0%and 0.4% expected savings for storage and networking, respectively.

On-premise server growth goes negative: A shift of workloads to more-efficient cloud environments and increased utilization of current server resources in private cloud environments push on-premise new server shipments to a -1% CAGR over the next three years in our model.

Challenges for vendors tied to the growth of on-premise data centers: Growth drivers are shifting as vendors try to incorporate public cloud strategies; those slow to move will see significant headwinds to growth.

Models in Flux

1. Brocade: Brocade offers a competitive fabric-based strategy, but its ability to execute and penetrate large accounts remains a concern.

2. Cisco: Lacking a flat architecture for large-scale cloud deployments in its portfolio, we believe Cisco remains in a defensive position.

3. Hewlett-Packard: Hewlett-Packard lacks a clear strategy to attack cloud data centers with traditional server and networking products.However, converged portfolio is taking share in on-premise data centers.

4. Microsoft: Microsoft’s dominant share in server operating systems is almost solely in on-premise environments. However, its public cloud offerings polled the strongest of any vendor in our survey.

5. Red Hat: While well positioned for the cloud build out, Red Hat’s current subscription base is largely tied to on-premise deployments,and its virtualization, PaaS, and IaaS offerings are nascent.

6. SAP AG: A ramp in the BBD reseller network is likely to drive higher top-line growth and meaningful revenue contribution for the group.We estimate business-by-design (BBD) revenues at €83 million in 2012e (less than 1% of group SQL server reporting services),reaching about €900 million in 2015e, about 10% of group SSRS.

Potentially Secularly Challenged

Tags : , , , , , , ,

Designing and implementing security & privacy controls

We apply generally accepted frameworks – such as ISO27001, CoBIT and ISF – to enable secure businesses through Cloud computing. Our risk based approach deals with these challenges:

Combining a sound integrated Security and Privacy Management with a clear view on the “to-be” architecture puts us in a unique position to define and execute a comprehensive security and privacy strategy. The design and implementation of security and privacy controls inan integrated manner cover a wide variety of methods that enable new business models while controlling the risks:

  1. Re-perimeterisation by segregating networks, storage and application environments and by implementing data-centric control mechanisms
  2. Disaster recovery and Business Continuity optimisation
  3. Identity and Access management beyond the perimeter
  4. Role-based access controls extended outside the Enterprise
  5. Data encryption and confidentiality for data in rest, during transit and in use
  6. Implementing appropriate risk based privacy processes, controls and procedures

Tags : , , , , , , , , , ,

The Market for Next-generation Optical Discs

The storage market has undergone remarkable changes over the past several years. It includes the emergence of tablet PCs without ODDs (optical disc drives) and PCs without HDDs as well as the market debut of SSDs (solid state drives). These trends have caused the ratio of ODDs installed in PCs to decrease overall. Looking at the entire market, we expect that the percentage of semiconductor-base storage will increase with respect to storage systems in the PC and consumer oriented system segments.

On the other hand, due to the ever-growing amount of data handled by users, there is a general perception that the amount of data will continue to grow exponentially. As for where this data will be stored, we expect that there will be a move from storage on personally owned systems to network storage. In addition, the storage capacity of enterprise systems, as represented by cloud computing systems  is expected to rapidly grow larger. For enterprise systems, effective management realized by specializing data handling based on accessibility,  importance, and other such factors is required.

Therefore, selecting the storage device most suitable for each purpose is the key to successful system management. For example, it is not effective if data that is rarely accessed but has a high need for long-term storage is stored on an SSD, which allows for fast access but is relatively expensive. Rather than SSD or HDD storage, cost-effective management of this type of data can be achieved through offline management disconnected from the network, such as storage on an optical disc, which is excellent for long-term storage and has superior environmental resistance performance. In addition, removing the disc from the system and storing it in an offline environment leads to reduced energy consumption forlong-term storage, thereby contributing to more environmentally friendly systems.

Thus we expect that the next-generation optical disc market will significantly evolve from primarily serving as a means of distributing media to consumers (e.g., music on CDs and movies on DVDs and Blu-ray discs) to serving as media for enterprise data archiving (long-term storage and recording). In addition to the above-mentioned technical changes, we can surmise that next-generation optical discs will significantly change their role in the market. Next-generation optical discs are therefore required to be built with specifications prepared for these market changes. To positively respond to this trend, we shall do so.

Tags : , , , , , , , , , , , ,

Abuse and Nefarious Use of Cloud Computing

IaaS providers offer their customers the illusion of unlimited compute, network, and storage capacity — often coupled with a ‘frictionless’ registration process where anyone with a valid credit card can register and immediately begin using cloud services. Some providers even offer free limited trial periods. By abusing the relative anonymity behind these registration and usage models, spammers, malicious code authors, and other criminals have been able to conduct their activities with relative impunity. PaaS providers have traditionally suffered most from this kind of attacks; however, recent evidence shows that hackers have begun to target IaaS vendors as well. Future areas of concern include password and key cracking, DDOS, launching dynamic attack points, hosting malicious data, botnet command and control, building rainbow tables, and CAPTCHA solving farms.


IaaS offerings have hosted the Zeus botnet, InfoStealer trojan horses, and downloads for Microsoft Office and Adobe PDF exploits. Additionally, botnets have used IaaS servers for command and control functions. Spam continues to be a problem — as a defensive measure, entire blocks of IaaS network addresses have been publicly blacklist.



Criminals continue to leverage new technologies to improve their reach, avoid detection, and improve the effectiveness of their activities. Cloud Computing providers are actively being targeted, partially because their relatively weak registration systems facilitate anonymity, and providers’ fraud detection capabilities are limited.


Tags : , , , , , , ,

Privacy challenges of Cloud computing

In the Cloud computing environment, Cloud providers,being by definition third parties, can host or store important data, files and records of Cloud users. In certain forms of Cloud computing, the use of the service per se entails that personally identifiable information or content related to individual’s privacy sphere are communicated through the platform to, sometimes, an unrestricted number of users (see, social networking paradigm). Given the volume or location of the Cloud computing providers, it is difficult for companies and private users to keep at all times in control the information or data they entrust to Cloud suppliers. Some key privacy or data protection challenges that can be characterised as particular to the Cloud-computing context are, in our view, the following:

Sensitivity of entrusted information – It appears that any type of information can be hosted on, or managed by the Cloud. No doubt that all or some of this information may be business sensitive (i.e. bank account records) or legally sensitive (i.e. health records), highly confidential or extremely valuable as company asset(e.g. business secrets). Entrusting this information to a Cloud increases the risk of uncontrolled dissemination of that information to competitors (who can probably co-share same Cloud platform), individuals concerned by this information or to any other third party with an interest in this information.

Localisation of information and applicable law -The relation of certain data to a geographic locationhas never been more blurred than with the advent of Cloud computing. In the EU, as in other jurisdictions,the physical “location” plays a key role for determining which privacy rules apply. Thus, data collected and “located” within the European territory can benefit from the protection of the European privacy rules.

Users access rights to information – Given that theusers of the same Cloud share the premises of data processing and the data storage facilities, they are bynature exposed to the risk of information leakage, accidental or intentional disclosure of information.

Data transfers – If the data used by, or hosted on, the Cloud may change location regularly or may reside on multiple locations at the same time, it becomes complicated to watch over the data flows and, consequently,to determine the conditions that would legitimize such data transfers. If, data movements are geographically unlimited under some local laws, in Europe, data transfers to third countries often require contractual or other arrangements to be in place (e.g. EU “model contracts” or US Safe Harbor registration for data transfers to the US). It may become complicated to fulfill these arrangements if data locations are not stable.

Externalization of privacy – Companies engaging in Cloud computing expect that the privacy commitments they have made towards their customers, employees orother third parties will continue to apply by the Cloud computer provider. This becomes particularly relevant if the Cloud provider operates in many jurisdictions inwhich the exercise of individual rights may be subject to different conditions.

Contractual rules with privacy implications - It is common for a Cloud provider to offer his facilities to users without individual contracts. Yet, it can be that certain Cloud providers suggest negotiating their agreements with clients and, thus, offering the possibility of tailored contracts. Whatever the opted contractual model is, certain contractual clauses can have direct implications in the privacy and protection of the entrusted information (e.g. defining who actually“controls” the data and who only “processes” the data).

Tags : , , ,