Cloud Computing
Cloud computing describes computation, software, data access, and storage
services that do not require end-user knowledge of the physical location and
configuration of the system that delivers the services.
Parallels
to this concept can be drawn with the electricity grid where end-users consume power resources without any
necessary understanding of the component devices in the grid required to
provide said service.
Cloud computing is a natural evolution
of the widespread adoption of virtualization, service-oriented architecture, autonomic and utility computing.
Details are abstracted from end-users,
who no longer have need for expertise in, or control over, the technology
infrastructure "in the cloud" that supports them.
Cloud computing describes a new
supplement, consumption, and delivery model for IT services based on Internet
protocols, and it typically involves provisioning of dynamically scalable and often virtualized resources.
It is a byproduct and consequence of the
ease-of-access to remote computing sites provided by the Internet.
This frequently takes the form of web-based tools or applications that users
can access and use through a web browser as if it were a program installed locally on their own
computer.
The
National Institute of Standards and Technology (NIST) provides a somewhat more objective and specific
definition:
"Cloud computing is a model for enabling convenient,
on-demand network access to a shared pool of configurable computing resources
(e.g., networks, servers, storage, applications, and services) that can be
rapidly provisioned and released with minimal management effort or service
provider interaction."[6]
The term
"cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the
past to represent the telephone network,[7] and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.[8] Typical cloud computing providers deliver common business applications online
that are accessed from another Web service or software like a Web browser, while the software and data are stored on servers.
Most cloud
computing infrastructures consist of services delivered through common centers
and built on servers. Clouds often appear as single points of access for
consumers' computing needs. Commercial offerings are generally expected to meet
quality of service (QoS)
requirements of customers, and typically include service level agreements (SLAs).[9]
Overview
Comparisons
Cloud
computing derives characteristics from, but should not be confused with:
2.
Client–server model – client–server
computing refers broadly to any distributed application
that distinguishes between service providers (servers) and service requesters
(clients)[11]
3.
Grid computing — "a form of distributed computing and parallel computing, whereby
a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large
tasks"
4.
Mainframe computer —
powerful computers used mainly by large organizations for critical
applications, typically bulk data-processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.[12]
5.
Utility computing — the "packaging of computing resources, such as
computation and storage, as a metered service similar to a traditional public utility, such as electricity";[13]
6.
Peer-to-peer – distributed architecture without the need for central
coordination, with participants being at the same time both suppliers and
consumers of resources (in contrast to the traditional client–server model)
7.
Service-oriented computing – Cloud computing provides services related to computing
while, in a reciprocal manner, service-oriented computing consists of the computing
techniques that operate on software-as-a-service.[14]
Characteristics
The key
characteristic of cloud computing is that the computing is "in the
cloud" i.e. the processing (and the related data) is not in a specified,
known or static place(s). This is in contrast to a model in which the
processing takes place in one or more specific servers that are known. All the
other concepts mentioned are supplementary or complementary to this concept.
Architecture
Cloud computing sample architecture
Cloud
architecture,[15] the systems architecture of the software systems involved in the delivery of cloud computing, typically
involves multiple cloud components communicating with each other over application programming interfaces, usually web services and 3-tier architecture. This
resembles the Unix philosophy of having multiple programs each
doing one thing well and working together over universal interfaces. Complexity
is controlled and the resulting systems are more manageable than their monolithic counterparts.
The two
most significant components of cloud computing architecture are known as the
front end and the back end. The front end is the part seen by the client, i.e.
the computer user. This includes the client’s network (or computer) and the
applications used to access the cloud via a user interface such as a web
browser. The back end of the cloud computing architecture is the ‘cloud’
itself, comprising various computers, servers and data storage devices.
History
The
underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a
public utility." Almost all the modern-day
characteristics of cloud computing (elastic provision, provided as a utility, online,
illusion of infinite supply), the comparison to the electricity industry and
the use of public, private, government and community forms was thoroughly
explored in Douglas Parkhill's 1966 book, The Challenge of the Computer
Utility.
The actual
term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s
primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network
(VPN) services with comparable quality of service but at a much lower cost. By
switching traffic to balance utilization as they saw fit, they were able to
utilize their overall network bandwidth more effectively. The cloud symbol was
used to denote the demarcation point between that which was the responsibility
of the provider from that of the user. Cloud computing extends this boundary to
cover servers as well as the network infrastructure.[16] The first scholarly use of the term “cloud computing” was
in a 1997 lecture by Ramnath Chellappa.
Amazon played a key role in the development of cloud computing by
modernizing their data centers after the dot-com bubble, which, like most computer networks, were using as little as 10% of their capacity at any one
time, just to leave room for occasional spikes. Having found that the new cloud
architecture resulted in significant internal efficiency improvements whereby
small, fast-moving "two-pizza teams" could add new features faster
and more easily, Amazon initiated a new product development effort to provide
cloud computing to external customers, and launched Amazon Web Service (AWS)
on a utility computing basis in
2006.[17][18]
In 2007, Google, IBM and a number of universities
embarked on a large scale cloud computing research project.[19] In early 2008, Eucalyptus became the first open source AWS
API compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission funded
project, became the first open source software for deploying private and hybrid
clouds and for the federation of clouds.[20] In the same year, efforts were focused on providing QoS
guarantees (as required by real-time interactive applications) to Cloud-based
infrastructures, in the framework of the IRMOS European Commission funded
project.[21] By mid-2008, Gartner saw an opportunity for cloud computing
"to shape the relationship among consumers of IT services, those who use
IT services and those who sell them"[22] and observed that "[o]rganisations are switching from
company-owned hardware and software assets to per-use service-based
models" so that the "projected shift to cloud computing ... will
result in dramatic growth in IT products in some areas and significant
reductions in other areas."[23]
Key Characteristics
·Agility improves with users' ability to rapidly and inexpensively
re-provision technological infrastructure resources.
·Application Programming Interface (API) accessibility to software that enables machines to
interact with cloud software in the same way the user interface facilitates interaction
between humans and computers. Cloud Computing systems typically use REST-based APIs.
·Cost is claimed to be greatly reduced and in a public cloud
delivery model capital expenditure is
converted to operational expenditure.[25] This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party
and does not need to be purchased for one-time or infrequent intensive
computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT
skills are required for implementation (in-house).[26]
·Device and location independence[27] enable users to access systems using a web browser
regardless of their location or what device they are using (e.g., PC, mobile).
As infrastructure is off-site (typically provided by a third-party) and
accessed via the Internet, users can connect from anywhere.[26]
·Multi-tenancy enables sharing of resources and costs across a large pool
of users thus allowing for:
o
Centralization of infrastructure in locations with lower costs (such as
real estate, electricity, etc.)
o
Peak-load
capacity increases (users need not engineer
for highest possible load-levels)
·Reliability is improved if multiple redundant sites are used, which
makes well designed cloud computing suitable for business continuity and disaster recovery.[28] Nonetheless, many major cloud computing services have
suffered outages, and IT and business managers can at times do little when they
are affected.[29][30]
·Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near
real-time, without users having to engineer for peak loads. Performance is monitored, and consistent and
loosely coupled architectures are constructed using web services as the system interface.[26] One of the most important new methods for overcoming
performance bottlenecks for a large class of applications is data parallel
programming on a distributed data grid.[31]
·Security could improve due to centralization of data,[32] increased security-focused resources, etc., but concerns
can persist about loss of control over certain sensitive data, and the lack of
security for stored kernels.[33] Security is often as good as or better than under
traditional systems, in part because providers are able to devote resources to
solving security issues that many customers cannot afford.[34] Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Furthermore, the
complexity of security is greatly increased when data is distributed over a
wider area and / or number of devices.
·Maintenance of cloud computing applications is easier, since they don't
have to be installed on each user's computer. They are easier to support and to
improve since the changes reach the clients instantly.
·Metering means that cloud computing resources usage should be
measurable and should be metered per client and application on a daily, weekly,
monthly, and yearly basis.
Layers
The Internet functions through a series of network protocols that form a
stack of layers, as shown in the figure (or as described in more detail in the OSI model). Once an Internet Protocol connection is established among
several computers, it is possible to share services within any one of the
following layers.
Client
A cloud
client consists of computer hardware and/or computer software that relies on cloud computing for application delivery, or
that is specifically designed for delivery of cloud services and that, in
either case, is essentially useless without it. Examples include some computers, phones and other devices, operating systems and browsers.[35][36][37][38][39]
Application
Cloud
application services or "Software as a Service (SaaS)"
deliver software as a service over the Internet, eliminating the need to install and run the application on
the customer's own computers and simplifying maintenance and support. People
tend to use the terms ‘SaaS’ and ‘cloud’ interchangeably, when in fact they are
two different things.[citation needed] Key
characteristics include:
·Network-based access to, and management of, commercially
available (i.e., not custom) software
·Activities that are managed from central locations rather
than at each customer's site, enabling customers to access applications
remotely via the Web
·Application delivery that typically is closer to a
one-to-many model (single instance, multi-tenant architecture) than to a
one-to-one model, including architecture, pricing, partnering, and management
characteristics
·Centralized feature updating, which obviates the need for
downloadable patches and upgrades.
Platform
Cloud
platform services or "Platform as a Service (PaaS)"
deliver a computing platform and/or solution stack as a service, often consuming cloud infrastructure and
sustaining cloud applications.[41] It facilitates deployment of applications without the cost
and complexity of buying and managing the underlying hardware and software
layers.[42][43]
Infrastructure
Cloud
infrastructure services, also known as "Infrastructure as a Service (IaaS)", delivers computer infrastructure - typically a platform virtualization
environment - as a service. Rather than purchasing servers, software,
data-center space or network equipment, clients instead buy those resources as
a fully outsourced service. Suppliers typically bill such services on a utility computing basis and amount of resources consumed (and therefore the
cost) will typically reflect the level of activity. IaaS evolved from virtual private server
offerings.[44]
Cloud
infrastructure often takes the form of a tier 3 data center with many
tier 4 attributes, assembled from hundreds
of virtual machines.
Server
The servers
layer consists of computer hardware and/or computer software products that are specifically designed for the delivery of
cloud services, including multi-core processors, cloud-specific operating
systems and combined offerings.[35][45][46][47]
Deployment models
Cloud computing types
Public cloud
Public
cloud or external cloud describes
cloud computing in the traditional main stream sense, whereby resources are
dynamically provisioned on a fine-grained, self-service basis over the
Internet, via web applications/web services, from an off-site third-party provider who bills on a
fine-grained utility computing basis.[26]
Community cloud
A community
cloud may be established where several organizations have similar
requirements and seek to share infrastructure so as to realize some of the
benefits of cloud computing. With the costs spread over fewer users than a public
cloud (but more than a single tenant) this option is more expensive but may
offer a higher level of privacy, security and/or policy compliance. Examples of
community cloud include Google's "Gov Cloud".
Hybrid cloud
There is
some confusion over the term "hybrid" when applied to the cloud - a
standard definition of the term "Hybrid Cloud" has not yet emerged.
The term "hybrid cloud" has been used to mean either two separate
clouds joined together (public, private, internal or external), or a
combination of virtualized cloud server instances used together with real
physical hardware. The most correct definition of the term "hybrid
cloud" is probably the use of physical hardware and virtualized cloud
server instances together to provide a single common service. Two clouds that
have been joined together are more correctly called a "combined
cloud".
A combined
cloud environment consisting of multiple internal and/or external providers
"will be typical for most enterprises". By
integrating multiple cloud services users may be able to ease the transition to
public cloud services while avoiding issues such as PCI compliance.
Another
perspective on deploying a web application in the cloud is using Hybrid Web
Hosting, where the hosting infrastructure is a mix between cloud hosting and managed dedicated servers - this is most commonly achieved as part of a web cluster
in which some of the nodes are running on real physical hardware and some are
running on cloud server instances.
A hybrid
storage cloud uses a combination of public and private storage clouds. Hybrid
storage clouds are often useful for archiving and backup functions, allowing
local data to be replicated to a public cloud.
Private cloud
Douglas
Parkhill first described the concept of a "private computer utility"
in his 1966 book The Challenge of the Computer Utility. The idea was
based upon direct comparison with other industries (e.g. the electricity
industry) and the extensive use of hybrid supply models to balance and mitigate
risks.
Private
cloud and internal cloud have been
described as neologisms, however the concepts themselves
pre-date the term cloud by 40 years. Even within modern utility
industries, hybrid models still exist despite the formation of reasonably
well-functioning markets and the ability to combine multiple providers.
Some
vendors have used the terms to describe offerings that emulate cloud computing
on private networks. These (typically virtualization automation) products offer the ability to host applications or virtual
machines in a company's own set of hosts. These provide the benefits of utility
computing -shared hardware costs, the ability to recover from failure, and the
ability to scale up or down depending upon demand.
Private
clouds have attracted criticism because users "still have to buy, build,
and manage them" and thus do not benefit from lower up-front capital costs
and less hands-on management,[51] essentially "[lacking] the economic model that makes cloud
computing such an intriguing concept".[54] [55] Enterprise IT organizations use their own private cloud(s)
for mission critical and other operational systems to protect critical infrastructures.
Cloud engineering
Cloud
engineering is the application of a systematic, disciplined, quantifiable, and
interdisciplinary approach to the ideation, conceptualization, development,
operation, and maintenance of cloud computing, as well as the study and applied
research of the approach, i.e., the application of engineering to cloud. It is
a maturing and evolving discipline to facilitate the adoption, strategization,
operationalization, industrialization, standardization, productization,
commoditization, and governance of cloud solutions, leading towards a cloud
ecosystem. Cloud engineering is also known as cloud service engineering.
Cloud storage
Cloud
storage is a model of networked computer data storage where
data is stored on multiple virtual servers, generally hosted by third parties,
rather than being hosted on dedicated servers. Hosting companies operate large data centers; and people who
require their data to be hosted buy or lease storage capacity from them and use
it for their storage needs. The data center operators, in the background, virtualize the resources according to the
requirements of the customer and expose them as virtual servers, which the
customers can themselves manage. Physically, the resource may span across
multiple servers.
The Intercloud
The Intercloud is an interconnected global "cloud of clouds" and
an extension of the Internet "network of networks" on
which it is based.[60] The term was first used in the context of cloud computing
in 2007 when Kevin Kelly stated that "eventually we'll
have the intercloud, the cloud of clouds. This Intercloud will have the
dimensions of one machine comprising all servers and attendant cloudbooks on the planet.". It became popular in 2009 and has also been used to describe the datacenter of the
future.
The
Intercloud scenario is based on the key concept that each single cloud does not
have infinite physical resources. If a cloud saturates the computational and
storage resources of its virtualization infrastructure, it could not be able to
satisfy further requests for service allocations sent from its clients. The
Intercloud scenario aims to address such situation, and in theory, each cloud
can use the computational and storage resources of the virtualization
infrastructures of other clouds. Such form of pay-for-use may introduce new
business opportunities among cloud providers if they manage to go beyond
theoretical framework. Nevertheless, the Intercloud raises many more challenges
than solutions concerning cloud federation, security, interoperability, quality
of service, vendor's lock-ins, trust, legal issues, monitoring and billing.
The
concept of a competitive utility computing market which combined many computer
utilities together was originally described by Douglas Parkhill in his 1966
book, the "Challenge of the Computer Utility". This concept has been
subsequently used many times over the last 40 years and is identical to the
Intercloud.
Issues
Privacy
The cloud
model has been criticized by privacy advocates for the greater ease in which
the companies hosting the cloud services control, and thus, can monitor at
will, lawfully or unlawfully, the communication and data stored between the
user and the host company. Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million
phone calls between American citizens, causes uncertainty among privacy
advocates, and the greater powers it gives to telecommunication companies to
monitor user activity.[63] While there have been efforts (such as US-EU Safe Harbor) to "harmonize" the legal environment, providers
such as Amazon still cater to major markets
(typically the United States and the European Union) by deploying local infrastructure and allowing customers
to select "availability zones."[64]
Compliance
In order
to obtain compliance with regulations including FISMA, HIPAA and SOX
in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid
deployment modes which are typically more expensive and may offer restricted
benefits. This is how Google is able to "manage and meet additional
government policy requirements beyond FISMA"[65][66] and Rackspace Cloud are able to claim PCI compliance.[67] Customers in the EU contracting with cloud providers
established outside the EU/EEA have to adhere to the EU regulations on export
of personal data.[68]
Many
providers also obtain SAS 70 Type II certification (e.g. Amazon, Salesforce.com, Google and
Microsoft, but this has been criticised on the grounds that the hand-picked set
of goals and standards determined by the auditor and the auditee are often not
disclosed and can vary widely. Providers typically make this information
available on request, under non-disclosure agreement.
Legal
In March
2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark
77,139,082) in the United States. The
"Notice of Allowance" the company received in July 2008 was canceled
in August, resulting in a formal rejection of the trademark application less
than a week later. Since 2007, the number of trademark filings covering cloud
computing brands, goods and services has increased at an almost exponential
rate. As companies sought to better position themselves for cloud computing
branding and marketing efforts, cloud computing trademark filings increased by
483% between 2008 and 2009. In 2009, 116 cloud computing trademarks were filed,
and trademark analysts predict that over 500 such marks could be filed during
2010.
Other
legal cases may shape the use of cloud computing by the public sector. On
October 29, 2010, Google filed a lawsuit against the U.S. Department of
Interior, which opened up a bid for software that required that bidders use
Microsoft's Business Productivity Online Suite. Google sued, calling the
requirement "unduly restrictive of competition." Scholars have
pointed out that, beginning in 2005, the prevalence of open standards and open source may have an impact on the way that public entities choose
to select vendors.
Open source
Open source software has
provided the foundation for many cloud computing implementations.[78] In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3
intended to close a perceived legal loophole associated with free software designed to be run over a network.[79]
Open standards
Most cloud
providers expose APIs which are typically well-documented (often under a Creative Commons license[80]) but also unique to their implementation and thus not
interoperable. Some vendors have adopted others' APIs[81] and there are a number of open standards under development,
including the OGF's Open Cloud Computing Interface. The Open Cloud Consortium (OCC)[82] is working to develop consensus on early cloud computing
standards and practices.
Security
The
relative security of cloud computing services is a contentious issue which may
be delaying its adoption.[83] Issues barring the adoption of cloud computing are due in
large part to the private and public sectors unease surrounding the external
management of security based services. It is the very nature of cloud computing
based services, private or public, that promote external management of provided
services. This delivers great incentive amongst cloud computing service
providers in producing a priority in building and maintaining strong management
of secure services.[84]
Organizations
have been formed in order to provide standards for a better future in cloud computing
services. One organization in particular, the Cloud Security Alliance is a
non-profit organization formed to promote the use of best practices for
providing security assurance within cloud computing.
Availability and performance
In
addition to concerns about security, businesses are also worried about
acceptable levels of availability and performance of applications hosted in the
cloud.
There are
also concerns about a cloud provider shutting down for financial or legal
reasons, which has happened in a number of cases.
Sustainability and siting
Although
cloud computing is often assumed to be a form of "green computing", there is as of yet no published study to
substantiate this assumption. Siting the servers affects the environmental
effects of cloud computing. In areas where climate favors natural cooling and
renewable electricity is readily available, the environmental effects will be
more moderate. Thus countries with favorable conditions, such as Finland, Sweden
and Switzerland, are trying to attract cloud computing data centers.
SmartBay,
marine research infrastructure of sensors and computational technology, is
being developed using cloud computing, an emerging approach to shared
infrastructure in which large pools of systems are linked together to provide
IT services.
Research
A number
of universities, vendors and government organizations are investing in research
around the topic of cloud computing.[92] Academic institutions include University of Melbourne
(Australia), Georgia Tech, Yale, Wayne State, Virginia Tech, University of
Wisconsin–Madison, Carnegie Mellon, MIT, Indiana University, University of
Massachusetts, University of Maryland, IIT Bombay, North Carolina State
University, Purdue University, University of California, University of
Washington, University of Virginia, University of Utah, University of
Minnesota, among others.
Joint
government, academic and vendor collaborative research projects include the
IBM/Google Academic Cloud Computing Initiative (ACCI). In October 2007 IBM
and Google announced the multi- university project designed to enhance
students' technical knowledge to address the challenges of cloud computing. In
April 2009, the National Science Foundation joined the ACCI and awarded approximately $5 million in
grants to 14 academic institutions.
In July
2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center,
open source test bed, called Open Cirrus, designed to encourage research into all aspects of cloud
computing, service and data center management. Open Cirrus partners include the
NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, the
Infocomm Development Authority (IDA) of Singapore, the Electronics and
Telecommunications Research Institute (ETRI) in Korea, the Malaysian Institute
for Microelectronic Systems(MIMOS), and the Institute for System Programming at
the Russian Academy of Sciences (ISPRAS). In Sept. 2010, more researchers
joined the HP/Intel/Yahoo Open Cirrus project for cloud computing research. The
new researchers are China Mobile Research Institute (CMRI), Spain's
Supercomputing Center of Galicia (CESGA by its Spanish acronym), Georgia Tech's
Center for Experimental Research in Computer Systems (CERCS) and China Telecom.
In July
2010, HP Labs India announced a new cloud-based technology designed to simplify
taking content and making it mobile-enabled, even from low-end devices.[101] Called SiteonMobile, the new technology is designed for
emerging markets where people are more likely to access the internet via mobile
phones rather than computers.[102] In Nov. 2010, HP formally opened its Government Cloud
Theatre, located at the HP Labs site in Bristol, England.[103] The demonstration facility highlights high-security, highly
flexible cloud computing based on intellectual property developed at HP Labs.
The aim of the facility is to lessen fears about the security of the cloud. HP
Labs Bristol is HP’s second-largest central research location and currently is
responsible for researching cloud computing and security.[104]
The IEEE
Technical Committee on Services Computingin IEEE Computer Society sponsors the
IEEE International Conference on Cloud Computing (CLOUD). CLOUD 2010 was held
on July 5–10, 2010 in Miami, Florida
Criticism of the term
Some have
come to criticize the term as being either too unspecific or even misleading.
CEO Larry Ellison of Oracle Corporation asserts
that cloud computing is "everything that we already do", claiming
that the company could simply "change the wording on some of our ads"
to deploy their cloud-based services.[107][108][109][110][111] Forrester Research VP Frank
Gillett questions the very nature of and motivation behind the push for cloud
computing, describing what he calls "cloud washing" in the industry
whereby companies relabel their products as cloud computing resulting in a lot
of marketing innovation on top of real innovation GNU's
Richard Stallman insists
that the industry will only use the model to deliver services at ever
increasing rates over proprietary systems, otherwise likening it to a
"marketing hype campaign".
No comments:
Post a Comment