Utility computing, or computer utility, is a service provisioning model in which a service provider makes computing resources and infrastructure management available to the customer as needed, and charges them for specific usage rather than a flat rate. Like other types of on-demand computing (such as grid computing), the utility model seeks to maximize the efficient use of resources and/or minimize associated costs. Utility is the packaging of
system resource
In computing, a system resource, or simply resource, is any physical or virtual component of limited availability that is accessible to a computer. All connected devices and internal system components are resources. Virtual system resources in ...
s, such as computation, storage and services, as a metered service. This model has the advantage of a low or no initial cost to acquire computer resources; instead, resources are essentially rented.
This repackaging of computing services became the foundation of the shift to "
on demand
On-demand or on demand may refer to:
Manufacturing
* Build-on-demand
* Just-in-time manufacturing, a methodology for production
* Print on demand, printing technology and business process in which new copies of a document are not printed until ...
" computing,
software as a service
Software as a service (SaaS ) is a cloud computing service model where the provider offers use of application software to a client and manages all needed physical and software resources. SaaS is usually accessed via a web application. Unlike o ...
and
cloud computing
Cloud computing is "a paradigm for enabling network access to a scalable and elastic pool of shareable physical or virtual resources with self-service provisioning and administration on-demand," according to International Organization for ...
models that further propagated the idea of computing, application and network as a service.
There was some initial skepticism about such a significant shift.
However, the new model of computing caught on and eventually became mainstream.
IBM, HP and Microsoft were early leaders in the new field of utility computing, with their business units and researchers working on the architecture, payment and development challenges of the new computing model. Google, Amazon and others started to take the lead in 2008, as they established their own utility services for computing, storage and applications.
Utility computing can support grid computing which has the characteristic of very large computations or sudden peaks in demand which are supported via a large number of computers.
"Utility computing" has usually envisioned some form of
virtualization
In computing, virtualization (abbreviated v12n) is a series of technologies that allows dividing of physical computing resources into a series of virtual machines, operating systems, processes or containers.
Virtualization began in the 1960s wit ...
so that the amount of storage or computing power available is considerably larger than that of a single
time-sharing
In computing, time-sharing is the Concurrency (computer science), concurrent sharing of a computing resource among many tasks or users by giving each Process (computing), task or User (computing), user a small slice of CPU time, processing time. ...
computer. Multiple servers are used on the "back end" to make this possible. These might be a dedicated
computer cluster
A computer cluster is a set of computers that work together so that they can be viewed as a single system. Unlike grid computers, computer clusters have each node set to perform the same task, controlled and scheduled by software. The newes ...
specifically built for the purpose of being rented out, or even an under-utilized
supercomputer
A supercomputer is a type of computer with a high level of performance as compared to a general-purpose computer. The performance of a supercomputer is commonly measured in floating-point operations per second (FLOPS) instead of million instruc ...
. The technique of running a single calculation on multiple computers is known as
distributed computing
Distributed computing is a field of computer science that studies distributed systems, defined as computer systems whose inter-communicating components are located on different networked computers.
The components of a distributed system commu ...
.
The term "
grid computing
Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished fro ...
" is often used to describe a particular form of distributed computing, where the supporting nodes are geographically distributed or cross
administrative domains. To provide utility computing services, a company can "bundle" the resources of members of the public for sale, who might be paid with a portion of the revenue from clients.
One model, common among
volunteer computing
Volunteer computing is a type of distributed computing in which people donate their computers' unused resources to a research-oriented project, and sometimes in exchange for credit points. The fundamental idea behind it is that a modern desktop ...
applications, is for a central server to dispense tasks to participating nodes, on the behest of approved end-users (in the commercial case, the paying customers). Another model, sometimes called the
virtual organization (VO), is more decentralized, with organizations buying and selling
computing resources as needed or as they go idle.
The definition of "utility computing" is sometimes extended to specialized tasks, such as
web service
A web service (WS) is either:
* a service offered by an electronic device to another electronic device, communicating with each other via the Internet, or
* a server running on a computer device, listening for requests at a particular port over a n ...
s.
History
Utility computing merely means "Pay and Use", with regards to computing power.
Utility computing is not a new concept, but rather has quite a long history. Among the earliest references is:
IBM and other mainframe providers conducted this kind of business in the following two decades, often referred to as time-sharing, offering computing power and database storage to banks and other large organizations from their worldwide data centers. To facilitate this business model, mainframe operating systems evolved to include process control facilities, security, and user metering. The advent of mini computers changed this business model, by making computers affordable to almost all companies. As Intel and AMD increased the power of PC architecture servers with each new generation of processor, data centers became filled with thousands of servers.
In the late 1990s utility computing re-surfaced. InsynQ, Inc. launched
n-demandapplications and desktop hosting services in 1997 using HP equipment. In 1998, HP set up the Utility Computing Division in Mountain View, California, assigning former Bell Labs computer scientists to begin work on a computing power plant, incorporating multiple utilities to form a software stack. Services such as "IP billing-on-tap" were marketed. HP introduced the
Utility Data Center in 2001. Sun announced the
Sun Cloud service to consumers in 2000. In December 2005,
Alexa
Alexa may refer to: Technology
*Amazon Alexa, a virtual assistant developed by Amazon
* Alexa Internet, a defunct website ranking and traffic analysis service
* Alexa Fluor, a family of fluorescent dyes
* Arri Alexa, a digital motion picture ca ...
launched Alexa Web Search Platform, a Web search building tool for which the underlying power is utility computing. Alexa charges users for storage, utilization, etc. There is space in the market for specific industries and applications as well as other niche applications powered by utility computing. For example, PolyServe Inc. offers a
clustered file system based on commodity server and storage hardware that creates highly available utility computing environments for mission-critical applications including Oracle and Microsoft SQL Server databases, as well as workload optimized solutions specifically tuned for bulk storage, high-performance computing, vertical industries such as financial services, seismic processing, and content serving. The Database Utility and File Serving Utility enable IT organizations to independently add servers or storage as needed, retask workloads to different hardware, and maintain the environment without disruption.
In spring 2006 3tera announced its AppLogic service and later that summer Amazon launched
Amazon EC2
Amazon Elastic Compute Cloud (EC2) is a part of Amazon's cloud-computing platform, Amazon Web Services (AWS), that allows users to rent virtual computers on which to run their own computer applications. EC2 encourages scalable deployment of ap ...
(Elastic Compute Cloud). These services allow the operation of general purpose computing applications. Both are based on
Xen virtualization software and the most commonly used operating system on the virtual computers is Linux, though Windows and Solaris are supported. Common uses include web application, SaaS, image rendering and processing but also general-purpose business applications.
See also
*
Cloud computing
Cloud computing is "a paradigm for enabling network access to a scalable and elastic pool of shareable physical or virtual resources with self-service provisioning and administration on-demand," according to International Organization for ...
*
Edge computing
Edge computing is a distributed computing model that brings computation and data storage closer to the sources of data. More broadly, it refers to any design that pushes computation physically closer to a user, so as to reduce the Latency (engineer ...
*
Computer bureau
References
Decision support and business intelligence 8th edition page 680
External links
How Utility Computing WorksUtility computing definition
{{DEFAULTSORT:Utility Computing
Business computing
Business models
Computer systems
Distributed computing architecture
Time-sharing