Skip to main content

Many people believe that cloud computing is a fairly recent development that has gained traction since 2006 due to Amazon’s EC2, but they may not recognize that it is headed toward becoming a utility much like water and electricity. The idea of computing as a utility is not new however.

John McCarthy, an American Computer Scientist, is said to have first publicly suggested that computer time-sharing might be sold as though it were a utility like the telephone system while speaking at the MIT centennial in 1961. In the years following, many companies, including IBM®, HP, and Sun participated in early renditions of this concept in various forms by offering computing power and storage.

The popularity of the concept is inevitable. Good IT practice dictates that technology is put in place to strategically serve business needs rather than create distraction—even as that technology becomes more complex. The need to focus on the business therefore drives the simplification of computing resource management wherever practicable.

Enter cloud computing with its automated virtualization of com­puting resources. Upon demand, IT staff can use the network to remotely assemble just about any scale of computing resources needed. Since the business can simply pay for the necessary resources as needed, there is no longer a requirement to order equipment, set it up, provide constant power, keep it cool, back it up, and deal with deprecation.

Even though mainframes of old and the increasingly powerful servers of today have been significant enablers when coupled with reliable networking, building a cloud is still difficult and proprietary. Because of these complications and the lack of standards, clouds are usually not interoperable. That is a significant hindrance to the use of clouds as a utility in any sense similar to other utilities such as electrical and phone service.

While businesses can build applications and deploy them rapidly and economically in clouds, portability to other clouds can be a risk factor. Apple®, Microsoft®, and Google cloud applications all currently only work in their own clouds—no doubt by design at this point. Porting them over to the others would certainly take some measure of concerted effort.

The global Intercloud has been conceived with this in mind. With a standard layered model for building cloud infrastructures, the likelihood that several cloud providers around the world could interchangeably host an application would increase multifold. Additionally, ubiquitous and interoperable clouds further enable big data and other applications that were previously too costly to run since these amplify the need for massive computing resources.

The IEEE and the IEEE Computer Society have been working toward standards through IEEE P2301™ for cloud portability and commonality and IEEE P2302™ for cloud-to-cloud interoperability. The vision of P2302 states it well:

“The vision is an analogy with the Internet itself: in a world of TCP/IP and the WWW, data is ubiquitous and interoperable in a network of networks known as the ‘Internet’; in a world of Cloud Computing, content, storage and computing is ubiquitous and interoperable in a network of Clouds known as the ‘Intercloud'”.

The standards are meant to address everything required to allow clouds to become utilities in perhaps a decade or more. For example, they see Intercloud Root providers who will serve as brokers hosting the Cloud Resource Catalogs similar to the manner in which DNS and Top Level Domains are governed now. They would also serve as Trust Authorities.

The standards also provide for Intercloud Exchanges that facilitate the negotiation dialog and collaboration among clouds. These will broker what is described as “Domain Based Trust” systems for the cloud providers to enable matching those providers with desired cloud resources at different levels of trust that may be required.

Intercloud-capable clouds would communicate with each other via gateways that are analogous to Internet routers, using established protocols for Intercloud interoperability. Extensible Messaging and Presence Protocol (XMPP) is described as a potential protocol for transport, which could support encrypted communication using Simple Authentication and Security Layer (SASL) and Transport Layer Security (TLS).

The developing standards describe additional security measures and protocols that are comparable to similar extensions in IMAP and POP3. Sample communication exchanges in the draft clarify how this would all work and provide automation among clouds.

These standards and the work that will follow in building out infrastructure means that the data centers of the future are very likely to be built on cloud computing technologies. Not only that, but they should no longer be proprietary and individually designed, but rather according to accepted standards similar the ISO-OSI network model of the 1980s that has so successfully underpinned the growth of networking as we now know it. By then, cloud computing should be well on its way to becoming a utility we will barely give a second thought to.

Let us know in the comments below: Would you add cloud computing to your list of utility bills?

Photo by Steve Snodgrass, taken from Flickr Creative Commons
Summary
The Cloud: a Large Step Toward Computing as a Utility
Article Name
The Cloud: a Large Step Toward Computing as a Utility
Description
Cloud computing is headed toward becoming a utility much like water and electricity with the help of IEEE standards.
Author
Scott Roy Smith

Author Scott Roy Smith

A visionary and creative online media product manager with extensive experience in interactive multimedia and online product strategy, design, and development. Scott has spearheaded successful products such as webinars, blogs, and recording of entire conferences for immediate webcasting and podcasting. He was instrumental in rapid Web site traffic growth and sales growth of digital products.

More posts by Scott Roy Smith

What's your take?