Cloud computing has now become a common fixture in the business community. Most companies have seen the benefits that come from using the cloud and have adopted it for their own business practices. To say the cloud has grown in popularity in just a few years would be an understatement. According to a recent survey from RightScale, 88 percent of businesses are using some form of the public cloud, while 63 percent have adopted a form of private cloud. There’s little question that the cloud is here to stay, but even though cloud computing is widely used, many still don’t understand the nuances of the technology nor the history of how it came to be. Increasing one’s knowledge about the cloud can be a fruitful exercise which helps organizations of all types and sizes better utilize the technology. And it’s this understanding about the history of the cloud that can help businesses prepare for what the future holds.
So what is cloud computing, or rather, where does it originate from? While sources differ on an exact starting point, many experts agree that the concept of the cloud began not long after computers were first invented. Back in the 1950s, people were still trying to get used to new computing technology, and this resulted in many technical experts experimenting with new ideas. One of those ideas involved mainframe computers, wherein multiple users would access one computing source from separate terminals or endpoints. Those terminals essentially acted as entry points to the mainframe and nothing else. While this concept is credited with being similar to what cloud computing would become, the term was not used at the time. The main thing that kept the mainframe computing concept from taking off was the high cost, making it impractical for businesses to use. Mainframe computing was simply too far ahead of its time to be of much use to the world when it was first conceived.
Development toward what would be known as cloud computing continued unabated over the next few decades. One of the key components of the cloud — virtualization — advanced significantly during this time period. Around 1970, virtual machines were first created. This technology allowed multiple operating systems to run at the same time inside the same physical environment. The growth and proliferation of this technology would prove crucial in the development of cloud computing later on.
By now it’s easy to see where some of the technology that lead to the cloud came from, but what about the term itself? That can be traced to the 1990s, more specifically to computer programmers, scientists, and technical engineers that needed a way to represent a network in a concise way. The network in this case is usually referred to as the computers, servers, storage devices, and other devices that were simply “out there.” From early patent diagrams, the network was usually represented as an amorphous bubble-like blob, indicating an area that was outside of the concern of the programmers and scientists. Over time, that blob slowly transformed into a structure that more resembled that of a cloud. This lead to many computer experts to refer to networks, data centers, and servers as “the cloud” or “cloud computing.” The first time the term was actually used in any official setting was likely in 1996 by Compaq.
So now we can pinpoint where the cloud came from, both in its technology and its name, but during the 90s, it was mainly a niche concept, one that hadn’t been seized upon by the public’s collective conscious yet. It wasn’t until one monumental event that organizations began to look into new and alternative technologies. That event was the bursting of the dot-com bubble in 2000. Suddenly, internet businesses were struggling, if they survived at all. The economy was plunged into a recession. Things looked bleak for everyone involved in the tech community. These dire circumstances forced internet companies to look elsewhere for business solutions, ones that would enable them to grow their operations while saving them money in the long run and promoting stability.
That’s when many surviving internet businesses turned to the cloud. The movement was small at first and mostly experimental. It required the modernization of IT architecture while still adopting more efficient technologies. Cloud computing seemed to be the best solution for the problems many organizations were encountering, but it still wasn’t widely available. One of the great shifts in the market happened in 2006 when Amazon officially entered the cloud market, becoming a pioneer in the technology. The company launched Amazon Web Services (AWS), which sought to provide other companies with rented servers that they could use for their own operations. Many other businesses quickly caught on, choosing to use the Infrastructure-as-a-Service (IaaS) at an affordable price. This early entry into the cloud market helped Amazon develop cloud computing, turning it into a major player with a sizeable lead that it still holds to this day.
Though Amazon was successful in turning the cloud into a more identifiable technology, many industry insiders, experts, and businesses were still unsure about it. Disagreements were common, with some saying the cloud represented the future of how businesses would operate, while others that it was a mere buzzword that had little substance and would soon go the way of other flash-in-the-pan fads. Even for those who took the technology seriously, the consequences of upending the dynamic of IT and how it would impact budgets gave them cause for concern. They recognized the cloud as revolutionary in its potential but wondered if the amount of disruption it caused would be worth it in the end. The debate would remain intense over the years and even extends to today.
For all the concerns over the cloud at the time, many large tech companies saw the potential as well and wanted to get in on the action. Tech giants like Google, Microsoft, and Cisco Systems quickly entered the cloud market with offerings and services of their own. As the number of vendors increased, so did the competition. Many companies were still wary of fully adopting the cloud, but they were at least considering the possibility as the number of options multiplied every year.
By the time 2009 rolled around, other cloud solutions came to the forefront. It became known known as the Open Source Cloud Movement, where special open source cloud platforms were developed to enable companies to build their own private clouds. This movement also contributed to the growing demand for developing hybrid clouds, which essentially combine aspects of both private and public clouds. It was at this same time that cloud computing reached new heights of popularity, with more companies using it than ever before. Now, numerous services of varied types are offered, and startup cloud companies are a common sight.
Now that we know where the cloud comes from and where it is right now, what does the future have in store? Of course, much of predicting the future is speculation, there’s still some good guesses experts have made. For one thing, cloud growth is expected to continue for at least the next few years. Surveys have shown that cloud customers have been largely satisfied with the services and products they’ve received, making them likely to continue using the cloud. Cloud computing is also expected to spread internationally as the internet becomes more available to parts of the world that don’t currently have it.
At the same time, a number of issues surrounding the cloud will likely crop up. One of the biggest concerns is bandwidth latency. As internet usage continues to rise, the amount of bandwidth used will also grow, making it difficult to keep up with demand. That will make the transmission of information over the web slower, threatening the efficacy of the cloud. This, along with other factors, will likely spur the development and innovation of newer technologies, ones that might even replace or at least modify cloud computing as we know it. Some of those technologies include fog computing and edge computing (sometimes known as machine-to-machine computing), which bypasses the transmission of data to servers and keeps much of that computing at the endpoints. This is especially important as the Internet of Things continues to grow and as more devices become connected to the web.
Cloud computing has a long and fascinating history, but it’s only in the last decade that the technology has really become mainstream. Right now, it’s at the height of its popularity, but growth is not only possible, it’s likely. The way the cloud has transformed businesses in just a few years has been impressive, and more change is on the way. The future also holds exciting possibilities for the cloud, and whether that means modifications to the technology or the growth of alternatives, we’ll likely be better off because of it. The cloud has a lot to offer still, and for now it feels like we’ve only reached a turning point in its historical journey.