January 3, 2016
The world is rapidly moving toward the cloud and away from traditional IT systems. Browser-based security using https has evolved and is now accepted by customers as “secure.” In recent years, many browser security flaws have been identified and fixed. A major industry shift occurred when some browsers were created and maintained as open-source projects. This helped increase security thanks to a community of engineers reviewing and updating the code on an ongoing basis at no cost. Since some current browsers are not controlled by private corporations, this also creates a key element of trust.
Getting to this point has been a long journey, with a number of important industry and technology events helping to make cloud computing possible and secure. For example, public network bandwidth was over built for about a decade leading up to 2001. All of this network capacity helped pave the way for lower prices and higher bandwidth. Beginning in 1975 and extending about 45 years, chip performance increases followed “Moore’s Law” which led to enormous amounts of processing power at very low prices. A lot of capital was allocated to the security industry in forming new companies to launch security products such as firewalls, antivirus solutions and penetration testing.
Starting around 2000, software vendors began writing web applications architected from the ground up to run over the Internet. Netscape went public in 1995 and Exodus Communications had a value of $37 billion in 2000. All these events came together in one big confluence to create the rise of hosting providers that provided dedicated servers for website hosting. They also provided co-location services for companies to rent space, power, network bandwidth, and more to manage their own servers.
This was all a precursor to managed and shared hosting, which started the cloud revolution. When service providers began using low cost/high performance hardware, inexpensive bandwidth, firewalls and new virtualization software like VMware, the “cloud” was created. When physical servers could be sliced up into any number of virtual servers this was a sea change and an inflection point in the industry. Amazon Web Services (AWS) launched their first cloud service in 2004.
Today, economics are driving decisions to go the cloud. If you are a CIO of a $500 million company, you may have a $15 million annual IT budget. You probably spend 50 percent of the budget on people, while a significant part of the remaining budget could be spent on buying hardware and software and, in essence, trying to replicate a data center as part of your IT infrastructure. It would be impossible for your team to achieve the same level of security, redundancy and uptime that you could purchase from a cloud hosting provider as well as using commercially developed cloud applications.
Two decades ago, if offices within a company were connected, it was done via leased lines. Today, it is most likely done via the Internet. Data is flowing over the unsecured public data highway, so security is critical, particularly as more workers switch to remote and mobile work. Infrastructure and applications are exposed to the outside world.
At this point in the cloud evolution, most new cloud 2.0 applications are architected specifically for the cloud. This means that the performance and response time is higher than the first generation of cloud applications, which were just old client/server applications retrofitted with web interfaces.
Around 2013, Moore’s Law started to run into the constraints of the laws of physics. Approaching very small size, transistors are less reliable. Consumers of computing power have enjoyed riding the wave of inexpensive computing power in increasingly smaller devices. However, the cost to research, test and manufacture each new generation of chips is directly opposite. All these costs have steadily increased and are anticipated to increase exponentially over time. This is better known as “Rock’s Law”.
Strangely enough, companies like Facebook, Google and Amazon are creating a tectonic shift in the chip industry as they demand different chips to power the next generation of cloud computing. In essence, the chips they demand are required to analyze massive amounts of data rather than just compute. The chip companies are investing in this new generation of R&D and manufacturing and leaving behind the legacy of Moore’s Law. Given this significant industry sea change, it makes even less sense for a CIO to invest in creating a standalone data center. It makes economic sense to ride the new tsunami wave of the cloud.
Did you like this article?
Get more delivered to your inbox just like it!
Sorry about that. Try these articles instead!