Why it's different this time

Why it's different this time

Moving to the Cloud is an inevitability at this point for any serious Enterprise application providing mission critical services.

I tend to ask why...a lot. As part of my current work to bring a SaaS offering to the market, I wanted to figure out whether or not the current Cloud computing hype could be substantiated as more than just marketing of new offerings by Cloud infrastructure companies. In my pursuit of answers I found a very interesting paper that analyzed market evidence for centralization and de-centralization of computing, written by D.A. Peak in 1997. This appears to be one of the only scholarly attempts at identifying these trends and cycles. The work was likely spurred by the rise of the World Wide Web, but I find it's research to be just as applicable to analyzing the growing market for Cloud computing.

Computing began as a centralized activity, "..one machine would suffice to solve all the problems that are demanded of it from the whole country."(Copeland) This quote from Sir Charles Darwin in 1946, grandson of the famous naturalist, summarizes the unfathomable power of the earliest machines. For the next 18 years centralization of computing resources was simply a fact.

From 1946 to 1964, a large, centralized and expensive mainframe was the only choice if you needed computing power. This created a new and emerging need for a smaller footprint computing device that would be, above all, cheaper. The first cycle of decentralization had begun, and over the next 10 years these smaller "mini-computers" would improve enough that in 1974 mainframe sales stalled and began a fall that would last for 4 years before hitting bottom. Interestingly enough this growth was precipitated by a breakthrough technology that enabled these mini-computers to be networked together. With this advancement it was possible to distribute computing needs across less expensive machines, which lowered the entry cost for computer automation.

In the 1973 a researcher in IBM's San Jose Research Laboratory published a paper entitled "A Relational Model of Data for Large Shared Data Banks", which would describe for the first time a relational database. Oracle, then Relational Software Inc. (RSI) would be the first to commercialize the technology in 1979. The promise of the relational database was tantalizing, and the two things that companies were willing to pay for in order to exploit it were speed and storage. Mainframes could provide both of these features where no other computing platform could. Over the next 10 years, peaking in 1988, corporate IT again focused on centralization.

Alongside the second major wave of centralization onto Mainframes there was another critical bit of research being conducted by a man named Robert Metcalfe. Metcalfe was working at the Palo Alto Research Center run by Xerox, and his work included training military personnel on ARPANET, the world's first packet switching network. Metcalfe would go on to found 3Com in 1979 and in 1983 would succeed in gaining IEEE approval for the 802.3 standard, also known as Ethernet. This new networking technology set the stage for an explosive period of decentralization, spurred heavily and in some ways made possible by an Internet based on this networking standard. And in many ways ushered in the era of hybrid computing, where Mainframes, or other centralized and powerful servers connected with clients to offer both centralized and de-centralized computing solutions.

As previous cycles have shown, it is the emergence of critical technologies that create the shift in the market from centralized to decentralized and vice-versa. Since the 1990's we have witnessed multiple new technologies which attempt to make hybrid computing more feasible, looking for a balance between centralized and de-centralized. These technologies have grown in standardization and adoption and they have formed the foundation of the Cloud computing platform.

Manifesting itself as the descendant of several other computing research areas such as Service-Oriented Architecture, distributed and grid computing, and virtualization, cloud computing inherits their advancements and limitations.
[Youseff, 2008]

Google has spent more than $20B on it's infrastructure since 2005, Microsoft has spent nearly $18B, and Amazon comes in at $12B. This spend is beginning a centralization cycle in computing. This trajectory towards centralization is has been shown historically to be so vitally important to achieving technology goals across many industries. Given the fact that the Cloud has now become a formally layered architecture, supporting massively scalable processing power and the benefits of centralization and outsourcing, moving to the Cloud looks more and more like the kind of good hybrid computing decision that we've come back to time and time again. With each cycle it's different, and compelling, driving the entire computing industry at once towards a common and accepted architecture. If history is a teacher, this level of investment towards a common industry goal will create a pattern that will be our foundation of computing for the next decade.

Peak, D.a., and M.h. Azadmanesh. "Centralization/decentralization Cycles in Computing: Market Evidence." Information & Management 31, no. 6 (1997): 303-17.

Copeland, B. Jack. Colossus the Secrets of Bletchley Park's Codebreaking Computers. Oxford: Oxford University Press, 2006.

Youseff, L.; Butrico, M.; Da Silva, D., "Toward a Unified Ontology of Cloud Computing," Grid Computing Environments Workshop, 2008. GCE '08 , vol., no., pp.1,10, 12-16 Nov. 2008 doi: 10.1109/GCE.2008.4738443

Photo by Hometown Poetry, https://flic.kr/p/qJKFen

That's a nice article. I too have noticed the cyclical nature of centralized vs decentralized computing. This time around our "terminals" include all of our smart devices, so I wonder if the centralization will last a little longer. Thanks for sharing!

To view or add a comment, sign in

More articles by Michael Witt

Others also viewed

Explore content categories