The Inevitability of the Cloud
Overview
After working on Microsoft Azure for over three years I think I understand the cloud pretty well (as well as I can with a constantly changing technology!). I understand the benefits (cost, scalability, etc.), the migration drivers, and the goal of a digital transformation, but I don’t really think much about how the cloud came to be. However something clicked while reading one paragraph in Satya Nadella’s book Hit Refresh (Employee edition) and I suddenly see the cloud as an inevitable step in the evolution of computing. The cloud is more than a new technology; it the platform that will enable the future of computing.
What exactly is the cloud?
Let’s start by defining what the cloud is. The cloud is a concept which has been implemented independently by multiple providers including Microsoft, Amazon, and Google. Each cloud is separate, but often you can span them with VPNs (virtual private networks). To quote Microsoft, “Simply put, cloud computing is the delivery of computing services—servers, storage, databases, networking, software, analytics, and more—over the Internet (“the cloud”). Companies offering these computing services, cloud providers, typically charge for cloud computing services based on usage, similar to how you’re billed for water or electricity at home.” (see https://azure.microsoft.com/en-us/overview/what-is-cloud-computing/)
Below I break down the major phases in the history of computing and hope to show why the cloud was the logical next step after on-premises data centers. Note, these phases are strictly my own organizational scheme. They often overlap and aren’t strictly consecutive.
Evolution of Digital Computing
Before Digital Computing
Computation, mathematical calculation, has been around since the discovery of numbers. Early analog computers include the abacus, astrolabe, and slide rule followed by mechanical computers including Pascal’s calculator or Pascaline, the Thomas Arithmometer, and the landmark programmable looms in the early 1700s. While generally single purpose, these devices were huge achievements in engineering and laid the foundation for modern computing.
The Rise of the Modern Computer
In 1936 Alan Turing wrote a ground-breaking paper detailing what we know now as the modern computer and shortly thereafter the first computers were build. These computers were programmable – able to perform tasks as detailed in a specific algorithms or programs – giving them the flexibility to do anything that could be expressed via a computer program. This opened the door for computers to be used in nearly every industry, but the extreme cost and extensive training requirements led to limited market adoption. Only the “big players” had computers so often programmers, highly trained computer specialists, would share a computer by reserving time or submitting jobs in batches and waiting for the results.
The Democratization of Computing
Multiple advancements in computer hardware, software, and human computer interaction (HCI) combined for the birth of the personal computer (PC). Simultaneously computers got smaller, cheaper, and more user friendly. Things we take for granted today -- like an operating system, monitor, and mouse --made computers more accessible to non-techies. With the release of standard operating systems came a flood of applications aimed at both businesses (word processing, accounting, etc.) and consumers (games). As the use of computers expanded, the global data volume began to rapidly grow.
Connecting Devices
No man is an island and that goes for computers too. The solution to this isolation was computer networking. At first these were small, private, physical networks connecting devices within a single location (client-server model). Over time they evolved to span geographic locations, allowed for secure connections, removed the physical connection requirement, and culminated in the internet which allows us to share data instantly on a global scale. This mesh of omnipresent communication networks changed how we share, store, and access data. This is where the cloud began…
The Internet of Things (IoT)
Devices have been talking to computers for decades (SCADA has been around since the 1950s), but the release of smart phones in 2007 accelerated the path to near constant connection of both people and devices. Beds, light bulbs, cars, etc. now regularly gather, transmit, and receive information with the goal of improving our lives. This unprecedented surge in data made it impractical for most organizations with on-premise storage to keep up. This is where the cloud becomes a necessity – allowing you to store vast amounts of data, access it from anywhere, analyze it quickly, and do it all on-demand.
Leveraging our Data
The scientific method involves careful observation, data gathering, and analysis and in this way scientist have reached conclusions that revolutionized how we live, i.e. disinfecting hands and medical implements reduces the spread of disease. The cloud allows for the data, compute, and analytic resources at an unprecedented scale. Since massive data sets can now be composed from a variety of sources not typically combined, we should see more insights, right? Unfortunately, there are two big blockers I see.
1. Trust – For example, do we trust that our employer, health insurance provider, and medical team can share data in a way that won’t impact our career or health care premiums?I believe Microsoft Azure itself is secure as demonstrated by the countless certifications, attestations, and compliances and the use of blockchain technologies like Microsoft Coco could be the answer to trusted exchanges. Combined these technologies enable data sharing in new and exciting ways.
2. Data mining – This is the difficult process of finding patterns in large data sets – the insights that can be used for prediction, etc. Finding these patterns is so resource intensive that we are increasingly turning to Artificial Intelligence (AI) to assist.
Artificial Intelligence (AI)
AI allows computing to move into new roles that require adaptability and creativity, previously the purview of humans who have limited lifespans and workdays. I believe new AI technologies like machine learning will provide the breakthroughs needed to harness the power of our data. The Microsoft Azure Cloud already offers AI services (Cognitive Services and Machine Learning) and examples to show just how easy it is to use. This will be a major shift in how humanity functions and will be disruptive in the short term. However I don’t foresee a dystopian future, but rather a future where humans use AI to solve “the big problems.” Humanity always depends on the tools we invent – AI is simply a little fancier than the wheel and alphabet.
And then?
If our tech-neck, poor eyesight, and carpal tunnel health issues are any indication, humans still conform to the computer instead of it conforming to us. I expect advancements that make interacting with computers more natural and less physically demanding, with special focus on assisting those with disabilities. Turning to hardware, quantum computing is poised to break Moore’s Law and we can now encode massive volumes of data within DNA. Walls we perceive now as technical limits will disappear as humans prove yet again that we are ingenious and can innovate our way past any blocker.
Closing
In short, I see an exciting future of possibilities powered by the cloud. Transport devices that drive themselves, education tailored to each student, and medical breakthroughs that extend the human lifespan. Our lives thirty years in the future will be just as different from today as our lives were thirty years ago and likely in ways we can’t even image. Better? Hopefully. Different? Definitely!
Special thanks go to my father Joel Luedeman for his help with this article. After nearly 50 years in computing, he was the perfect partner to help me put the past in perspective.