Cloud 2.0: Let’s start the dialogue
In today’s digital era it not a question of “if you are on cloud but are you on cloud yet?”
The number of connected devices by end of 2030 will be 50 billion with over 80 zettabytes of storage. Imagine all the chatter from between your car, refrigerator, oven, smart watches, security cameras, smart bulbs even to your trash cans (and this all from just your house alone!) constantly emitting data to be processed quickly with extremely low latency in perhaps milli to micro seconds.
As more and more sensors and machine automation such as self-driving cars emerges, these new devices will feed off of each other’s data. The problem is that this sharing will often need to happen quickly -- very quickly -- to get the best results. Imagine two planes needing information on each other’s location or the speed at which two driverless cars would need data to avoid a collision. There is simply not enough time to send data to the cloud and back. It is clear that traditional monolithic cloud has not been able to keep up with this growing volume of end-device data to the cloud.
The current monolithic cloud model is limited with processing of these sensor data as it is too resource-intensive, time-consuming, and inefficient to be processed by large, monolithic clouds. This signals a need for major shift in the way that data is collected, stored, and processed.
The next gen applications focus on machine to machine interaction. The internet of things (IoT), machine learning and artificial intelligence (AI) all involve the gathering and processing of incredible amounts of data. But, the data is not generated in the cloud at all in fact they are generated outside of cloud by the sensors in the devices generating huge volume of data.
It is imperative to discuss a key term in computing since this is the reason for this very dialogue I started today – latency. Latency is the period of time it takes for a certain action to occur. Latency is often a more important speed factor than bandwidth. In an IoT transaction, most of the transaction time is not consumed by the processing of the data, it is consumed by the latency (time) it takes for the data to move to/from the cloud.
Recommended by LinkedIn
One promising solution is to create mini-clouds also known as "cloudlet”. They can be viewed as a data center in a box or mini data center, with the goal to bring the cloud closer to the device by providing it with the ability to process at least some data locally. For aggregation purposes, the data would likely still be forwarded to cloud-based systems, but the performance need is not critical as the transaction is already complete.
By far the biggest challenge that cloudlets present, however, is security. One of the major driving forces behind the move to cloud infrastructure has been that these systems are significantly more secure than their distributed counterparts, because they allow all data to be brought together and managed under a centralized system for access and control. This has led to cloud storage systems becoming extremely popular among security-conscious organizations and individuals.
The most effective way to mitigate the security in cloudlets would be to encrypt data as they move between cloudlet, device, and cloud. At the moment neither device nor cloudlets possess the computational power to deploy strong encryption while retaining respectable performance.
Based on what we discussed, how well is your cloud strategy? Are you prepared to handle the explosion in number of connected devices?
Let us keep the dialogue moving.
So true in today’s world - a valid question worth further discussion.
Nicely described. Great