Edge Computing: The Cloud Nemesis?
Google drive, iCloud, box, who doesn’t use one (if not all) of these popular cloud solutions? But businesses are partially shying away from clouds for some applications, following a counter movement & buzzword in the making: Edge computing.
Why would someone wouldn’t use a cloud?
Clouds are hosted by server farms, which are football field sized warehouses filled with computers. Countries like Iceland, have laid fiber optic cables across the Atlantic to leverage low energy cost & natural cooling, enabling providers to host clouds at basically zero marginal cost. While this makes hosting cheap, data has to travel a long way to the desired destination – well, long relatively speaking, as it only takes a few milliseconds.
Introducing edge computing
Following the real estate wisdom “location is everything”, edge computing is bringing data storage & computational power physically near to the device where it’s needed to keep latency at a minimum. By having a computer (e.g. a mini computer or even phone) close to a data gathering device (e.g. a sensor), actions can be taken quicker.
What is the benefit?
For instance, an autonomous vehicle gathers data about it’s surroundings, e.g. the front camera captures that there is a man on the road. You don’t want the camera sending that picture to Iceland, let the server in Iceland “compute” that there is a man on the road and sending a response (break!) back over the Atlantic. In this case, milliseconds are a matter of life or death, thus some form of edge computing is necessary. But that doesn’t mean each car needs a large server – you probably can wait a few milliseconds for your GPS to tell you to turn left in 200 meters.
Why are you hearing this now?
Notable mentions are two trends: The Internet of Things and 5G network.
Firstly, as the Internet of things is maturing, more & more devices are gathering data in the most remote locations, those that are deemed critical need careful consideration where data is stored / computations are done.
Secondly, 5G can be 10x faster than 4G. This means that processing data from “new” IoT devices, can be done in real-time. With slower speeds that was simply not possible for some applications (think larger files e.g. photos compared “smaller” files e.g. measure temperature). Specifically, in rural / remote areas, edge computing can now play a role through the possibility of wireless connection. The opportunity for automation (edge or cloud aside) is as big as never before.
So what?
Edge computing is the cloud nemesis, as it’s cutting out cloud services in latency depended applications. The IT infrastructure of the future likely has both cloud & edge computing elements. In addition to latency, cyber security plays a major role in this considerations (more on this at a later point): While edge computing avoids data transmission outside of your four walls, each edge network needs it’s own security system, exposing an organization at multiple fronts – an obvious trade-off.
As companies transform and leverage more & more technology, the role of edge computing will play a significant role in choosing the right IT infrastructure.