"Edge Computing", that's a deceptively catchy term, ain't it? Computing on the edge, that the phrase literally translates to, invokes interesting and novel interpretations. Nevertheless, the concept of Edge Computing is at least a few years old and was born out of the technological advancements that enabled more powerful devices at the edge of the network - devices close to where data is generated.
History
Let's look at the march of progress in applications and their nature
The dawn of commercial computing arrived with mainframes around 1960 and the nature of computing was centralized. Client-Server model was the next big thing in the way computing is performed and was predominantly adopted from 1980 to early 2000's. Then came the cloud era from 2006 onwards that fundamentally brought back the centralized computing, putting virtually infinite computing capacity up for sale. In the last few years internet connected devices entered the stage. These IOT devices are capable of doing more than their core function as they have excess computing capacity that can be utilized. Consider mobile phones, smart cars, surveillance cameras, all kinds of sensor devices. It is these devices that are the enabling edge computing.
Why Edge Computing?
With the proliferation of IOT devices, huge amounts of data is generated by millions of devices and that data needs to be sent to centralized servers/cloud for processing. This is pushing the limits of network infrastructure. It is projected that by 2025 there will be over 41 million IOT devices in the field and they will generate over 79 zettabytes of data. 1 zettabyte = 1012 GB. That's massive amount of data and is only going to increase with time.
"Sending massive amount of data generated by billions of IOT devices on the edge to centralized servers/cloud is prohibitive in terms of network bandwidth, latency and cost."
The devices on the edge need to be equipped with computing power so the data that is being generated can be analyzed at the device itself. In a way the data center is moving to the edge. Consider the following cases
Self-driving cars
In the car there is an assortment of sensors that produce lot of data per second. The car's brain needs to make decisions based on the data generated by sensors in real time, there is just not enough time to send data to cloud and act based on the response.
It is estimated that self-driving car's onboard real-time operating system will have to process approximately 1 GB data per second.
Surveillance cameras
The way surveillance cameras have traditionally worked is they record and upload the footage to a central server which can then be analyzed. With edge computing real time video analytics can be performed on the camera itself. This is enabled by building compute and storage capacity in the camera.
Today's IP cameras increasingly have the processing power to run AI- or deep learning-based analytics and algorithms such as facial recognition. The benefits of processing data on the camera has the following benefits
- Low bandwidth consumption
- Faster/real-time response
Above are just two examples of how edge computing is transforming how and where the data is processed. There are numerous other use cases for edge computing like smart cities, smart factories, smart airplanes, etc.
Will this diminish the cloud?
While it may seem that edge computing is going to take away the need for cloud computing, it is really taking away only part of the work done at the cloud. Cloud will still provide for management, non-time sensitive analytics and durable storage of data while real-time processing being done at the edge. The edge devices will only send a subset of all data generated – data that is needed at the cloud. The emergence of 5G network is a critical piece in this equation.
All of the three major cloud providers have already released services catering to the need of edge computing devices.
Azure
AWS
GCP
Top comments (1)
Nice list of cloud services for the edge. Some of these are briefly described at devopedia.org/edge-computing