From rhythmic to disruptive change, computing has seen a sea of transformation since it came into being; that’s history! Besides, running after the GenX has been the modus operandi for the IT industry and the “GenX” was initially the ENIAC that has been taking various shapes till Cloud Computing to Edge Computing.
Edge Computing!
Edge Computing is a credo in networking that focuses on bringing down computing to the starting point of the data so as to reduce the access time and usage of bandwidth. In basic terms, edge computing is all about running lesser amount of processes in the cloud and shifting those processes to places like users’ computers, IoT devices, or edge servers. The execution of this networking idea of bringing computation to the network’s edge reduces the quantum of long-distance communication that has to happen involving a client and server.
What is the network edge?
Understanding the term “edge” is critical to understand how edge computing works. As much as the internet devices are concerned, the network edge could be understood as the point where the internet device or the local network that contains the device, communicates with the Internet. Here is an example to clearly understand the term “edge”; a user’s computer or the processor inside of an IoT camera can be considered the network edge; that doesn’t end there, the user’s router, Internet Service Provider, or local edge server are also counted as the edge. The significant carry off is that the edge of the network is geologically close to the device, in contrast to the origin servers and cloud servers, which can operate and communicate with the devices from a remote location.
An Example of Edge Computing
Visualize, a building protected with lots of highdefinition IoT video cams. Potentially, these cams simply give a raw video signal and constantly stream that signal to a cloud server. In the process, the internet infrastructure undergoes a persistent and significant strain as they receive the video output from the cameras and process them through an on-the-cloud motion-detection application to ensure that only the clips that feature activity are stored to the cloud server’s database. Not only the Internet infrastructure undergoes a strain in the process, this signifies that there is a constant, heavy consumption of bandwidth gets consumed by the high volume of video footage being transferred. Moreover, there is extremely heavy load on the cloud server that requires it to route the video footage received from all the cameras concurrently.
The key part of the stressful process involves motion sensor computation that happens on the cloud server. The entire problem of strained computing process and bandwidth consumption exist as the device that collects the data doesn’t process it too. Think of moving motion sensor computation to the network edge i.e. each camera is provided an exclusive internal computer that runs the motion-detection app and send only the clips that feature the activity to the cloud server. Primarily, this disruptive change would bring down the heavy consumption of bandwidth, as the entire footage need not be transmitted to the cloud server. In addition, the cloud server could engage in an uninterrupted communication with more number of cameras without any fear of getting overloaded.
Edge Computing - The Benefits
Edge Computing helps minimize bandwidth use and server resources.
- Another significant benefit of moving processes to the edge is to reduce latency.
- Decrease in bandwidth use and associated cost
- Decrease in server resources and associated cost
- Added functionality
- Edge Computing processes and analyzes data at the edge, making it possible run in real-time
The Prediction and the Need
With every household and office getting increasingly equipped with smart devices toasters, Statista, a statistics portal projects that by 2025 there will be more than 75 billion IoT devices put in place globally. In order to support all those devices,significant amounts of computation will have to be moved to the edge.
Computation Latency and Bandwidth Consumption
resources for personal purposes, and bandwidth is no excuse. For example, when colleagues of a same workplace chat over an Instant Messaging (IM) platform they may experience a substantial delay in getting the messages because each message gets routed beyond the building, communicates with a server at some corner of the globe, and appears on the recipient’s screen. If this process is brought to the “edge”, and the company’s internal router takes charge of transferring intra-office chats, that noticeable delay would not exist.
Correspondingly, there will be delays when a variety of web applications run into processes that require communicating with an external server. The duration of such delays primarily depend and vary in accordance with the bandwidth availability and the server location, which can be evaded in total by bringing more processes to the network edge. Moreover, there happens a huge drain of bandwidth. Hence, responsible use of the Internet at workplace has become the need of the hour.
The Edge Computing Drawback
- It can increase attack possibilities.
- Higher possibilities of malicious actors compromising the devices
- Needs more local hardware