Over the last few years, edge computing has been gaining considerable attention, but it is not as known as cloud computing. In this article, we highlight the basics of this technology, and why it is likely to become more prominent in the foreseeable future.

 

Over the past two years, and thanks in large part to the need to facilitate remote engagement, such as to connect individuals with their workplace, children to schools, patients to their doctor’s office, etc., there has been marked increase in the adoption of cloud computing by organisations. Generally, cloud computing refers to a system comprising computing resources, such as storage, network and processing/computation services, that are shared and can be accessed on demand. Typically, these services are centralised in large data centres, and so are located remotely, and considerable distances from their clients.

As much as a key advantage of cloud computing is the economies of scale and scope that can be realised, which can benefit consumers in terms of the rates payable for accessing the services offered, there are also disadvantages. First, and in addition for paying for the cloud services secured, the customer will also need to pay for the connectivity between itself and the cloud. In circumstances where large volumes of data are involved that need to be processed in real-time, such as for video surveillance or manufacturing quality assurance, the bandwidth rates can be expensive. Further, and leading to a second challenge of the cloud, is the associated latency, which is the time taken for data to travel between the originating source and destination, and can be relatively significant. Additionally, and due to the centralised nature of cloud computing, risks arising from cloud security, data security and the impact of downtime, plus the potential for loss of data and productivity, tends to be much higher than other for non-centralised systems.

In light of the above, there has been a growing shift to edge computing, which resolves some of the key challenges associated with cloud computing.

 

Defining edge computing

In a nutshell, and unlike cloud computing, which centralises its resources, which users access remotely, edge computing places the computing resources into the system where it is needed. In other words, it localises the computing resources/services. Below are a two definitions that further explain what edge computing is, along with a few of its benefits.

Edge computing: A distributed computing model that takes advantage of compute available outside of traditional and cloud data centers. An edge computing model places a workload closer to where the associated data is created and where actions are taken in response to analysis of that data. Placing data and workload on edge devices reduces latencies, lowers demands on network bandwidth, increases privacy of sensitive information ,and enables operations during network disruptions.

Source:  IBM

Edge Computing

The delivery of computing capabilities to the logical extremes of a network in order to improve the performance, operating cost and reliability of applications and services. By shortening the distance between devices and the cloud resources that serve them, and also reducing network hops, edge computing mitigates the latency and bandwidth constraints of today’s Internet, ushering in new classes of applications. In practical terms, this means distributing new resources and software stacks along the path between today’s centralized data centers and the increasingly large number of devices in the field, concentrated, in particular, but not exclusively, in close proximity to the last mile network, on both the infrastructure and device sides.

Source:  The Linux Foundation

 

Why is edge computing important?

Although the concept of edge computing has been around at least two decades, increasingly, it has been receiving attention due to how data-centric and data-intensive the world has become. Big data, the proliferation of video, the growing integration of the Internet of Things (IoT) and machine-to-machine (M2M) communication, and the prolific integration of sensors, are just a few of the phenomena that we will be grappling with well into the future. Moreover, these technologies require considerable storge and processing power that will be needed on an ongoing and growing basis, in order to manage the data they generate.

In localising the cloud resources needed, increased efficiency and productivity can be realised from edge computing. For example, bottlenecks due to the processing and transmission of data are reduced when compared with traditional cloud computing constructs, due to the fact that the computing resources and the data are closer to each other, and so can result in faster turnarounds and improved system performance.

The increased efficiency of edge computing is even more critical when large volumes of data are time-sensitive, and must be processed in real-time. As previously mentioned, IoT and M2M communications are some of the areas that driving edge computing adoption. Consider also the following real-world examples:

  • Smart homes: in real-time, devices and appliances need to process data from sensors, data from other connected devices, as well as commands from multiple users. Much of this processing needs to happen within the device, but may need to be supplemented by a nearby local server.
  • Virtual assistants:  The conversational interfaces of Apple Siri, Amazon Dot Echo, Google Assistant, plus others, use voice recognition and process automation algorithms, and require data to be processing instantly.
  • Geolocation beacons:  Used widely in retail situations, these beacons collect data on customers’ movements throughout a store or other location, which not only provides the organisation with intelligence, but can also facilitate on the spot engagement with customers, such as to provide product information, suggestions and recommendations, thereby improving the customer experience.
  • Smart watches: In relation to health and wellness, these watches, which are part of the wearable IoT segment, are proving invaluable as they are able to: continuously monitor a user’s vital statistics; advise the user of an emerging concerns; and even contact help should the user be unable to do so. A well-known example is the Apple Watch.

In summary, and although it is ironic that now that cloud computing has finally been experiencing more widespread acceptance, there is growing support for edge computing – which for all intents and purposes is the opposite to cloud computing. Having said this, edge computing may not necessarily replace cloud computing in all instances. However, it has become a viable option when speedy processing of large volumes of data is crucial.

 

 

Image credit: TheDigitalArtist (Pixabay)