You are here
Home > Computability Theory > What is Edge Computing?

What is Edge Computing?

edge computing

What is Edge Computing?

Edge computing is the practice of processing data near the edge of your network, where the data is being generated, instead of in a centralized data-processing warehouse.

Edge computing definition

Edge computing is a distributed, open IT architecture that features decentralised processing power, enabling mobile computing and Internet of Things (IoT) technologies. In edge computing, data is processed by the device itself or by a local computer or server, rather than being transmitted to a data centre.

Edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet,” according to research firm IDC.

It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing. Edge computing triages the data locally so some of it is processed locally, reducing the backhaul traffic to the central repository.

Typically, this is done by the IoT devices transferring the data to a local device that includes compute, storage and network connectivity in a small form factor. Data is processed at the edge, and all or a portion of it is sent to the central processing or storage repository in a corporate data center, co-location facility or IaaS cloud.

edge computing

 

Why does edge computing matter?

Edge computing deployments are ideal in a variety of circumstances. One is when IoT devices have poor connectivity and it’s not efficient for IoT devices to be constantly connected to a central cloud.

Other use cases have to do with latency-sensitive processing of information. Edge computing reduces latency because data does not have to traverse over a network to a data center or cloud for processing. This is ideal for situations where latencies of milliseconds can be untenable, such as in financial services or manufacturing.

Here’s an example of an edge computing deployment: An oil rig in the ocean that has thousands of sensors producing large amounts of data, most of which could be inconsequential; perhaps it is data that confirms systems are working properly.

That data doesn’t necessarily need to be sent over a network as soon as its produced, so instead the local edge computing system compiles the data and sends daily reports to a central data center or cloud for long-term storage. By only sending important data over the network, the edge computing system reduces the data traversing the network.

Another use case for edge computing has been the buildout of next-gen 5G cellular networks by telecommunication companies. Kelly Quinn, research manager at IDC who studies edge computing, predicts that as telecom providers build 5G into their wireless networks they will increasingly add micro-data centers that are either integrated into or located adjacent to 5G towers. Business customers would be able to own or rent space in these micro-data centers to do edge computing, then have direct access to a gateway into the telecom provider’s broader network, which could connect to a public IaaS cloud provider.

Edge computing security

There are two sides of the edge computing security coin. Some argue that security is theoretically better in an edge computing environment because data is not traveling over a network, and it’s staying closer to where it was created. The less data in a corporate data center or cloud environment, the less data there is to be vulnerable if one of those environments is comprised.

The flip side of that is some believe edge computing is inherently less secure because the edge devices themselves can be more vulnerable. In designing any edge or fog computing deployment, therefore, security must be a paramount. Data encryption, access control and use of virtual private network tunneling are important elements in protecting edge computing systems.

Edge computing terms and definitions

Like most technology areas, edge computing has its own lexicon. Here are brief definitions of some of the more commonly used terms

  • Edge devices: These can be any device that produces data. These could be sensors, industrial machines or other devices that produce or collect data.
  • Edge: What the edge is depends on the use case. In a telecommunications field, perhaps the edge is a cell phone or maybe it’s a cell tower. In an automotive scenario, the edge of the network could be a car. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop.
  • Edge gateway: A gateway is the buffer between where edge computing processing is done and the broader fog network. The gateway is the window into the larger environment beyond the edge of the network.
  • Fat client: Software that can do some data processing in edge devices. This is opposed to a thin client, which would merely transfer data.
  • Edge computing equipment: Edge computing uses a range of existing and new equipment. Many devices, sensors and machines can be outfitted to work in an edge computing environment by simply making them Internet-accessible. Cisco and other hardware vendors have a line of ruggedized network equipment that has hardened exteriors meant to be used in field environments. A range of compute servers, converged systems and even storage-based hardware systems like Amazon Web Service’s Snowball can be used in edge computing deployments.
  • Mobile edge computing: This refers to the buildout of edge computing systems in telecommunications systems, particularly 5G scenarios.

Edge vs. Fog computing

As the edge computing market takes shape, there’s an important term related to edge that is catching on: fog computing.

Fog refers to the network connections between edge devices and the cloud. Edge, on the other hand, refers more specifically to the computational processes being done close to the edge devices. So, fog includes edge computing, but fog would also incorporate the network needed to get processed data to its final destination.

Backers of the OpenFog Consortium, an organization headed by Cisco, Intel, Microsoft, Dell EMC and academic institutions like Princeton and Purdue universities, are developing reference architectures for fog and edge computing deployments.

Some have predicted that edge computing could displace the cloud. But Mung Chaing, dean of Purdue University’s School of Engineering and co-chair of the OpenFog Consortium, believes that no single computing domain will dominate; rather there will be a continuum. Edge and fog computing are useful when real-time analysis of field data is required

Like us on Facebook

Top