Nmedia - Fotolia


Edge computing lets IoT systems tap into cloud efficiency

Enterprises want their IoT systems to have ultra-fast response times, while also reaping the cost and efficiency benefits of cloud. Find out how edge computing bridges these two technologies together.

When an internet of things system runs on a public cloud, a set of sensors frequently sends data to a database that resides in that public cloud. And if you think this sounds problematic -- it is.

The time it takes for data to transfer from the sensor or device to the cloud is often too long to meet the latency requirements of an internet of things (IoT) system, many of which depend upon an immediate response. To work around this, some device manufacturers avoid public clouds, but that means IoT systems can't take advantage of the cost and resource efficiency of cloud-based computing.

Edge computing offers an alternative to transmitting every piece of IoT data back to a centralized cloud for processing.  In an edge computing model, data storage and processing capabilities are pushed to the "edge" of a network, typically residing within the device or sensor that collects the data. However, that data and processing is often still coupled with public cloud storage systems, acting as a single, virtual unit. This helps eliminate latency and increase response times for IoT systems.

The emergence of edge computing

Edge computing offers an alternative to transmitting every piece of IoT data back to a centralized cloud for processing.

The idea of edge computing isn't new. We've been doing it for years to solve network or machine latency issues. Specifically, a few edge computing concepts have emerged. One is cloudlet, a new architectural element from Carnegie Mellon University that blends mobile and cloud computing. There is also fog computing, a decentralized computing architecture pushed by Cisco.

The problem is that IoT applications need to react almost instantly to the data a sensor or device generates; this allows those applications to perform tasks, such as shutting down a smelting machine that's about to overheat. There are hundreds of use cases where reaction time is a key component of IoT systems, which is why latency is such an important concept. Reliability and data processing are critical, as well, including the ability to process data without depending upon communications with a remote, cloud-based app.

Where the 'edge' and the cloud collide

As a result, edge computing -- specifically as it relates to cloud computing -- is becoming a best practice. It frees cloud application architects from having to send all data back to the public cloud. However, the core idea of edge computing is that the edge and the public cloud are physically distributed, but virtually coupled.

As edge components move around, their data should sync automatically with a centralized data store in the cloud.  While that data may be stored temporarily at the edge, and processed there, the data stored in the cloud becomes the single source of truth.

The need for computing at the edge becomes more critical as IoT systems and services find their way to public clouds. And while edge computing can introduce new challenges, especially around management and security, it's something that fills a need, as enterprises deploy IoT systems that require the efficiency of cloud computing.

Next Steps

See how edge computing can grow IoT

Improve IoT with microservices

Learn more about fog computing

Dig Deeper on Managing cloud infrastructure