twobee - Fotolia

Get started Bring yourself up to speed with our introductory content.

Fog computing is in the forecast for IoT deployments

IoT devices produce large amounts of data, which can be a burden for a network. A distributed model, such as fog computing and edge IT, lessens the load.

This article can also be found in the Premium Editorial Download: Modern Infrastructure: The future of cloud computing looks foggy:

The internet of things will demand a cloud computing architecture without well-defined boundaries. Tomorrow's cloud will need to extend beyond the walls of a service provider's data center, seeping into the business -- becoming almost pervasive via edge devices and local connection hubs.

Even today, the cloud is tough to define. Take, for instance, Amazon Web Services' (AWS) Snowball Edge. It's an appliance that stores local data and transfers it to the cloud and is designed to bridge the gap between locally created data and the public cloud.

According to Cisco Systems, which is usually acknowledged as the original source for the term, fog computing extends the cloud to be closer to the devices that produce and act on internet of things (IoT) data. Cisco calls these devices fog nodes and defines them as any device with compute resources, storage and network connectivity -- including traditional industrial controllers, switches, routers, embedded servers and even surveillance cameras.

If that sounds a lot like edge computing, well, it should, at least in the view of Kelly Quinn, an analyst at IDC.

Fog computing is a slightly older term for what she and others generally call edge IT. "The use cases for fog are essentially identical to the ones for edge computing," Quinn said. "Businesses that are likely to benefit the most from fog or edge IT are those that need computational capabilities for data either close to or at the point at which the data is generated," she added.

Sophia Vargas, an analyst at Forrester Research, offers a similar but not identical definition. "From my perspective, fog is a broader, integrated architecture across various distributed components -- meshing the tiny pieces on the edge to operate as one dynamic system," Vargas said.

"I see fog as similar to a cloud environment. But instead of having everything colocated in a single environment, it is distributed over multiple local elements, and cloud is just a piece of it," she said. Depending on the use cases and on exactly how IoT develops, cloud will be a component of fog computing, helping stitch together distributed systems.

Fog computing recognizes that IoT will embrace far more devices than ever contemplated in any network. Cisco estimated organizations will need to connect 50 billion IoT devices within the next several years. This will demand a flexible and scalable approach.

IoT technology spending

Just as importantly, some of those IoT devices will generate vast amounts of data, only some of which will have essential value. For instance, sensors on a large turbine could generate terabytes of data in the course of a day, and most of that data is mundane or repetitive. It is the anomalous data and its relation to the whole that's crucial. Thus, rather than overtaxing network and compute capacity in the cloud or in a data center by shipping all of that raw data somewhere else, why not parse the data at or near its origin and share only the conclusions it supports?

When looking at the phenomenon of edge IT, Forrester analyst Jeffrey Hammond said about one-third of the companies that have deployed production IoT systems find that they need to push more compute functions to the edge of the network to perform real-time analysis and data filtering.

"This is particularly the case in the industrial internet and in automotive, healthcare and manufacturing," Hammond said. "In cases where sensors generate a lot of telemetry, but only sporadic data that's actionable, you want to discern the signal from the noise without overwhelming the ingestion processes at the core," he said. Likewise, for certain key functions such as collision avoidance, "you don't want a 100 millisecond loop to the internet and back."

IDC's research has shown that manufacturing leads the adoption of fog computing, along with information technology, professional services and distribution, Quinn said.

Fog computing pioneers

Vargas sees different paths to a fog model, depending on the infrastructure in use. In general, since IoT devices will be deployed outside the data center, out of the reach of IT professionals, they will likely be software-defined so that administrators can manage them remotely. "In other words, we are talking about specialized infrastructure, even chips in devices that maybe aren't real computing devices. So, in that case, a whole new set of protocols may be needed. And there is not yet a lot of standardization in those realms," she said.

Indeed, vendors in the gateway market are trying to address that challenge with a bridge between operational tech protocols and technologies from various connection types, Vargas said. That work also aims to create protocols that can be embedded into traditional infrastructure environments.

"You can set up and operate Snowball as an extension of [AWS]," Vargas said.

However, Snowball isn't targeted specifically at fog-type challenges. Instead, Amazon defines its primary use cases as cloud migration, disaster recovery, data center decommissioning and content distribution. But AWS has crafted something oriented toward fog computing: AWS Greengrass. The technology is specifically for connected devices, according to AWS.

Hammond considers the AWS Greengrass runtime environment a more significant and relevant technology than Snowball. It lets you run local compute, messaging and data caching on various kinds of devices, including Snowball. AWS Greengrass-connected devices can then communicate with each other or other devices -- even locally, without internet access. They can also run AWS Lambda, which provides on-demand serverless compute services that operate automatically in response to specific events, whether on a shop floor or in an agricultural environment.

"I have already seen a lot of experimentation with Greengrass," Hammond said.

It isn't just established players like AWS that are making a fog computing play. David King, CEO of FogHorn Systems, said his startup is pursuing a fog opportunity in parallel with Greengrass. He, too, sees fog as distinct from edge in offering a more ample and flexible architecture. To that end, he noted, FogHorn focuses on bringing powerful, compact software to existing programmable logic controllers and supervisory control and data acquisition control systems. This enables them to take on more of a gateway function.

"The big idea is that of doing analytics and machine learning on streaming OT [operational technology] data," he said. By processing closer to the data, you can execute more advanced rules to supplement OT and save on bandwidth costs compared to processing at a central site or even in the cloud, he said.

A complicated outlook

At the moment, vendors and product teams talk the most about fog computing. Enterprise IT isn't talking much about it at all, and that's a potential problem, Vargas said.

"A lot of these kinds of deployments are being led by line of business or the product sides of the organization, because they are trying to modernize the manufacturing floor or change the in-store retail experience," Vargas said. Because the process is driven by outsiders rather than data center operations professionals, organizations often need consulting and integration services.

Further complicating the outlook is that fog computing seems destined to require design and implementation on a case-by-case basis. "Every IoT deployment I have encountered seems to be using a slightly different slice of infrastructure resources, but all are ultimately relying on some kind of aggregation services to bring together the different pieces," she said.

But the lack of simple implementation templates isn't necessarily enough to derail a project. Businesses that plan their fog IT strategy thoughtfully and carefully tend to have greater success with initial deployments. They are also more likely to overcome unpredicted obstacles, Quinn said. And, regardless of the challenges, she noted, "in our research, we haven't yet found any scenarios in which fog or edge IT should be avoided."  

Next Steps

Push your cloud to the edge

Distributed cloud models gain popularity

Fog computing creates opportunities and disruptions

This was last published in October 2017

Dig Deeper on Cloud architecture design and planning

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Which cloud architecture changes did you make to support fog computing?
Cancel

-ADS BY GOOGLE

SearchServerVirtualization

SearchVMware

SearchVirtualDesktop

SearchAWS

SearchDataCenter

SearchWindowsServer

SearchCRM

Close