Edge computing addresses the interaction of sensors, monitors, and other simple devices with the rest of the world via the cloud. Not everyone is clear about its terminology even as its implications dwarf computing as we've traditionally thought about it.
By John Ghrist
The confluence of cloud computing and the Internet of Things (IoT) is growing in importance, to the point where in less than a handful of years this intersection may become even more vital than standard computer processing and cloud data storage has already been. The point of contact is variously referred to as fog computing and edge computing, and some even consider the terms to be interchangeable. However, they're actually not. Let's start by differentiating between the two.
On Little Cat Feet
Fog computing is the older term, coined in 2014 by a group of technologists at Cisco Systems. Called "fog" to liken its activity at the usually less-dense periphery of a cloud, fog computing deals with the challenge of handling high volumes of data coming in from peripherals (e.g., sensors, alarms, security cameras, health monitoring equipment) that needs to be handled without necessarily transmitting it across the cloud. By handling some basic processing more local to the sensing devices, fog computing's aim is to rely on cloud resources only for high-level or long-term analysis of large amounts of such data, thereby quietly reserving more cloud bandwidth for other uses.
As fog computing moved from theory to practice, it's come to rely on three general service components. Fog devices are generally the sensors and monitors that gather data. Fog gateways channel this data to various fog servers and make decisions about to which servers to move the data, usually based on the kind of data coming in, in order to maximize data-flow efficiency. Fog servers store the data and at a minimum analyze it enough to make decisions about whether further action is needed.
A good example is medical monitoring. Sensors take readings of a patient's heartbeat, oxygen levels, and other data. The constant monitoring produces a high volume of data every second, particularly if the system is observing a sizeable number of patients concurrently. Fog gateways parcel this data out to likely multiple fog servers that either are located nearby or have a dedicated physical connection. The fog servers decide whether the data coming in warrants some kind of escalation. If a patient's heartbeat is steady, for example, the data might simply be stored. If the server detects a heartbeat stoppage, or even an irregularity, depending on what attention threshold might be set, the server may send an alarm. Only then might the system send a message non-locally through the cloud, rather than letting the sensor data-stream transmissions constantly compete with other cloud use. (Almost incidentally, the time of response to the situation is likely to be reduced.)
Typically, a fog computing setup also uses monitoring services that check whether all fog devices, gateways, and servers are operating and provide traffic control to make sure the system isn't overwhelmed by too many messages.
Close to the Edge
Edge computing is a term introduced by IBM the following year to refer similarly to computing action on the periphery of the cloud, but of a nature to analyze more of the data. Despite this contrasting feature, because fog and edge's overall functions are the same from a thousand-foot view, the terms fog and edge are sometimes erroneously thought of as the same concept.
The crucial difference is essentially in the amount of processing the fog servers, or other machines local to fog devices, are called on to do. In edge computing, it's more comprehensive than deciding whether to send an alarm or message. Because fog servers are gradually being designed to do more analytical data processing locally, it becomes an increasingly murky proposition to decide exactly where to draw a hard line between fog and edge, and therein lies the confusion.
Down at the End, Close by a River
Edge computing carries with it the implication that analysis of fog device data is carried out on servers proximate to the sensors generating the data stream. In this scenario, fog servers, or more standard server types in an attached network, actually run applications that turn the incoming data into some kind of information on the spot.
This could simply mean helping network admin types decide what information is time-sensitive or in need of escalation. Alternatively, it could provide other end users with statistics or results that are needed for making overall judgements about the edge network itself or a larger environment under analysis.
In contrast, fog computing means data is warehoused and perhaps evaluated in a rudimentary way, such as for purposes of categorization, but generally the data is held for later optional analysis by a cloud-connected server as the need arises.
Another way to explain the difference is that edge computing largely confines itself to analyzing actual data coming in from fog nodes and limits itself to handling a circumscribed number of peripherals. Fog computing, while doing less hard-data analysis, also must concern itself with administrative tasks such as the speed of processing across a whole network of fog nodes, controlling fog network traffic, and general storage of incoming fog data.
Out of the Fog and Into the Mist
As if to give us all the vapors, there's still another metaphoric layer to the general haziness of fog vocabulary, and that's mist computing. Located, as you might suspect, at the edge of the cloud's edge (and therefore referred to as "the extreme edge" in the latest technese), mist refers to the use of microcontrollers and microcomputers that interface to fog nodes and provide cheaper alternative fog devices for receiving raw data inputs, and perhaps even monitoring and managing how fog devices are handling the data. This means mist nodes can range from simpler sensors in "smart" homes that check temperature for heating and cooling adjustments and daylight for turning on interior home illumination, to smart cell phones users can carry with them to keep tabs on all kinds of things.
Nearly always physically present within the environments they are monitoring, mist nodes can be smart enough to help automate their own provisioning and deployment, as well as to process some inputs for uploading. Such usefulness will likely have a greater impact on our lives over the next a few years, even as it further blurs any effort to draw neat lines between the aqueous-monikered computing types, no matter what we may, um…"dew."
The Cutting Edge of Edge Taxonomy
Or say, for that matter. The Linux Foundation's LF Edge, an organization established in 2020 to "budget and spend funds in support of various open source projects relating to development of an edge computing software stack, including infrastructure and support initiatives," recast edge computing as a continuum in one of its foundational white papers. This was done at least in part to help nail down a vocabulary for this branch of computing technology.
In a nutshell, the foundation's taxonomy white paper separates edge into three major groupings. A "User Edge" branch is made up of a "Constrained Device Edge" (i.e., microcontroller-based devices), a "Smart Device Edge" branch (headless IoT devices and end-user client computing resources), and an "On Premise Data Center Edge" (server-based computers in secure locations). This branch is separated by "Last Mile Networks" from the second grouping, "Service Provider Edge," which is made up of "Access Edge" (server-based computing at telco networks and edge exchange sites), and "Regional Edge" (server-based computing at regional telco and direct peering sites) branches. This combination is separated by an "Internet Edge" from the third group, "Centralized Data Centers" (server-based computing in traditional cloud data centers). ("Headless IoT," by the way, means devices without a UI, "last mile" refers to the final leg of any telecommunications link, and "direct peering sites" refers to independent Internet networks that voluntarily enable communications between their individual users.) This continuum is also designed to support LF Edge's advocacy of a system of applications for interconnecting edge ecosystems, a concept it refers to as "B2B2X."
Whether LF Edge's entire nomenclature system will catch on is beyond this writer's predictive powers, but it does give a sense of how much the whole field of fog computing is expanding like a roiling thunderhead on a summer's day.
Ice Cream Castles in the Air
Be that as it may, edge computing appears to be heading us for a Jetsons-like future in which fog capabilities will eventually be woven into the very fabric of our lives. LF Edge defines the major edge "ecosystems" as Industrial, Enterprise, Web Mobile, Brick & Mortar Retail, Home, Auto, and Cities.
There are already edge-enabled applications popping up all around us. Large-scale patient monitoring has expanded during the COVID epidemic and may become crucial in coming years as Baby Boomers and their kids reach a nursing-home stage of life. Autonomous cars, although they still have some important safety bugs to work out, will eventually come to pass, and edge processors will be front and center there. The oil and gas industry already makes extensive use of industrial sensors via well-site monitoring devices, a situation that will likely last until someone figures out how to build a fusion reactor.
"Smart" homes are still a novelty, but many people alive today will probably live in one at some point in their lives, turning on the lights, the stove, and heating or cooling just in time for getting home from work. Automated and robotic processes and devices will provide better means of remote control of industrial and manufacturing activities. Wearable cognitive-assistance apps are already showing up in TV commercials for implanted blood-glucose-level testers that can be read with a smartphone. Immersive experiences will get even more realistic via virtual reality, immersive reality, and 360 video. Artificial intelligence will get even more intelligent when an AI app can have access to feedback from sensors reading user reactions. "Smart Cities" are cities that, for example, use feedback from traffic cameras to time signal lights to regulate automobile traffic flows during peak use times.
One doesn't need to be a prophet to predict these changes. Early applications and prototypes for all of these examples already exist. Cloud computing has led us to the edge. It just remains to take the plunge…so long as we all understand what we are talking about.
MC Press Online