This desire to minimize bandwidth and RF power expenditures has resulted in the emergence of Edge Computing, where the intelligence is placed in the sensor or other remote subsystem involved. This results in significantly reduced communications with the Cloud, where messages can be reduced to scheduled or situation-related reporting. For example, when trying to recognize a face, instead of sending massive amounts of streaming data to the server, it can do it locally, and just send the “I found the face”.
When you have tens of billions of IoT devices, if all they're doing is capturing data and sending it back, you don't have to do a lot of local processing, but if it's working on something that inherently has a lot of extraneous useless data, shipping it back through the Cloud to the data center doesn't appear very cost-effective. Add to that a smart node is (can be) a more secure node, and the argument for Edge Computing becomes all that more compelling.
In reality it will be a mix, as there's no one solution that fits all. A similar argument was made in the old-timey days, when the server resided in the basement and there were dumb terminals on the desktop. When the PC was being introduced to the workplace by IBM, there was an argument over the utility of abandoning “Thin Client” architectures for smart desktops. Then as now, the utility of putting power as close to the point of application as possible is a real force multiplier.
Another significant aspect of Edge Computing is that it pretty much eliminates latency issues. This is most applicable in safety and security situations, where immediate response as well as robust and reliable operation are critical. Having to wait for the data to be sent to, dealt with,and returned by the Cloud can result in an unacceptable time lag, exacerbated by network latency issues. There is an obvious benefit to Edge versus Core processing for things that inherently require low latency.
The RF Environment
However, while Edge Computing is a powerful tool to address the burgeoning demands of the IoT, the wireless ecosystem also needs to be addressed. Another way to serve the growing issues of a Cloud-based IoT is by using multiple bands to serve various application spaces. Zigbee, Bluetooth, LoRa, 5G, and WiFi are all jockeying for position to serve as many application spaces that their solution can support.
This creates a need to organize, integrate, and homogenize all these various solutions to create a seamless operating environment for IoT devices. Having separate proprietary solutions in IoT development without an integration path can create areas of incompatibility in the Cloud that will reduce its reliability and utility. The best set of solutions is one that allows users choice within the system, with the ability to switch between bands based on need and accessibility.
A praiseworthy effort in this direction is the collaboration between WiFi and LoRa, which enables connection flexibility between the two solutions. WiFi 6 and LoRa are now compatible in such that an IoT device can now switch between the two, based on the requirements of the moment. For example, a smart sensor could use LoRa for its reporting, and WiFi for over-the-air updates, only using the high-bandwidth medium when needed.
The growing importance and scope of the IoT has placed significant demands on the design and engineering community, to create new solutions to address the myriad of needs. Proper integration of Edge Computing, leveraged by responsible coordination among service providers within the Cloud, will enable a strong and robust IoT ecosystem beneficial to both users and providers.