Hardware Design for Edge Computing vs. Cloud Computing
The next great developments in IoT products are in edge computing.
The term “cloud computing” is now a regular piece of technical vocabulary, and for some time it was viewed as the solution to every data-intensive computing problem. While cloud computing certainly gives many developers a simple way to create value and scale quickly, edge computing is quickly becoming a viable alternative to deploying software applications on the cloud. In fact, the two solutions are not mutually exclusive, meaning both can be used as part a solution to real technical problems.
If you’re a hardware designer, then there are some important points to consider when choosing to deploy a solution on the cloud or on the edge. Deploying hardware on the edge refers to an application that runs on the “edge of the network” as part of the Internet of Things. Software and hardware developers have a role to play in enabling different applications in edge computing, but none of them will be possible without the right hardware design tools.
Comparing Edge Computing vs. Cloud Computing
The primary environment for cloud computing is in data centers, where large clusters of computers are centralized in a single location. Computing power and storage at data centers is accessible from anywhere through a web or mobile application, providing users access to computing resources without having to purchase the equipment themselves. While cloud computing is economically efficient, it quickly becomes computationally inefficient; all data is processed in the cloud and returned to the end user, which takes too long, uses excessive power, creates privacy and security vulnerabilities, and carries scaling problems.
Edge computing is different in that computing is decentralized and distributed across a large number of devices in different locations, i.e., at the edge of the network. Edge computing is rather inefficient from a data storage perspective due to potential downtime, inability to backup large amounts of data in multiple locations, and limited data storage on individual mobile and IoT devices. Cloud storage will likely continue to dominate decentralized storage over time.
However, edge devices quickly become much more efficient than the cloud from a data processing perspective, in that data does not need to be transferred back-and-forth between a computing node on the network edge and a data center. With edge computing, data can be processed in the device that collected it and results can be used by the device immediately, rather than sending the data to a data center and waiting for a response. The ability to process data on the device that collected it enables data-intense, storage-light applications in artificial intelligence, remote sensing, smart cities, digital manufacturing, and many other areas.
Edge computing is related to another term called “fog computing,” where a small cloud network is created from a collection of edge computing devices, and this edge network can still take advantage of cloud storage. The edge computing devices are responsible for gathering and processing data at remote sites, exchanging data directly with each other or through a nearby gateway, and sending their results back to a cloud data center. This removes much of the initial computational burden from the data center and allows edge devices to immediately act on the data they collect.
Edge computing takes the processing power in this data center and distributes it across multiple devices in different locations.
As an example, The Things Network, which operates using LoRaWAN nodes and dedicated gateways, allows a point-to-point network to be created from LoRaWAN nodes that pass data to each other and cooperate in processing. The data can then be sent to a centralized repository for storage and further processing through a LoRaWAN gateway. There has been significant research into using IoT devices in LoRaWAN networks for fog computing.
Designing for Edge Computing
You should keep the following points in mind if you are considering designing a custom edge computing device:
- Processing power. One of the greatest challenges facing edge computing devices is being able to process large quantities of data that are gathered at these remote locations in near real-time. Any edge device should include a sufficiently fast processor to quickly process and extract value from the data gathered by the device.
- Power source. Edge computing devices may be deployed as mobile devices, requiring a stable, rechargeable source of power and power regulation. This ensures uptime and prevents power integrity problems that often arise at the PCB level. Your edge computing device should include power regulation, and possibly power over Ethernet if your device will be connected directly to a LAN.
- Memory. Depending on the connectivity at your remote site, an edge computing device may need to store some data for a significant period of time before the data can be transferred back to a base station, gateway, or the cloud. You’ll need to determine how much data accumulates and include the appropriate amount of memory in your system.
- Wireless connectivity. Not all devices will connect to a wireless network; those that don’t use a wireless protocol will need Ethernet to connect to a LAN or broadband internet. Otherwise, you will need to include a wireless module in your board so that data can be sent to a base station, or so that one edge device can communicate with other edge devices as part of a larger network.
Building Edge Computing Devices with Modular Design Tools
Modular design tools are taking a cue from modular and graphical software development tools, allowing hardware designers to quickly create powerful systems from standard COMs. Once you add wireless or Ethernet connectivity to your board, you have a system that can act as an edge computing device as part of a larger IoT network.
Consider the case of miniNodes, who created lightweight dedicated servers that run on Linux and ARM processors. They quickly built a prototype with 5 Raspberry Pi modules and ARM processors using the modular design tools in Upverter; miniNodes later moved to full-scale production. This type of system is ideal for use as a gateway as part of an edge computing network, and it contains enough processing power to easily handle computationally intensive tasks in a variety of applications.
This miniNodes board was created with the modular hardware design tools in Upverter
The modular design tools in Upverter allow any hardware designer to quickly build an edge computing node by arranging standard COMs on a board. You don’t need to manually define connections between modules on your board; these interconnects are already well-defined and are automatically generated by Upverter. This allows an engineer who may not be a professional PCB designer to quickly create a customized, cutting-edge system without becoming mired in the finer points of PCB design.
If you’ve landed on creating a network of devices for edge computing vs. cloud computing, the modular design tools in Upverter can help you quickly create cutting-edge, fully functional modular hardware systems in a browser-based design interface. You’ll have access to industry-standard COMs and popular modules, allowing you to create production ready hardware for nearly any application.
Take a look at some Gumstix customer success stories or contact us today to learn more about our products, design tools, and services.