“Edge computing is not a cloud killer and neither will it be. However, edge computing is driven by the emerging needs of the IoT industry. It can be envisioned that edge computing might have massive adoption in the coming years powering next-generation technologies from connected cars, smart drones and manufacturing.” The NZ IoT Alliance views IoT and Edge Computing as synergistic exponential technology environments and welcomes further discussion on the intersection of IoT and Edge Computing.
The Internet of Things (IoT) is taking almost every industry — and every household — by storm. Gartner expects the number of IoT devices to almost double between 2018 and 2020 to 20 billion connected devices. This figure is likely to rise with the expected arrival of 5G and could lead to an exponential adoption of IoT. This growth will be accompanied with a drastic increase in the amount of structured and unstructured data.
Figure 1: Cambrian explosion of Data [Courtesy of EE Times]
This large amount of data also brings some challenges, among them, the ability to process the data. Cloud computing might not be the ideal solution due to various issues, namely:
a. Latency – Sending data across long routes can slow down the response of the system. A lower response time is especially important in mission-critical applications
b. High costs of ingesting, storing and processing the data in the cloud. A self driving car will create 1GB of data per second. Processing this large amount of data through cloud architecture could lead to bankruptcy for many firms.
c. Data Privacy- With this growth in IoT devices data collection, there is also the issue of data privacy. As per the EIU report, “What the Internet of Things means for consumer privacy”, 92 percent people expressed that they want to control what personal information is automatically collected., while 74 percent were concerned that small privacy invasions lead to loss of civil rights.
However, there is a solution that can tackle these challenges – Edge computing.
What is Edge Computing?
In cloud computing, the processing power is centralized. All data must travel from device to the cloud servers to be processed. Edge computing, on the other hand, pushes generation, collection, and analysis out to the point of origin – the IoT devices and sensors – as opposed to a data centre or cloud. Edge computing does not mean the end of the cloud. But a more disciplined approach to data storage and processing.
Edge computing is ideal for the expanding IoT arena for four reasons:
1. Increased Availability/Reliability: Decentralization of computing power assures that other nodes and associated IoT assets will remain operational even if one edge IoT device fails.
2. Lower Data Management Costs: Storing and analyzing most of your data at the edge might help cut down on these cloud computation costs. And only the data that needs to be analysed in aggregate can be transferred to the cloud.
3. Improved Privacy and security: Especially in case of sensitive data like health metrics, defence equipment readings etc., there are legal, ethical and privacy restrictions on transmitting data through networks or storing in the cloud. Edge computing could help reduce the risk while providing similar benefits to cloud computing.
4. Reduced Latency: With most data being processed at the edge there is more bandwidth available for other processes. Edge computing is the also the ideal solution where millisecond lags can cause serious damage.
5. Intelligence for Remote sites: Edge computing can provide intelligence in places where cloud connectivity is not stable or economical. For example, forests, remote oil fields, etc.
Figure 2: How Edge computing works [Courtesy of Network World]
Where is the “Edge” in Edge IoT?
The “Edge” is a theoretical space where the data processing resource may be accessed in the minimum amount of time. This resource could be on the on-premise server or it would be on the sensor/IoT device itself. Examples of server-edge are AWS-GreenGrass and Azure -IoT edge.
However “Edge-device” processing is a challenge. Most AI algorithms need huge computing power to accomplish tasks from the huge amounts of data. For this reason, they rely on cloud servers to perform their computations and aren’t capable of accomplishing much at the edge, the mobile phones, computers and other devices where the applications that use them run. Especially with advanced AI deep learning models, regular chips lack the necessary processing power.
Movidius Neural Compute Stick (NCS)
Come in, Movidius Neural Compute Stick. Described by Intel as the world’s first edge USB-based deep learning inference kit and self-contained AI accelerator. The Movidius stick enables a wide range of AI applications to be deployed at the edge. Think of it as supercomputer at the edge. With the tiny fan-less Movidius Neural Compute Stick, we can optimally run advanced deep learning models like image recognition or controlling drones, optimise existing workloads for AI as well as deploy virtual and augmented reality applications. The stick currently supports two Deep Learning Neural Network Frameworks: TensorFlow and Caffe. We can easily run complex deep learning models like SqueezeNet, GoogLeNet and AlexNet on your computer with low processing capability.
As shown in Figure 3, the training of the model can be done using superior computer resources, but the inference during deployment can be done on the Movidius Neural Compute Stick. It is the only edge computing chip currently available commercially.
Figure 3: Movidius NCS: Training vs Inference
Who else is developing Edge Computing AI chips?
Amazon has started designing a custom artificial intelligence chip that would power future Echo devices and improve the quality and response time of its Alexa voice assistant. Currently, Amazon Echo like other voice assistant’s (Google Home) has to process speech by sending a compressed version of it to the cloud for inference. Edge computing with AI chips will provide low latency, faster response and an improved experience for the speaker. This might provide Amazon, the edge in the smart home hardware market.
ARM recently announced a new Cortex-A76 architecture that is claimed to boost the processing of AI and ML algorithms on edge computing devices by a factor of four. This does not include ML performance gains promised by the new Mali-G76 GPU. These developments are part of the first phase of Project Trillium Initiative for AI.
Baidu has announced the Kunlun series of processors that are designed specifically for handling AI models for training and inference. The Kunlun 818-300 model will be used for training AI, and the 818-100 for inference.
Google is investing heavily in Edge Tensor Processing Unit (TPU). These chips are destined for enterprise-level tasks like factories or manufacturing units where it will be used for superfast image recognition or quality control checks.
Similar initiatives for AI accelerators are under development by other chip manufacturers like QualComm, Mediatek and CEVA.
In the manufacturing arena, AI and edge applications show a great potential in the development of the smart factory or Industry 4.0 (shown in Figure 4). Smart factory promises to provide greater autonomy to machines and provide the capability for the machines to communicate with each other. This would provide the machine the ability to make decisions without any human intervention. Thus, Smart Factories would enable to develop new and higher quality goods faster.
Imagine you have thousands of products passing through a conveyor belt per minute in an assembly line and suddenly one of the products falls flat and endangers damaging the machine. An image recognition system could identify the situation and stop the production line to prevent extensive damage to the machinery. It can also communicate the other machines in the operations regarding the failure and pause the production to prevent overflow.
Figure 4: Industry 4.0 (Courtesy of Christoph Roser at AllAboutLearn.com)
The other area where edge computing can shine is remote monitoring. For locations without proper internet connectivity or even absence of cellular signals, edge computing can provide onsite computing resources at a cheap rate. For example, on a remote oil field edge computing can be used to make real-time prediction regarding machinery failure rather than wait once every 24 hrs for the satellite to pass above the site and collect the data, process it and return the results.
Edge computing can play an important role in protecting critical infrastructure that require split-second decisions. Sub stations are an important example where a few seconds of delay can cause huge damage. With edge computing these sub-stations can be provided with smarts to make decisions in milliseconds on-site.
Edge-computing still in infancy is not a cloud killer and neither will it be. However, edge computing is driven by the emerging needs of the IoT industry. It can be envisioned that edge computing might have massive adoption in the coming years powering next-generation technologies from connected cars, smart drones and manufacturing.