#4 – IoT drives AI to the edge
Although cloud-based AI has been taking the centre stage of media attention, there is another type of simpler and distributed AI emerging – edge AI, driven by the demand from various IoT applications. Running simpler AI on the device itself brings many advantages such as reducing data consumption and providing low latency for real-time decision making. There are challenges in terms of cost and energy consumption of the device but with the advancements in AI processors and new AI algorithms suitable for edge deployment, the solutions and application areas are expanding. We foresee that in 2018 edge AI will take an important role in the AI development.
Driven by the advancements in processing hardware and AI algorithms, machine learning (especially deep learning) is addressing many complex problems and has outperformed its human equivalent. But so far, the training and deployment of these types of AI has been limited to expensive data centers, providing services that we take for granted, such as Google Translate or facial recognition. This type of complex, cloud-based AI still captures the imagination and dominates the media’s headlines. However, another type of simpler and distributed AI is quickly emerging, driven by the demand from various IoT applications – edge AI.
Granted, the massive amount of data collected by IoT devices still needs cloud-based AI to help process and analyze it, but there are also many unique advantages of running simpler AI algorithms directly at the edge – on the device itself. For instance, processing data locally can reduce data consumption and also ensure continuity in case of network downtime. It can improve security and privacy with locally processed and anonymized data. Finally, edge AI eliminates communication latency and thus enables real-time decision making.
There are many IoT applications that can benefit from simple edge AI, ranging motion trackers for elderly people that can detect when they have a fall, to preventive maintenance that detects anomalies in a machine’s performance and shuts down the equipment before any incidents occur. But edge AI is not limited to only simple algorithms: even trained neural networks can be deployed onto edge devices, such as an algorithm running on an USB stick-sized device to identify particular objects in live video streams, or a self-driving algorithm in autonomous cars.
Most IoT applications need cheap and energy-efficient devices, which is the complete opposite to the environment where deep learning is typically deployed. To address these challenges, AI algorithms with completely different approaches are being developed for edge deployment by companies ranging from start-ups to technology giants like Microsoft. On the device side, chipset manufacturers such as ARM, Intel and Qualcomm are pushing strongly for AI processors. Each new product launch brings better processing power, smaller footprint and lower energy consumption.
Compared to the traditional “dumb” IoT devices that use cloud-based AI for their data analysis, a device with edge AI can be more expensive. However, over time edge AI can still be more cost efficient since processing data locally can reduce the cost of connectivity. In the end, the business case decides where to deploy the intelligence, but the options are there and the solutions and applications are increasing.
Considering the above, we predict that in 2018 edge AI will assume an important role in wider AI development, driven by both the maturity of product offerings and also the demand for IoT devices with built-in intelligence.