Bringing AI to Sensors in the Age of Edge AI

Subscribe To Download This Insight

2Q 2019 | IN-5474

As edge devices, such as smartphones, surveillance cameras, and smart home devices, become more powerful, they start to feature embedded Artificial Intelligence (AI) inference capabilities. This characteristic has started to move into industrial and environmental sensors, which did not previously have processing capabilities sufficient to perform AI inference. This insight looks at the commercial solutions available to make sensors smarter as AI moves to the edge.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.

 

Bagi nanoAI is Designed to Bring Embedded AI to Sensors

NEWS


At Mobile World Congress (MWC) 2019, Bragi, a hearable vendor, announced its AI stack, which consists of nanoSYSTEMS, a sensor-rich hardware platform designed for low-powered edge computing devices; nanoOS, Bragi’s proprietary, hardware-agnostic and modular OS; and nanoAI, Bragi’s machine learning-based sensor fusion technology for embedded systems. One of the key markets that Bragi intends to target is sensors deployed in industrial manufacturing and smart cities. Taking data acquired by these sensors and processing it with nanoAI, Bragi provides its clients with edge-based AI inference capabilities that have near-zero latency for behavioral tracking, predictive maintenance, and anomaly detection.  

Bringing AI to the Sensors

IMPACT


Bragi’s solution is a good indication that AI is moving from the cloud to the edge. As most edge sensors and devices do not carry the necessary processing capabilities and cloud connectivity, a new set of AI system must be designed specifically for AI applications at the edge. At ABI Research, we break down these systems into four major categories:

  • Sensor-based system – Refers to a system where a device has no connectivity to any other platform and is a standalone platform itself. In this circumstance, for AI to take place both training and inference have to be performed on the device, as there is no other area where the compute can take place.
  • Device-connected system – Refers to a system where the device has some means of connectivity–for instance, through WiFi, Bluetooth, or LTE. This means that inference and training can take place either on the device or at a different point in the technology stack, like the cloud.  
  • Gateway-connected system – Refers to a system where a device either relies on a gateway for some of its compute, the gateway is the source of all the device’s compute, or the device is connected to a cloud or local server, in which case training and inference can take place across these areas.
  • On-Premises Servers – Refers to a system where devices are connected directly to a company’s local computer server systems. In this instance, training and inference are likely to be distributed across the device and the local server. 

All of these systems are designed to satisfy several key requirements for edge AI. Often, edge sensors and devices acquire large amounts of data. If the training and inference of AI occur in the cloud, then information has to travel back and forth between cloud and edge devices. This will result in various concerns, including connectivity cost, data security, data privacy, edge storage, and power consumption. Among all, the chief concern is latency. Given that more mature AI applications currently come from smart manufacturing and public safety, where AI is deployed to make critical decisions such as quality inspection, surveillance, and alarm management, any latency within the system may result in heavy damages or losses. Moving AI to the edge mitigates potential vulnerability and risks such as unreliable connectivity, data loss, and delayed responses.

Bragi’s nanoAI solution is perfect for the sensor-based systems mentioned above. As sensor-based systems are, more often than not, standalone systems, smart sensors require an AI inference engine that is lightweight enough to store on their limited storage capacity and highly accurate with little to no resource overhead. This allows these smart sensors to perform AI inference fully onboard, without needing to rely on external resources such as gateways, on-premise servers, or the cloud, and will ensure data privacy, security, and reliable performance regardless of connectivity status.

Advancements in the Wider Ecosystem

RECOMMENDATIONS


At the same time, the AI inference engine needs to rely on hardware accelerators dedicated to AI processes. For example, Bragi partners with Syntiant, an Application-Specific Integrated Circuit (ASIC) manufacturer that focuses on Natural Language Processing (NLP). Aside from Syntiant, there are also other ultra-low-power ASIC chipsets available in the market, including GAP8 from GreenWaves and the Lattice sensAI stack. All these chipset companies are targeting edge sensors and devices that are deployed large-scale, with low installation and operation costs. Moving forward, the field is getting crowded as cloud players such as Google and Huawei introduce their own edge AI chipsets to address low-powered AI applications. As cloud-based AI applications become more commoditized, companies have identified the edge to be a key differentiator for their solutions.

At the moment most of these solutions, including Bragi’s AI stack, are still in the early stages of commercial deployment in smart cities and smart manufacturing, mainly used for asset tracking and anomaly sensing, and yet to achieve large-scale adoption. While able to offer better processing capabilities, sensors with embedded AI are often much more expensive. End users will also need to design and introduce a new set of procedures and protocols to leverage information and insights derived from these sensors. Nonetheless, these obstacles will slowly disappear as the cost of sensors and computing power continues to come down.

Services

Companies Mentioned