TinyML Is A Perfect Match for Environmental Data

By the end of 2022, more than 1 billion Tiny Machine Learning (TinyML) devices will ship globally, with personal and work devices accounting for around 60% of all shipments. Growing at a Compound Annual Growth Rate (CAGR) of 28% between 2022 and 2027, annual TinyML device shipments will catapult to 3.5 billion in 2027 (2 billion for personal and work devices).

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Research Highlight.

Market Overview

  • By the end of 2022, more than 1 billion Tiny Machine Learning (TinyML) devices will ship globally, with personal and work devices accounting for around 60% of all shipments. Growing at a Compound Annual Growth Rate (CAGR) of 28% between 2022 and 2027, annual TinyML device shipments will catapult to 3.5 billion in 2027 (2 billion for personal and work devices).
  • When it comes to shipments by use case, audio signal processing and ambient sensing are pretty even—470 million and 435 million devices shipped in 2022, respectively. Those numbers will climb to 1.8 billion and 1.2 billion shipments by 2027. Meanwhile, always-on vision use cases will see 453 million devices shipped in 2027.
  • Out of the roughly 3.5 billion total shipments of edge Artificial Intelligence (AI) interface chipsets in TinyML expected in 2027, a little more than 3 billion of them will be for Central Processing Unit (CPU) architecture. Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs) account for the other 500 million shipments.
  • For 2022, nearly half (45.7%) of all TinyML chipset shipments for sensors will be for sound. The next three most common sensor types that TinyML chipsets are used for are as follows: vision (11.5%), temperature (10%), and humidity (6.8%).

“Some of the use cases are, in a sense, rather low key in the greater scheme of things of Artificial Intelligence (AI), but in the more rarefied context of the market, these can be very productive indeed. Among others, keyword spotting, object recognition, people counting, and audio detection are use cases examples.”  – David Lobina, Analyst at ABI Research

 

Get More Data 

Key Decision Items

IoT and Cloud Popularity Make TinyML Possible

Neural Networks (NNs) with Deep Learning (DL) algorithms are not a new concept. Indeed, these solutions have been used for pattern detection and predictive analysis for many years now. However, it hasn’t been until the increased adoption of IoT devices and cloud infrastructure that these compute-intensive models have been feasible at the edge in a low-cost manner. This has paved the way for a plethora of new use cases of ML.

Vendors Must Focus on the Strengths of TinyML

TinyML excels in several application scenarios, such as facial recognition, industrial operations, agriculture, and traffic management, to name a few. For example, by sorting through a dataset of millions of faces, a facial recognition algorithm can identify a specific face, despite being cloaked. This would be highly attractive to security teams, for instance.

In the industrial space, TinyML learning models can detect correlations between the numerous operational parameters involved—impossible for a human to find. It’s easy to imagine, for example, that a chemical manufacturing plant would find incredible value in TinyML because the learning model can alert operators of safety hazards.

For the agriculture industry, TinyML helps monitor crop health, automate water distribution, track the health of farm animals (with biometric sensors), and observe environmental data to prevent crop damage. With a more transparent and automated approach to agriculture, farmers ensure greater crop yields. It’s also demonstrated that computer vision can be used for training agriculture robots to carry out automated tasks like crop maintenance and monitoring.

Additionally, ML can serve as the main conduit for adaptive traffic control to reduce vehicle congestion and provide right-of-way to emergency vehicles. With the support of sensors, the ML algorithm can factor in the number, and in some cases the type, of incoming vehicles approaching each side of the intersection to automatically adapt traffic light signals.

These are only four of the many real-world use cases of TinyML. Really, the possibilities are quite extensive. As can be inferred, using environmental data is where TinyML really shines. The real-time environmental conditional data are the fuel that drives a TinyML algorithm to work its magic.

TinyML Is Best Suited for Environmental Sensors

As highlighted in ABI Research's 2023 technology trends, TinyML can be applied to essentially any sensor that collects environmental data, including smart building use cases like climate control and predictive maintenance. The reason for this is that environmental sensors are an ideal match for TinyML in terms of size and compute capabilities. Vendors should most definitely work environmental use cases into their product portfolios as the market attempts to widen the reach of ML processes and modeling. Out of all the environmental sensors, sound sensors account for almost half of the total market share in 2022—highlighting the importance of meeting audio/voice recognition demand.

Software Must Be Built with Hardware in Mind

On the software side, some market players train their TinyML learning models for specific use cases and industries, which often require discrete hardware. For example, Seoul-based technology provider Nota AI offers software solutions that are tailor-made for Intelligent Transportation Systems (ITSs) and low-powered Driver Monitoring Systems (DMSs). By focusing on the automotive sector, Nota AI carves out an important niche role that increases confidence in prospective customers.

Still, numerous platform and technology-based vendors, such as Edge Impulse, SensiML, Microbic, and Neuton, create hardware-agnostic environments. Doing so allows them to cast a wider net on the potential customer base. For example, hardware-agnosticism enables Edge Impulse to home in on several industries like agriculture, healthcare, infrastructure, and wearables, along with other verticals.

Other companies like the London-based vendor Plumerai, have created technology from the ground up for low-memory devices. In fact, Plumerai’s TinyML solution works on US$1 microcontrollers, which use less than 10 Milliwatts (mW). To make this happen, Plumerai leverages Binarized Neural Networks (BNNs) that only use 1 bit for every parameter, as opposed to the typical 32, 16, or 8 bits used in most models. This approach allows memory-constrained devices to effectively execute speech recognition and people detection applications.

Developing Software Tools for Automating TinyML Is Critical

There is a growing need for automating TinyML processes, such as data collection, modeling, and compilation. Research on this topic has, so far, been carried out by The TinyML Foundation, as well as the On-Device Learning working group—two crucial steps in the right direction. Neuton has been a shining example of automating TinyML processes, with its Automated Machine Learning (AutoML) solution removing the need for manual search. Further, automation like this breaks down technical barriers because users don’t need any coding skills and the neuron-by-neuron structure growth is self-sufficient.

To make NNs run at the edge, they need to be more compact. Deeplite’s Neutrino platform is one such answer to this problem, as it uses ML processes to automatically downsize other ML models into smaller, quicker, and more energy-efficient models that are designed for edge devices (i.e., vehicles, cameras, sensors, drones, etc.). Leveraging years of research and recent developments in TinyML, Deeplite is working on a fully automated, proprietary, and patented AI engine that will automatically optimize Deep Neural Networks (DNNs). This development is in response to the challenges of DL deployment in smaller hardware. Companies that can create automated solutions will be highly attractive in the eyes of development teams, as it takes less time to search for suitable designs. Finally, quantization is seen as another helping hand in delivering automation to TinyML, as companies like Latent AI and Aizip use it to optimize the ML optimization process.

Hardware-Specificity Brings Opportunity to Semiconductor Companies

Compared to general ML models, TinyML models entail a closer software-hardware relationship, as noted in a recent ABI Research report. While the majority of IoT devices operate on 16-or 32-bit Microcontroller Units (MCUs), some TinyML applications operate on 8-bit or as low as 1-bit MCUs. Needless to say, TinyML architectures require very specific chipsets. Semiconductor companies can capitalize on this need, as GreenWaves has done. The company’s chipsets target very energy-constrained devices that necessitate low-latency responses. That’s why GreenWaves’ chipsets regularly find their way into occupancy sensors, infrared sensors, security cameras, and other smart building sensors. These types of devices cannot consume a lot of energy, but they still benefit from ML modeling.

Key Market Players to Watch

Dig Deeper for the Full Picture

Get an even more detailed analysis of the TinyML market and how ML modeling is being applied everywhere by downloading ABI Research’s TinyML: A Market Update research report.

Not ready for the report yet? Check out our Edge Machine Learning for Computer Vision in the Age of Industry 4.0 Research Highlight. This content is part of the company’s AI & Machine Learning Research Service.

Download report that updates you on the Tiny Machine Learning (ML) market.