Inside the Battle for Dominance in Wearable User Interface

Subscribe To Download This Insight

1Q 2018 | IN-5049

Due to the proliferation of wearables and advancement of technologies, there has been a convoluted war going on at the front lines of user interface (UI). It is essentially a proxy war, fought not by the UI component manufacturers, but by device manufacturers that are seeking for the right UI to accelerate customer adoptions and enhance user experience. New UIs such as voice, eye tracking, gesture, proximity sensor, and augmented reality are starting to become critical for new user experiences (UX). Case in point, ABI Research has just released a teardown report looking at Motiv’s fitness tracker ring, which rely exclusively on gesture control.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.

 

The War Horn has Been Sounded

NEWS


Due to the proliferation of wearables and advancement of technologies, there has been a convoluted war going on at the front lines of User Interface (UI). It is essentially a proxy war, fought not by the UI component manufacturers, but by device manufacturers that are seeking for the right UI to accelerate customer adoptions and enhance user experience.  New UIs such as voice, eye tracking, gesture, proximity sensor, and augmented reality are starting to become critical for new user experiences (UX). Case in point, ABI Research has just released a teardown report looking at Motiv’s fitness tracker ring, which relies exclusively on gesture control.

According to ABI Research’s recent report (PT-2080), touchscreen UI continues to be the dominant interface for wearable devices. More than 134 million wearable devices with touch displays were shipped in 2017; sports, fitness, and activity trackers, which are the largest market for touch displays, are expected to grow at a CAGR of 7% over the next five years. These devices with touch UI present data points so the users can obtain the information instantaneously. However, the UI experience of the touchscreen will decline over the next few years due to the integration of more sensors into wearables and the miniaturization of wearable devices. (For more information, please refer to the ABI Research Market Data Wearable Device Market Share and Forecasts [MD-WADT]).

Different Wearables and Use Cases Require Different User Interfaces

IMPACT


Wearable devices can vary from simple tools to powerful machines which are fully integrated to users’ needs. Some are vital signal monitors, while others feature visual interactive display. The interactions will be initiated by contextual data from a device’s sensors and end-user behaviors. As wearable devices are becoming more capable and powerful, new functionalities—such as smart glasses and VR—will demand novel interface development. Furthermore, wider technology adoption, especially in enterprises, shall increase as more and more use cases are developed for wearable devices.

Types of Input (UI)

Wearables Types GUIs &Touch Voice Eye Tracking Gesture Proximity Sensor Augmented Reality
Smartwatches X X   X X  
Healthcare Devices X   X   X  
Sports, Fitness, and Wellness Trackers X     X X  
Hearables X X   X X  
Wearable Scanners       X X  
Smart Glasses X X   X   X
Head Mounted Display (HMD) X X X X X X

 

Voice recognition improvements continue to push the envelope as an input method, which is now a standard in most consumer wearables. ABI Research recently forecasts that voice will play a more prominent role in user experience for wearable devices and can be used as either the sole input method or as one of the input options. Smartwatches will be the leading wearable to adopt voice UIs in the wearable category, as voice UI helps to resolve the problems of smaller screens (e.g., difficulties with navigation and interaction), and voice UIs help reduce the friction. Nonetheless, voice recognition still faces barriers in language support, especially in non-English-speaking regions. This is where Google and Apple have a clear competitive edge over their rivals. Google Assistant is capable to support 8 languages and will grow to 30, while Apple’s Siri supports 24 languages. Mobvoi, a Google-backed Chinese AI startup, has started to develop devices with native Chinese language support.

On the other hand, gesture UIs are more likely to be built into stand-alone products, which include software tools to pair with other devices and encountered systems. Gesture UI is popular among head-mounted displays because they are limited in input type by their form factor; gesture recognition helps remove unnecessary buttons, touch pads, and controllers for some applications. Accuracy and fast response is critical for most use cases in gesture, such as safety-conscious verticals or the consumer-focused gaming industry. Poor battery life continues to hinder smartwatches and other high-capability wearable devices from adopting more intuitive UIs such as always-on voice/gesture control, which limits potential with input methods.

Eye tracking has been adopted for gaming (e.g., VR headwear), and there is market potential in automotive and health-care verticals for eye-tracking capabilities. For instance, eye tracking will find a growing market in health-care devices for studying patient behavior, monitoring health conditions, and communicating with patients. While eye tracking will not be prevalent in most wearable form factors outside of head-mounted displays, there is potential for augmented reality to become the standard for digital information display in the future, replacing most traditional screens. AR presents new opportunities for wearable user interface (thanks to its hands-free device usage) while maintaining a graphical interface. This means eye-tracking technology can have a great impact in the future as AR continues to become more widely adopted. (For more information, please see ABI Research’s AR & Mixed Reality research website.)

Voice Control to Become the Leading User Interface

RECOMMENDATIONS


As wearables are becoming smaller and smaller, our interaction with the device is restricted to a few gestures and voice recognition. Gesture control is intuitive and light on battery life, but it remains fairly limited and requires ability to learn, remember and accurately execute gestures. While voice recognition revolution is able to input much more content to the smaller devices, ABI Research foresees that voice control will arrive in the wearable industry. 

Moving forward, ABI Research believes that device manufacturers, especially the OEMs for key components, must give more focus on voice. The advancement in speech recognition technology has seen the technology being adopted in smartphones, and to a greater success, in smart home. Once battery life becomes more long-lasting and edge machine learning become more popularized with the advancement of processing technology, voice control will become the main user interface for wearables. As such, digital sensor processing chipset suppliers (CEVA, Qualcomm, Texas Instruments), acoustic sensor suppliers (TDK, SenseOR, Teledyne) and cloud AI players (Google, Apple, Amazon) are expected to benefit from this transition.