Optimizing UX for Smart Glasses

Oftentimes with the introduction of a new technology, efforts in hardware and software development focus on experimentation and pushing the envelope, rather than the actual experience for new users. Augmented Reality (AR) is no different, where efforts over the past few years have mostly centered around piloting and experimenting with head-mounted and mobile-based AR.

Content and use cases are flourishing in the current market, and AR smart glasses and mobile devices are technically capable of supporting AR experiences. However, hardware manufacturers and content/software providers need to prioritize enhancing both User Interface (UI) and User Experience (UX) to simplify interaction between user AR device and virtual content. 

User Interface and User Experience

User Interface (UI) is a critical component to designing a product that will exceed user expectations, achieve optimal User Experience (UX), and stand out from the competition. At the current stage of hardware maturity, AR smart glasses and Software Development Kits (SDKs) support a greater range of interaction methods than ever before.

Voice control and novel head-mounted display-based options like gesture recognition and eye tracking have entered the mix with traditional controllers and smartphones. Pairing the proper input method to a well-designed UI allows users to perform tasks more efficiently, intuitively, and swiftly, leading to an enhanced user experience and ultimately, more fully realized valueABI Research has forecasted the AR smart glasses market to exceed US$100 billion by 2024.

The decision around the most efficient input method and UI design strongly relies on the nature of the target task or content, the user’s potential environment, and type of AR device. For instance, traditional buttons and touchpads are not always suitable in scenarios where a user must wear safety gloves or handle equipment. Hands-free voice control is considered the most optimal interaction method in these and similar scenarios. With regard to AR device type, monocular devices, typically used in less complex enterprise and consumer applications, can perform tasks efficiently with traditional user-friendly interaction methods via buttons or smartphones, with the option to add additional value and flexibility with more advanced input methods.

Interaction Methods are Emerging in AR

Gaze and gesture control are considered among the most emerging interaction methods for AR smart glasses, which significantly enhances UX and immersion by allowing users to intuitively and swiftly perform hands-free tasks. While highly capable, gaze and gesture control are not suited for every user or use case because they require high accuracy and low latency to be efficient and meet user expectations. Advanced AR headsets such as HoloLens 2 and Magic Leap 1 support a wider range of UI opportunities and input methods thanks to enhanced features, with eye tracking as an example. While these are currently the most capable AR devices available, there is still significant value in simpler devices that maximize usability and streamline user experience with other input paradigms.

Keeping the Momentum

Simple and intuitive UI and streamlined UX have mostly been ignored in the AR market thus far, but both are essential for enhancing and maximizing the value which will propel AR smart glasses growth toward the mainstream. The ability to remove the requirement for device training and keep users engaged with the device will maximize value for consumers and enterprises. At the same time, device features such as lower weight, capability for spatial mapping and sound, improved display quality, and better user feedback through UI and haptics also play an important role in maximizing this value.

As part of ABI Research's Augmented and Virtual Reality research service, our User Interface and User in Experience in AR technology analysis report identifies and closely examines emerging interaction methods for AR smart glasses to optimize the user experience. Subscribe to the service here and get the full report below: