Qualcomm Snapdragon XR2+ Gen 2 Expands XR Chipset Market in Anticipation of Strong 2024

Subscribe To Download This Insight

By Eric Abbruzzese | 1Q 2024 | IN-7211

Announced before CES 2024, Qualcomm has made public the latest chipset in the company’s Extended Reality (XR)-specific product line, the Snapdragon XR2+ Gen 2. The company also announced a reference design using XR2+ Gen 2 or the previous XR2 Gen 2 to accelerate partnership opportunities in the Mixed Reality (MR) space.

Registered users can unlock up to five pieces of premium content each month.

Log in or register to unlock this Insight.


A New Chipset for a Promising 2024


Expanding its already market defining XR chipset platform, Qualcomm recently unveiled its latest chipset targeting Extended Reality (XR) devices in the XR2+ Gen 2. The main upgrades over the previous XR2 version include higher supported resolution (4.3K 90 Hz), higher max processor frequency for the Central Processing Unit (CPU) and Graphics Processing Unit (GPU), and support for 12+ concurrent cameras. The plus version XR2 Gen 2 joins other XR line chipsets from Qualcomm as a flagship product, with earlier XR chipset offerings still available. There are several Original Equipment Manufacturers (OEMs) confirmed leveraging XR2+ Gen 2, including Samsung and HTC. Existing XR2 chipset benefits carry over, including improved Artificial Intelligence (AI) performance, low-latency full color camera passthrough support, and Wi-Fi 7 and 6E support. Qualcomm also announced a reference design partnered with Goertek and eye tracking from Tobii. Both the XR2+ Gen 2 and the earlier XR2 Gen 2 are on offer for the reference design.

As the XR hardware space grows and matures, more choice is always better. OEMs having a flagship chipset pushing capability is beneficial in planning and manufacturing their own near-term flagship devices. A modern reference design serves as example for these OEMs and can educate a market on best practices for a novel device—as spatial compute and AI grow in capability and popularity, this can be especially valuable.

Resolution Wars, Spatial Compute, and AI


Going into 2024, there will be a few key battlegrounds for XR hardware: display resolution, spatial computing, and AI. Resolution is the most straightforward at the component level, with displays seeing a gradual increase in average resolution Year-over-Year. Resolution, combined with other display elements like contrast and brightness, is one of the most powerful “wow” factors for a headset. This is, at least in part, why devices continue to push resolution higher. Apple is also at least partially responsible for what will be an overall increase in resolution demand in 2024 for XR headsets. High-resolution displays paired with (theoretically) best-in-class passthrough capabilities are the main draws for Apple’s Vision Pro headset. Higher resolution also means more compute required to meet that resolution demand, so increases in graphical compute capability go hand in hand.

Spatial compute is a term still finding its footing, with Apple somewhat coopting the term to push Vision Pro and its broader XR ecosystem. At its core, spatial compute is not unique to Apple, but instead includes technologies that leverage spatial data in some way—device tracking in space, Mixed Reality (MR) experiences, etc. This is not locked into the XR market—there is discussion of spatial compute in automotive, for instance, but XR is the most directly user-facing. Spatial compute can be improved with more data access and more spatial compute power: cameras enable more data, and AI enables more compute power.

Finally, AI remains a differentiator for so many technology markets, XR included. At the chipset level, increasing dedicated silicon space for AI compute will be a trend in 2024. The end use for this AI and dedicated silicon will vary, but spatial compute will be a major element. Improving tracking accuracy while reducing the power draw for enabling that tracking is a win-win use case. AI is already being used for super resolution and foveated rendering, software techniques that aim to improve the visual user experience. The XR2 chipset lineup touts both as differentiators, and for good reason. Latency is also of paramount concern for XR, and AI is being used to enable low-latency camera tracking and accurate, concurrent tracking for everything from hand and controller tracking to facial expression estimation and Three-Dimensional (3D) environment reconstruction. Object and environment recognition is a significant portion of this pipeline, and one that requires AI to ensure accuracy and speed of recognition.

Improvements in battery life and MR capability/quality will also be talking points in 2024. Meta’s Quest 3 leaned into the MR play, while the majority of Vision Pro’s go-to-market is around its passthrough quality and quality of interaction. Battery and MR capability both tie heavily into resolution, spatial compute, and AI as well, with symbiotic relationships between the elements: AI improving GPU efficiency and enabling foveated rendering improves battery life, for instance.  

A reference design can aid in highlighting these relationships. Design is one thing, and manufacture is another, and a reference design can add visibility to supply chain elements—with resolution and spatial compute being focus areas for 2024, a handful of new components will be required, possibly from new suppliers. A reference design also helps lay a foundation for the market to better understand spatial compute from the hardware and use case level, serving as an example for how to build and develop for a product before an OEM scales up its own product.

A Newly Competitive Landscape


The XR competitive landscape will change dramatically in 2024, with Apple joining the fray with Vision Pro and greater Augmented Reality (AR) smart glasses hardware competition expected as well. For Qualcomm, which basically represents the entire XR market outside of Apple, these three battlegrounds are important and are rightfully core to the XR2 chipset go-to-market strategy. As pure AR smart glasses become more commonplace—with some activity in 2024, but more impactfully in 2025, having a cohesive ecosystem across device types can help build a brand image beyond single devices.

While hardware will be a major talking point in 2024, user experience trumps all, so content is equally as or more important for pure hardware specifications. Making use of spatial data and improved hardware specifications with improved and expanded use cases and content for XR is paramount. This can start at the chipset level, where ensuring support for cameras, AI processing, resolution, etc. will help build out a potential partner ecosystem and ultimately appeal to the widest user base possible. Building this out to software and services is a logical next step, and one that Qualcomm and others are doing. Snapdragon spaces adds developer support to the Qualcomm go-to-market strategy and helps bridge hardware to software. Apple has an advantage here with its walled garden, by default entering the XR hardware market with its software and service strengths in tow, so any efforts to leverage cohesion in an ecosystem, while potentially avoiding the walled garden weaknesses is a best-case scenario.

Expect 2024 to hold more interest and discussion on every aspect of the XR market. Hardware is often the easiest to market, so headlines will default to hardware capabilities, but, increasing, maturity of hardware will gradually introduce more software and use case conversation. Building a product story around content can be difficult, usually requiring hands-on experiences to truly make an impact. Apple promises to make Vision Pro available at Apple Stores, which is a massive advantage over players without that direct end-user touchpoint. Therefore, ensuring hands-on availability, however possible for competitors, will be beneficial. In the end, like any tech product, success lies in the combination of hardware, software, and service to deliver a valuable experience at an appropriate price—ensuring capability at the hardware level, tying that to developer tools and content creation communities, and then getting that experience into user hands sounds simple, but has proven challenging for XR for years.


Companies Mentioned