Blog Infos
Author
Published
Topics
Published
Topics

TL;DR: Snapdragon Spaces™ XR Developer Platform is Qualcomm Technologies’ end-to-end platform for building XR experiences on compatible headsets powered by Android. The platform’s Dual Render Fusion feature enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality.

Snapdragon Spaces™ XR Developer Platform is Qualcomm Technologies’ end-to-end platform for building XR experiences on compatible headsets powered by Android. Recently, Snapdragon Spaces introduced Dual Render Fusion — a new feature in the SDK designed to help developers transform their 2D mobile Android applications into spatial 3D experiences with little prior knowledge required.

What is Dual Render Fusion?

Snapdragon Spaces Dual Render Fusion enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality. In practical terms, a smartphone acts as a controller for AR experiences, letting users select what they want to see in AR using familiar mobile UI and gestures. Imagine you are using your go-to maps app for sightseeing. With Dual Render Fusion, you can use the phone as usual to browse the map and at the same time, see a 3D reconstruction of historical places in AR.

Watch the following video to see examples of end-user experiences that Dual Render Fusion can provide:

TL;DR: Snapdragon Spaces™ XR Developer Platform is Qualcomm Technologies’ end-to-end platform for building XR experiences on compatible headsets powered by Android. The platform’s Dual Render Fusion feature enables smartphone screens to become a primary input for AR applications, while AR glasses act as a secondary augmented display. The dual display capability allows developers and users to run new or existing apps in 2D on the smartphone screen while showing additional content in 3D in augmented reality.
Why use Dual Render Fusion in your AR experiences?

The feature makes it easier for you to extend your 2D mobile Android apps into 3D spatial experiences without creating a new spatial UI. It’s also the first time in the XR industry that AR development tools have been made available to combine multi-modal input with simultaneous rendering to smartphones and AR glasses. Using Dual Render Fusion, developers with Unity experience can easily add an AR layer to their existing app using just a few extra lines of code. The feature gives more control over app behavior in the 3D space, significantly lowering the entry barrier to AR. But that’s not all — while using the feature, you have the option to utilize all available inputs enabled with the Snapdragon Spaces SDK, including Hand Tracking, Spatial Mapping and Meshing, Plane Detection, and Image Tracking, or go all in utilizing the convenience of the mobile touch screen for all input.

Job Offers

Job Offers

There are currently no vacancies.

OUR VIDEO RECOMMENDATION

, , ,

Now smile (also in 3D)! Exploring AR, ML and Camera related APIs on Android

When it comes to Camera APIs, Android has come a long way since its start, which is good, not only for the platform consistency above Lollipop but also because now we can increase its potential…
Watch Video

Now smile (also in 3D)! Exploring AR, ML and Camera related APIs on Android

Walmyr Carvalho
Lead Android Engineer
Polaroid

Now smile (also in 3D)! Exploring AR, ML and Camera related APIs on Android

Walmyr Carvalho
Lead Android Enginee ...
Polaroid

Now smile (also in 3D)! Exploring AR, ML and Camera related APIs on Android

Walmyr Carvalho
Lead Android Engineer
Polaroid

Jobs

Why is Dual Render Fusion significant? Why are we at an inflection point?

It takes a lot of learning for developers to break into mobile AR and even more to rethink what you already know, to then apply spatial design principles to headworn AR. The same applies to your end users who need to get familiar with new spatial UX/UI and input. Enabling the majority of developers to create applications which are accessible to smartphone users will unlock the great potential of smartphone-based AR. Snapdragon Spaces team members have been working hard to reimagine smartphone AR’s status quo and take a leap forward to fuse the phone with AR glasses. The Dual Render Fusion feature allows just that — to blend the simplicity and familiarity of the smartphone touch screen for input while leveraging the best of augmented reality. Dual Render Fusion unlocks smartphone-powered AR to its full potential and opens it up to millions of mobile developers like you.

Download and Build your AR app now

Dual Render Fusion is now available in beta as an optional add-on package for Snapdragon Spaces SDK for Unity version 0.13.0 and above, so be sure to browse the Documentation to learn more.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

This article was previously published on proandroiddev.com

YOU MAY BE INTERESTED IN

YOU MAY BE INTERESTED IN

blog
It’s one of the common UX across apps to provide swipe to dismiss so…
READ MORE
blog
In this part of our series on introducing Jetpack Compose into an existing project,…
READ MORE
blog
This is the second article in an article series that will discuss the dependency…
READ MORE
blog
Let’s suppose that for some reason we are interested in doing some tests with…
READ MORE

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu