Blog Infos
Author
Published
Topics
Published

TL;DR: Snapdragon Spaces™ XR Developer Platform is a game-changing XR development platform by Qualcomm Technologies, Inc. comprising an HDK, and SDKs for Unity and Unreal. Developers can build cross-platform immersive XR experiences which include features like hand tracking, plane detection, anchors, and more.

Lightweight headworn AR and VR headsets are poised to become the next evolution of the smartphone. They offer more immersive experiences than 2D screens and will transition users from looking down at their phones, back to looking around at their surroundings, while retaining that digital view.

Snapdragon Spaces™ XR Developer Platform was designed to facilitate the development of these headworn XR experiences. But it’s not until you get into the small details that you realize why Snapdragon Spaces is a BIG deal for developers.

To start, Snapdragon Spaces is an end-to-end platform with a rich HDK and SDK for building XR experiences on compatible headworn displays powered by Android. It works with the Snapdragon Spaces Services APK, which provides the XR runtime on Android devices.

The SDK is available for Unity and Unreal Engine — arguably two of today’s most popular game engines. These tools combine content-driven workflow with developer-oriented programming capabilities. If you’re coming from game development and other multimedia backgrounds, you might already have Unity or Unreal experience. If not, you can probably get up to speed quickly. Either way, you’ll be working in modern, proven frameworks, with the confidence that their capabilities can deliver high-performance AR apps.

The tools are also OpenXR compliant, which means that API calls should work across compatible devices with minimal porting. OpenXR plugins from Unity and Unreal establish base functionality like rendering and head tracking, and also provide interfaces to communicate with OpenXR runtimes including the subsystems provided by Snapdragon Spaces. This means you can build experiences that target OpenXR-compliant devices, while hardware manufacturers can build compatible devices with capabilities that fulfill the OpenXR specification.

Key Features

Here are some of the key XR features that Snapdragon Spaces supports and why they are important for building XR experiences.

Positional Tracking
The most fundamental and important feature for any AR application is positional tracking. It precisely maps the environment and estimates the position and orientation of the user’s viewing device within a 3D space. Snapdragon Spaces captures this as 6DoF information via input from headworn AR devices. Developers use this to understand and track the end-user’s position relative to the world and render AR content in the scene relative to the end user’s head position and orientation.

Hand Tracking
Hand Tracking tracks the position and orientation of a user’s hands and finger joints in 3D space. It can be used as input data to manipulate digital objects, interact with 3D GUIs, or animate digital representations of the user (e.g., realistic avatars, on-screen hands, etc.)

This provides a whole new level of input and feedback, not possible with traditional devices that rely on screen gestures or physical controls. Users can now see virtual representations of their limbs that update and behave like their real-world counterparts as they interact with objects. Capturing users’ movements and mannerisms can also bring a sense of presence to your XR experiences.

The Snapdragon Spaces Unity package provides an AR Foundation-like interface. Hand tracking is performed using ML-based models and the positional tracking cameras.

Additional hand tracking features for Unity (e.g., rigged hand meshing and interaction components) are available in the SDK’s QCHT Unity Core package, which is part of the Unity SDK download and comes in a folder called Hand Tracking inside the folder called Unity Package. The SDK has interaction methods, interaction components, and an extended hand tracking sample for examples of distal and proximal interactions.

Image Recognition and Tracking
Image Recognition and Tracking identifies visual features or markers captured by a camera. Image recognition and tracking are often used to trigger and display digital content in relation to real-world objects (e.g., display an information popup when the user looks at a product label). On headworn AR displays, this can provide a world of information as the user scans their surroundings.

Image Targets are images stored in a database as markers that can be recognized by the tracking system to identify flat regions in the world for the app to augment or otherwise act upon recognition of each specified image.

Plane Detection
Plane Detection is a form of spatial mapping that detects flat surface regions to define boundaries (e.g., walls, tabletops, etc.). This foundational feature paves the way for advanced constructs down the road such as digital twins (3D mesh representations of physical environment or objects).

The Snapdragon Spaces SDK Hit Testing feature also allows for ray casting to identify interactions with geometry found in the scanned area, by bouncing off detected surfaces.

Job Offers

Job Offers

There are currently no vacancies.

OUR VIDEO RECOMMENDATION

, ,

Migrating to Jetpack Compose – an interop love story

Most of you are familiar with Jetpack Compose and its benefits. If you’re able to start anew and create a Compose-only app, you’re on the right track. But this talk might not be for you…
Watch Video

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engineer for Jetpack Compose
Google

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engin ...
Google

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engineer f ...
Google

Jobs

Local Anchors
An anchor is metadata used to position, track, and persist digital content. Anchors offer the ability to attach an anchor (i.e., lock or pin) to digital assets in space, associating them with real-world clusters of recognized geometry.

Local anchors are currently restricted to on-device local storage within the instance of an app, and by default, will be cleared as soon as you close your app.

Local anchor information can now be stored in a local save file, allowing for the recall of the anchors upon opening the scene again. This creates the illusion that objects live in the environment across time. Not surprisingly, anchors depend on accurate positional tracking to enable their placement and orientation.

Learn more about Snapdragon Spaces to Start Building your Headworn AR App
Head to the Snapdragon Spaces Developer Portal to check out the documentation, find code samples , register to download the Snapdragon Spaces SDK for Unity or Unreal, and learn about compatible devices.

Author: Brian Vogelsang
Brian Vogelsang is Senior Director, Product Management for Snapdragon Spaces. He leads strategy and partnerships for the XR (Augmented & Virtual Reality) business at Qualcomm Technologies, Inc.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

This article was previously published on proandroiddev.com

YOU MAY BE INTERESTED IN

YOU MAY BE INTERESTED IN

blog
It’s one of the common UX across apps to provide swipe to dismiss so…
READ MORE
blog
In this part of our series on introducing Jetpack Compose into an existing project,…
READ MORE
blog
This is the second article in an article series that will discuss the dependency…
READ MORE
blog
Let’s suppose that for some reason we are interested in doing some tests with…
READ MORE

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu