Nothing's in my cart
6 minute read
Earlier this year we unveiled one of the most advanced XR devices ever. Lightweight, powerful, versatile, and packed full of features and flexibility – VIVE XR Elite.
We always want to push the boundaries of what’s possible, and we love seeing others in the industry do the same. We recently talked about VIVE XR Elite and OpenXR , and we wanted to share more information for Developers on new features we’re unlocking for them with VIVE XR Elite.
Let’s dive in.
Our Mixed Reality takes full advantage of the full colour RGB camera in VIVE XR Elite. We designed it to be high quality so that people can do things like read the text on screens like phones and monitors - great for working in XR, or for taking immersive experiences to the next level. To help Developers create these experiences, we’re enabling MR over PCVR streaming. Launching in Beta for Developers and Enterprise users initially, this will be available from today through VIVE Business Streaming after opting into the beta branch and updating to v1.11.3. MR Mode support in this update will include the MR passthrough underlay first and we'll have more MR features to come in future releases of VIVE Streaming, including OpenXR support, mesh and anchor.
Model: Turbine | Turbofan Engine , by blenderbirb / CC BY 4.0
In the short clip above, you can see it working with Blender in XR with passthrough underlay.
Oh and our latest WaveSDK (v5.2 or higher) we now support DirectPreview in both Unity and Unreal editors, so you can preview your application directly while you're developing without needing to build, sign & deploying the apk each time!
There are different ways to design co-location experiences for users of VIVE XR Elite, so that people can play the same experience in the same place at the same time. Or even, play different experiences in the same place at the same time but be aware of each other, like in a traditional LBE venue.
Next week via our FOTA 4.0 system update, we’re rolling out support for marker-based co-location, specifically by using ArUco code markers. These trackable markers give Developers the ability to identify the specific defined pattern and transfer it to the marker’s location and orientation. This pose can later be applied as an anchor or aligned coordinates for multi-user in same room.
Some of you may be familiar with ArUco codes from our work with VIVE Focus 3 and Location Based Experiences .
The demos you see above is our internal experiment called Shared Space Experience, which uses ArUco markers for alignment of two headsets in the same space. We have detailed instructions on how to generate, detect, track, and even recycle trackable markers, on the Developer Forum. You can find more details linked below:
Soon, we will be open sourcing Shared Space Experience for developers to try out and look to for sample code of our Marker implementation.
Last month we enabled our depth sensor for developers through our Beta program . Our depth sensor makes it even easier to have accurate scene scanning by building a virtual mesh from the physical environment, like room mapping in MR. Those meshes can then be used in the experience you're building. If you haven’t seen it in action before, you can see it as part of our Jelbee mixed reality demo.
We’ve also open-sourced our Jelbee demo application for developers to use as sample material. If you want to learn more, head over to our Developer Forum .
Hand tracking is critical for natural user interaction in a lot of use cases, and the industry has seen more and more hand tracking content come to life. VIVE has made huge strides in making our hand tracking more robust, especially when it comes to better handling of occlusion in our latest generation of our tracking engine.
We’re always listening to feedback and making changes based on it. In our next FOTA update for VIVE XR Elite, we’ll unveil more improvements to hand tracking, particularly in environments with reflections, and across more lighting conditions.
We recently shared more information about how Developers can leverage the flexibility of VIVE XR Elite’s inputs, using controllers, hand-tracking, or a combination of the two.
And of course, don’t forget to check out how we continue to support OpenXR. We believe in open ecosystems and making it easy for Developers to create outstanding XR content. Read more here.
We really want to hear what you think about all these new features and VIVE XR Elite. You can reach out to us over on the VIVE Forums or on our official Discord server .
Shen