How would you like the feature to work?
Implement support for the OpenXR protocol via a LEAP plugin.
Why is this feature important to you? How would it benefit the community?
The functionality that Puppetry introduces has incredible potential, especially in a world where AR and VR devices are becoming available at lower and lower prices, allowing a wider user base access to hardware capable of some degree of reliable body and / or face tracking.
However, implementing support for individual devices is not practical, and is far from future proof; and leaving the creation of such a plugin to the development community outside of Linden Lab opens the can to many problems, including, but not limited to: Privacy concerns, malware concerns, difficulty of setup, incompatibilities between viewers, etc. Having an official plugin would solve many of these problems.
Implementing OpenXR can also be made to work alongside the existing Webcam Puppetry plugin, so for example, OpenXR providing body tracking data, and Camera Based Puppetry providing face tracking; essentially allowing complete control over a Second Life avatar via real-world actions.
Instead, I'd like to propose that Linden Lab create an official plugin to interface between OpenXR and LEAP, which removes the burden of having to support devices individually, instead, allowing any OpenXR-compatible device to interface with the Puppetry system.
Other residents have created a proof-of-concept, which was made using tracking data provided by a variety of hardware - So far, the system has been tested with the Meta Quest Headset and Trackers, Valve Index Trackers, HTC Vive Trackers and Kinect Cameras; with tracking data retrieved via OpenXR and live-streamed into SL via the Puppetry system. See the attached videos, graciously provided by WindowsCE and Darksider Alex.
Given the push towards open standards in other current projects at Linden Lab (Notably, the PBR project and implementing Khronos glTF, and the general positivity that approach has gained from users, and the intention to transition Second Life onto a Vulkan-based renderer), implementing another Khronos standard within the Puppetry project makes perfect sense, and massively increases the scope of potential uses out-of-the-box and plans ahead for Second Life’s future. For some more in-depth information about OpenXR, it’s history, development and future, see this video: https://youtu.be/cMyUqDeGH6A
One example of a business use-case of such an implementation is the ability to hold a live-production concert in the virtual world, as seen on some competing platforms during the pandemic. One such example of this being done with full-body tracking can be seen in a Fortnite live music concert, see here: https://youtu.be/wYeFAlVC8qU?t=76
This is far from the only use-case however - the functionality could be uses to attract Educators (and, in turn, Educational Institutions) to host virtual seminars and presentations on the Second Life platform, leveraging the enhanced user interactivity and gameplay possibilities that a link from the real world and the virtual world would provide.
There may be a time when Second Life considers the addition of VR functionality. When that time comes, having OpenXR support integrated into Second Life already would be a massive benefit and would form a great backbone to build off, and would lower the complexity for the project overall. For now, OpenXR can do a lot of puppetry weight lifting but, as mentioned, it could be much more in future. OpenXR already being properly added by LL via this plugin may also encourage TPVs to innovate on the concept, and possibly develop their own VR-enabled viewers. As it stands right now a good majority of the industry who make AR/VR headsets and tracking technology have adopted the OpenXR standard to be compatible across a wide range of software and devices.
Last, but not least, some honorable mentions for other platforms already integrating OpenXR: Meta Horizon Worlds, VRChat, NeosVR, GarrysMod, Sandbox, RecRoom, SteamHangOuts, Mocap Fusion
Additonally, there seems to already be some degree of integration between Python and OpenXR, as shown in this Github Repo: https://github.com/cmbruns/pyopenxr