top of page
Writer's pictureCaila K

Cyborgs Who Access The ENTIRE Metaverse

Give a cyborg one virtual world and you feed him for a day. Make his avatar interoperable and you feed him for a lifetime.


An essential characteristic absent from all existing VR applications for Motor BCIs is the Avatar’s ability to interact with virtual worlds beyond those developed in that study or for that tool.


Paradoxically, to meet the needs of Users with VR (Success Criteria satisfied with VR), the Avatars must be able to interact with as many other available VR applications — and switch between them — as every other VR user does.


Bridging the gap, I examine the demand for interoperable Avatars across the VR industry, the steps any new tracking hardware needs to take to penetrate the cross-compatible VR market, and how the Motor BCIs fit into the narrative.


Movements of Daily Living


Owing to the revival of VR, arguably traceable back to Palmer Luckey’s 2011 highly immersive Oculus Rift prototype (Carmack, 2012), dozens of types of HMDs and even more trackers circulate in today’s market of early VR adopters (Fig. 41).


More importantly, while some VR systems begin to target a wider demographic, treating VR as anything but the early-adopter stage does the field more harm than good.


Rogers (1962) brilliantly characterizes the early-adopter archetype who, to this date, controls the adoption of VR:


“The early adopter is considered by many as ‘the individual to check with’ before using a new idea. This adopter category is generally sought by change agents to be a local missionary for speeding the diffusion process. […] So the role of the early adopter is to decrease uncertainty about a new idea by adopting it, and then conveying a subjective evaluation of the innovation to near-peers by means of interpersonal networks.”


In many regards, the bloom of the digital economy (also known as “Web2.0”) in the early 2000s can be analogous to VR’s present maturity stage.


Roughly 900M worldwide shipments of personal computers (”PC”) between 2001–2006 generated roughly $1.5T in value (~$2.2T in present purchasing power). During this period, PC shipments and the value they generated grew at a rough average rate of 10% year-over-year. However, the number of hosts supplying the internet with information increased disproportionately by roughly 30% year-over-year (Britton and McGonegal, 2007). This disproportion indicates the rise of internet access by the PC consumer base along with the rise of networked host computers supporting them (serving content and maintaining transactions) and their interactions (user-generated content and communication).


The evolution of the value added by the digital economy (to the GDP of the US) further supports the content-driven argument as computer hardware value-add has progressively yielded to software, e-commerce, and digital media (Fig. 38).


Figure 38; Barefoot et al., 2019: Components of the digital economy — Current-dollar value-added share of total 1997, 2007, and 2017. Reprinted and condensed from Chart 5.



Like in 1997, when the takeoff of Web2.0 was on the horizon and the content’s value-add roughly equaled that of PC hardware, the same ratio can be observed in the VR market in 2023 (Fig. 39).



Figure 39; The Economist, 2022: Reprinted and condensed.



A newfound master of its movements, the child whines when it cannot interact with its environment. We, too, fluster when the limitless virtual possibilities are reduced to a handful of immersive activities.


Yet as demonstrated in Success Criteria satisfied with VR and propelled by the market demand, VR content is poised to break free from its niche use cases and match the breadth of human experience — even surpassing it with the fantastical qualia.


While much content already circulates the VR market, its incompatibility across hardware is a critical aspect hampering this momentum and preventing users from embracing it.


🖼️In the year 2024, it is the lack of cross-compatible content, not technological capability or price, which primarily hinders the early majority adoption of VR.


Access to VR Content


Presently, Steam offers the most extensive VR content library with over eight thousand applications and games. To access them, the user typically needs to execute them on their PC and connect to it with their HMD and trackers.


Some HMDs, like the Quests, Pico 4, or Vive XR Elite, offer a standalone experience and built-in stores to access VR content without running it on the PC, called Quest Store, Pico Store, and Viveport Infinity, respectively. While dominating the HMD sales (Omdia, 2023), they offer limited content coverage and quality due to the HMD’s inherent poor computational capacity.


Much like how Microsoft PC users accessed the internet through the rival Netscape browser during the early digital economy, tethered Meta Quest 2 users now dominate the Steam library (Fig. 40).


Figure 40; Steam, 2024: Steam Hardware & Software Survey: December 2023 — Steam conducts a monthly survey to collect data about what kinds of computer hardware and software our customers are using. Participation in the survey is optional and anonymous.



However, the landscape remains diverse. It is integral that the VR application supports many devices on the market to reach a broad audience of potential users. (Fig. 41).


Figure 41: Some of the commercially available VR systems, including their proprietary tracking and input devices.


The browsers of the early digital economy embraced standards such as HTML5, CSS3, Ajax, DOM, JSON, XML 5th edition, HTTP/1, REST, and so on so that web developers could create cross-browser dynamic content that looks stunning and behaves as a desktop application would.


In a network architecture devoid of these standards, a website must depend on proprietary or non-standardized alternative protocols to retrieve, submit, assemble, and render the content hosted on the server.


Similarly, the VR application targeting multiple devices depends on their proprietary way of retrieving tracking and controller input information, assembling the virtual environment, characters, and models, and rendering them into stereoscopic displays.


More sophisticated VR applications may even host much of their content on the server and transfer it to the device upon the request of the application client (e.g., virtual worlds, avatars, or even real-time tracking information of users in social experiences). For example, Kim et al. (2022) published a guide to prototype such an application using Amazon Web Services.


Motor BCI, therefore, acts as yet another VR input device. Applications wishing to utilize it would need to implement its proprietary data transfer method — just like the Motor BCI researchers did.


For the User to effectively interact with virtual worlds using their Motor BCI input, it must be interpretable by an immense array of available applications.


Standards to the Rescue


Luckily, the same challenge faced by a Motor BCI is shared by any new input modality wishing to join the Spatial Computing market or by any existing runtimes (the software handling everything between and including the tracking/device inputs and the efficient rendering of 3D graphics and spatial audio) which aim to support a wider range of VR applications.

With such significant pressure, in 2019, the Khronos Group released the first widely accepted VR standard: OpenXR 1.0 (rpavlik, 2019).


Rather than a runtime itself, OpenXR is a specification that defines many of the inputs, processes, and outputs that a runtime (e.g., SteamVR, Quest, Monado) should comply with so that one OpenXR application implementation may reliably function across their many devices.


In the words of the Khronos Group (OpenXR Working Group, 2024):

“OpenXR is an API for XR applications. […] OpenXR is the interface between an application and an in-process or out-of-process ‘XR runtime system,’ or just ‘runtime’ hereafter. The runtime may handle such functionality as frame composition, peripheral management, and raw tracking information. Optionally, a runtime may support device layer plugins which allow access to a variety of various hardware across a commonly defined interface.”


Figure 42; The Khronos Group Inc., 2019: Tackling XR fragmentation with OpenXR. Reprinted.


Rather than a runtime, the Motor BCI should be thought of as a custom input device plugin for a runtime (Fig. 42), which replaces the inputs otherwise provided by controllers (i.e., obtained from IMU, infrared LED, button ammeters), cameras (tracking objects with infrared light) or trackers (other the outside-in or inside-out modalities).


💃To summarize, the Motor BCI produces decoded movement variables, while the HMD has access to numerous VR applications on a PC that operates within the SteamVR runtime. The task at hand is to transform and submit the decoded movement variables into a format that can be seamlessly recognized by the VR applications in SteamVR.


To this end, OpenXR also includes ‘API Layers,’ which can programmatically modify the runtime’s functionality exposed by the OpenXR interface. Important for Motor BCI, one such functionality is hand tracking.


The widely supported XR_EXT_hand_tracking OpenXR extension provides a VR application with the individual joint kinematics it can use to render hands and interact with virtual objects.


Overriding these kinematics with the Motor BCI decoded representation via an implicit OpenXR API Layer allows Users to automatically interact with all applications that implement this extension (for instance, any applications built in Unity with the popular XR Hands package).


Figure 43; The Khronos Group Inc., 2018: Motor BCI acts as a kind of OpenXR Device Plugin. Reprinted.



Companies developing motion-capture and haptic glove VR hardware have used this exact approach to feed their custom hand poses to OpenXR runtimes (Ultraleap, n.a; Manus, n.a).


Alternatively, approaches like the OpenGloves Driver (2021) use Valve’s proprietary OpenVR API, namely the Skeletal Input System of the IVRDriverInput, to input finger tracking from their custom glove into the SteamVR runtime. OpenVR is, in many respects, the precursor to OpenXR, exposing some SteamVR functionality (accessible by a variety of HMDs compatible with SteamVR content Fig. 41) to application developers or input device manufacturers. In 2020, Valve abandoned OpenVR in support of OpenXR, although OpenVR solutions will continue to work (Valve, 2020).


To quickly validate a prototype, the Motor BCI could leverage the open-source OpenGloves driver, acting as a custom hardware input. The benefit of using the deprecated OpenVR API lies in its long-standing validation by other developers. In contrast, OpenXR extensions are still only beginning their adoption.


The Cortical Bionics group also implements OpenVR, although for a different purpose. While their code is not yet available, in general, they project images rendered by MuJoCo’s OpenGL renderer to SteamVR HMDs via the IVRCompositor and request changes in HMD’s position and orientation (as the User is looking around) from SteamVR via the IVRSystem interface. These interfaces are also necessary for the User who may have lost their ability to move their head but can provide such motor commands via the Motor BCI.


Beyond supplying the VR applications with the arm and hand (XR_EXT_hand_tracking, XR_EXT_palm_pose, XR_EXT_hand_joints_motion_range, XR_EXT_hand_interaction, and other vendor-specific OpenXR hand extensions), other OpenXR extensions the Motor BCI may override include:

  • Face tracking XR_FB_face_tracking, which can trigger 63 facial expressions (blend shapes), and XR_HTC_facial_tracking  with a combined 49 expressions,

  • Upper body tracking XR_FB_body_tracking with 70 joints: 18 core body joints + 52 hand joints.


Notably, the OpenXR API Layer and OpenVR Input Driver can be activated by a specific application (explicitly). However, they can also be activated by other preinstalled programs (implicitly) on runtime startup. Unfortunately, popular standalone HMD runtimes based on Android still do not support the OpenXR API Layers as they use custom loaders (PICO, n.a; Meta, n.a).


To supply the interoperable Avatar with speech the User needs to venture outside of OpenXR specification. The program implicitly plugging the Motor BCI into the VR runtime may include a simulated audio device that can forward the synthesized voice into the SteamVR runtime as it is being decoded. Similar implementations include the Advanced Settings OpenVR application, which lets the user choose the audio device, or the modern voice modification programs that adjust the incoming audio stream ahead of the SteamVR renderer.


While the Khronos group is making strides in standardizing the fragmented VR ecosystem, many more pressing issues persist. For example, any rooms, characters, or objects the user creates in one application do not persist between games and can be exported at best.


The Metaverse Standards Forum, a non-profit XR collective of over three thousand members, has identified over 200 topics in dire need of standardization in the age of Spatial Computing. They further collapsed them into 21 prioritized domains to be systematically standardized by their working groups (Fig. 44).


Figure 44; Metaverse Standards Forum, 2023: The Metaverse Standards Forum Domain Group Process Pipeline. Reprinted.


Nevertheless, with either the OpenXR or the OpenVR implementation, the User may use their Motor BCI to interact with a plethora of VR applications (SteamDB OpenVR, 2024; SteamDB OpenXR, 2024, Finger Tracking Index Compatibility, 2023), including those native to the compatible runtime such as settings, content libraries, galleries, et cetera. In other words, once the User puts on the HMD, they can control VR just like any other user.


 

Part 15 of a series of unedited excerpts from uCat: Transcend the Limits of Body, Time, and Space by Sam Hosovsky*, Oliver Shetler, Luke Turner, and Cai Kinnaird. First published on Feb 29th, 2024, and licensed under CC BY-NC-SA 4.0.



uCat is a community of entrepreneurs, transhumanists, techno-optimists, and many others who recognize the alignment of the technological frontiers described in this work. Join us!


*Sam was the primary author of this excerpt.

Comments


bottom of page