[DAO:d953d6e] WebXR - The future of VR in Decentraland

by 0x8b257b97c0e07e527b073b6513ba8ea659279b61 (Morph)

Should the problem/opportunity outlined be refined and taken to the next level?

Problem Statement

It is going to be increasingly difficult to retain feature parity, and user parity, across three different verticals (Browser/Desktop/VR), especially with dozens of VR devices and multiple interfaces.

We need a solution that brings these clients closer together, instead of silo-ing them further apart. What’s more, we need a solution that leverages the current client and technology we have, instead of creating more clients for the foundation/DAO to maintain.

Proposed Solution

The browser is one of the greatest interoperable technologies ever developed, a majority of which run on the same open source technology (Chromium) and is compatible with nearly every device in the world.

WebXR is an open interface for VR devices to connect to 3D content in the browser. All VR devices, including the latest Apple headset, have webXR support (including Quest). By simply opening a URL/browser window in VR, users can immediately connect to VR platforms, all with the modern security and functionality of browsers.

You can see this technology for yourself, already in use at Hyperfy, another open metaverse with similar ideals to Decentraland.

You can read more about the WebXR spec here:

https://www.w3.org/TR/webxr/

Target Audience/Customer Base

The original premise of Decentraland was a VR platform and open source SDK, to enable the future open metaverse.

I believe this is a large driving force behind MANAs continued valuation, and much of the user/developer interest in the space.

Unfortunately, it seems this has been removed from the current foundation roadmap.

Why is this relevant now?

You may be quick to dismiss the browser, as an older, outdated technology when compared to the higher graphics and FPS available on the desktop client - however, webGPU is a new technology that can deploy graphics card resources to render in-browser, this technology alongside our metaverse 3D engine already available in-browser would be incredibly powerful, secure, and easier to jump into than a desktop download.

We should begin to test webXR and integrations for when webGPU is ready, we may find webXR even provides a decent VR experience in-browser with higher end devices right now.

To begin, we do not need full body tracking like VRChat, we simply need a user to be able to walk around as their avatar, see other users in browser/desktop mode, interact with world objects, and use voice chat.

A proof of concept webXR integration would enable this, without requiring additional backend changes to current character interfaces. It also seems that .VRM was discussed as being on the roadmap at recent avatar Q&A sessions, which can be leveraged together with this concept.

Long term, full body tracking will be a necessity, however I believe this kind of functionality would be perfectly suited for our distributed node system thanks to local latency options, and could perhaps even be a paid premium feature used to incentivize more nodes.

Vote on this proposal on the Decentraland DAO

View this proposal on Snapshot

As someone who is forced to keep up to date with browser technology day to day, I truly believe people are underestimating how powerful it currently is, and how much more powerful it will soon be.

With the standardization that has happened across browsers the last 5-10 years, alongside security improvements and now hardware connective SDKs for both GPU and better memory access, it is very possible we see browsers replace much of the concept of OS middleware in the years to come.

We should seize this opportunity to improve the browser experience as not only the ultimate interoperable and accessible portal, but the complete experience with native VR right out of the box and no downloads.

I truly believe this, alongside .VRM expansions (see here Enable .vrm support for Decentraland models) are the missing pieces of the puzzle to a complete, metaverse engine.

1 Like

It seems reasonable, but I’m gonna need some of the more technically inclined and VR-enthusiasts of the community to tell me if you are way off base or not.

Pretty sure this would require a different client from the Unity one.
Perhaps the BabylonJS client from the protocol squad could be used for this?

I believe it’s compatible as it should simply be an interface on top for the browser code, camera movement can be converted from headset accelerometer <> Mouse, while movement is still joystick controlled, a separate point and click interface is likely needed to go from controller location controller direction instead of viewpoint projection.

EDIT: I’m not a Unity dev so I’m genuinely asking for feasibility discussion here if possible

Demo for feasibility check: Unity WebGL Player | Unity WebXR Export

Repo:

I think there’s two discussions at large to have here:
1.) Is it feasible to implement webXR on the current platform?
2.) If not, is it worthwhile to focus on the browser to prep for incoming wave of WebGPU/WebXR?

I would argue that even if 1 is not possible, 2 seems like a viable path to Decentraland’s original vision and much more realistic than running multiple separate clients. Given this is something that has not previously been viable, but is now thanks to both software and hardware innovations, I think it warrants a discussion on both counts.

That seems very much of a workaround, it might work, but chances are it will create many problems

This is using a third party tool.
With the complexity of the Decentraland client, I have my doubts it would work, but we can always try, it would be a free way to support WebXR.

1 Like

WebXR - The future of VR in Decentraland

This proposal is now in status: PASSED.

Voting Results:

  • Yes 87% 5,684,316 VP (79 votes)
  • No 12% 830,145 VP (5 votes)
  • Abstain 1% 12 VP (1 votes)

Hello!

Thank you to everyone who voted and showed interest in this idea, we reached even higher interest than the previous .VRM proposal!

HP’s suggestions are valid, webXR is natively supported in babylon.js and has been simplified heavily compared to what it would take in current browser implementation:

The babylon.js client is also pretty great, I think we can consider this a viable option for webXR/VR on the browser, with the potential for webGPU support in the future also: WebGPU Support | Babylon.js Documentation

Overall, I am extremely bullish on the babylon.js application, it’s a simpler and more intuitive interface that many can likely build on top of.

For this concept to be viable, we will also need a player full body tracking networking module, and eventually, ability to add shapekeys/jiggle-bones to wearables. As we already have a quest proof of concept it’s my hope that this is something we can start working on horizontally in the near future.

1 Like

Latest update:

Based on the protocol squad update: Discord

It seems Godot client may be our best bet for a webXR and interoperable (mobile!) future of DCL!

Godot also seems to have webXR support, available here: WebXRInterface — Godot Engine (stable) documentation in English