Replies: 2 comments 2 replies
-
I think ultimately this is a good idea, but I'd rather focus on getting direct 1:1 apis implemented first. Then we can identify commonality + start building abstractions (which should definitely go through the RFC process). Godot built a high level XR interface, so we can probably learn from that.
Hmm this is a tough call. I don't have an opinion off the top of my head. We should probably build a separate path first, then try to unify if possible. Ideally users don't have to build entirely new render logic when adding vr.
We'll probably need to change the graph structure according to whether or not XR is enabled (ex: replace the window swapchain node with the xr swapchain node). This will definitely require some thought / design work and like (1), ideally users don't need to build entirely new render graphs to account for this.
I've already removed Window from |
Beta Was this translation helpful? Give feedback.
-
|
I'd personally like to see bevy_arcore. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I wanted to help with XR integration in bevy. @blaind is currently working on it but I wanted to take a step back and discuss some details about how this integration should be implemented (I currently can't actively work on it but I know this can help any contributor).
First of all, we may want a unified system for handling VR, AR and XR devices. A single crate
bevy_xrcan be written as a unified interface for OpenXR, WebXR, ARCore and ARKit backends, but much work needs to be done in other bevy crates for a proper integration.For now I will discuss only the rendering code paths.
bevy_xr? If we want full control in the backend we may need a newXR_CAMERAnode.WindowSwapChainNode, but OpenXR uses an offscreen ad-hoc swapchain. if XR mode is enabled, should the primary swapchain be aXrSwapChainNodeor should we add a newXR_SWAP_CHAINnode? A point in favor toXR_SWAP_CHAINis thatbevy_uiuses thePRIMARY_SWAP_CHAINand expects it to be linked to a window.RenderResourceContextrequires aWindowhandle. Should we add acreate_swap_chain_xr()andnext_swap_chain_texture_xr()methods or should we abstractWindowand a (new)XrScreeninto a new (enum) type?To house the most of the XR logic I though of adding a
bevy_xrcrate, which should handle input and should be renderer agnostic.bevy_xrplugin struct can be used bybevy_wgputo construct wgpu context objects and swapchains (bevy_wgpuwill receive just the backend-specific handles).About wgpu and gfx-hal, @blaind is working on wgpu and it seems on the right path. I already submitted a PR on gfx-hal for constructing objects from raw handles but it still needs some tweaks to allow gfx-hal to take ownership of these handles (which should simplify the implementation of
bevy_xr). Currently there is no support for multiview in gfx-hal and wgpu, this can be added later, and bevy integration can take this into account for future support.Beta Was this translation helpful? Give feedback.
All reactions