Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenXR: Virtual Reality Rendering #115

Open
johannesvollmer opened this issue Aug 11, 2020 · 18 comments
Open

OpenXR: Virtual Reality Rendering #115

johannesvollmer opened this issue Aug 11, 2020 · 18 comments
Labels
A-Rendering Drawing game state to the screen C-Feature A new feature, making something new possible O-XR Specific to virtual and augmented reality platforms

Comments

@johannesvollmer
Copy link
Contributor

Hi! This project sounds awesome. I just discovered it and wanted to talk about XR support.

One could assume that this could be implemented as an external module. However, the OpenXR library creates one or more render targets (for Vulkan/GL/...) based on the physical XR hardware. Therefore, the XR module needs to talk to the rendering module. As a consequence, a few generic changes to the rendering module may possibly be required, changes that are most probably unspecific to OpenXR. That's why it's reasonable to talk about this now already.

I'd like to look into OpenXR, which already has Rust bindings. However, the architecture should probably be prototyped in collaboration, as I don't know all the bevy stuff yet. I'll read all the bevy documentation and then come back to this issue.

I'd be happy to hear your thoughts to this! Cheers

@cart
Copy link
Member

cart commented Aug 12, 2020

I would love to support XR. I've never personally written code in that space, but I have a general idea about a few ways it could integrate with bevy. Does OpenXR interact directly with render apis (ex vulkan / opengl) or is it abstract (in a way that we could use the bevy render api directly)?

If it needs direct vulkan access, then we can either:

  1. build a wgpu extension that exposes the relevant apis
  2. build a second bevy render backend that uses an api (like vulkan) directly

my preference would be to do this through wgpu to reduce complexity on our end (and also improve the wgpu ecosystem).

@erlend-sh
Copy link

Probably should be done upstream: gfx-rs/gfx#3219

@johannesvollmer
Copy link
Contributor Author

johannesvollmer commented Aug 12, 2020

Yeah, it needs direct vulkan access. It should really be done upstream.

This means that bevy should only expose the gfx XR API after gfx adds support

@StarArawn
Copy link
Contributor

I want to point out once gfx-hal supports openXR you'd have to create a wgpu native(web wont get this) extension as well. Keep in mind we'd need some way of turning on/off wgpu native extensions in bevy. Current extensions can be found here:
https://wgpu.rs/doc/wgpu/struct.Features.html

@karroffel karroffel added C-Feature A new feature, making something new possible A-Rendering Drawing game state to the screen labels Aug 12, 2020
@IceSentry
Copy link
Contributor

gfx-rs/gfx#3219 mentions that the api could potentially be based on WebXR which means it could also target the web without needing a native and web extension

@Moxinilian Moxinilian mentioned this issue Dec 9, 2020
@ambeeeeee
Copy link
Contributor

The WGPU API will be based on WebXR, the gfx one is going to be based on OpenXR.

@jpryne
Copy link

jpryne commented Apr 6, 2021

Is this a work in progress? Can I study the code, perhaps even make a pull request?

@ndarilek
Copy link
Contributor

ndarilek commented Apr 7, 2021 via email

@agrande
Copy link

agrande commented Apr 7, 2021

I've been working on this for a while, continuing what @amberkowalski started on the gfx side of things, but I'm new to Rust, Vulkan and OpenXR. Here's what I have so far: https://github.com/agrande/gfx/tree/gfx-xr. Not quite there yet, but I'm actively working on it.
I don't feel experienced enough to lead the effort, but since no one seemed to be doing it, and I really want to have VR in bevy, I decided to give it a go. Maybe we should open a channel on the discord to coordinate the efforts and make sure we all have the same level of information, as several people seem interested in the topic...

@SafariMonkey
Copy link

I am not experienced in OpenXR, but I have some OpenVR experience. I would like to see a channel or other place for discussion if that's possible, as I am also in the position of wanting to contribute, but not having enough experience that I feel able to lead the effort. If we can coordinate, we can hopefully make some progress without too much duplication of effort.

@agrande
Copy link

agrande commented Apr 8, 2021

Ok, so now we have an XR channel on discord to coordinate the effort 🙌

@johannesvollmer
Copy link
Contributor Author

Nice. I'm afraid I can't help as I currently do not have access to a vr headset anymore

@blaind
Copy link
Contributor

blaind commented May 14, 2021

I've created a proof-of-concept (but functional) Bevy - OpenXR that works on Oculus Quest 2 and most probably on PC with Monado as well. See https://github.com/blaind/xrbevy.

@palodequeso
Copy link

Love to see this integration, as someone who watched Amethyst closely and am now here, I really love VR dev. Would be happy to test on SteamVR and contribute as needed.

@IceSentry
Copy link
Contributor

For the record, there's some work being done in this PR #2319 to support XR in bevy

@alice-i-cecile alice-i-cecile added the O-XR Specific to virtual and augmented reality platforms label Apr 26, 2022
@dkhovanskii
Copy link

I am wondering if there any progress on WebXR?

@mrchantey
Copy link
Contributor

mrchantey commented Feb 20, 2023

Edit: here's a summary of bevy xr forks I've come across with my understanding of the difference in features:

alice-i-cecile pushed a commit that referenced this issue Jun 19, 2023
# Objective

We can currently set `camera.target` to either an `Image` or `Window`.
For OpenXR & WebXR we need to be able to render to a `TextureView`.

This partially addresses #115 as with the addition we can create
internal and external xr crates.

## Solution

A `TextureView` item is added to the `RenderTarget` enum. It holds an id
which is looked up by a `ManualTextureViews` resource, much like how
`Assets<Image>` works.
I believe this approach was first used by @kcking in their [xr
fork](https://github.com/kcking/bevy/blob/eb39afd51bcbab38de6efbeeb0646e01e2ce4766/crates/bevy_render/src/camera/camera.rs#L322).
The only change is that a `u32` is used to index the textures as
`FromReflect` does not support `uuid` and I don't know how to implement
that.

---

## Changelog

### Added
Render: Added `RenderTarget::TextureView` as a `camera.target` option,
enabling rendering directly to a `TextureView`.

## Migration Guide

References to the `RenderTarget` enum will need to handle the additional
field, ie in `match` statements.

---

## Comments
- The [wgpu
work](gfx-rs/wgpu@c039a74)
done by @expenses allows us to create framebuffer texture views from
`wgpu v0.15, bevy 0.10`.
- I got the WebXR techniques from the [xr
fork](https://github.com/dekuraan/xr-bevy) by @dekuraan.
- I have tested this with a wip [external webxr
crate](https://github.com/mrchantey/forky/blob/018e22bb06b7542419db95f5332c7684931c9c95/crates/bevy_webxr/src/bevy_utils/xr_render.rs#L50)
on an Oculus Quest 2.

![Screenshot 2023-03-11
230651](https://user-images.githubusercontent.com/25616826/224483696-c176c06f-a806-4abe-a494-b2e096ac96b7.png)

---------

Co-authored-by: Carter Anderson <[email protected]>
Co-authored-by: Paul Hansen <[email protected]>
james7132 pushed a commit to james7132/bevy that referenced this issue Jun 19, 2023
# Objective

We can currently set `camera.target` to either an `Image` or `Window`.
For OpenXR & WebXR we need to be able to render to a `TextureView`.

This partially addresses bevyengine#115 as with the addition we can create
internal and external xr crates.

## Solution

A `TextureView` item is added to the `RenderTarget` enum. It holds an id
which is looked up by a `ManualTextureViews` resource, much like how
`Assets<Image>` works.
I believe this approach was first used by @kcking in their [xr
fork](https://github.com/kcking/bevy/blob/eb39afd51bcbab38de6efbeeb0646e01e2ce4766/crates/bevy_render/src/camera/camera.rs#L322).
The only change is that a `u32` is used to index the textures as
`FromReflect` does not support `uuid` and I don't know how to implement
that.

---

## Changelog

### Added
Render: Added `RenderTarget::TextureView` as a `camera.target` option,
enabling rendering directly to a `TextureView`.

## Migration Guide

References to the `RenderTarget` enum will need to handle the additional
field, ie in `match` statements.

---

## Comments
- The [wgpu
work](gfx-rs/wgpu@c039a74)
done by @expenses allows us to create framebuffer texture views from
`wgpu v0.15, bevy 0.10`.
- I got the WebXR techniques from the [xr
fork](https://github.com/dekuraan/xr-bevy) by @dekuraan.
- I have tested this with a wip [external webxr
crate](https://github.com/mrchantey/forky/blob/018e22bb06b7542419db95f5332c7684931c9c95/crates/bevy_webxr/src/bevy_utils/xr_render.rs#L50)
on an Oculus Quest 2.

![Screenshot 2023-03-11
230651](https://user-images.githubusercontent.com/25616826/224483696-c176c06f-a806-4abe-a494-b2e096ac96b7.png)

---------

Co-authored-by: Carter Anderson <[email protected]>
Co-authored-by: Paul Hansen <[email protected]>
@alice-i-cecile
Copy link
Member

Much more recent, actively developed XR plugin for Bevy: https://github.com/awtterpip/bevy_openxr

Largely built off the work listed above <3

Currently targeting main, but should be usable with Bevy 0.12 in a couple weeks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Rendering Drawing game state to the screen C-Feature A new feature, making something new possible O-XR Specific to virtual and augmented reality platforms
Projects
None yet
Development

No branches or pull requests