Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rendering to <canvas> element? #3

Open
floe opened this issue Jan 16, 2023 · 4 comments
Open

Rendering to <canvas> element? #3

floe opened this issue Jan 16, 2023 · 4 comments

Comments

@floe
Copy link

floe commented Jan 16, 2023

Just tested your component with my a-frame 1.4.1 scene and it works like a charm, kudos!

However, for my somewhat esoteric usecase, I'd like to render the output of the second camera to a canvas element (just like in https://jgbarah.github.io/aframe-playground/camrender-01/ ). Unfortunately, that doesn't seem to do anything. I verified that aframe-multi-camera itself works, using an extra plane, but I need something that I can use to create a MediaStream object from, and that has to be a canvas.

My setup:

<script src="[https://cdn.jsdelivr.net/gh/diarmidmackenzie/aframe-multi-camera@latest/src/multi-camera.min.js](view-source:https://cdn.jsdelivr.net/gh/diarmidmackenzie/aframe-multi-camera@latest/src/multi-camera.min.js)"></script>

...

<a-scene cursor="rayOrigin: mouse">
      <a-assets>
        ...
        <canvas id="canvas3"></canvas>
      </a-assets>

...

      <a-entity id="second-cam" secondary-camera="output:screen; outputElement:#canvas3; sequence:before" position="0 1.6 -1" rotation="0 180 0"></a-entity>

When I change the second camera options to "output:plane; outputElement:#testplane; sequence:before", I get the expected result rendered to the plane, but with the code above, the canvas stays unchanged. Any ideas about how to fix this?

Thanks!

@diarmidmackenzie
Copy link
Owner

diarmidmackenzie commented Jan 16, 2023

Hi, to render to a canvas would require a second WebGL context (i.e. a 2nd THREE.WebGLRenderer)

One of the things I deliberately tried to do with this set of components was to avoid the need for multiple WebGL contexts, as described here:
https://diarmidmackenzie.github.io/aframe-multi-camera/#single-vs-multiple-webgl-contexts

In your case, it sounds as though you actively want an additional canvas & hence you'll need an additional WebGL context, since each THREE.WebGLRenderer targets exactly one canvas element.

In that case, I think you would be better off using the code from jgbarah's camrender.js, rather than trying to adapt these components?

Is there a reason that doesn't work for you?

@floe
Copy link
Author

floe commented Jan 16, 2023

I've fiddled around a bit more and found that the canvas element actually gets rendered to (if I move it out of a-assets and make it visible as a standalone element, it shows the second camera view). But the MediaStream I get from canvas.captureStream() still shows a blank element. So I'll try the approach from camrender.js next, thanks for your quick response!

@floe
Copy link
Author

floe commented Jan 16, 2023

Update: yes, it works with camrender.js, with the caveat that the canvas needs to be initialized before captureStream() works (either through THREE.WebGLRenderer, or through canvas.getContext("webgl")).

@diarmidmackenzie
Copy link
Owner

 (if I move it out of a-assets and make it visible as a standalone element, it shows the second camera view). But the MediaStream I get from canvas.captureStream() still shows a blank element

When you do this, I think is not actually rendering to the 2nd canvas.

Rather, it is rendering to a section of the original canvas that is defined by the boundary of the 2nd canvas.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants