Skip to content
This repository has been archived by the owner on Oct 27, 2022. It is now read-only.

a question #3

Open
xvde110 opened this issue Jul 11, 2018 · 1 comment
Open

a question #3

xvde110 opened this issue Jul 11, 2018 · 1 comment

Comments

@xvde110
Copy link

xvde110 commented Jul 11, 2018

every frame the script call startevent and endevent, that mean every frame we store a time that GPU use in buffer,but we get the time just every second, we may lost lots of frames time and what we get
is the latest frame.
if this project just want to get one frame time that GPU use,i think you should use bigger buffer to store,
i suggest we get one second GPU use time, every startevent,we add the before time, when call gettimeEvent,return the total time and reset it.

@belm0
Copy link
Contributor

belm0 commented Jul 11, 2018

RenderTiming.instance.deltaTime is updated every frame. It's your application's decision how to use it.

Logging once per second is meant as an example and can be disabled.

usernamealreadyis added a commit to usernamealreadyis/render-timing-for-unity that referenced this issue Sep 1, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants