You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 27, 2022. It is now read-only.
every frame the script call startevent and endevent, that mean every frame we store a time that GPU use in buffer,but we get the time just every second, we may lost lots of frames time and what we get
is the latest frame.
if this project just want to get one frame time that GPU use,i think you should use bigger buffer to store,
i suggest we get one second GPU use time, every startevent,we add the before time, when call gettimeEvent,return the total time and reset it.
The text was updated successfully, but these errors were encountered:
every frame the script call startevent and endevent, that mean every frame we store a time that GPU use in buffer,but we get the time just every second, we may lost lots of frames time and what we get
is the latest frame.
if this project just want to get one frame time that GPU use,i think you should use bigger buffer to store,
i suggest we get one second GPU use time, every startevent,we add the before time, when call gettimeEvent,return the total time and reset it.
The text was updated successfully, but these errors were encountered: