Purpose of buffered video source? Guaranteed video-audio delay, may be very big. #11142
Replies: 3 comments 1 reply
-
In the past I've had to turn off buffering with V4L2 inputs, too, for this same reason. I never investigated why because I hadn't looked under the hood of OBS. I definitely think this should be looked at or explained! The documentation on async video sources is rather sparse - there's no docs for |
Beta Was this translation helpful? Give feedback.
-
I added in a debug string to show a situation where V4L2 input grabs a few frames between calls to I did this by switching scenes, which caused a small stutter (see the line of debug output I'm seeing that the
It never recovers to the 1-2 frames in the async buffer that it would normally have, it eventually balloons out to 20-21 frames. |
Beta Was this translation helpful? Give feedback.
-
This is definitely bugged 🐛 Firstly, if any timestamp sent with a frame is zero; then the test Secondly, there is something incorrect in the logic within Reproducible exampleThe optional test-input plugin has a Sync Test (Async Video/Audio) source. This source is supposed to show a black box for 1 sec, followed by a white box for 1 sec. When the box is white - a 'tone' is played. This source, however, sends a timestamp of zero, so we need to address that first. I made the following changes to @@ -80,8 +80,8 @@ static void *video_thread(void *data)
while (os_event_try(ast->stop_signal) == EAGAIN) {
fill_texture(pixels, whitelist ? 0xFFFFFFFF : 0xFF000000);
- frame.timestamp = cur_time - start_time;
- audio.timestamp = cur_time - start_time;
+ frame.timestamp = cur_time; //- start_time;
+ audio.timestamp = cur_time; //- start_time; I then put the (workaround) source into a scene; and found that the tone was being played while the box is black. The opposite of what should be happening. I'll work on a fix for the logic; but given how 'deep' this is within the way OBS (async) sources operate - it will need some good eyes to check over it. I have a sneaking suspicion that async (video) sources are 1 frame 'behind' where they should be at all times - but most people are playing videos at frame rates where this doesn't matter. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
The buffered video source uses a queue of frames, and then the source picks the best next frame, based on their timestamps. But that makes the offset between input timestamps and output timestamps grow in case that the video source temporarily stops having frames ready (like CPU too busy, video acq lags a bit).
I have a situation where the camera returns between 30 and 60fps, and my output is at 30fps.
Then imagine that there is a lag in receiving frames (like the CPU is too busy). There will be a while without new frames. But nevertheless the last_frame_ts advances sys_offset (1/30 of second at 30fps). Here it falls into this case:
obs-studio/libobs/obs-source.c
Line 4299 in 1451554
finishing the function with no frame:
obs-studio/libobs/obs-source.c
Line 4346 in 1451554
Despite we end with no-frame, last_frame_ts advanced a lot. With usual frames, I get a 20ms frame_offset, but when there are no frames because of temporary CPU lag, the last_frame_ts advances 33ms!
So when CPU becomes free again, the video source has last_ts_frame maybe one second into the future, making the video images go 1s behind audio (up to 2s or as much as the queue length), and it will never recover back to ilttle delay.
It's fair to say that, with buffered video source, video will be delayed by a random amount of time, always increasing every time some frame lags a bit behind, and it will never be resynched back.
This looks clearly wrong, and I don't see when anyone would desire this situation. So is the ready_async_frame algorithm wrong, or really people have just to use the video sources unbuffered, for recording live video?
One hard limit I thought was something like:
That limits the delay to the acq latency at least, and that's too little to make any use of buffer. Maybe "= sys_time + 50ms"?
I don't know what would be desirable to improve this: try to keep the frame->timestamp within some boundaries? Use a smaller queue than 30 frames (that's 1s delay at 30fps input)?
Beta Was this translation helpful? Give feedback.
All reactions