Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support setting targetLatency #6473

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

vk342
Copy link

@vk342 vk342 commented Jun 2, 2024

This PR will...

Allow setting hls.targetLatency.

Setting targetLatency will:

  • set stallCount to 0
  • set liveSyncDuration to new value

https://github.com/vk342/hls.js/blob/f8c8c42c2968e5d79661b5f0b0c904b76a9de375/src/controller/latency-controller.ts#L75-L79

Why is this Pull Request needed?

Currently targetLatency is calculated as following:

  • when liveSyncDuration is specified in config,
    targetLatency is calculated as liveSyncDuration plus liveSyncOnStallIncrease multiplied by number of stalls.
  • otherwise targetLatency is calculated as liveSyncDurationCount multiplied by EXT-X-TARGETDURATION
    plus liveSyncOnStallIncrease multiplied by number of stalls.

As discussed in #6350, incrementing targetLatency each time the player encounters the stall may become problematic in low-latency scenarios, especially when the live stream needs to play at consistently low latency for an extended period of time.

Allowing to set targetLatency will provide the developers who use hls.js with a simple way to implement a custom logic that will adjust targetLatency. This will allow to control latency without performing hard reset on the player.

Are there any points in the code the reviewer needs to double check?

The code does not update liveSyncDuration value of the config that was supplied on player start-up (a.k.a. userConfig), used, for example, here:

const userConfig = this.hls.userConfig;

Is this an expected behaviour?

Resolves issues:

Checklist

  • changes have been done against master branch, and PR does not conflict
  • new unit / functional tests have been added (whenever applicable)
  • API or design changes are documented in API.md

@iamboorrito
Copy link
Collaborator

Perhaps we should consider decaying the stall penalty or stall count over time if no stalls are encountered? For example, if we do not encounter a stall in 3 target durations reduce stall count by 1 until it hits 0 again. This would make the safety margin dynamic and automatic for people who just want the player to work without interacting with its APIs.

What do you think?

@vk342
Copy link
Author

vk342 commented Jun 3, 2024

I agree, the idea of decaying the stalls makes sense.

If we want to go down this path, we need to make it configurable, i.e. how aggressively we reduce the stall count. We would also need a strategy to handle the stalls that happen after stall count (and latency) go down. Do we increment the period we wait before reducing stall count again?

This could lead to a complex solution and potentially increase overall number of stalls if the initial targetLatency is too low for a given stream (and network), thus defeating the purpose of stall count.

This is why I opted for a quick and basic solution where these decisions are made outside of the player. So heuristics of arriving to the optimal latency for a given end user are passed over to the developer who embeds hls.js...

Perhaps we should have a separate discussion on what building this logic into the player itself should look like...

@robwalch
Copy link
Collaborator

robwalch commented Jun 3, 2024

The code does not update liveSyncDuration value of the config that was supplied on player start-up (a.k.a. userConfig), used, for example, here <hls.userConfig> Is this an expected behaviour?

That is expected. userConfig is preserved so that it can be compared to the composed player config (a new object extended by default and user config options).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants