Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lazily allocate context for chunk compression #1249

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

anacrolix
Copy link

Changelog

Reduce memory overhead of many concurrent writers.

Docs

The CustomCompressor interface now returns a new compressor instead of assuming to return the same one every time.

Description

The compression context created for each Writer can have a significant memory usage. Only allocate this as required when compressing chunks. When operating a large number of concurrent writers this significantly reduces memory overhead.

https://linear.app/foxglove/issue/FG-8734/inbox-listener-ooms-when-file-has-too-many-input-channels

@jtbandes
Copy link
Member

Thanks for the PR. It still looks like there are some unrelated go module changes, would you be able to remove those so it can pass CI?

@jtbandes jtbandes requested a review from james-rms October 28, 2024 22:09
@anacrolix
Copy link
Author

Feel free to build on top of this.

In particular to minimise any extra changes required I believe:

You can retain the original custom compressor interface, for now, as the compressor is just reused.

I think the go module changes are just residual go workspace stuff, it can all be reverted.

@anacrolix anacrolix marked this pull request as ready for review November 3, 2024 03:41
@anacrolix
Copy link
Author

I can no longer see the related data platform PR but that can be modified to not bother with chunking topic split uploads: It's still a significant performance fix, and is more reliable than the original code. The chunking can be addressed in some other more complete PR with tests, or not at all.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants