Skip to content

Commit

Permalink
Create reading from a large compressed file
Browse files Browse the repository at this point in the history
Previously, only the uncompressed data was "large", but issue #160 is only
triggered if the *compressed* data is large (in this case, larger than
128 kB), which apparently exceeds some input buffer.

See #160
  • Loading branch information
marcelm committed Jun 4, 2024
1 parent 9cd7987 commit a3be946
Showing 1 changed file with 6 additions and 4 deletions.
10 changes: 6 additions & 4 deletions tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,12 @@
def create_large_file(tmp_path):
def _create_large_file(extension):
path = tmp_path / f"large{extension}"
random_text = "".join(random.choices(string.ascii_lowercase, k=1024))
# Make the text a lot bigger in order to ensure that it is larger than the
# pipe buffer size.
random_text *= 2048
random.seed(0)
chars = string.ascii_lowercase + "\n"
# Do not decrease this length. The generated file needs to have
# a certain length after compression to trigger some bugs
# (in particular, 512 kB is not sufficient).
random_text = "".join(random.choices(chars, k=1024 * 1024))
with xopen(path, "w") as f:
f.write(random_text)
return path
Expand Down

0 comments on commit a3be946

Please sign in to comment.