-
Notifications
You must be signed in to change notification settings - Fork 286
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
read_delim_chunked
takes much more memory than expected?
#1410
Comments
After a bit of searching through the issues on this repo, I noticed that at least one other person seems to be encountering this issue as well: #1120 (comment)_. |
Additional note: this seems to be a more pervasive issue than I had realized. I tried loading a sequence of files via
I repeated this experiment with
As far as I can tell, the current version of |
I am having the same issue. The memory use increases almost monotonically even though the individual chunks are small. |
Any updates or workarounds? Can I use edition 1 (via |
Having the same problem here. |
To investigate this issue we'll need a reprex, and some indication of how you're measuring R's memory consumption. |
I am using the read_delim_chunked function to process large text files chunk-by-chunk. My expectation is that memory is cleared after each chunk is read. However, this does not seem to be the case. The amount of memory required to read the text file (by chunking) is the same as the amount of memory to read the text file (without chunking). I assume that this is a bug, but maybe my understanding of
read_delim_chunked
is incorrect. The purpose of reading by chunk is to conserve memory, right? Thanks!The text was updated successfully, but these errors were encountered: