You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, I was traveling and hence couldn't write to you earlier... Have you resolved the issue yet?
On most modern machines with >2GB RAM, you should have no issues running the script for your JSON file of 209MB. I'm just curious to know...
What are the details of your system? (RAM, OS etc.)
How are you running the script? (on your CMD line, using an IDE etc.)
The script has room for memory/performance improvement. Here's a couple of approaches to take, if you want to process really large files...
Split the large JSON file into multiple smaller files before processing. - Easiest.
Modify the header determination logic and use buffer object to incrementally process the JSON file. - Slightly more difficult than 1 as you have to take care of handling JSON objects that do not have the same keys.
I'm going to leave this issue open and try to do some optimization when I get some time. Let me know what you do in the meanwhile...
I getting this when trying to process a 209 mb json file:
Traceback (most recent call last):
File "json_to_csv.py", line 85, in
header += reduced_item.keys()
MemoryError
What can I do?
The text was updated successfully, but these errors were encountered: