-
Notifications
You must be signed in to change notification settings - Fork 190
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Argument list too long" for too many log files #163
Comments
Eeh, yeah that would probably solve it, but why would you want to parse 62000 log files? |
We should support wildcards for this use case. I don't think we currently do. |
This is still an issue. To answer "why would you want to parse 62000 log files?" A single active s3 bucket serving static content can create 10,000 log files per day. I have examples over 15,000 per day. I don't know why AWS does this, it's just default behavior. I'll have to concat the logs, or loop over them or something in order to work around. |
Hey Z-matth, thank you for your update. This project was abandoned by @wvanbergen and me about 2 years ago. Pull requests are always welcome though, but the chances of any one of us fixing it in our end are very small. |
I have more than 62000 log files in a directory, and get an "Argument list too long" error when trying to analyze these with your tool.
This seems to be a bash restriction, but could be solved by not using bash commands directly but some loops to split up the amount of files.
The text was updated successfully, but these errors were encountered: