Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for hitting the S3 SlowDown message #90

Open
adamchesterton opened this issue Jun 26, 2017 · 2 comments
Open

Support for hitting the S3 SlowDown message #90

adamchesterton opened this issue Jun 26, 2017 · 2 comments

Comments

@adamchesterton
Copy link

Hi @cobbzilla - first this is a really awesome tool. I have a bit of a unique use case where the bucket i am trying to copy from one account to another has around 80 million files and after tweaking the max connections and number of threads with the retry parameters I managed to hit the S3 threshold and added protection kicked in to return this error:

com.amazonaws.services.s3.model.AmazonS3Exception: Please reduce your request rate. (Service: Amazon S3; Status Code: 503; Error Code: SlowDown;

At this point it was reading 270k items a minute which is truly impressive and still took 50mins to hit the limit :) Any chance of adding support to hold back when the API requests, as soon as I stopped making requests by canceling the script i could restart almost straight away so a pause for a minute or two should do the trick. Thanks!

@adambarthelson
Copy link

@adamchesterton what settings did you use? Curious

@cobbzilla
Copy link
Owner

@adamchesterton I love the idea of an adaptive backoff to try to maximize the request rate while staying under rate limits. I don't have time to write this but would welcome a pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants