Welcome to the Inedo Forums! Check out the Forums Guide for help getting started.

If you are experiencing any issues with the forum software, please visit the Contact Form on our website and let us know!

Amazon.S3.AmazonS3Exception: Please reduce your request rate.



  • An error occurred processing a GET request to <snip>: Please reduce your request rate.
    
    Amazon.S3.AmazonS3Exception: Please reduce your request rate.
     ---> Amazon.Runtime.Internal.HttpErrorResponseException: Exception of type 'Amazon.Runtime.Internal.HttpErrorResponseException' was thrown.
       at Amazon.Runtime.HttpWebRequestMessage.ProcessHttpResponseMessage(HttpResponseMessage responseMessage)
       at Amazon.Runtime.HttpWebRequestMessage.GetResponseAsync(CancellationToken cancellationToken)
       at Amazon.Runtime.Internal.HttpHandler`1.InvokeAsync[T](IExecutionContext executionContext)
       at Amazon.Runtime.Internal.RedirectHandler.InvokeAsync[T](IExecutionContext executionContext)
       at Amazon.Runtime.Internal.Unmarshaller.InvokeAsync[T](IExecutionContext executionContext)
       at Amazon.S3.Internal.AmazonS3ResponseHandler.InvokeAsync[T](IExecutionContext executionContext)
       at Amazon.Runtime.Internal.ErrorHandler.InvokeAsync[T](IExecutionContext executionContext)
    

    Are there any options to enforce retry-backoff in Proget itself? I'm working to reduce the caller frequency with some backoffs, but it's a complicated web, isn't it?

    Alternately, has anyone found a good proxy solution to cache packages and ensure the S3 storage is fast and global (DR concerns)? We run Proget directly in k8s to S3, with no dedicated servers in the mix, and our storage is currently around 10TB of data, spread scross dozens of feeds, but some of them are hit frequently, unfortunately, given the developer and tooling mixes present in house.


  • inedo-engineer

    Hi @cole-brand_2889 ,

    Wow, I didn't realize that S3 rate-limited like that!! That's good to know.

    Unless this is something that could be configured as an advanced SDK switch in the S3FileSystem, then I don't think there's much that could/should be done in the ProGet or extension code.

    After searching the error (and seeing this very post on the first page of Google 😂), this just seems to be endemic with S3; there's no published rate limit, and even in AWS official blog article the only solutions seem to be "follow the error message and reduce your request rate".

    There's probably something you can do do on the load-balancer side of things... reducing concurrent requests, etc.

    Let us know what you find!

    Thanks,
    Alana


Log in to reply
 

Inedo Website HomeSupport HomeCode of ConductForums GuideDocumentation