Welcome to the Inedo Forums! Check out the Forums Guide for help getting started.
If you are experiencing any issues with the forum software, please visit the Contact Form on our website and let us know!
Amazon.S3.AmazonS3Exception: Please reduce your request rate.
-
An error occurred processing a GET request to <snip>: Please reduce your request rate. Amazon.S3.AmazonS3Exception: Please reduce your request rate. ---> Amazon.Runtime.Internal.HttpErrorResponseException: Exception of type 'Amazon.Runtime.Internal.HttpErrorResponseException' was thrown. at Amazon.Runtime.HttpWebRequestMessage.ProcessHttpResponseMessage(HttpResponseMessage responseMessage) at Amazon.Runtime.HttpWebRequestMessage.GetResponseAsync(CancellationToken cancellationToken) at Amazon.Runtime.Internal.HttpHandler`1.InvokeAsync[T](IExecutionContext executionContext) at Amazon.Runtime.Internal.RedirectHandler.InvokeAsync[T](IExecutionContext executionContext) at Amazon.Runtime.Internal.Unmarshaller.InvokeAsync[T](IExecutionContext executionContext) at Amazon.S3.Internal.AmazonS3ResponseHandler.InvokeAsync[T](IExecutionContext executionContext) at Amazon.Runtime.Internal.ErrorHandler.InvokeAsync[T](IExecutionContext executionContext)Are there any options to enforce retry-backoff in Proget itself? I'm working to reduce the caller frequency with some backoffs, but it's a complicated web, isn't it?
Alternately, has anyone found a good proxy solution to cache packages and ensure the S3 storage is fast and global (DR concerns)? We run Proget directly in k8s to S3, with no dedicated servers in the mix, and our storage is currently around 10TB of data, spread scross dozens of feeds, but some of them are hit frequently, unfortunately, given the developer and tooling mixes present in house.