@mathieu-belanger_6065 said in Connection reset while downloading npm packages: I am curious, would there be an impact on performance when "piping" connectors together? For example, internal feed A has a connector to internal feed B, which has a connector to internal feed C, which has a connector to npmjs.org? Connectors are accessed over HTTP. So assuming you have a "chain" like A --> B --> C --> npm.js, (i.e. different 3 feeds and 3 different connectors), each request may yield 3 additional requests. So when your browser asks feed A for package typescript@3.7.4, then following will happen. If the package is cached or local, the file is streamed to the browser Each connector (just B, in this case) is queried over HTTP for typescript@3.7.4 The first connector that returns a response, the response body is streamed to the browser Each connector follows the same logic. When ProGet (via a request to feed A) asks feed B for that package, the same logic is followed: If the package is cached or local, the file is streamed to the browser Each connector (just C, in this case) is queried over HTTP for typescript@3.7.4 The first connector that returns a response, the response body is streamed to the browser Continuing the pipe, when ProGet (via a request to feed B via a request to feed A) asks feed C for that package, the same logic is followed: If the package is cached or local, the file is streamed to the browser Each connector (just nuget.org, in this case) is queried over HTTP for typescript@3.7.4 The first connector that returns a response, the response body is streamed to the browser This is why caching is important, but also why chaining may not be a good solution for high-trafficked npm developer libraries like typescript. The npm client basically does a DoS by requesting hundreds of packages at once. Same is true with nuget.exe as well.