Navigation

    Inedo Community Forums

    Forums

    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    1. Home
    2. dean-houston
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    dean-houston

    @dean-houston

    inedo-engineer

    10
    Reputation
    175
    Posts
    11
    Profile views
    0
    Followers
    0
    Following
    Joined Last Online

    dean-houston Follow
    inedo-engineer

    Best posts made by dean-houston

    • RE: API expects null instead of 0 for integer values

      @atripp @joshuagilman_1054

      TIL that PowerShell can use internal CLR generic reference type names like that! But really, please don't do that...

      • â›” [System.Nullable``1[[System.Int32]]]
      • 👍 [Nullable[int]]

      ... much easier to read 🤠

      posted in Support
      dean-houston
      dean-houston
    • RE: Proget: error Response Content-Length mismatch: too few bytes written

      You can ignore this error; for whatever reason, the NuGet client unexpectedly terminated the connection, and the result was that ProGEt stopped writing bytes. Not really anything to worry about.

      The diagnostic center isn't for proactive-monitoring, more for diagnostic purposes. So ulnless users are reporting a problem, you don't need to check it.

      posted in Support
      dean-houston
      dean-houston
    • RE: "Log scope Execution has already been completed" exception after OSCall

      Hi @jimbobmcgee ,

      Thanks for all the details; we plan to review/investigate this via OT-518 in an upcoming maintenance release, likely in the next few two-week cycles.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: npm install slow on proxy feed

      Hi @andreas-unverdorben_1551 ,

      npmjs.org primarily serves static content and runs on massive server farms running in Microsoft's datacenters.

      Your ProGet server is much less powerful and does not serve static content. Not only is every request is dynamic (authentication, authorization, vulnerability checking, license checking, etc), but most requests (such as "what is the latest version of package X") need to be forwarded to npmjs.org and aggregated with local data.

      So, a much less powerful server doing a lot more processing is going to be a little slower ;)

      Running ProGet in a server cluster will certainly help.

      Cheers,
      Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: pgutil uploads error for large assets. HTTPS uploads show “operation cancelled” whilst HTTP is fine

      Hi @mmaharjan_0067 ,

      It sounds like you're on the right track with researching this; your reverse proxy is definitely "breaking things" somehow.

      Based on what you wrote, it sounds like your reverse proxy is terminating the request because there's no output from the server after a while. The "no output" is expected, since assembling the upload takes quite some time, and that's likely where the "operation cancelled" would be coming from.

      I would look there and see if you can adjust timeouts. As for pgutil, here's the code used to perform the multi-part upload:
      https://github.com/Inedo/pgutil/blob/thousand/Inedo.ProGet/AssetDirectories/AssetDirectoryClient.cs#L197

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Standards for Feed Setup with Connectors

      Hi @kichikawa_2913,

      We see multiple connectors pretty often, and it rarely presents a problem.

      The main downside comes in the overhead of aggregation; for some queries like "list all package versions", each connector will need to be queried and have the results aggregated. So it could cause performance issues for for very high-traffic feeds - at least that's what we see on the support side of things.

      However, if you plan on using a package-approval workflow, then it won't be a problem, as your approved-npm feed wouldn't have any connectors.

      Hope that gives some insight,

      Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Docs on Github, CONTRIBUTING.md

      @joel-shuman_8427 thanks for the heads up!

      I just updated it
      https://github.com/Inedo/inedo-docs/blob/master/CONTRIBUTING.md

      posted in Support
      dean-houston
      dean-houston
    • RE: Adding an ad-hoc deployment process to an existing application

      Hi mwatt_5816,

      BuildMaster does support "release-less" builds, though you may need to enable it under the application's Settings > Configure Build & Release Features > Set Release Usage to optional. That will allow you to create a build that's not associated with a release.

      It's also possible to do "ad-hoc" builds (i.e. builds with no pipeline), but we don't make it easy to do in the UI because it's almost always a mistake (once you already have pipelines configured). So in your case, I think you should create a secondary pipeline for this purpose.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Published timestamp resets after pulling remote npm packages

      Hi @d-kimmich_0782 ,

      This behavior is by design; the "publish date" in ProGet 2025 and earlier is whenever a package is added to a feed. This means that, even if a package was published to NuGet.org 3 years ago, the "publish date" will be whenever it was first cached.

      However, in ProGet 2025.14 and later, you can change this behavior under "Admin > Advanced Settings > Use Connector Publish Date". This will be the default behavior in ProGet 2026.

      This is being done for a similar set of rules you should investigate, which we call Recently Published & Aged Rules :
      https://docs.inedo.com/docs/proget/sca/policies#recently-published-aged-rules-proget-2026-preview

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: User can't view [Usage & Statistics] for packages when 'Manage Feed' is scoped to Feed Group

      Hi @Nils-Nilsson,

      Thanks for the report; this was a trivial fix and I just committed the change to PG-3138 , which will be in the next maintenance release (Oct 24).

      As an FYI, if you uncheck "Restrict viewing download statistics to Feed Administrators" on the Feed Permissions page, then error shouldn't occur.

      -- Dean

      posted in Support
      dean-houston
      dean-houston

    Latest posts made by dean-houston

    • RE: ProGet: Is it possible to create universal virtual packages with sources from multiple feeds?

      Hi @m-lee_3921,

      When specifying a package source, the package must be in the same feed; you could specify it using a URL however.

         {
          "virtualPath": "common/logo/logo.png",
          "source": { 
             "url": "http://proget/endpoints/customer-assets/content/ast-logo.png"
          }
      

      Good point on the documentation; download-vpack is for the .vpack file only (i.e. manifest). I'll update it

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: migrating from Octopus Deploy

      @uel_2013 (I deleted my previous reply since I learned a few more details from a team member who talked with you already)

      Given your team size, you'd definitely be better off upgrading (rethinking) your CI/CD processes when switching over to BuildMaster. As some users have told us, the Octopus Deploy way is like "trying to apply the SVN mindset in a Git world".

      The main benefit to a small team is that it's a simplification/consolidation of build- and deployment tools, while also giving you a powerful platform and process. We're working on "codifying" this in an upcoming guide called Lean Platforms: Engineering & Orchestration.

      You could likely get BuildMaster to work in a similar way (i.e. a "deployment script runner"), but you'll be "fighting against the current" and you would be missing out nearly all of the benefits. For example, we have different ways of handling multi-tenancy (e.g. depending on if you do quasi-custom software) and the Git and Issue-tracking integration will make a huge difference in your internal processes.

      I'd suggest taking a quick tutorial of the software (you can freely download it), and see how far you can get with setting a basic application from scratch. That should help you see the differences and how the concepts maps. There are a lot of similar ideas, but like Git and SVN, there are differences that don't translate very well.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Debian feed broken after upgrade to 2025.14

      Hi @thomas_3037 and @felfert ,

      We heard from another users that creating a new signing key in the feed worked, so please try that.

      We didn't experience that in our testing, but I suppose it's not all that unexpected given the scope of the changes. We will update the documentation / blog post and also try to put some guidance in the software via PG-3157 to help other users who encounter an error like this.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Reporting and SCA

      Hi @rick-kramer_9238 ,

      It looks like you're using ProGet 2023? That functionality was relatively new in that version and there is very possibly some kind of bug linking the two together.

      We've since made some big improvements to SCA/compliance, so I would recommend upgrading. Many of the changes were in ProGet 2024:
      https://docs.inedo.com/docs/proget-upgrade-2024#new-features-in-proget-2024

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Debian feed broken after upgrade to 2025.14

      Hi @felfert ,

      There were major changes to Debian feeds in ProGet 2025.14:
      https://blog.inedo.com/inedo/proget-2025-14-major-updates-to-debian-feeds

      They were thoroughly tested and we did not encounter issues when upgrading and switching back/forth between versions. As you might imagine, we don't have enough information to help you troubleshoot this - so we'd appreciate if you could investigate further.

      It is most certainly a client-related configuration issue and you may need to reconfigure the keys, clear a cache, or something to that effect. Signing keys are typically not changed for already-configured repositories, so this isn't common client behavior.

      Please let us know what you find, so we can document appropriately.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Published timestamp resets after pulling remote npm packages

      Hi @d-kimmich_0782 ,

      This behavior is by design; the "publish date" in ProGet 2025 and earlier is whenever a package is added to a feed. This means that, even if a package was published to NuGet.org 3 years ago, the "publish date" will be whenever it was first cached.

      However, in ProGet 2025.14 and later, you can change this behavior under "Admin > Advanced Settings > Use Connector Publish Date". This will be the default behavior in ProGet 2026.

      This is being done for a similar set of rules you should investigate, which we call Recently Published & Aged Rules :
      https://docs.inedo.com/docs/proget/sca/policies#recently-published-aged-rules-proget-2026-preview

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Remote packages that isnt cached does not format correctly

      Hi @gisleso,

      But I'm starting this topic to ask if its expected behavior that formatting of uncached remote packages is a litte off? See screenshots.

      Good catch. In this case, the ProGet implementation of the warehouse API (which connectors use) was using the "description" field instead of the "summary" field. Easy fix, it'll be handled via PG-3151 in the next maintenance release

      And if anyone have experience with promoting packages including their dependencies I am open to suggestions. (Right now I'm thinking a script that first pull the packages, and the loop to pull/promote the depedencies)

      This is frequently requested, but it's simply impossible to do. The only way to solve this is to restore from your unapproved feed, then promote the dependencies into your approved feed.

      The reason is that this would require ProGet (or your script) to perform a "dependency resolution" which is impossible to do without environmental context (i.e. other packages installed, operating system, and client configuration).

      This is because dependencies are not only specified as ranges (e.g. hypothesis requires sortedcontainers<3.0.0,>=2.1.0) but they often include usage constraints (e.g. hypothesis specifies redis>=3.0.0; when extra=="redis" is specified) and environmental constraints (e.g. hypothesis specifies tzdata>=2025.2 but only when (sys_platform == "win32" or sys_platform == "emscripten") and extra == "zoneinfo").

      Only a package manager tool (e.g. pip) can perform dependency resolution, and even then it's not deterministic. This is why lock files are so important too.

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: Incompatibility between Gitlab Package Registry and Proget Nuget Connector

      Hi @mayorovp_3701,

      Based on the message, it appears to be a bug/problem with the GitLab API. It's indicating that the required PackageBaseAddress endpoint is not present in the service index.

      Can you share the "service index" (i.e. the index.json) file? It should be present at the root of the API and look something like this: https://api.nuget.org/v3/index.json

      -- Dean

      posted in Support
      dean-houston
      dean-houston
    • RE: [ProGet] Questions about configuring and behavior of self-connectors

      Hi @koksime-yap_5909 ,

      Data deduplication is an operating-level system function. On Windows, there's the Data Deduplication Feature. There are more options for Linux, but ZFS Deduplication is pretty popular.

      -- Dean

      posted in Support
      dean-houston
      dean-houston